Last Update 4:29 PM February 05, 2023 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Friday, 03. February 2023

FIDO Alliance

Biometric Update: Transparency can drive digital ID equity, policy forum panelists say

An event last week on ‘Identity, Authentication and the Road Ahead’ held by the Better Identity Coalition, the FIDO Alliance and the Identity Theft Resource Center held a panel discussion […] The post Biometric Update: <a href="https://www.biometricupdate.com/202301/transparency-can-drive-digital-id-equity-policy-forum-panelists-say" target="_blank" rel="noreferrer noopener">Transparency c

An event last week on ‘Identity, Authentication and the Road Ahead’ held by the Better Identity Coalition, the FIDO Alliance and the Identity Theft Resource Center held a panel discussion among biometrics experts on equity and bias concerns in identity proofing.

The post Biometric Update: <a href="https://www.biometricupdate.com/202301/transparency-can-drive-digital-id-equity-policy-forum-panelists-say" target="_blank" rel="noreferrer noopener">Transparency can drive digital ID equity, policy forum panelists say</a> appeared first on FIDO Alliance.


Tech Republic: Unphishable mobile MFA through hardware keys

Passwords are a mess, MFA can be more of a stopgap than a solution to phishing and running your own public key infrastructure for certificates is a lot of work. […] The post Tech Republic: <a href="https://www.techrepublic.com/article/mobile-mfa-hardware-keys/" target="_blank" rel="noreferrer noopener">Unphishable mobile MFA through hardware keys</a> appeared first on FIDO Alliance.

Passwords are a mess, MFA can be more of a stopgap than a solution to phishing and running your own public key infrastructure for certificates is a lot of work. The long-term goal is to move to passwordless credentials that can’t be phished.

The post Tech Republic: <a href="https://www.techrepublic.com/article/mobile-mfa-hardware-keys/" target="_blank" rel="noreferrer noopener">Unphishable mobile MFA through hardware keys</a> appeared first on FIDO Alliance.


Intelligent CISO: Passing on passwords in 2023 (finally)

Passwords are often an organisation’s weakest link. On World Password Day (May 5), Apple, Google and Microsoft jointly announced that they were building in support for passwordless sign-in, leveraging FIDO2, […] The post Intelligent CISO: Passing on passwords in 2023 (finally) appeared first on FIDO Alliance.

Passwords are often an organisation’s weakest link. On World Password Day (May 5), Apple, Google and Microsoft jointly announced that they were building in support for passwordless sign-in, leveraging FIDO2, across all of the desktop, browser and mobile platforms that they control. Crucially, they have all emphasised cross-platform functionality will be a high priority in the development of FIDO2-based passwordless features. With the help of its community of identity, security and biometrics experts, the FIDO Alliance has developed and promoted free, open standards that have taken passwordless authentication to the next level.

The post Intelligent CISO: Passing on passwords in 2023 (finally) appeared first on FIDO Alliance.


CyberWire: NIST on phishing resistance

According to NIST Special Publication DRAFT 800-63-B4, a phishing-resistant authenticator offers “the ability of the authentication protocol to detect and prevent disclosure of authentication secrets and valid authenticator outputs to […] The post CyberWire: NIST on phishing resistance appeared first on FIDO Alliance.

According to NIST Special Publication DRAFT 800-63-B4, a phishing-resistant authenticator offers “the ability of the authentication protocol to detect and prevent disclosure of authentication secrets and valid authenticator outputs to an impostor relying party without reliance on the vigilance of the subscriber.” Two examples of phishing-resistant authenticators are PIV cards for US Federal employees and FIDO authenticators paired with W3C’s Web Authentication API for the private sector.

The post CyberWire: NIST on phishing resistance appeared first on FIDO Alliance.


Solutions Review: Identity Management and Information Security News for the Week of February 3; Radiant Logic, Guardz, Legrand, and More

Axiad, a passwordless solutions provider, announced it has been appointed to the FIDO Alliance Board of Directors. Karen Larson, Axiad’s Senior Director, Strategic Partnerships and Alliances will serve as Axiad’s […] The post Solutions Review: <a href="https://solutionsreview.com/identity-management/identity-management-and-information-security-news-for-the-week-of-february-3-radiant-logic-gua

Axiad, a passwordless solutions provider, announced it has been appointed to the FIDO Alliance Board of Directors. Karen Larson, Axiad’s Senior Director, Strategic Partnerships and Alliances will serve as Axiad’s Primary Board Delegate. In her new role as a FIDO Alliance Board Member, Larson will help set direction for the alliance focused specifically on the authentication needs of the enterprise and public sector.

The post Solutions Review: <a href="https://solutionsreview.com/identity-management/identity-management-and-information-security-news-for-the-week-of-february-3-radiant-logic-guardz-legrand-and-more/" target="_blank" rel="noreferrer noopener">Identity Management and Information Security News for the Week of February 3; Radiant Logic, Guardz, Legrand, and More</a> appeared first on FIDO Alliance.


Business Today: Making the Internet safer and better for all – including our seniors

Andrew Shikiar highlights growing cyber threats and e-scams amongst seniors online. As one of the least technologically savvy groups, the elderly is also one of the most vulnerable – and […] The post Business Today: <a href="https://www.businesstoday.com.my/2023/02/03/making-the-internet-safer-and-better-for-all-including-our-seniors/" target="_blank" rel="noreferrer noopener">Making the I

Andrew Shikiar highlights growing cyber threats and e-scams amongst seniors online. As one of the least technologically savvy groups, the elderly is also one of the most vulnerable – and targeted – demographics by online scammers. To achieve a more secure Internet for everyone, we must ensure that the elderly’s needs and preferences are considered. Fortunately, there are readily available standards and best practices to help service providers guide users in adopting passwordless authentication. 

The post Business Today: <a href="https://www.businesstoday.com.my/2023/02/03/making-the-internet-safer-and-better-for-all-including-our-seniors/" target="_blank" rel="noreferrer noopener">Making the Internet safer and better for all – including our seniors</a> appeared first on FIDO Alliance.


OpenID

2023 OpenID Foundation Board of Directors Election Results

I want to personally thank all Foundation members who voted in the 2023 elections for representatives to the OpenID Foundation board of directors. Each year Corporate members of the Foundation elect a member to represent them on the board with all Corporate members in good standing eligible to nominate and vote for candidates. Thank you […] The post 2023 OpenID Foundation Board of Directors Elect

I want to personally thank all Foundation members who voted in the 2023 elections for representatives to the OpenID Foundation board of directors.

Each year Corporate members of the Foundation elect a member to represent them on the board with all Corporate members in good standing eligible to nominate and vote for candidates. Thank you to Ashish Jain, Chief Product Officer at Arkose Labs, and Dima Postnikov, Head of Identity Strategy and Architecture at ConnectID, for nominating themselves for this important board position that requires a substantial investment of time and energy.

I am very pleased to welcome Dima to the board of directors as the 2023 Corporate Representative. Dima’s ongoing support and contributions to the Foundation including being an active participant in FAPI WG working on supporting Open Banking initiatives worldwide and as co-chair of GAIN PoC working group delivering global interoperability for consumer identity are most appreciated.  I look forward to his continued leadership in these regards as well as on the board of directors in 2023.

I want to kindly thank Ashish who served as Corporate Representative in 2021 and 2022 and has been a member of the task force defining the Foundation’s strategy and budget. I welcome Ashish’s continued involvement and contribution to the Foundation as his guidance and technical expertise has been of great value.

Per the Foundation’s bylaws, three individual board members represent the membership and the community at large. George Fletcher has one year remaining on his two-year term and I look forward George’s ongoing leadership in 2023 as he continues to contribute significantly as a member of the task force, executive committee and board of directors.

Nat Sakimura and John Bradley were re-elected to two-year terms as Community member representatives. I want to thank Nat and John for their continued leadership of the Foundation in their positions as officers and members of the board of directors. I look forward to continuing to work with both closely as we execute the 2023 Foundation strategy and grow the Foundation.

Please join me in thanking Nat, John and Dima, as well as all of the board of directors, for their service to the Foundation and the community at large. And thank you to all members for your continued investment of time and membership that drives and supports the Foundation’s strategic initiatives.

Gail Hodges
Executive Director
OpenID Foundation

The post 2023 OpenID Foundation Board of Directors Election Results first appeared on OpenID.

EdgeSecure

Catapulting Telecom into the Modern Age at a Small Liberal Arts College

The post Catapulting Telecom into the Modern Age at a Small Liberal Arts College appeared first on NJEdge Inc.

Improve Cybersecurity Performance While Meeting JIF Insurance Requirements

The post Improve Cybersecurity Performance While Meeting JIF Insurance Requirements appeared first on NJEdge Inc.

GS1

Greg Smith

Greg Smith EVP, Global Operations and Supply Chain daniela.duarte… Fri, 02/03/2023 - 16:04 Member management Medtronic Greg Smith
Greg Smith EVP, Global Operations and Supply Chain daniela.duarte… Fri, 02/03/2023 - 16:04 Member management

Medtronic

Greg Smith

FIDO Alliance

Tech Radar Pro: Apple says these are the best security keys around now

Whatever security key you choose, it must be FIDO certified, Apple says. Apple has revealed what it believes are the best security keys to add an extra layer of protection […] The post Tech Radar Pro: <em>Apple says these are the best security keys around now</em> appeared first on FIDO Alliance.

Whatever security key you choose, it must be FIDO certified, Apple says. Apple has revealed what it believes are the best security keys to add an extra layer of protection to your digital world.  The recent release of iOS 16.3 saw Apple add security key compatibility to its iPhone and iPad devices – as well as to its laptops and desktops with the macOS 13.2 update. Now, in a support document, the company has selected its recommendations for the best physical security keys to use with its devices, which comply with FIDO standards – the foremost alliance on credential security that most of big tech are signed up to.

The post Tech Radar Pro: <em>Apple says these are the best security keys around now</em> appeared first on FIDO Alliance.


B2B Cybersecurity: Do companies need a Chief Zero Trust Officer?

The FIDO Alliance provides that you can log in anywhere without a password. Face or fingerprint login is used instead of the old username/password combination. A FIDO login key, sometimes […] The post B2B Cybersecurity: Do companies need a Chief Zero Trust Officer? appeared first on FIDO Alliance.

The FIDO Alliance provides that you can log in anywhere without a password. Face or fingerprint login is used instead of the old username/password combination. A FIDO login key, sometimes called a “passkey,” makes it easier for users and harder for attackers.

The post B2B Cybersecurity: Do companies need a Chief Zero Trust Officer? appeared first on FIDO Alliance.


CSO: What are passkeys?

The FIDO Alliance is the main body behind defining specifications for passkeys. All major cloud service providers and Passkey infrastructure providers are members. In cooperation with the W3C, FIDO introduced […] The post CSO: What are passkeys? appeared first on FIDO Alliance.

The FIDO Alliance is the main body behind defining specifications for passkeys. All major cloud service providers and Passkey infrastructure providers are members. In cooperation with the W3C, FIDO introduced the WebAuthn API.

The post CSO: What are passkeys? appeared first on FIDO Alliance.


NIST: Phishing Resistance – Protecting the keys to your kingdom

Phishing refers to a variety of attacks that are intended to convince you to forfeit sensitive data to an imposter. These attacks can take a number of different forms; from […] The post NIST: Phishing Resistance – Protecting the keys to your kingdom appeared first on FIDO Alliance.

Phishing refers to a variety of attacks that are intended to convince you to forfeit sensitive data to an imposter. These attacks can take a number of different forms; from spear-phishing (which targets a specific individual within an organization), to whaling (which goes one step further and targets senior executives or leaders). Furthermore, phishing attacks take place over multiple channels or even across channels; from the more traditional email-based attacks to those using voice – vishing – to those coming via text message – smishing. Regardless of the type or channel, the intent of the attack is the same – to exploit human nature to gain control of sensitive information (citation 1). These attacks typically make use of several techniques including impersonated websites, attacker-in-the-middle, and relay or replay to achieve their desired outcome.

The post NIST: Phishing Resistance – Protecting the keys to your kingdom appeared first on FIDO Alliance.

Thursday, 02. February 2023

Oasis Open Projects

Invitation to comment on Electronic Court Filing v4.1 and ECF Web Services SIP v4.1 from the ECF TC

ECF Version 4.1 is a maintenance release to address several minor schema and definition issues identified by implementers of the ECF 4.0 and 4.01 specifications. The post Invitation to comment on Electronic Court Filing v4.1 and ECF Web Services SIP v4.1 from the ECF TC appeared first on OASIS Open.

Public review ends March 4th

We are pleased to announce that Electronic Court Filing Version 4.1 and Electronic Court Filing Web Services Service Interaction Profile Version 4.1 from the LegalXML Electronic Court Filing TC [1] are now available for public review and comment. This is the first public review for Version 4.1 of these specifications.

ECF defines a technical architecture and a set of components, operations and message structures for an electronic court filing system, and sets forth rules governing its implementation.

LegalXML Electronic Court Filing Version 4.1 (ECF v4.1) consists of a set of non-proprietary XML and Web Services specifications, along with clarifying explanations and amendments to those specifications, that have been added for the purpose of promoting interoperability among electronic court filing vendors and systems. ECF Version 4.1 is a maintenance release to address several minor schema and definition issues identified by implementers of the ECF 4.0 and 4.01 specifications.

Electronic Court Filing Web Services Service Interaction Profile defines a Service Interaction Profile, as defined in section 5 of the ECF v4.1 specification. The Web Services Service Interaction Profile may be used to transmit ECF 4.1 messages between Internet-connected systems.

The documents and related files are available here:

Electronic Court Filing Version 4.1
Committee Specification Draft 01
07 December 2022

Editable source (Authoritative):
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/ecf-v4.1-csd01.docx
HTML:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/ecf-v4.1-csd01.html
PDF:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/ecf-v4.1-csd01.pdf
XML schemas:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/xsd/
XML sample messages:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/xml/
Model and documentation:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/model/
Genericode code lists:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/gc/
Specification metadata:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/xsd/metadata.xml
Complete package in ZIP file:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/ecf-v4.1-csd01.zip
Additional information about this and any previous public reviews is published in the public review metadata record:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v4.1/csd01/ecf-v4.1-csd01-public-review-metadata.html
************************

Electronic Court Filing Web Services Service Interaction Profile Version 4.1
Committee Specification Draft 01
07 December 2022

Editable source (Authoritative):
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/ecf-webservices-v4.1-csd01.docx
HTML:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/ecf-webservices-v4.1-csd01.html
PDF:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/ecf-webservices-v4.1-csd01.pdf
WSDL files:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/wsdl/
WSDL examples:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/wsdl/examples/
Complete package in ZIP file:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/ecf-webservices-v4.1-csd01.zip
Additional information about this and any previous public reviews is published in the public review metadata record:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v4.1/csd01/ecf-webservices-v4.1-csd01-public-review-metadata.html

How to Provide Feedback

OASIS and the LegalXML Electronic Court Filing TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public reviews start 03 February 2023 at 00:00 UTC and end 04 March 2023 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=legalxml-courtfiling).

Comments should clearly identify which of these two specifications they address.

Feedback submitted by TC non-members for these works and for other work of this TC is publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/legalxml-courtfiling-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [2] applicable especially [3] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about this specification and the ECF TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/legalxml-courtfiling/

Additional references:
[1] OASIS LegalXML Electronic Court Filing TC
https://www.oasis-open.org/committees/legalxml-courtfiling/
[2] https://www.oasis-open.org/policies-guidelines/ipr/
[3] https://www.oasis-open.org/committees/legalxml-courtfiling/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-Limited-Mode
RF on Limited Terms Mode

The post Invitation to comment on Electronic Court Filing v4.1 and ECF Web Services SIP v4.1 from the ECF TC appeared first on OASIS Open.


ResofWorld

Indians find easy workarounds to watch BBC’s banned documentary on Modi

Links to the video are being shared on Telegram, Reddit and public file-hosting sites.
On January 21, India’s Ministry of Information and Broadcasting banned the sharing of a BBC documentary for “undermining the sovereignty and integrity of India” — and Indians have been looking...

The end of free returns is coming

Across India, e-commerce giants like Myntra and Flipkart are canceling free returns to boost profits.
Sachin Tati, a loyal customer of India’s largest online fashion portal Myntra, is irritated that his favorite shopping app is overhauling its returns policy. Tati says he has used Myntra...

This social network paid users for posts. What could go wrong?

Wildly popular in the Philippines, Lyka collapsed and left users with stores of tokens worth nothing. Now the CEO is plotting a comeback.
In early 2021, in Manila, a social media app called Lyka seemed to appear out of nowhere. Lyka promised the impossible: earn money just for being online. On the app,...

We Are Open co-op

Can we plz talk about privacy and security?

Building the passbolt community in 2023 At the end of last year, passbolt got in touch with us about doing some community work. Pleased to work with the passbolt team again, we’re kicking off 2023 with an organisation working to help people embrace their privacy and security while being realistic about how people collaborate in the digital space. Passbolt is the open source password ma
Building the passbolt community in 2023

At the end of last year, passbolt got in touch with us about doing some community work. Pleased to work with the passbolt team again, we’re kicking off 2023 with an organisation working to help people embrace their privacy and security while being realistic about how people collaborate in the digital space.

Passbolt is the open source password manager for teams. They aim to help people collaborate without compromising on password security. We share passwords. It’s just a fact.

TL;DR — learn about our community plans and get involved!

cc-by-nd Bryan Mathers Why: To build communal power

If you know WAO at all, you know that we are proud to be a cooperative that has some…uh…perspectives. We don’t take on just any project — we prefer to work with organisations that are community-focused and trying to make the world a better place. Passbolt is focused on privacy and security, themes that are near and dear to our nerdy hearts.

As the passbolt community grows, they want to ensure that their values are baked into everything they do. They also want to be in a position to say no to corporate overloads and asshats.

WAO members have long been interested in the privacy and security space, and we feel like we have a lot to both learn as well as contribute. We’re good at community building, and a strong community means power. The power to say no, the power to influence change, the power to change the world.

Who: passbolt, WAO and YOU

The majority of the people currently in the passbolt community understand why password security matters and they are people who can help spread the word. They also understand the complexities of internet security and digital identity. These folks care about the ethics of their tech. There are active developers and security savvy people helping out with passbolt support.

We want to figure out how we can grow the community and to include people like…well, us. Internet savvy, privacy aware people. People who are trying to be safe online. People who love open source and want to contribute to a growing and ever more important conversation around online security, data privacy and the like. We want to learn and contribute to this discussion, and this community has something to teach us.

What: a privacy & security focused community cc-by-nd Bryan Mathers

We’re lucky to be able to build together with the passbolt community, instead of from scratch. We are starting by looking at the passbolt community using our patented* Architecture of Participation. This framework helps us make sure we’re covering our bases. We’ve written about applying it many times. You can read all about it in:

An Architecture of Participation for Community growth Quick wins to improve your Open Source community’s Architecture of Participation HOWTO: Create an Architecture of Participation for your Charity tech project

Other things we’re thinking about include resources that help ensure that community structures, workflows and processes are clear and being implemented. We’ll help the passbolt team and contributors embrace, encourage and distribute leadership.

* not actually patented because patents are corporate bullshit used to control people, keep people poor and generally stymie innovation

How: the tricky part cc-by-nd Bryan Mathers

With any growing community there’s a delicate balance to be struck between improving things for new community members and ensuring that OG community members remain engaged. We’ve identified some areas that can help make the community experience more streamlined and robust. We’re also thinking about how to engage with current contributors and help them reach their goals.

Improving Documentation & Pathways

Open Source projects are often spread out all over the web and that can be confusing for potential contributors. It’s important to ensure a project is navigable for people who aren’t yet members of the community. Entry points can be varied, but they should signpost and cross link to the information that people are looking for. Getting lost causes frustration, and improving documentation and creating explicit contributor pathways can help keep people involved.

Solid plans

We found that there are 10 main reasons why people contribute to open source projects. A solid engagement plan will consider these motivations and create pathways for people to level up their involvement with the project. No matter the motivation, contributors are more likely to continue using their skills on behalf of the project if they feel valued. Ensuring that contribution is recognised is an important part of a healthy community.

Solid engagement also means consistent engagement, so we’re thinking about the patterns and workflows we need to co-design and document in order to ensure that people feel supported. This includes things like transparent roadmaps, moderation plans or updated community resources.

Making it Super Easy to Contribute

Any time spent in the service of a collective project is an amazing gift to the community that manages it. There are so many ways to contribute to open source. Open source projects need developers, yes, but they also need a wide range of other types of people and skills in order to be a success. Something as simple as providing feedback can change the course of an initiative.

Creating an inclusive community means encouraging all sorts of people to see their skills as valuable and necessary. Lowering the technical bar to contribution can help, and here modularity is key. We need to think about what ‘Minimum Viable Contribution’ might be for this community, and then make it super easy to contribute.

Where: Across the web and at an event near you cc-by-nd Bryan Mathers

We are working to tie together and create spaces to engage with anyone who cares about privacy and security. Passbolt will continue to host spaces for people who are highly technical, while also working to spread good practices and knowledge amongst people who are learning. A community is made up of the individuals who join it, so there’s no way of knowing what this community might become. Our intention, though, is that the passbolt community remains a welcoming and inclusive space.

People from the passbolt community are planning to be at a variety of open source events this year. Next up is FOSDEM! If you’re going to be there, we highly encourage you to say hello!

When: Now!

We know that building a community like this has a lot of ins and outs.

Over the next months, WAO is going to help the passbolt team grow and engage an inclusive and healthy open source community. In addition to making sure that anyone can contribute to this amazing project, we want to help open sourcers, makers, co-ops, activists, edtechies and charity folks have an honest conversation about passwords. The reality is that people share passwords, and we’re not going to stop just because we are told it is a bad practice.

Help us! Comment directly on the board!

We are making and sharing our plans with you, and we want your feedback! Do you have opinions? Do you care about privacy and security? Are you eager to help push an honest conversation? Just looking for a password manager that helps you collaborate? We want to hear from you!

Can we plz talk about privacy and security? was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 01. February 2023

Digital ID for Canadians

CONSULT HYPERION Joins the Voila Verified Trustmark Program as Readiness Advisor

Voilà Verified Program Builds Real Trust in Digital Solutions by Spotlighting World-Class Vetted Digital Identity Solutions. Vancouver, February 1st– The Digital ID and Authentication Council…

Voilà Verified Program Builds Real Trust in Digital Solutions by Spotlighting World-Class Vetted Digital Identity Solutions.

Vancouver, February 1st– The Digital ID and Authentication Council of Canada (DIACC) is pleased to officially recognize Consult Hyperion as a Readiness Advisor to help vendors prepare for  Pan-Canadian Trust FrameworkTM (PCTF) certification.

A non-profit coalition of over 115 public and private members, the DIACC develops research and opportunities to enable Canada’s confident, safe, and full participation in a global digital economy. 

“One of the PCTF certification goals is to provide certainty and trust in the Canadian market by ensuring that the identity checks will be consistent and trustworthy. Getting Consult Hyperion recognized as a Readiness Advisor will help to prepare vendors to go through this journey,” says Joni Brennan, president of the DIACC. “With the PCTF, and now with Voilà Verified, there is an opportunity to adopt a framework rooted in trust – and to earn compliance recognition. Voilà Verified identifies those who are ‘walking the walk’ and delivering safe and secure access to the global digital economy.”

The DIACC’s PCTF is a publicly available framework for identity solutions that defines client, customer, and individual duty of care. The Voilà Verified program provides a vetting and assessment opportunity where PCTF-compliant solution vendors can earn a public-facing trustmark. The result? Spotlight visibility of trustworthy, safe, reliable, and efficient solutions.

‘Consult Hyperion is excited by the vibrant digital identity and authentication market within Canada and recognizes that the PCTF central to its success. A strong encompassing framework and robust certification process is the foundation for any interoperable service.’  says Justin Gage, Consult Hyperion’s Digital Identity delivery lead for North America. ”Consult Hyperion has over 20 years’ experience designing and implementing global interoperable payment systems, which are currently used by billions of people in every continent every day. More recently we have advised several organizations developing PCTF compliant solutions and look forward to helping them with their successful application to Voilà Verified for accreditation.”

Voilà Verified presents an opportunity to grow provincial-level investments in digital identity solutions. Provincial governments which have launched identity services can now earn a trustmark of their own, and provinces that are on the cusp of entering the digital solution market can do so with confidence by seeking vendors with a Voilà Verified trustmark. 

Ruth Puente, Voilà Verified’s Trustmark Verification Program Manager, says “We are glad to welcome Consult Hyperion as a Readiness Advisor, which will bring its remarkable expertise in the identity domain to conduct gap analysis regarding the requirements that will help service providers to prepare the application for VVP Verification.”.

“Voilà Verified is inclusive yet diligent in verifying PCTF-compliant solutions. The program was developed in alignment with ISO standards  – and empowers informed decision-making in a rapidly growing ecosystem of identity solutions,” said Puente. “Delivering high-quality service, customer protection, and Increasing access to trustworthy solutions are our priorities. We have formed teams of international experts to perform assessments and to oversee the process through an impartial lens.”

“Voilà Verified is a unique opportunity in which I am honoured to share my experience as an advisor and auditor within information security, compliance, and identity,” says Björn Sjöholm, Cybersecurity Entrepreneur of Seadot, and Trustmark Oversight Board Chair.

Vendors are turning to the Voilà Verified program for several reasons, but the leading value proposition is market differentiation. Trustmark holders stand out from competitors by unlocking global business opportunities through international recognition and credibility.

“Voilà Verified puts internationally reputable identity solutions on the map,” says Dave Nikolejsin, the DIACC’s Board Chair. “This is the way forward. With lateral growth of PCTF compliance across sectors – public and private – we establish a common value of trust. Voilà Verified is a monumental stride for Canada to influence a safe and secure global digital economy.”

To learn more about Voilà Verified and access your application package, visit the program overview on the DIACC website or contact voila@diacc.ca.

– 30 –

ABOUT DIACC

DIACC is a growing coalition of public and private sector organizations who are making a significant and sustained effort to ensure Canada’s full, secure, and beneficial participation in the global digital economy. By solving challenges and leveraging opportunities, Canada has the chance to secure at least three percent of unrealized GDP or $100 billion of potential growth by 2030. Seizing this opportunity is a must in a digital society as we work through the COVID pandemic challenges. Learn more about the DIACC mandate.

ABOUT CONSULT HYPERION

Consult Hyperion is an independent strategic advisory and technical consultancy, based in the UK and North America, specialising in secure electronic transactions in the areas of Payments, Identity and Mobility. With over 30 years’ experience, we help organisations across the globe exploit opportunities presented by new technologies, regulatory changes and consumer expectations. We design systems that support mass scale secure electronic payments, fare collection and identity transaction services. We deliver value to our clients by supporting them in delivering on their strategy through digital innovation and unblocking technical challenges. Hyperlab, our in-house software development and testing team, rapidly prototypes new concepts, delivers security critical software for mass deployment, and thoroughly tests the functionality and security of third-party products on behalf of clients.

For more information contact info@chyp.com


Digital Scotland

The Future of Digital in Scottish Education with Ollie Bray, Strategic Director Education Scotland

A webinar talk exploring the progress of digital learning in Scottish Education, and ideas for enhancing and accelerating it's adoption. The post The Future of Digital in Scottish Education with Ollie Bray, Strategic Director Education Scotland appeared first on digitalscot.net.

In this talk Ollie Bray, Strategic Director Education Scotland, talks on the Future of Digital in Scottish Education. This is part of a series intended to share insights from keynote leaders from across Scottish Education on where they see Digital Technology in the sector is headed.

From 12m:30s Ollie talks about how Scotland has long been a pioneer of Technology for Learning, being a leader in adopting computers in schools and online learning systems.

But critically, this hasn’t necessarily been accompanied by modernization of the associated teaching practices.

Covid highlighted this, where online learning was simply a process of moving traditional teaching models into a virtual world, not really embracing the capabilities to change how teaching itself is delivered.

With this in mind at 33m:40s Ollie asks how do we equip young people’s skills to prepare them for online learning, given the sector continues to move in that direction. It can be a challenging experience for them.

He suggests one key methodology for addressing this is Building Digital Communities. He feels the Scottish Curriculum is well designed to lend itself to that, and at 45m:50s he explores the pedagogies that can support the adoption of digital across Scottish schools.

Ollie sees the curriculum reform in Scotland as a great opportunity to develop different types of curriculum experiences, where technology can play a more pivotal role in enhancing teaching and learning in general, not just in computing subjects.

The post The Future of Digital in Scottish Education with Ollie Bray, Strategic Director Education Scotland appeared first on digitalscot.net.


Oasis Open Projects

Genericode Approved as an OASIS Standard

Boston, MA – 1 February 2023 – Members of OASIS Open, the international open source and standards consortium, have approved Code List Representation (genericode) v1.0 as an OASIS Standard, a status that signifies that highest level of ratification. Developed by the OASIS Code List Representation Technical Committee (TC), the format, known as “genericode,” is a […] The post Genericode Approved as

IBM, Publications Office of the European Union, and Others Advance Open Standard for IT-Enabled Code Lists Used in Transportation, Finance, and Other Areas

Boston, MA – 1 February 2023 – Members of OASIS Open, the international open source and standards consortium, have approved Code List Representation (genericode) v1.0 as an OASIS Standard, a status that signifies that highest level of ratification. Developed by the OASIS Code List Representation Technical Committee (TC), the format, known as “genericode,” is a single semantic model for code lists and accompanying XML serialization that is designed to IT-enable and standardize the publication of machine-readable code list information and its interchange between systems. 

A code list in its simplest form is a set of strings that represent an item or idea. Standardized code lists include country abbreviations, currency abbreviations, shipping container descriptors, and airport codes, while nonstandardized code lists used between trading partners include financial account types, workflow status indicators, and any set of values representing the semantics of related concepts known between the parties involved in information interchange. Genericode, a standardized code list representation, is a complete description of a code list, including alternate codes, and any other associated data. Genericode also describes how new code lists are derived from existing code lists, so that the derivation is repeatable, automatable and auditable. 

“Code lists have been used for many years, often published and disseminated in manners that have not been IT-enabled nor standardized. Genericode is needed to ensure that communities, from business sectors to public authorities worldwide, are able to publish, exchange, and process code lists, a key element for business documents and data spaces interoperability, with a standardized IT-enabled representation,” said Andrea Caccia, Chair of the Code List Representation TC. “Since 2007, with the publication as a Committee Specification, genericode has been widely used. With its publication as an OASIS Standard, it’s expected to gain even more traction.” 

“The Publications Office of the European Union publishes European Public Procurement Notices on the TED website. We are pleased to announce the new generation of these notices, eForms, has been restructured using UBL and the associated genericode files,” said Ms. Hilde Hardeman, Director General of the Publications Office of the European Union. “We have made the genericode files that we use in the notices freely available for reuse in EU Vocabularies under the eProcurement Business Collection.” 

Participation in the Code List Representation TC is open to all through membership in OASIS. Key stakeholders (those who use code lists in their business exchanges) including standardization bodies, registration and source authorities, implementers of software and services, business sectors and public authorities, and others are invited to join the group.

Code List Representation FAQ

Media Inquiries: communications@oasis-open.org

The post Genericode Approved as an OASIS Standard appeared first on OASIS Open.


ResofWorld

Uber dodges responsibility for its workers in Chile

To keep its third-party contractor model, Uber is flexing its legal game.
Whether Uber is a transportation company or a technology platform has long been at the heart of the discussion of its legal responsibilities with its third-party contractors. With the array...

Relatives abroad are driving Cuba’s e-commerce boom

Most local residents can’t afford to shop online, owing to limited internet connectivity and a devalued Cuban peso.
Mara Karla Sánchez often feels compelled to send groceries to her mother and grandfather;  they live in the Cuban capital of Havana and often struggle to find basic goods on...

Tuesday, 31. January 2023

Oasis Open Projects

Code List Representation (genericode) V1.0 approved as an OASIS Standard

OASIS is pleased to announce that the call for consent has closed [1] and, effective 30 January 2023, Code List Representation (genericode) V1.0 is an OASIS Standard. Project Administration will now undertake the final tasks of preparing and loading the standard. Code lists have been with us since long before computers. They should be well […] The post Code List Representation (genericode) V1.0

Standard semantic model for code lists and accompanying serialization approved as an OASIS Standard

OASIS is pleased to announce that the call for consent has closed [1] and, effective 30 January 2023, Code List Representation (genericode) V1.0 is an OASIS Standard. Project Administration will now undertake the final tasks of preparing and loading the standard.

Code lists have been with us since long before computers. They should be well understood and easily dealt with by now. Unfortunately, they are not. As is often the case, if you take a fundamentally simple concept, you find everybody has their own unique view of what the problem is and how it should be solved.

The OASIS Code List Representation format, genericode, is a single semantic model of code lists and accompanying XML serialization that can encode a broad range of code list information. The serialization is designed to IT-enable the interchange or distribution of machine-readable code list information between systems.

The ballot was held under the OASIS call for consent procedure [2]. In the ballot, the candidate OASIS Standard received 14 affirmative consents and no objections.

Our congratulations to the members of the TC and to the community of implementers, developers and users who have brought the work successfully to this milestone

=== Additional information

[1] Ballot:
https://www.oasis-open.org/committees/ballot.php?id=3747

[2] https://www.oasis-open.org/policies-guidelines/tc-process-2017-05-26#OScallForConsent

The post Code List Representation (genericode) V1.0 approved as an OASIS Standard appeared first on OASIS Open.


Energy Web

Energy Web joins the OPENTUNITY consortium to open electricity ecosystems to decarbonize European…

Energy Web joins the OPENTUNITY consortium to open electricity ecosystems to decarbonize European grids OPENTUNITY’s mission is to create a flexibility ecosystem reducing interoperability barriers and favouring the use of standards in order to decarbonize EU grids and put the end-user in the spotlight. Zug, Switzerland — 31 January 2023 — Energy Web is proud to join 21 partners from 8 count
Energy Web joins the OPENTUNITY consortium to open electricity ecosystems to decarbonize European grids OPENTUNITY’s mission is to create a flexibility ecosystem reducing interoperability barriers and favouring the use of standards in order to decarbonize EU grids and put the end-user in the spotlight.

Zug, Switzerland — 31 January 2023 — Energy Web is proud to join 21 partners from 8 countries across Europe on OPENTUNITY, a new, EU-funded initiative focused on enhancing distributed energy resource interoperability in order to accelerate grid decarbonization. Under the initiative, Energy Web is providing underlying digital infrastructure for data exchange amongst the consortia of companies comprising OPENTUNITY..

OPENTUNITY aims to unlock deep flexibility from distributed energy resources by eliminating data silos and establishing standards for data exchange, all with a focus on creating value for end customers. More specifically, the initiative is focused on enabling prosumers and other market participants to more easily provide demand flexibility to grid operators.

Partners under the initiative are focused on conducting thirteen demonstration projects using a common underlying digital infrastructure. These demonstration projects are meant to test hypotheses and prove value for a number of different technology and business use cases with a common theme: distributed energy resources, properly integrated with the grid, can create value for both end customers and energy market participants.

These demonstrations will be conducted in 4 different EU Countries: Greece, Slovenia, Spain and Switzerland. Initial estimates forecast the potential for innovations unlocked via OPENTUNITY to reduce energy bills by 30% for end customers, primarily by making distributed energy resources a core part of grid planning and operations. in Greece, 4,98 cts/kWh in Slovenia, 6,9 cts/kWh in Spain and 6 cts/kWh in Switzerland.

About Energy Web

Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, data exchange, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union. Neither the European Union nor the granting authority can be held responsible for them. Horizon Europe Grant agreement Nº 101096333.

Energy Web joins the OPENTUNITY consortium to open electricity ecosystems to decarbonize European… was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

As traditional VC funding slows, revenue-based financing is becoming popular

Direct-to-consumer brands have been the primary driver for revenue-based financing, which is estimated to be a $100 billion opportunity in India by 2025.
Bhavik Vasa is the founder of GetVantage, an alternative funding platform based in Mumbai that provides startups revenue-based financing (RBF), where entrepreneurs don’t need to dilute equity or pay interest....

Scammers steal $117,000 using mobile money transfers every month in Malawi

Malawians have more mobile wallets than bank accounts, but security lapses allow for identity theft and other fraud.
In November 2022, the police station at Malawi’s capital city Lilongwe registered at least two complaints involving fraudsters who had transferred more than 3 million Malawi kwacha (around $2,920 at...

‘iPhones are made in hell’: 3 months inside China’s iPhone city

Workers describe a peak production season marred by labor protests and Covid-19 chaos, right as Apple reconsiders its China supply chain.
Chinese factory laborers call jobs like Hunter’s “working the screws.” Until recently, the 34-year-old worked on the iPhone 14 Pro assembly line at a Foxconn factory in the central Chinese...

FIDO Alliance

Cybersecurity Policy Forum: Identity, Authentication and the Road AheadCybersecurity Policy Forum:

2023 brings a new year and a new Congress – but America is still struggling with many of the same old problems when it comes to digital identity and authentication. […] The post Cybersecurity Policy Forum: Identity, Authentication and the Road AheadCybersecurity Policy Forum: appeared first on FIDO Alliance.

2023 brings a new year and a new Congress – but America is still struggling with many of the same old problems when it comes to digital identity and authentication. Passwords keep getting phished, new account fraud keeps growing, and companies and consumers continue to struggle to prove that they are not a proverbial “dog on the Internet.” It’s becoming a major policy concern – and policymakers are considering a number of new initiatives to better protect people and combat these trends.

On January 25th, the Better Identity Coalition, FIDO Alliance, and the ID Theft Resource Center (ITRC) came together to present a policy forum looking at “Identity, Authentication, and the Road Ahead.”

This policy forum brought together leaders from government, industry, and non-profits to discuss topics including:

The release of the ID Theft Resource Center Annual Data Breach Report The impact of identity-related cybercrime on industry and government over the last year The human toll of identity theft – and the need to build inclusive digital identity systems that work for everyone What to expect from the new Congress and the Biden Administration in 2023 Updates on new products and standards like FIDO that can make identity and authentication both more secure and easier to use Discussions on what can be done to drive better identity infrastructure in America

The post Cybersecurity Policy Forum: Identity, Authentication and the Road AheadCybersecurity Policy Forum: appeared first on FIDO Alliance.


Blockchain Commons

Musings of a Trust Architect: Data Minimization & Selective Disclosure

I have been struggling for a while to communicate my framing of definitions for Data Minimization and Selective Disclosure, which are privacy-focused data-protection techniques that Blockchain Commons is now incorporating into Gordian Envelope. A few years ago, I supported an RWOT paper on the topic, but it ultimately didn’t match my vision, in part due to a traditional focus on government credenti

I have been struggling for a while to communicate my framing of definitions for Data Minimization and Selective Disclosure, which are privacy-focused data-protection techniques that Blockchain Commons is now incorporating into Gordian Envelope.

A few years ago, I supported an RWOT paper on the topic, but it ultimately didn’t match my vision, in part due to a traditional focus on government credentials that we’re (unfortunately) unlikely to be able to influence.

This year I gave it a new try by authoring an advanced-reading paper for RWOT 11 talking about our use of redaction and noncorrelation in Gordian Envelope. It’s generated an interesting discussion (and upcoming collaborative RWOT white paper) on when correlation is purposeful and when it’s accidental. It’s also provided me with the opportunity to more explicitly write down my thoughts on data minimization and selective disclosure, which I’ve excerpted below to continue my series of Musings of a Trust Architect.

Read More Data Minimization

Data Minimization is the practice of limiting the amount of shared data to the minimum necessary: just enough for parties to successfully transact, accomplish a task, or otherwise meet a goal with each other, while minimizing risks to all parties by omitting unnecessary content.

Though essential as part of security best practices (along with Least privilege, a topic for a future musing), Minimimal Disclosure is, in particular, is mandated practice for maintaining the privacy of people using digital identities and thus requires special attention when collecting and sharing, to curtail personal information to only that which is absolutely necessary.

The best practices of Data Minimization guide the design and implementation of personal data protection regulations, such as the General Data Protection Regulation (GDPR) in the European Union. They include:

Service providers must only collect the minimum amount of personal information necessary to perform a specific task or service. Service providers must imit the length of time personal information is retained and must delete it when it is no longer needed. Individuals should only share personal information with third parties when it is necessary to perform a task or service.

The GDPR supports these best practices by requiring companies to have a legal basis for collecting and processing personal information and by giving individuals the right to access, correct, and delete their personal information. The GDPR also requires companies to implement appropriate security measures to protect personal information and to notify individuals and authorities in case of a data breach.

Other regulations that require some form of Data Minimization include:

The California Consumer Privacy Act (CCPA) in the United States, which requires businesses to disclose the categories of personal information that they collect, use, disclose, and sell and also to obtain consumer consent for the sale of personal information. The Payment Card Industry Data Security Standard (PCI DSS), which is a set of security standards for organizations that handle credit card information and which requires merchants to limit the amount of cardholder data that is stored, processed, or transmitted. The Personal Information Protection and Electronic Documents Act (PIPEDA) in Canada, which requires organizations to collect only the personal information that is necessary for the identified purpose. The Health Insurance Portability and Accountability Act (HIPAA) in the United States, which requires healthcare organizations to protect the privacy of patient health information and to collect only the minimum necessary information to provide care. The Cybersecurity Information Sharing Act (CISA) in the United States, which encourages organizations to share information about cybersecurity threats, but also requires companies to protect personal information and to limit the collection of data to that which is necessary to address the threat.

In general, Data Minimization is enacted by policy decisions that limit the amount, duration, and scope of data collection and use.

Content minimization (amount) involves collecting only the minimum amount of data necessary. Temporal minimization (duration) involves retaining data only for the minimum amount of time necessary to execute the task. Scope minimization (scope) involves using data only for the specific purpose of the active task.

In the digital credentials ecosystem, data minimization is primarily implemented through policy decisions made by stakeholders: credential issuers ensure that credentials can be presented in a way that enables data minimization, and credential inspectors establish policies in advance regarding the minimum data necessary to accomplish a task, the minimum time data can be stored, and the processes that ensure data is only applied to the task at hand.

Subjects may desire data minimization for various reasons, such as reducing the risk of undesired correlation or providing information gradually (Progressive Trust). Verifiers may also desire data minimization for other reasons, such as avoiding appearing threatening, protecting from “I told you so” situations, avoiding potential collection of “toxic” GDPR data, and reducing the cost of storing information.

ISO (International Organization for Standardization) has several standards related to data protection and privacy that include principles of data minimization. Some examples include:

ISO/IEC 27001:2013 - Information security management systems (ISMS) provides a framework for managing sensitive information and includes a requirement for organizations to minimize the amount of sensitive information that is collected, used, and retained. ISO/IEC 29100:2011 - Privacy framework provides a framework for protecting personal information and includes a requirement for organizations to only collect personal information that is necessary for the specific purpose for which it is being collected. ISO/IEC 27701:2019 - Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management provides a framework for managing personal information and includes a requirement for organizations to minimize the amount of personal information that is collected, used, and retained. ISO/IEC 29151:2012 - Information technology - Security techniques - Data-at-rest protection provides guidance for protecting data when it is stored and includes a requirement for organizations to minimize the amount of sensitive information that is stored. ISO/IEC 27040:2015 - Storage security provides guidance for protecting data when it is stored and includes a requirement for organizations to minimize the amount of sensitive information that is stored.

Data minimization can be challenging to implement due to a number of factors, such as:

Difficulty in determining the minimum necessary data: It can be difficult to determine exactly how much data is necessary to accomplish a specific task or goal, especially when dealing with complex systems and processes. Balance of data minimization with other goals: Organizations may be torn between the need to collect and retain data for legitimate business purposes and the need to minimize data to protect privacy and security. Usage of data silos: Data may be collected and stored across multiple systems and departments, making it difficult to identify and remove unnecessary data. Lack of user consent: Data minimization may not be possible if users are not willing to share their personal information, which can make it difficult to implement privacy-enhancing technologies such as selective disclosure. Selective Disclosure

“Selective Disclosure” is one of a number of privacy-enhancing techniques that go beyond Data Minimization to also protect against correlation. Selective Disclosure allows individuals or organizations to share only specific pieces of information, rather than sharing all of them, and prevents using correlation inappropriately to merge information from different contexts without consent. Selective Disclosure offers approaches that balance the need to share information for legitimate purposes against the need to protect the privacy and security of people and to minimize risks.

GDPR does not mandate the use of Selective Disclosure, but the GDPR does gives individuals the right to control their personal information, the right to access, correct, and delete their personal information, and the right to object to certain processing activities. The California Consumer Privacy Act (CCPA) also allows individuals to know what personal information is being collected and shared and to opt out of the sale of their personal information.

In addition, some organizations have developed their own standards for Selective Disclosure, such as the Platform for Privacy Preferences (P3P) which provides a mechanism for websites to disclose their data collection and sharing practices and for individuals to set their privacy preferences.

Some requirements for Selective Disclosure include:

Granularity: Allow individual or organizations to share only specific pieces of information, rather than sharing all of their personal information. This enables users to share only the information that is necessary to accomplish a specific task or goal. Control: Give users more control over their personal information by allowing them to decide what information they want to share, and with whom they want to share it. Transparency: Allow users to see what information is being shared and with whom, which enhances trust and transparency in the sharing process. Security: Use cryptographic techniques to secure the information shared, ensuring that only authorized individuals or organizations can access the information. Privacy: Minimize the amount of personal information that is shared, reducing the risk of data breaches or unauthorized access to personal information. Compliance: Help organizations to comply with data protection regulations, such as GDPR, by minimizing the amount of personal data collected, used and retained. Auditability: Allow organizations to track and audit information-sharing activities, providing transparency on how the data is being used. Flexibility: Allow organizations to adapt the sharing process to different scenarios and use cases, providing necessary flexibility.

More specifically, Selective Disclosure can help address some of the challenges of Data Minimization mentioned above:

Difficulty in determining the minimum necessary data: By allowing users to share only the specific information needed to accomplish a task or goal, Selective Disclosure can limit the data collection to only necessary information, avoiding additional information that is sensitive and potentially toxic. Balance of data minimization with other goals: By requiring granulatity in Selective Disclosure, the different goals of different groups can be evaluated and addressed. Usage of data silos: Selective Disclosure can help with data silos by allowing users to share information from different systems and departments while ensuring that only authorized individuals or organizations can access the information and that it is not shared or stored for longer than necessary. This can also help organizations to comply with data protection regulations such as GDPR. Lack of user consent: By giving users more control over their personal information, Selective Disclosure allows organizations to obtain user consent for information sharing in a more effective way. Users can choose what information they want to share and with whom, giving them control over what information an organization has access to. This can also increase trust and transparency between the organization and its users. Cryptographic Techniques for Selective Disclosure

Cryptographic Selective Disclosure leverages cryptography to allow individuals to selectively share specific pieces of their information while keeping the rest of their information private. It allows parties to prove certain attributes about themselves, without revealing their entire identity.

There are three important approaches to cryptographic Selective Disclosure:

Hash-based Elision (or Redaction): A hash is a cryptographic fingerprint for a set of data. It takes an input and creates a unique output. In Hash-based Elision, one party (the prover) presents to another party (the verifier) a hash of a piece of personal information without revealing the actual information (redacts it). One advantage of using cryptographic hashes for selective disclosure is that they are relatively simple and efficient method for hiding personal information, don’t require complex mathematical calculations, and doesn’t necessitate a trusted third party. A disadvantage of using cryptographic hashes is that without multiple round-trips they do not provide any cryptographic guarantees that the prover knows the original data (they may be just passing the hash forward). Another disadvantage is that there is a correlation risk if the same hash is given to multiple parties, which requires techiques like salting. Zero-Knowledge proof (ZKP): This cryptographic techique allows one party (the prover) to prove to another party (the verifier) that they possess certain information, without revealing the actual information. This is done by allowing the prover to demonstrate that they know the information. ZKP can be used to prove a wide range of attributes like knowledge of a password, possession of a private key, and membership in a group. As compared to a cryptographic hashes, ZKP is a method for proving knowledge of a piece of information without revealing the actual information, while cryptographic hashes are a way to hide the actual information, but not the knowledge of it. A disadvantage of ZKP is that it can be computationally expensive or require a large amount of communication between the prover and verifier; both can be a bottleneck in certain scenarios where low latency is a concern or where constrained hardware is being used. Blind Signature: As the name implies, this is a technique in which a signature is “blinded” before it is signed, hiding the identity of the signer. This allows the signer to prove that they possess a certain attribute, such as being over 18 years old, without revealing their name or other personal information. The signature can then be “unblinded” to reveal the identity of the signer. This makes it useful in scenarios where it is important to prove that a document has been signed by a specific individual without revealing their identity. One disadvantage of blind signatures is that they rely on a trusted third party, which can be a bottleneck and a centralization or security risk.

There are also adjacent technologies that may allow the leverage of cryptography for Selective Disclosure, but are not as broadly being investigated today.

Secret Sharing: This technique allows a secret to be divided into specific number of shares, sent to parties from whom a specific number of shares are required (a quorum) to reconstruct the secret. This can be used as an escrow for a Selective Disclosure. A disadvantage is that restoring original secret requires one party to be entrusted to restore the quorum. Secure Multi-Party Computation (MPC): This technique allows multiple parties to jointly compute a function without revealing their inputs to each other. This is another method of escrow, one that solves the problems inherent to secret sharing, but at the cost of multiple rounds of interaction between the parties, and computational complexity. Homomorphic Encryption: This technique allows computations to be performed on ciphertext, resulting in the same plaintext output as if the computation was performed on the plaintext. As a result, computations can be performed on encrypted data without decrypting it first or revealing the actual data. However, these techniques are extremely computationally expensive (by multiple orders of magnitude).

Monday, 30. January 2023

FIDO Alliance

Recap: 2023 Identity, Authentication and the Road Ahead #IDPolicyForum

By: FIDO staff The identity landscape is set to undergo tremendous transformation in 2023 as lawmakers and regulators alike struggle to help protect individual privacy and improve access to services […] The post Recap: 2023 Identity, Authentication and the Road Ahead #IDPolicyForum appeared first on FIDO Alliance.

By: FIDO staff

The identity landscape is set to undergo tremendous transformation in 2023 as lawmakers and regulators alike struggle to help protect individual privacy and improve access to services and the digital economy. A primary underpinning for what will enable the new identity landscape is strong authentication.

On Jan. 25, the Better Identity Coalition, the FIDO Alliance, and the ID Theft Resource Center (ITRC) co-hosted the Identity, Authentication, and the Road Ahead Cybersecurity Policy Forum in Washington, D.C. to discuss the challenges and opportunities of identity and authentication. 

The full-day event included sessions loaded with data on the current state of data breaches, presentations by government leaders, panels on the state of passkeys and the path toward better identity in 2023 and beyond. A key theme that was often repeated throughout the day, by experts from government and industry alike, was the complexity of the identity landscape and the need for more collaboration and interoperable standards.

“A lot of our ability to make progress on the set of problems starts with a bigger issue, the recognition that identity is critical infrastructure and needs to be treated as such,” Jeremy Grant, Managing Director, Technology Business Strategy at Venable LLP and Coordinator, Better Identity Coalition said during his opening remarks for the event.

“Until we start to think about identity that way we’re going to continue to struggle to address challenges in this space.”

Identity risk continues to grow

In the opening keynote session, Jimmy Kirby, Acting Deputy Director of FinCEN (Financial Crimes Enforcement Network) outlined the identity related issues his agency has seen in recent years.

Kirby said that in recent years financial services have been increasingly migrating towards a primarily online environment. It’s a trend that creates new opportunities for abuse. As a result, FinCEN has been thinking about how it can leverage all of the data that financial institutions send to it to help stem the tide of abuse.  He noted that identity related suspicious activity reports (SARs) submitted to FinCEN grew more than 15%, from 2021 to 2022.

According to Kirby, reports of threats at each stage of the customer identification process continue to grow from the proofing and enrollment stage to the authentication stage, including the use of compromised credentials, impersonation and artificial intelligence to conduct illicit finance.

While there are challenges, there are also opportunities.

“We see opportunities for digital identity to address customer identification breakdowns in customer onboarding, account logins, transaction monitoring, as well as in investigations,” Kirby said. “There are a number of features of a digital identity framework that, taken together, have the potential to address threats and spur innovation across all types of financial services.”

FinCEN isn’t the only organization seeing a spike in cybercrime. James Lee, COO of the ITRC (Identity Theft Resource Center) presented data from his organization’s annual data breach report. Among the top line highlights of the report is that there were 1,802 data breaches during the year impacting over 422 million victims.

Lee commented that a prevailing trend was an increase in supply chain attacks as a preferred attack vector over just malware. He also emphatically complained about the lack of information present in many data breach disclosures. Lee said that 66% of data breaches did not include information about the root cause of the attack which led to the breach or any victims details.

In a panel session, titled “Data Breach Notices Suck,” John Breyault, Vice President, Public Policy, Telecommunications and Fraud at National Consumers League (NCL) lamented the current state of password usage, which inevitably is a root cause for many data breaches.

“I have been doing consumer education work for 15 years now at NCL, and not a day goes by it seems that I don’t tell consumers to not use the same password across multiple accounts,” Breyault said.

Towards the U.S. Government plan on secure digital identity

In a lunchtime keynote, Congressman Bill Foster (IL-11), outlined his view on Congressional efforts to introduce a secure digital identity policy for the U.S. 

Foster emphasized time and again during his keynote that secure digital identity needs to be a bipartisan effort in the U.S. Congress as it’s an issue that impacts all Americans. While he noted that there might be some concerns about the U.S. government having a database of user identities that it issues, he argued that to most people, the real life threat to their privacy comes more from having someone impersonate them online.

The lack of secure digital identity may have also been a factor in the massive volume of fraud experienced by the U.S. government over COVID benefits. Conversely, the fact there wasn’t a secure digital identity scheme in place may have made it more difficult than necessary for some to be able to get benefits. Overall, Foster said that he’s hopeful Congress can put something together.

“It can serve as a gentle reminder that the government does some good in your life,” Foster said. “One of the things that we could do a much better job with is preventing identity fraud, because that’s a real life pain for tens of millions of Americans every year.”

Bias and diversity is a requirement of digital identity

In multiple sessions over the course of the event, the topic of fairness, bias and diversity in relation to digital identity was discussed.

Jordan Burris, VP and Head of Public Sector Strategy at Socure commented that in his view, bias a lot of times comes down to the reality that an identity approach is taken that is solving for the majority of the population, and as such, the minority or those who operate on the fringes are being left out of the ecosystem.

Andrew Stettner, Deputy Director for Policy at the Office of Unemployment Insurance Modernization at the U.S. Department of Labor argued that his agency and the entire administration are taking equity in identity very seriously.

“We’re looking at equity in a much more conscious way, for us is a very key element of identification going forward,” Stettner said.

Why FIDO is critical for better identity

A critical element of secure identity is having strong authentication.

In a keynote session, Andrew Shikiar, Executive Director and CMO of FIDO Alliance, outlined the ways that FIDO is playing a role in helping to improve the state of identity today across multiple efforts. He also predicted that FIDO will become increasingly relevant in the year ahead.

“The average person on the street will start to understand what identity verification means, and actually start to understand what digital identity means,” Shikiar said. “That’s a net benefit because the more people understand what their identity means, and the importance of it, the more steps they’ll take to actually protect it.”

Among the FIDO efforts to help improve identity outlined by Shikiar are:

Biometric performance criteria. This is a biometric certification program, where FIDO helps to assess the performance of different biometric components that are critical to identity verification. Remote Identity Verification. This includes the Document Authenticity (DocAuth) Certification for mobile document verification, with ongoing work into face verification for liveness and selfie-match.

Shikiar also talked at length about passkeys, which brings added usability to FIDO based strong authentication.

“FIDO Alliance’s mission is to reduce the industry reliance on passwords,” Shikiar said. “Simply put, passkeys stand to take passwords out of play for the vast majority of consumer use cases.”

The passkey future for authentication

In a panel session on passkeys, panelists discussed the benefits and opportunities that passkeys will bring.

Tim Cappalli, Identity Standards Architect at Microsoft detailed what passkeys enable, including the ability to take a FIDO credential and use it in a similar way to how password managers work today. Passkeys can also be synchronized with a cloud provider and are interoperable across platform vendors enabling better usability overall.

Panelists emphasized that the promise of passkeys is to more easily enable users to benefit from strong authentication. Christiaan Brand, Product Manager, Identity and Security at Google explained that Google has been supporting FIDO for years, including supporting security key based approaches. In his view, passkeys represent the usability necessary to actually make strong authentication with un-phishable credentials a reality for Google’s users.

Usability was also a theme that Paul Grassi, Principal Product Manager – Identity Services at Amazon emphasized, since in in his view, past efforts to get strong authentication adoption haven’t been entirely successful

“It breaks my heart to say it but consumers are not adopting security keys, they’re not adopting Google Authenticator they’re not adopting two-factor,” Grassi said. “We’re excited to see passkeys as that replacement, and to see the adoption numbers skyrocket, reducing friction while increasing security, which is, I think, the goal of any security practitioner.”

The recording of the full event is available here.

The post Recap: 2023 Identity, Authentication and the Road Ahead #IDPolicyForum appeared first on FIDO Alliance.


ResofWorld

India’s tech unions see an opening amid a layoff tsunami

Indian tech workers have so far stayed away from unions, as they see them as blue-collar entities; they also fear backlash from employers.
In October 2022, Rahul, an employee at an Indian edtech firm, received an unexpected email from his employer: He was being asked to resign, and the following week would be...

Digital Scotland

AI in Video Games – Insights from Scottish Experts

Keynote experts from Scotland's gaming industry share insights on the evolving role of AI in games. The post AI in Video Games – Insights from Scottish Experts appeared first on digitalscot.net.

Think, Fight, Feel As the Guardian writes AI is playing an increasingly important role in Games.

From more intelligent characters through dynamically generating complex story-lines to rendering ever more life-like graphics, the technology is infused throughout the gaming experience.

Scotland boasts a world-class depth of expertise that can play a critical role in advancing the industry as a whole.

Scottish AI Alliance

This includes the Scottish AI Alliance, who recently hosted a podcast on the topic.

The Alliance is tasked with the delivery of the actions outlined in Scotland’s AI Strategy in an open, transparent and collaborative way, and the group provides a focus for dialogue, collaboration and, above all, action on all things AI in Scotland.

In this episode of their podcast they caught up with Gregor Hofer, CEO Speech Graphics, and Matthew Jack, CEO Kythera AI, to find out how they are using AI to improve video game experiences.

Chris van der Kuyl – How is AI used within video games? Chris van der Kuyl, Chairman of 4J Studios, is a seminal figure in Scotland’s gaming industry, and on this RSE webinar explore in depth the topic of AI within video games.

At 00:12, Kuyl introduces the wide definition of AI.

He says game designers typically ‘cheat’ because if they use the perfect AI technology it would consume all the processing capacity and then the game will typically slow down, especially if it’s a massively multiplayer game. Typically game developers use an unsophisticated side of AI which utilizes basic rules as to what one should do in certain circumstances.

There is also a use of much more sophisticated AI, such as providing the intelligence for a computer controlled character for the player to fight against, so that it feels as much as possible like playing with a real human being.

What are some examples of outstanding use of AI within games?

At 01:45, Kuyl says that the ultimate accolade is when a player can’t tell whether it’s AI software on the other end or a real player, a passing of the Turing Test so to speak.

When you look for specific examples, some of the work being done recently on very sophisticated AI is exemplified when you talk about games like chess and Go where they are remarkably good and are defeating the worlds best players. (An evolution documented in the AlphaGo movie).

Companies in the UK are also developing AI algorithms that keep getting better, you don’t tell the machine how to play but instead effectively tell them the rules. Because of the fast speed and processing of these algorithms, within hours and sometimes minutes, they turn in to the world’s best players without a human telling anything. Especially board games with fixed rules are where AI has been executed the best so far.

How has the gaming industry pushed AI advancements within other industries?

From 04:32, Kuyl says it’s difficult to state that games alone are advancing AI innovations, instead it is a simultaneous progress in parallel across multiple industries and open source communities.

However gaming and the gaming industry is an ideal environment for trying out new algorithms and feeding knowledge back into the communities. In particular Chris believes that what game development is the best at it is utilizing limited resources; as he mentioned developers ‘cheat’, referring to their approach of finding the optimal model for achieving game play functionality and also the huge amounts of user testing gaming delivers.

Thus they provide the best forum for making use of algorithms in the real world and feed that knowledge back into the academic community.

Is it possible for AI engines to adapt themselves using this player feedback?

At 06:43, Kuyl mentions how all good game developers put metrics into games to see data regarding how people genetically play the game, where the hotspots are and where improvements can be made in time. However a fine balance has to be struck between the improvements and game experience because you can’t significantly make changes like people stop defeating levels that they could easily defeat earlier.

Have hardware advancements allowed developers to push the boundaries of AI?

At 08:53, Kuyl talks about the integration of such sophisticated AI has been made possible because of the overall improvement in technology. He talks specifically about GPUs (Graphical Processing Units), that companies like Nvidia and AMD make, that may sound very expensive to people but the processing power it gives the user is almost inconceivable. Twenty or thirty years ago this was a power only available in Cray supercomputers.

He also talks about how in automotive industry, people have started using autonomous AI algorithms to drive cars. He also goes on to talk about cloud based AI in massively multiplayer games which, from a player’s perspective, is a player like any other player but it’s mainly to gather feedback. Some of that can also be challenging because they have to careful to not let the user experience be affected.

What are your views on the ‘Paper Clip Theory’?

At 13:29, he says that everyone has to realize that any computer program and any instructed machine will operate as the humans have made it to. However with AI, there’s no doubt that you have to be incredibly careful. He goes on to talk about the different schools of thought, the followers of one are extremely scared about the future of AI as it gets stronger while the other are very optimistic.

He talks about robots are never meant to harm the user so the same should be kept in mind while building systems with AI to rule out any negative implications. He says the optimization of anything without thinking of the consequences around it can naturally be harmful.

(The “paperclip maximizer” is a thought experiment described by Swedish philosopher Nick Bostrom in 2003. It illustrates the existential risk that an artificial general intelligence may pose to human beings when programmed to pursue even seemingly-harmless goals, and the necessity of incorporating machine ethics into artificial intelligence design.)

Do you think improved AI will hurt future developers or become another tool?

At 20:16, he says that the availability of solutions has really made a significant difference. He doesn’t think reinventing the wheel is necessary so a lot of record developments are optimized, shared and making life easier. He also believes that these developments should be democratized to avoid the alchemist paradox where only one person has all the knowledge so naturally they become more valuable.

He references the inventor of Minecraft (Markus Persson) as a great example. He didn’t overly concern himself with it being the best technology but rather that it delivered the best gaming experience for players, and then as the venture grew organizations like Chris’ 4J Studios could worry about the detail of platform optimizations. This is a guiding principle for the type of team members Chris is looking for.

The post AI in Video Games – Insights from Scottish Experts appeared first on digitalscot.net.


Playing to Win: A Digital Nation Action Plan for Scotland’s Gaming Sector

Ambition, ideas and action plan to expand Scotland's gaming industry to an even larger scale of success. The post Playing to Win: A Digital Nation Action Plan for Scotland’s Gaming Sector appeared first on digitalscot.net.

This article begins an Industry Innovation Roadmap for Scotland’s Gaming sector.

Industry Innovation Roadmaps provide a vehicle for a sector to collaboratively develop a shared growth strategy, encompassing market research, product innovation and development of new routes to market.

Career Opportunities In this STV News special they highlight just one dimension of the sector, sharing the story of Jade MacIntyre who is developing a career as a game streamer.

Jade broadcasts to more than 20k followers of her Facebook live streams playing Call of Duty.

It has proven more lucrative than the law degree she began and is personally much more rewarding for her too. Jade was urged into it by friends and was initially reluctant but decided to give it a go and her success brought sponsors and advertising revenues.

So this gives a small taste of the fact the gaming offers many different facets and career opportunities not just game programming.

Scotland’s Gaming Industry

From 3m:15s in the video STV move on to describing the gaming industry for Scotland. In the UK it’s now a sector worth £7 billion. Dundee boasts more than 40 companies with thousands of employees, and has plans to open a 4,000 seat capacity Esports arena.

They interview the team at Dundee and Angus College to highlight how the sector acts as a magnet and enabler for students to enter the industry. One very interesting career journey they share is of Lucas Blakeley, who began in motor racing but had to give it up due to costs who then found success as a virtual F1 Esports racer, and that success enabled him to return to real-world racing.

Scottish Enterprise provides industry support for Scotland’s digital sectors, and here provides an overview of the cluster.

The Scottish Games Network reports that Scotland’s industry grew 26% between April 2020 and December 2021, keeping Scotland’s position as the UK’s fourth largest games cluster, following London, the South East and the North West. There are over 2,000 creative staff working on games development in 147 companies, generating over £350 million for Scotland’s economy.

Playing to Win

To the point of our innovation roadmap there are also challenges and much more room for growth, indeed given the success of the sector thus far and the ever-expanding global market, it’s a strategic priority for our whole economy to do so.

The founder of the Scottish Games Network Brian Baglow talks with the Herald’s Neil Mackay in this interview, providing a detailed summary of just how successful the sector has been but yet coverage is very low key – Scotland should be doing far more to shout about it.

The cost of living crisis is forcing a slow down in recruitment and last year the Courier wrote that Scotland risks being left behind.

The article cites the report ‘Playing to Win‘ published by Our Scottish Future, which warns Scotland’s gaming sector now risks being left behind as American, Chinese and Japanese giants dominate the market, and proposes the development of a UK-wide network to leverage economies of scale.

“The National Strategy for Economic Transformation launched earlier this year in Dundee. It does not grasp the potential for greater cooperation with the rest of the UK.

Nor has Westminster woken up to the value of combining Scottish expertise with industry across the country.”

The report offers a first blueprint that can act as an Action Plan for Scotland’s gaming sector, with a number of recommendations to draw out and act on.

Collaboration Network – Building Scotland’s Games Ecosystem

The central theme of building a collaboration network is one that can be pursued with immediate impact.

As Brian Baglow comments to Digit he already leads such work to great effect, and academics at Stirling and Glasgow Universities teamed up on research to help create more successful companies that can compete on a global basis, producing this report.

This research offers another knowledge asset to build an action plan around, providing a number of critical insights that can be acted on:

Companies are geographically scattered – No games hub. Highly competitive sector. Difficulty connecting with potential clients, especially internationally. Difficulty recruiting talent: a skills gap and an access issue, locally and internationally. Little collaboration between academia and games. Lack of funding for joint projects. Weak entrepreneurial culture and mindset – Game Makers do not identify as entrepreneurs. Weak networks among the game maker community – Weak networks between game makers and other key actors. The sector is not well understood, which undermines the quality of support – Key support needs around the commercial side of running a company. Lack of role models to inspire future generations. Lack of mentors to provide guidance. There is nobody leading the industry (!)

Clearly Scotland’s opportunity is simply one of execution: These are all challenges easily addressed through the resources we have to hand.

The global market is vast in size while our small country punches far above it’s weight in both legacy and sector capability, meaning it’s not a challenge of building from scratch but rather of more and better joining of dots to yield a further boost for an already supercharged industry.

The report offers the headline theme and objective of the action plan: Building Scotland’s Games Ecosystem.

The post Playing to Win: A Digital Nation Action Plan for Scotland’s Gaming Sector appeared first on digitalscot.net.

Sunday, 29. January 2023

Digital Scotland

Industry Innovation Roadmaps

Industry Innovation Roadmaps provide a vehicle for a sector to collaboratively develop a shared growth strategy. The post Industry Innovation Roadmaps appeared first on digitalscot.net.

A headline feature of our Digital Nation Action Plan is the development of ‘Industry Innovation Roadmaps’. Industry Innovation Roadmaps provide a vehicle for a sector to collaboratively develop a shared growth strategy, encompassing market research, product innovation and development of new routes to market.

Roadmaps collate market research, identify niche opportunities and co-ordinate a shared innovation pipeline for members to co-develop products to meet those opportunities.

Global Best Practices

The approach is based on best practices from around the world, notably Canada.

Canada has made extensive use of Michael Porter’s cluster model to underpin their economic development strategies, a focus on building industry sector collaborations that pool resources and encourage shared innovation that grows the success for all the participating members.

The backbone of these efforts are the creation of ‘Technology Roadmaps‘ (TRMs), a common product innovation roadmap and business plan for the whole industry that each business contributes to and can benefit from.

Technology Roadmaps are introduced and how to use them explained in detail in this guide, and the Canadian Government further explains them here, providing a list of completed roadmap programs for industries such as electric vehicles, intelligent buildings and smart grids among others, where they describe:

“Technology roadmapping brings players together to work together in a far-reaching planning process and opens the door to collaborative research and development (R&D).

Technology Roadmaps (TRM) can play a key role in enhancing innovation. It is a document outlining future market demand and the recommended means to meet this demand. A roadmap does not predict future breakthroughs in science or technology; rather, it forecasts and articulates the elements required to address future technological needs. A roadmap describes a given future, based on the shared vision of the people developing the roadmap and provides a framework for making that future happen technologically.”

whole book is available that explores their usage in academic detail.

Micro Clusters

Clusters have long been a staple of government’s Economic Development strategies but are often organized by government in a top down fashion around large, broad industries like aerospace, and typically feature mostly large enterprises.

For smaller countries like Scotland with mostly SMEs this can be a limited approach, and so Digital Scotland is pioneering an approach of ‘Micro Clusters’, a bottom up forming up of an industry group around very specialized, small but very high growth niche segments.

The post Industry Innovation Roadmaps appeared first on digitalscot.net.

Friday, 27. January 2023

ResofWorld

Why is the Modi documentary so hard to find? Some blame lies with the BBC

Copyright claims by the BBC are making a bad situation worse.
For just over a week, India’s Ministry of Information and Broadcasting has demanded online platforms remove links to a BBC documentary on Prime Minister Narendra Modi, which explores his role...

FIDO Alliance

GlobeNewswire: NordPass will store passkeys and offer passwordless authentication

LONDON, Jan. 26, 2023 (GLOBE NEWSWIRE) — On Thursday, NordPass, the password management company, announced its plans to provide passwordless online authentication solutions to its users in the upcoming months. With an increasing […] The post GlobeNewswire: NordPass will store passkeys and offer passwordless authentication appeared first on FIDO Alliance.

LONDON, Jan. 26, 2023 (GLOBE NEWSWIRE) — On Thursday, NordPass, the password management company, announced its plans to provide passwordless online authentication solutions to its users in the upcoming months. With an increasing number of websites supporting passkey technology, the company’s customers will soon be able to keep their passkeys in NordPass and access it with biometrics only.

The post GlobeNewswire: NordPass will store passkeys and offer passwordless authentication appeared first on FIDO Alliance.


Security Boulevard: Are you using a FIDO Certified authenticator?

Multi-factor authentication (MFA) gets touted as a significant security improvement over traditional “username + password” authentication. However, not all MFA processes are created equal. As the opportunities narrow for cybercriminals to […] The post Security Boulevard: Are you using a FIDO Certified authenticator? appeared first on FIDO Alliance.

Multi-factor authentication (MFA) gets touted as a significant security improvement over traditional “username + password” authentication. However, not all MFA processes are created equal. As the opportunities narrow for cybercriminals to pick off the low-hanging fruit of password-only systems, they’ve turned their focus to weak MFA.

A growing number of organizations have suffered security breaches despite having MFA in place, thanks to expanding digital systems, more advanced phishing tools, and the continued allowance of passwords as an authentication factor. The past year, which saw Microsoft, Uber and Cisco breached by MFA “prompt bombing,” demonstrates that organizations can’t just deploy any type of MFA and presume they’re safe from breaches.

For these reasons, the federal Office of Management and Budget (OMB) and the Cyber and Infrastructure Security Agency (CISA) have emphasized the need for phishing-resistant MFA, specifically passwordless MFA built around FIDO standards. We’ve examined FIDO standards and what they mean for authentication before, but in this post, we look at one of the most critical elements of the process: FIDO Certified authenticators.

The post Security Boulevard: Are you using a FIDO Certified authenticator? appeared first on FIDO Alliance.


Next Inpact: Passwordless Authentication: BitWarden Acquires Passwordless.dev

Even if the movement will take years, sooner or later the future will be without passwords. Many companies are investing in this transition, including large ones. Last year we saw the proclamation […] The post Next Inpact: <a href="https://www.nextinpact.com/lebrief/70882/authentification-sans-mot-passe-bitwarden-rachete-passwordless-dev">Passwordless Authentication: BitWarden Ac

Even if the movement will take years, sooner or later the future will be without passwords. Many companies are investing in this transition, including large ones. Last year we saw the proclamation of passkeys, led by the Apple-Google-Microsoft triad, under the aegis of the FIDO Alliance.

The post Next Inpact: <a href="https://www.nextinpact.com/lebrief/70882/authentification-sans-mot-passe-bitwarden-rachete-passwordless-dev">Passwordless Authentication: BitWarden Acquires Passwordless.dev</a> appeared first on FIDO Alliance.


CSO: How passkeys are changing authentication

Well-implemented passkeys can improve the user experience and make it harder for cybercriminals to launch phishing and other attacks. Passwords are a central aspect of security infrastructure and practice, but […] The post CSO: How passkeys are changing authentication appeared first on FIDO Alliance.

Well-implemented passkeys can improve the user experience and make it harder for cybercriminals to launch phishing and other attacks.

Passwords are a central aspect of security infrastructure and practice, but they are also a principal weakness involved in 81% of all hacking breaches. Inherent useability problems make passwords difficult for users to manage safely. These security and useability shortcomings have driven the search for alternative approaches known generally as passwordless authentication.

Passkeys are a kind of passwordless authentication that is seeing increasing focus and adoption. They are set to become a key part of security in the coming years. Passkeys represent a more secure foundation for enterprise security. Although they are not foolproof (they can be synced to a device running an insecure OS, for example), they are far more secure than passwords for customers, employees, and partners alike.

The post CSO: How passkeys are changing authentication appeared first on FIDO Alliance.


XDA Developers: Here’s how FIDO security keys will keep your Apple ID secure

FIDO security keys are an additional layer of protection for your Apple ID, and here’s how they keep it secure. The most recent update for iOS, iPadOS, and macOS brought a number […] The post XDA Developers: Here’s how FIDO security keys will keep your Apple ID secure appeared first on FIDO Alliance.

FIDO security keys are an additional layer of protection for your Apple ID, and here’s how they keep it secure. The most recent update for iOS, iPadOS, and macOS brought a number of improvements and other changes, but one that stands out is the introduction of support for FIDO Certified security keys for Apple IDs. Apple has since detailed how these keys work with your Apple ID, and how you can use them to secure your account.

The post XDA Developers: Here’s how FIDO security keys will keep your Apple ID secure appeared first on FIDO Alliance.


Payments Journal: 2023 Predictions: Authentication, Digital Identity, and In-Car Payments

As the number of devices and connected services rise, our lives are becoming increasingly digitized. Keeping up with this evolving landscape is vital, and 2023 promises to bring with it […] The post Payments Journal: 2023 Predictions: Authentication, Digital Identity, and In-Car Payments appeared first on FIDO Alliance.

As the number of devices and connected services rise, our lives are becoming increasingly digitized. Keeping up with this evolving landscape is vital, and 2023 promises to bring with it a host of new use cases and innovations. New technologies are coming to market that provide a greatly enhanced user experience that doesn’t compromise on security. Innovative solutions such as SoftPOS are challenging traditional payment methods, while account-to-account (A2A) payments have the potential to shake up the entire payments ecosystem.

We explore some of the key trends in the ecosystem that will have a major impact on the way we live in 2023. From the changing nature of authentication to paying with your car, the ever-digitizing world will continue to transform our lives.

The post Payments Journal: 2023 Predictions: Authentication, Digital Identity, and In-Car Payments appeared first on FIDO Alliance.


Silicon: Authentication: “Apple ID + FIDO key” option enabled

Using a FIDO key as a second authentication factor on an Apple account is now possible, under certain conditions. The post Silicon: Authentication: “Apple ID + FIDO key” option enabled appeared first on FIDO Alliance.

Using a FIDO key as a second authentication factor on an Apple account is now possible, under certain conditions.

The post Silicon: Authentication: “Apple ID + FIDO key” option enabled appeared first on FIDO Alliance.


LionsGate Digital

Heroes of Data Privacy – the practise-oriented data privacy conference in Vienna

Following the success of the 1st edition of Heroes of Data Privacy, the conference will now be even bigger and in-person. Save the date: May 24 and 25, 2023! VIENNA, AUSTRIA, January 26, 2023 /EINPresswire.com/ — At the dawn of the new era of data privacy, organisations of all sizes and sectors face the challenges posed by new regulations and privacy-enhancing technologies.

Following the success of the 1st edition of Heroes of Data Privacy, the conference will now be even bigger and in-person. Save the date: May 24 and 25, 2023!

VIENNA, AUSTRIA, January 26, 2023 /EINPresswire.com/ — At the dawn of the new era of data privacy, organisations of all sizes and sectors face the challenges posed by new regulations and privacy-enhancing technologies. Actionable insights and knowledge on how to overcome these challenges are much harder to come by.

Heroes of Data Privacy is the European conference for data professionals, legal experts, marketers and technologists that addresses the need for practice-oriented knowledge sharing.

On the fifth anniversary of the implementation of the General Data Protection Regulation (GDPR), Heroes of Data Privacy will bring together the brightest minds in data privacy and data quality in Vienna, from regulators to industry professionals, from marketing leaders to legal experts. In 20+ talks, interviews, panels and breakout sessions, they will discuss current and future trends in data privacy – and share valuable insights on how organisations can overcome the challenges and gain new opportunities.

Confirmed speakers include Ann Cavoukian, the Canadian grande dame of data protection and the inventor of the Privacy by Design concept. Lou Montulli, the inventor of the HTTP cookie, will talk about the origins of his innovation that changed the Internet forever. Also confirmed is lawyer and author Thomas Höppner, who has successfully led several Big Data abuse cases against Apple, Google and Amazon.

In addition, Carolin Loy (Regierungsrätin, Bayerisches Landesamt), Rainer Knyrim (Partner and Founder, Knyrim Trieb Rechtsanwälte), Gregor König (Group Data Protection Officer, Erste Group), Tom Peruzzi (CTO, Virtual Minds), Martin Possekel (Managing Partner, Future Marketing), Stefan Santer (Enterprise Sales Director, Didomi), Matthias Schmidl (Deputy Head of the Austrian Data Protection Authority), Alexandra Vetrovsky-Brychta (President DMVÖ) – and many more – will share experiences and insights.

In more than 20 presentations, interviews, discussion panels and breakout sessions, Heroes of Data Privacy speakers will provide information on current and future trends in data protection. Attendees can expect a wide range of valuable insights on how companies are overcoming data privacy challenges and seizing new opportunities for themselves.

For the very first time, Heroes of Data Privacy will take place over two days as an on-site event. The venue at Vienna’s Marx Palast invites attendees to network with peers and experts from a wide range of industries. Heroes of Data Privacy not only pampers its guests with excellent catering during the program breaks, but also offers the tasting of the best cookie as a culinary highlight. These thematically appropriate sweets are created exclusively for the event by selected Viennese confectioners. For holders of the limited VIP tickets, the program will continue until late evening on the first day.

Save the date: May 24 and 25, 2023! Early Bird tickets are only available until February 15.
https://www.heroesofdataprivacy.com
https://www.youtube.com/watch?v=FtOWLACEkv8

Christine Heeger
JENTIS GmbH
contact@heroesofdataprivacy.com

The post Heroes of Data Privacy – the practise-oriented data privacy conference in Vienna appeared first on Lions Gate Digital.

Thursday, 26. January 2023

Digital Scotland

Global Gaming Market: Industry Trends and Segment Opportunities

A brief overview of the vast worldwide market opportunities presented by the global gaming sector. The post Global Gaming Market: Industry Trends and Segment Opportunities appeared first on digitalscot.net.

The Gaming Industry Innovation Roadmap will document the many different segment opportunities that make up the overall sector. The global gaming industry will be worth $321 billion by 2026, according to PwC’s Global Entertainment and Media Outlook 2022-26.

ExplodingTopics provides a list of the individual driving trends shaping the growth of the industry:

Roblox Helps Scale Indie Gaming – Large-scale platforms like Roblox make it easier for independent studios to create and distribute games. Increased Diversity In Games – Games like Animal Crossing have helped bring in more women to the console, PC and esports gaming world. New PC Gaming Platforms Challenge Steam – The PC gaming industry is worth approximately $35 billion, and it’s dominance by Steam is being challenged by new entrants. Cloud Gaming Services Grow And Expand – Online streaming of games enables you to browse thousands of titles, and purchase a game instantly without needing to leave your house. The sector has grown by 400% over the last 5 years. Next Gen Consoles Battle It Out – The console giants Xbox Series X, PlayStation 5 and Nintendo Switch will largely determine the biggest gaming industry trends for the next few years. More Remakes And Reboots – Like Hollywood, the gaming industry is starting to realize that remakes and reboots sell well. And, unlike a completely new franchise, they’re less likely to bomb. Early Access Changes The Game Development Process – Developers use Early Access as a way to get feedback from actual players. Feedback that can ultimately change the direction of a game. Cloud Gaming: Micro-cluster opportunities

The critical point about the roadmap and industry segments is that there is both opportunity for games and also for all the related infrastructure components and the innovations that enhance them.

‘Cloud Gaming’ is a great example of segment offering both ‘Gaming as a Service’ and technology infrastructure markets. ShadowTech explains what Cloud Gaming is, and Facebook and Google provide guides to the underlying technologies that power it. The global cloud gaming market size accounted for $691.6 million in 2021 and is anticipated to expand at a compound annual growth rate of 45.8% from 2022 to 2030.

Identifying and focusing on these specialized niches, rather than trying to address the whole gaming market, is a great way for Scotland to build innovation programs that yield practical, high value success.

For example Cloud Gaming offers a real-world Scottish success story that highlights the commercial opportunity: Cloudgine, an Edinburgh-based startup, was acquired by Epic in 2018.

As the news highlights Epic acquired them for the ability to enable their Unreal Engine to span out and harness Cloud resources to expand its processing capacity.

“Cloudgine’s tech uses cloud servers to enable console, PC, and virtual reality games to render content and interactive objects without worrying about the platform. This is a concept that has shown up in a handful of games, like Titanfall and Forza Motorsport 7. In those online shooters and racing games, the developers offload artificial intelligence routines to the cloud.”

The gaming sector is a major driver of technology advances, with the Blockchain and the Metaverse opening up yet more vast universes of new opportunity. Scotland, with it’s small size but relatively large game development capability, can identify and carve out hyper-scale niche specialisms that cultivate multiple commercial ventures, each with considerable revenue potential.

The post Global Gaming Market: Industry Trends and Segment Opportunities appeared first on digitalscot.net.


Oasis Open Projects

NIEMOpen Initiative for Information Exchange and CSAF Cybersecurity Standard Win OASIS Open Cup Awards

26 January 2023 — OASIS Open, the international open source and standards consortium, announced the winners of the 2022 Open Cup, which recognizes exceptional advancements within the OASIS technical community. The Open Cup for Outstanding New Initiative was awarded to NIEMOpen, a framework for sharing critical data in justice, public safety, emergency management, intelligence, and […] The post N

Andrea Caccia, Jason Keirstead, and Vasileios Mavroeidis Named Distinguished Contributors

26 January 2023 — OASIS Open, the international open source and standards consortium, announced the winners of the 2022 Open Cup, which recognizes exceptional advancements within the OASIS technical community. The Open Cup for Outstanding New Initiative was awarded to NIEMOpen, a framework for sharing critical data in justice, public safety, emergency management, intelligence, and security sectors. The Open Cup for Outstanding Approved Standard was awarded to Common Security Advisory Framework (CSAF) v2.0, a widely used open standard for automated security advisories and vulnerability reporting. Also announced were the 2022 OASIS Distinguished Contributors, individuals recognized for their significant impact on the open source and open standards communities: Andrea Caccia, Jason Keirstead, and Vasileios Mavoeidis.

Open Cup Recipients

The 2022 Outstanding New Initiative, NIEMOpen, transitioned to an OASIS Open Project from the U.S. Department of Defense in October. A collaborative partnership between private industry and all levels of governmental agencies, the NIEM framework enables the effective and efficient sharing of critical data as currently demonstrated in the justice, public safety, emergency and disaster management, intelligence, and homeland security sectors. Developing and implementing NIEM-based exchanges allows diverse organizations to leverage existing investments in information systems by building the bridges for interoperability at the data level. NIEMOpen was chosen as the winner in the Outstanding New Initiative category that included finalists Infrastructure Data-Plane Function (IDPF) TC and the Value Stream Management Interoperability (VSMI) TC.

CSAF v2.0, the Outstanding Approved Standard Open Cup recipient, makes it possible for cyber defenders to quickly and automatically assess the impact of vulnerabilities and respond in an automated way. This version of CSAF includes support for the Vulnerability Exploitability Exchange (VEX) profile, which is especially helpful in efficiently consuming software bills of materials (SBOM) data, part of the recent U.S. Executive Order on Improving the Nation’s Cybersecurity. 

CSAF v2.0 was chosen from a group of finalists that included:

Architecture Management v3.0 & OSLC Quality Management v2.1  OSLC Lifecycle Integration for Project Management of Contracted Delivery v1.0  Security Algorithms and Methods Threshold Sharing Schemes v1.0 Secure QR Code Authentication v1.0 

Distinguished Contributors

Each year, the Distinguished Contributor designation is awarded to OASIS members who have made significant contributions to the advancement of open standards and/or open source projects. This year’s honorees hail from Italy, Canada, and Norway, and exemplify the global commitment and collaborative spirit that is indicative of OASIS members.

Andrea Caccia is an independent consultant and project manager with extensive experience in standard and regulation compliance, electronic invoicing and archiving, data preservation, e-signatures, trust services, blockchain and DLT. Caccia participates in numerous European standardization groups and activities at the European Telecommunications Standards Institute (ETSI), the European Committee for Standardization (CEN), and the International Organization for Standardization (ISO). At OASIS, Caccia is Chair of the Code List Representation TC and is an active member of the Security Algorithms and Methods (SAM), ebXML Messaging Services, and the Universal Business Language (UBL) TCs.

“I am very grateful and honored for this unexpected award from OASIS, where I have always found outstanding and supportive colleagues. I am also very grateful to the OASIS staff, always ready to facilitate our work,” said Caccia.

Jason Keirstead is an IBM Distinguished Engineer and CTO of Threat Management at IBM Security. He has been involved in open technology for decades, making significant contributions to and serving as maintainer of several major open source projects. Keirstead has served on the OASIS Board of Directors since 2018 and currently serves as Co-Chair of the Open Cybersecurity Alliance (OCA), where he enjoys helping to define cybersecurity interoperability. A longtime OASIS member, Keirstead is actively involved in numerous Board Committees and Subcommittees, as well as the Cyber Threat Intelligence (CTI), CSAF, and Collaborative Automated Course of Action Operations (CACAO) for Cyber Security TCs.

“I am both proud and humbled to accept this award. Openness and interoperable standards are what created the internet as we know it, as well as the foundations for all the critical technologies we rely on every day,” said Jason Keirstead. “As technologists, it is important that we continue to build upon that tradition of technology openness and thoughtful collaboration, for the greater good of society – and I feel privileged to have been able to help in those efforts.”

Vasileios Mavroeidis, PhD, a scientist and professor of cybersecurity at the University of Oslo, specializes in the domains of automation and cyber threat intelligence representation, reasoning, and sharing. Mavroeidis is actively involved in European and national research and innovation projects that enhance the cybersecurity capacity of EU authorities and operators of essential services. Since 2021, he has been an appointed member of the European Union Agency for Cybersecurity (ENISA) ad hoc working group on Cyber Threat Landscapes and the Cybersecurity Playbooks task force. Mavroeidis is focused on cybersecurity standardization efforts and has extensive involvement in OASIS. He is currently serving as Chair of the Threat Actor Context (TAC) TC and Secretary of the CACAO TC. In addition, he is engaged in the CTI and the Open Command and Control (OpenC2) TCs.

Mavroeidis said, “First and foremost, I want to thank OASIS for naming me a Distinguished Contributor. It is an award I welcome. I’m a great believer in the value of OASIS standardization activities and their role in enhancing and supporting the European Union’s cybersecurity capacity. My involvement in OASIS has been a rewarding journey, and I look forward to further contributing to the advancement of cybersecurity standardization.”

OASIS congratulates this year’s winners and nominees and thanks them for their willingness to share their time and expertise to help advance OASIS’ work.

About OASIS Open

One of the most respected, nonprofit open source and open standards bodies in the world, OASIS advances the fair, transparent development of open source software and standards through the power of global collaboration and community. OASIS is the home for worldwide standards in cybersecurity, blockchain, privacy, IoT, AI, cryptography, cloud computing, emergency management, and other technologies. Many OASIS standards go on to be ratified by de jure bodies and referenced in international policies and government procurement. 

Media inquiries: communications@oasis-open.org

The post NIEMOpen Initiative for Information Exchange and CSAF Cybersecurity Standard Win OASIS Open Cup Awards appeared first on OASIS Open.


ResofWorld

U.S. and European markets are “very hard to please”: VinFast’s CEO on its global ambitions

Lê Thị Thu Thủy on why Vietnam's Vingroup went from making instant noodles to EVs.
Founded in 2017, Vietnam-based automaker VinFast pivoted to exclusively making electric vehicles just five years after launch. In 2022, it began exporting to the U.S. market, announcing it would set...

Origin Trail

Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge…

Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge Graph and GS1 standards Product master data — the foundation for supply chain visibility Product master data is important in supply chain networks because it provides a consistent, accurate, and up-to-date set of information about the products that are being bought, sold, and moved through the supply
Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge Graph and GS1 standards Product master data — the foundation for supply chain visibility

Product master data is important in supply chain networks because it provides a consistent, accurate, and up-to-date set of information about the products that are being bought, sold, and moved through the supply chain. This data can be used by various stakeholders in the supply chain, such as manufacturers, wholesalers, retailers, logistics providers, and consumers, to make informed decisions, optimize operations, and improve efficiency.

Having access to reliable product master data also sets the foundation for business partners to generate more advanced insights into their supply chains, including product supply chain visibility, authenticity verification, utilization reporting, and other insights that enable more sustainable supply chain management. For example, product master data, combined with LOT master data and supply chain event data, made interoperable using GS1 standards, is essential to getting a comprehensive end-to-end understanding of what transpired to products as they flowed through the supply chain towards the consumer — and make the necessary business decisions to address any identified issues.

There are several potential challenges that manufacturers face when it comes to sharing product master data with their business partners:

Data quality — product master data shared with business partners needs to be accurate and up-to-date to avoid misunderstandings and making decisions based on outdated or wrong information. Maintaining data quality is therefore one of the key things for manufacturers to consider when setting up product master data-sharing mechanisms. Data standardization — manufacturers share product master data with numerous business partners, so they need to make sure it is standardized and not in different formats for every business partner. Standardization significantly reduces the effort associated with sharing product master data and ensures everyone understands it in the same way. This is where GS1 standards, specifically the GS1 Global Data Model (GDM) and GS1 Attribute Definitions for Business (ADB), play a crucial role. Data security — product master data may contain sensitive information about a manufacturer’s product and business operations. Sharing this data with business partners may increase the risk of data breaches or unauthorized access to the data, so manufacturers need to make sure that there is sufficient data security in place. Data ownership — sharing product master data is often done through third parties that aggregate master data from multiple manufacturers and enable access to their business partners. In these situations, manufacturers might have considerations about handing over the ownership of their data, as this data is inherently valuable.

Some existing frameworks for product master data exchange, such as the GS1 GDSN Master Data Pools where GS1-certified third parties provide infrastructure for sharing product master data between manufacturers and their business partners, help address the first three challenges. However, the data ownership issue still persists, as these third parties aggregate product master data from various manufacturers in centralized systems, making them the de-facto data owners. Additionally, product master data in such systems is relatively isolated from and can’t be linked to other relevant supply chain data, such as LOT master data, supply chain events, IoT data, and others. This prevents manufacturers and their business partners from utilizing product master data to its full potential.

Trusted sharing of product master data with the OriginTrail Decentralized Knowledge Graph and GS1 standards

The OriginTrail Decentralized Knowledge Graph (DKG) is an open-source infrastructure blending two powerful technologies, knowledge graphs and blockchains, to facilitate the trusted exchange of supply chain data. This is done by enabling businesses to transform their data into discoverable, verifiable, and interoperable knowledge assets. In the context of supply chains, a knowledge asset can be a variety of things, for example, a supply chain event, business location, trade document, or in this case product master data.

Turning their product master data into knowledge assets on the OriginTrail DKG, manufacturers essentially create an index of that data that their business partners can search to retrieve the required product information. This innovative and trusted way of exchanging product master data delivers value from multiple angles:

Even though product master data is indexed on the OriginTrail DKG, the actual data can be kept on the manufacturer’s own IT real estate, allowing them to retain full data ownership and simply authorize their business partners to access it. When it comes to data privacy, the OriginTrail DKG allows for a full spectrum of options. Product master data on the OriginTrail DKG is trusted and verifiable, meaning business partners can be absolutely certain that the data they need was issued by the product manufacturer. Created knowledge assets are interoperable based on GS1 Global Data Model (GDM) standard, the same as in a traditional GDSN network, so they can be searched for and understood by all business partners. Knowledge assets on the OriginTrail DKG can be easily updated at any time, ensuring that the latest product master data is available to business partners at all times.

All of the characteristics above provide a powerful way to manage product master data, but this is only setting the foundation for a variety of other advanced business applications that can make use of it. As product master data in the form of knowledge assets on the OriginTrail DKG doesn’t live in isolation, it can be automatically linked to other relevant supply chain data across business partners. For example, it can be combined with instance/lot master data (GS1 CBV standard), relevant business locations (GS1 GLN Data Model standard), supply chain and consumer/end-user generated events (GS1 EPCIS standard), and other relevant data to provide comprehensive insights into supply chains, ranging from end-to-end product traceability to utilization reports, incident alerts, shipment risk profiling, insurance-related proofs, and many more.

Product master data and other supply chain data in the form of knowledge assets on the OriginTrail DKG, utilizing GS1 data standards.

Building such a supply chain data management hub with the OriginTrail DKG and GS1 standards can start off simple, focusing only on product master data, and be extended in stages — it is easy to introduce additional components, i.e. supply chain data, IT systems, and business partners. In fact, to streamline the creation and management of assets on the OriginTrail DKG, Trace Labs developed the Network Operation System (nOS) that makes it easy to connect to IT systems, apply the appropriate GS1 standards to the data, and create discoverable, verifiable and interoperable assets. Once those assets are available, specific business applications that make use of them can be built quickly.

AidTrust: A BSI and Trace Labs solution

Ensuring the transparency and traceability of medicine distribution is important, but it can be challenging. Pharmaceutical supply chains are increasingly complex and often don’t have standardized management controls, leading to issues with security, inventory management, and other aspects of distribution. If product distribution is not properly managed, it can result in suspicious loss, diversion, damaged and wasted products, and uncertainty about whether the medicines reached their intended patients.

AidTrust increases transparency and trust in the distribution of medicines by bringing together BSI’s global presence and supply chain risk management expertise with the OriginTrail DKG, developed by Trace Labs. As a business application built on the OriginTrail DKG, it enables visibility (product flows and utilization, inventory levels, etc.), risk alerts, and real-time decision-making at all stages of the supply chain while protecting the integrity, security, and privacy of data. Ultimately, the goal of AidTrust is to help donor organizations and NGOs demonstrate that donated medicines were handled properly and reached the intended patients, even in challenging environments.

BSI and Trace Labs have been working with the World Federation of Hemophilia (WFH), an NGO distributing donated hemophilia medicine to low- and middle-income countries (LMICs), to deploy AidTrust across 25+ treatment centers in India. The main objective is to better understand product utilization through increased supply chain visibility, particularly after products arrive at the in-country distribution center and are distributed to treatment centers across India. To achieve this, knowledge assets on the OriginTrail DKG are created from existing data locked in IT systems, amended by supply chain data generated by warehouse staff, treatment center staff, and physicians using the AidTrust scanning app:

Product master data (GS1 GDM standard), such as manufacturer, GTIN, product description, and packaging level. This is the foundation that enables all other supply chain data to be linked and understood in a comprehensive manner, Product LOT master data (GS1 CBV standard), such as LOT numbers and expiry dates, Supply chain events (GS1 EPCIS standard), such as shipping, receiving, destroying, and dispensing to patients. Some events are generated from data already stored in IT systems, while others are created using data captured by users with the AidTrust scanning app.

These verifiable and interoperable knowledge assets from across the supply chain network power the AidTrust business app, where WFH can log in and get real-time utilization reporting, diversion alerts, inventory overview of treatment centers, and other insights. This enables them to manage their donations much more efficiently and helps ensure products reach their intended patients. After successful deployment in India, BSI and Trace Labs are now in discussion for rollout to further countries.

AidTrust dashboard and scanning application.

👇 More about OriginTrail 👇

OriginTrail is an ecosystem dedicated to making the global economy work sustainably by organizing humanity’s most important knowledge assets. It leverages the open source Decentralized Knowledge Graph that connects the physical world (art, healthcare, fashion, education, supply chains, …) and the digital world (blockchain, smart contracts, Metaverse & NFTs, …) in a single connected reality driving transparency and trust.

Advanced knowledge graph technology currently powers trillion-dollar companies like Google and Facebook. By reshaping it for Web3, the OriginTrail Decentralized Knowledge Graph provides a crucial fabric to link, verify, and value data on both physical and digital assets.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

👇 More about Trace Labs👇

Trace Labs is the core developer of OriginTrail — the open source Decentralized Knowledge Graph. Based on blockchain, OriginTrail connects the physical world and the digital world in a single connected reality by making all different knowledge assets discoverable, verifiable and valuable. Trace Labs’ technology is being used by global enterprises (e.g. over 40% of US imports including Walmart, Costco, Home Depot are exchanging security audits with OriginTrail DKG) in multiple industries, such as pharmaceutical industry, international trade, decentralized applications and more.

Web | Twitter | FacebookLinkedIn

Trusted and extensible product master data hub based on the OriginTrail Decentralized Knowledge… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Digital Scotland

Promote Your Scottish Business With a Podcast

Podcasting offers a highly engaging format for reaching potential customers and enhancing your digital brand. The post Promote Your Scottish Business With a Podcast appeared first on digitalscot.net.

What better way to engage with potential customers than the fastest exploding medium on the Internet? It’s a booming trend with millions listening to a wide spectrum of authors and topics. Ad revenues are expected to reach $2 billion in 2022 and $4 billion by 2024.

So how can you create and promote them? Read on..

Podcast Tools and Services

A quick overview of the podcasting market and tools you can use to create them:

Buzzsprout offers a guide for creating your podcast, and Clipchamp describes how to create a video podcast. Resonic walks through how to set up your podcasting studio.

Tools like Alitu help you easily create high quality recordings, Acast enables large scale distribution and advertising sales, Transister also automates publishing to social networks like Twitter, Youtube and Spotify, and Audiocado enables you to turn your podcasts into videos – Digital Context says video is key to podcasting success.

Integration with Youtube

Co-producing videos & podcasts is a key theme and naturally Youtube itself is a major distribution market, where they offer thousands of shows; Ranker lists the best podcasts on Youtube.

Waave explains How to Upload Your Podcast, which Catos also explains in detail. Also transferring the other way is key, such as publishing your Youtube videos as podcasts. Captivate explains How To Podcast On Youtube, and Riverside offers a Six Step Guide for getting started.

Highlighting the potential for this market is the March news that Youtube considered a Podcasts home page, as they are getting serious about them, including paying podcast authors and hiring their first dedicated exec. In this blog they announce their first ever podcast, The Upload.

Business Promotion

Of course the fundamental question for business sponsors will be why invest in this medium, what can they hope to gain from marketing spend on podcasts? To answer this we can explore some great case studies and best practices.

For example Bluewing describes How 7 Companies are Killing It Through Branded Podcasts, citing examples such as Slack, Shopify and Basecamp, and Westwood details best practices for replicating their success. Backtracks lists another ten examples, and Feedspot offers a directory of 100 of the best, with TEDx as the first. Bluechip corporates like McKinsey publish their own podcast.

Quill offers a comprehensive guide for how to create Branded Podcasts, and Riverside analyzes the best ones to distill what you can learn from them.

A great example of a Scottish business utilizing a podcast to enhance their digital brand is Adarma Security, producing this very slick and cool looking show ‘Cyber Insiders‘:

Our #podcast 'Cyber Insiders' takes a deep dive into the challenges, stresses and motivations of those who protect our banks, high street brands, and critical infrastructure to undercover what it really takes to win against today's #cybercriminals.

to the trailer below. pic.twitter.com/HnmxPCBiKs

— Adarma (@adarma_security) January 25, 2023

Featured Digital Scots Scottish businesses providing podcast services:

Bespoken Media – Bespoken make podcasts. We bring decades of experience and a passion for audio to deliver your podcasting and storytelling goals. We’ll work with you to make your podcasts, or train you to make your own.

The Big Light – The Big Light Studio are podcasting specialists with an excellent pedigree in broadcasting, and will take care of the whole process from concept through to delivery of your first episodes and beyond.

The post Promote Your Scottish Business With a Podcast appeared first on digitalscot.net.


ResofWorld

Foreign currency shortages are cutting Nigerians off from Apple Music, AliExpress, and more

As Nigeria pushes its naira, startups and creators are getting locked out of international tech platforms.
In July 2021, Lagos-based software engineer Sodiq Lawal was working on a project for a fintech startup when the Central Bank of Nigeria suddenly discontinued the sale of foreign currencies...

Digital Scotland

Zumo co-founder and CEO, Nick Jones: How Web 3 is Set to Revolutionise the Internet

Crypto founder shares key insights on how Web 3 will transform digital businesses, and the evolution required to ensure it's ongoing success. The post Zumo co-founder and CEO, Nick Jones: How Web 3 is Set to Revolutionise the Internet appeared first on digitalscot.net.

Speaking at the Fintech Talents Festival, Zumo co-founder and CEO, Nick Jones, discusses the future of the internet and how Web 3 is set to revolutionise the way we interact online.

To stay relevant Web 2 companies need to adopt Web 3 practices into their digital business models, such as non-custodial wallets and wallets as a source of digital id and source of funds.

Governance and Regulations

Nick also joined a panel at the prestigious 2022 Web Summit, which brought together 70,000+ people and the companies redefining the tech industry.

He described how the Gen Z market is maturing in terms of their Internet knowledge and are more educated and astute users of crypto services, and are more financially literate.

Nick also stresses the importance of governance, highlighting how the industry has been guilty of promoting get rich quick schemes with disastrous consequences. It’s important that these down times act as a force to improve the industry, improving regulations and the role financial providers play as trusted guardians for users.

He sees this as the number one issue for the industry. There’s a knowledge gap between the sector and regulators and there’s no global consensus yet on how that should operate. Transparency and these robust structures are the key to achieving the trust factor that will continually attracting new users into the sector and thus sustain a successful economic growth for the asset class.

The post Zumo co-founder and CEO, Nick Jones: How Web 3 is Set to Revolutionise the Internet appeared first on digitalscot.net.


Succeed with Stampede – Montpeliers Group Boosts Customer Engagement AND Saves Staff Time

Stampede enables busy Edinburgh brasserie to quickly and easily capture dining experience feedback. The post Succeed with Stampede – Montpeliers Group Boosts Customer Engagement AND Saves Staff Time appeared first on digitalscot.net.

Edinburgh-based Stampede offers hospitality businesses a uniquely powerful digital marketing platform.

They offer an integrated suite of apps to address the full life-cycle of Digital Marketing customer engagement, including table bookings, gift cards, feedback reviews and loyalty cards.

Montpeliers Boosts Customer Feedback and Saves Staff Time

One keynote customer using Stampede is the Montpeliers Group, who share their experiences in this testimonial video.

Tammie Allan, Area Sales Manager for the group, explains that they operate seven venues across the city and have been using the Stampede WiFi for forms and data collection, and had adopted another system for customer feedback reviews.

However this proved to be very high maintenance and hard work for the staff, and critically wasn’t linked to their table bookings or customer records.

So they moved away from this and instead adopted Stampede’s module, which immediately saved them money and also greatly improved their workflows. It’s much easier to use and integrates with their booking system, which is very important as some of their venues are extremely busy, with 700-800 covers per day so there is a lot of reviews to process.

For the customers the review feedback process is ultra simple. Twenty four hours after their dining they receive an email offering a one-click option to submit feedback, adding comments optionally if desired. Staff can quickly and easily reply, and most importantly it’s sending through to Montpeliers insights on how they might improve the dining experience for their customers.

Key Results Over 280,000 new customer records. Over 1,500 reviews. 3 hours saved on reviews management per week. 96% customer opt in rate.

 

The post Succeed with Stampede – Montpeliers Group Boosts Customer Engagement AND Saves Staff Time appeared first on digitalscot.net.

Wednesday, 25. January 2023

Hyperledger Ursa

Hyperledger Mentorship Spotlight: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab

What did you work on? Project name: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab My name is Sarvesh Shinde and this is my personal blog that I’m... The post Hyperledger Mentorship Spotlight: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab appeared first on Hyperledger Foundation.
What did you work on?

Project name: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab

My name is Sarvesh Shinde and this is my personal blog that I’m writing to share my experience of working on the GVCR Project. A little background about the project is really necessary to fully appreciate the objectives of this project. 

Self-Sovereign Identity (SSI) is a digital identity management model in which an individual or a company has the exclusive ownership over their accounts and personal data. A verifiable credential protocol, in turn, forms one of the three pillars of Self-Sovereign Identity, along with the Decentralized Identifiers protocol (DIDs) and Distributed Ledger Technology (or blockchain).

This project was conceptualized to provision secure verifiable credential registries that utilize Github’s data model and API to offer exactly the same APIs for any other verifiable credential registry. This project exists as an extension to the DRMan project.

The DRMan project, inspired by the SDKMan, acts as a tool for managing multiple versions of different software development libraries. These libraries form the necessary dependencies for the extended feature modules that reside inside DRMan, including GVCR.

What did you learn or accomplish?

GVCR, along with DRMan, is a command line utility. This project has been entirely written in shell script. Shell script has a distinct advantage of making the tool light weight, easy to install and to use.

As of now, GVCR has been provisioned to utilize Github and Gitlab as its two git based registries. The plugin architecture of GVCR allows it to provision for more VCRs in the relative future. The APIs of these individual git based registries are designed to be a collection of facade functions that provide the same feature on the surface, all the while accommodating for the individual data models of the specific registry under the hood.

GVCR can be utilized in an Hyperledger Aries Framework as an implementation of VCR and collaborates with existing Agent and Wallet open source projects in Hyperledger Aries. It can also be used in Hyperledger Indy projects by providing endpoints of cryptographic verifications for credential issuers.

In near future, GVCR is envisioned to leverage Hyperledger Ursa to implement encryption, decryption and verification functions for verifiable credentials.

I was responsible for the design and implementation of this very GVCR module.

Now coming to the topic of the mentorship program itself, let me give you a rundown of its structure, working mechanism and decision making process. This mentorship was a careful balance of a hands-off approach towards the design planning and realization part that I undertook and the existence of biweekly meetings that acted as an efficient feedback mechanism from the mentors. These biweekly meetings were really efficient in setting the tempo of the progress and made sure that all the involved participants were aware of their individual tasks at hand.

The mentorship started on June 1st and continued until November 16th. Further, the mentorship was broken into two halves. In general, the first half was more focused on the design aspect of this project while the second half came down to its implementation.

What comes next?

Overall, this mentorship has been a wonderful experience and has enabled me to pursue my career in blockchain. The future of a secure, verifiable digital identity and its co-operability with a decentralized ledger brings a new outlook to the future of digital identity and just how important its acceptance is to finally realizing the ultimate goal of exclusively owning our own identities. New technologies are constantly coming up to make this future a reality, and I’m looking forward to contributing my part towards it.

The post Hyperledger Mentorship Spotlight: GVCR: Secure Verifiable Credential Registries (VCR) for GitHub & GitLab appeared first on Hyperledger Foundation.


Next Level Supply Chain Podcast with GS1

When E-Commerce and In-Store Collide: What You Need to Know to Stay Relevant

Simply communicating your brand’s physical presence isn't enough for online shoppers anymore. Your customers want more information – and that could include your product catalog. Join us as we chat with Mike Massey, CEO at Locally, a business that gives access to real-time inventory to nearby shoppers using e-commerce tactics, to hear about how physical retailers can avoid becoming invisible,

Simply communicating your brand’s physical presence isn't enough for online shoppers anymore. Your customers want more information – and that could include your product catalog. Join us as we chat with Mike Massey, CEO at Locally, a business that gives access to real-time inventory to nearby shoppers using e-commerce tactics, to hear about how physical retailers can avoid becoming invisible, and why Global Trade Item Numbers (GTINs) are the key to success.


MyData

Curious about ethical use of personal data? Meet MyData Awards 2023

MyData Awards 2023 call is now open to recognise and celebrate human-centric services that put the individual at the centre of digital solutions.
MyData Awards 2023 call is now open to recognise and celebrate human-centric services that put the individual at the centre of digital solutions.

Velocity Network

Aon Assessment’s Tarandeep Singh on the Velocity podcast

The post Aon Assessment’s Tarandeep Singh on the Velocity podcast appeared first on Velocity.

OpenID

Second Implementer’s Drafts of Two FAPI 2.0 Specifications Approved

The OpenID Foundation membership has approved the following Financial-grade API (FAPI) specifications as OpenID Implementer’s Drafts: FAPI 2.0 Security Profile FAPI 2.0 Attacker Model An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the second FAPI 2.0 Implementer’s Drafts. The Implementer’s Dr

The OpenID Foundation membership has approved the following Financial-grade API (FAPI) specifications as OpenID Implementer’s Drafts:

FAPI 2.0 Security Profile FAPI 2.0 Attacker Model

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the second FAPI 2.0 Implementer’s Drafts.

The Implementer’s Drafts are available at:

https://openid.net/specs/fapi-2_0-security-profile-ID2.html https://openid.net/specs/fapi-2_0-attacker-model-ID2.html

The voting results were:

Approve – 49 votes Object – 1 vote Abstain – 2 votes

Total votes: 52 (out of 259 members = 20% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Second Implementer’s Drafts of Two FAPI 2.0 Specifications Approved first appeared on OpenID.

Tuesday, 24. January 2023

Digital ID for Canadians

Spotlight on ADI Association

1. What is the mission and vision of the ADI Association? The mission of the ADI Association is to create a global digital identity framework…

1. What is the mission and vision of the ADI Association?

The mission of the ADI Association is to create a global digital identity framework for creating, managing, and using digital identity as well as sharing personal information. This system solves some of the most important practical problems attached to the operation and adoption of decentralized identity technology with an accountable governance framework by enabling a business interoperability layer as well as a technological specification.

2. Why is trustworthy digital identity critical for existing and emerging markets?

Establishing an effective model for digital identity is a priority that spans industries. The traditional identity model for the digital world has been an account with a password. Service Providers give each of their customers an account and assume that those accounts can be secured. However, accounts are breached, and data are compromised. Service Providers do not often even know if the person who created the account was legitimate. In the physical world, individuals can prove who they are by presenting formal documents like a driver’s license, passport, or birth certificate. In the digital world, individuals cannot definitively prove who they are, and the malicious can easily masquerade as someone they are not. This gap between physical and digital identity has led to an exponential increase in online fraud as more interactions and financial transactions have gone digital. It has also led to a rapid rise in misinformation and disinformation propagating through online social media.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

In the physical world, identities are created at birth by a trusted group of people. Parents choose names; hospitals record birth events; and governments create birth records, which are then used by individuals to register at schools, employers, banks, medical facilities, and so on. Everything an individual does in life is tied to that identity, and that identity becomes the accountable party. ADI Association has defined a similar concept for the digital world, called the “Digital Address.” The next generation of federated identity standards should support the issuance of a lifetime digital identity that a consumer can use across the digital services of their choice. It should eliminate passwords and protect personal data. The Accountable Digital Identity (ADI) Association is developing an open standard, based on decentralized identity, for bootstrapping individual identities online. The work will give identity providers and relying parties a framework to establish secure exchanges of identity-based services for their consumers, customers, employees, and partners. The ADI Association focuses on standards that enhance the digital consumer experience while improving cyber security at the same time.

4. What role does Canada have to play as a leader in this space?

As a current leader, Canada has the opportunity to frame how the future gets shaped by digital identity for both online use cases as well as in the physical world.

5. Why did your organization join the DIACC?

The ADI Association believes that this work cannot be done by a single organization, but will only be realized by concerted efforts to create and promote open standards which can be made available to and adopted by all.

6. What else should we know about your organization?

The ADI Association is a diverse coalition of companies and organizations from around the world who have come together to create an interoperable specification and ecosystem for accountable identity. This level of coordination across industries seamlessly connects a global community of businesses, governments, healthcare providers, and other organizations with customers, people, and patients to modernize the way we manage our online digital interactions.


Velocity Network

Credential Engine CEO Scott Cheney joins Velocity board

The post Credential Engine CEO Scott Cheney joins Velocity board appeared first on Velocity.

We Are Open co-op

WAO! We’re almost seven.

Documenting our first IRL meetup since the pandemic From left: John Bevan, Bryan Mathers, Doug Belshaw, Anne Hilliger, Laura Hilliger Last week, four members of We Are Open Co-op, plus one collaborator, descended on an American-style villa in the Netherlands for a meetup. We left with three members, two collaborators, a new pair of socks, some plans, and great memories. Our aims and ‘ru
Documenting our first IRL meetup since the pandemic From left: John Bevan, Bryan Mathers, Doug Belshaw, Anne Hilliger, Laura Hilliger

Last week, four members of We Are Open Co-op, plus one collaborator, descended on an American-style villa in the Netherlands for a meetup. We left with three members, two collaborators, a new pair of socks, some plans, and great memories.

Our aims and ‘rules’ for the week, such as they were, could be outlined simply:

Take some time away from client projects to reflect on who we are, where we’ve been, and where we’re going. Hang out with each other in real life and have some fun. The co-op will pay for everything. Except spirits.

Well, we managed two of the three. And it didn’t get too messy after some heavy pouring by the bartender at the TonTon Club 🥃

Collaborator pathways & member benefits CC BY-ND Visual Thinkery

After an extended period of dormancy in 2022, we knew that Bryan intended to step down as a director of WAO to become a collaborator. We’re looking forward to actually working with him more this year than last, now that he’s done the “wandering and crying” (his words!) needed to find what’s been calling him.

The image above was drawn by Bryan during a session where we though about what it means to be a collaborator with, versus being a member of, WAO. As with any business, it’s about money and power, although that operates in a different way within an organisation with a flat hierarchy and consent-based decision making.

Essentially, we would love to grow the co-op by working with more collaborators. The benefit for us includes everything from accessing new clients to learning new skills and discovering different ways of working. We would hope that the benefits of working with WAO are manifold and obvious, but for the avoidance of doubt, it’s autonomy, solidarity, and a pathway to becoming a member.

Highway to membership CC BY-ND Visual Thinkery

Talking of pathways to becoming a member of WAO, we talked about that too! Outlining a policy for this is a bit of a ‘Goldilocks’ strategy, as while the first principle of the International Cooperative Alliance is ‘voluntary and open membership’ there has to be some kind of speedbump. That barrier can’t be too high, though, otherwise we’ll never be able to welcome anybody new.

What we’ve settled on for now isn’t too much different from what went before. To join WAO, you need to:

bring in work from your network have worked with every current WAO member mesh with our values

Some of this is subjective. Of course it is. But then being a member of WAO currently also means being a company director so we want to get it right. While things may change in future with more than one class of membership, for now any additional members of WAO are taking on joint legal responsibility.

Introducing the ‘Context Understudy’ CC BY Laura Hilliger of WAO

When you’re a small organisation like our co-op, knowledge management can be both easier and more difficult than when you’re either a larger organisation or a single freelancer. It’s not sustainable for one member to be on every project, but nor is it reasonable to spend time documenting every little thing on every project.

At the meetup, we created the role of the ‘Context Understudy’. It’s a title that may change in name over time, but the idea is that they are the go-to partner for co-working. Just like an understudy in a theatrical play, they can step in as and when needed. They’re also the default person to co-work with on the project.

It’s worth noting that you have to be a member to lead a WAO project but you don’t have to be in order to be a Context Understudy. In fact, in terms of the ‘highway to membership’, bringing a project to the co-op and working on it in this way is a perfect fit.

We haven’t got a name we’re completely happy with for other roles on the project. ‘Skills provider’ feels… a bit corporate? Although that’s exactly what other people are bringing. As with everything, it’s a work in progress.

And now for something completely different…

If you’ve scrolled this far, you deserve more fun photos of our meetup. And we hate to disappoint!

WAO! We’re almost seven. was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 23. January 2023

The Engine Room

Applications are open! Learn how to bring a responsible data approach to your work with our cohort-learning programme

Apply now to join The Engine Room's pilot cohort learning programme The post Applications are open! Learn how to bring a responsible data approach to your work with our cohort-learning programme first appeared on The Engine Room.

The amount of data available about almost anything is growing, but if it’s not collected, processed and used in thoughtful and critical ways, the data increases risk of harm – especially to marginalised communities.

Many organisations doing evidence-based advocacy handle large amounts of data and need to strengthen their capacity to manage, store, and archive the data responsibly, as well as grow their knowledge on how to disseminate and present this data to external audiences in easily digestible ways.

Today we are launching a pilot of our new cohort learning programme: a collaborative course aimed to equip social justice leaders with the knowledge and tools needed for implementing responsible data (RD) practices and policies for their own internal operations, as well as for their advocacy efforts.

Our approach to cohort learning

Cohort learning represents a capacity building approach that connects peer support with tailored knowledge and vision, built from research, technical experimentation and community engagement The Engine Room has led over the past years.

Inspired by feminist movements and the “popular education” movement that emerged in Latin America and widely adopted in other parts of the world, we are proposing a participatory learning approach that will allow participants to share their individual knowledge and enrich each other through collective engagements with the cohort.

Through community-facing activities we will stress the “action/reflection” principle; as the Freire Foundation defines this: “It is not enough for people to come together in dialogue in order to gain knowledge of their social reality. They must act together upon their environment in order critically to reflect upon their reality and so transform it through further action and critical reflection.”

What will the programme look like?

The learning process will consist of engaging sessions with experts from The Engine Room, asynchronous activities such as reading guided material and watching videos, and community building activities where participants facilitate conversations within their organisations and movements.

Once a module starts, participants will be asked to read selected material and watch presentations from the responsible data community. In the third week, participants will engage in a 1,5 hour session with the team that will lead to assignments meant to bring their learnings back to their communities for discussion. Throughout the module, participants will have the opportunity to connect online through an immersive virtual environment.

By the end of this learning programme, we hope that members of the cohort will be able to:

Apply responsible data principles to their own work. Evaluate the relationship between power and the data they collect, and be able to identify and reject data practices that cause harm to the communities they serve. Continue building connections with others in the field and accompany each other in their journey.

At the end of the program, participants will be connected to our Light Touch Support programme to receive guidance for their organisation to strengthen a responsible data area of their choice.

Programme content

Module 1: Introduction to responsible data management
During weeks 1 and 2, we’ll focus on everyone getting to know each other and establishing some ground rules together for an inclusive space where we can explore ideas, ask questions and learn from each other. We will also start diving into key concepts and principles that drive responsible data management, and reflect on how your organisation may already be using responsible data practices and how you may be able to apply this framework in future.

Module 2: Justice-based approach to data
Given the array of harms that people have caused (and continue to cause) through their use (or misuse) of data, this module considers how to apply justice frameworks to data. The resources cover data justice, data feminism and decolonial approaches to managing and thinking about data through a variety of readings, podcast episodes and videos. We’ll discuss what resonates and how these approaches can be applied in our various contexts.

Module 3: Where to start and what responsible data looks like for an organisation
It’s time to dive deeper into responsible data management. We’ll look at what RD means for your organisation, collective or movement, help you to understand where you’ve taken steps already and where you need to fill in gaps, and start strategising about ways to further apply the RD principles.

Module 4: Becoming RAD
Using what we’ve learned in previous modules, we’ll now focus on practical applications of RD. We’ll dive deeper into the data lifecycle, spend time on topics relevant to data collection such as informed consent, walk you through activities such as storing and analysing data, and lastly discuss ways of sharing your data safely and securely. Through this session, you’ll learn how to start developing your plan for data retention, archiving and deletion and how to map your organisation’s critical information management processes and assets, as well as potential data risks and harms.

Module 5: Moving Forward
RD encompasses every stage of the data process, from data collection to deletion. It’s a lot to work on, so how do you figure out where to start? We’ll explore how to make RD transformations doable by beginning with a few simple changes and developing a plan for identifying and addressing gaps.

Programme team

Barbara is a Brazilian activist and researcher who is interested in exploring how technology and data can help us achieve social justice. She has worked defending the right to public information and transparency, advocating for open government, supporting women’s rights and mobilising against racism and online abuse. Bárbara co-founded Minas Programam, an initiative designed to share knowledge about technology with girls and women from São Paulo.

Cathy is interested in the intersection between data, technology and humanity. Prior to joining The Engine Room, Cathy worked at GitLab Inc. and Keystone Accountability as a data analyst. She has led and supported projects on a range of issues, from youth employment to rule of law and international health.

Joshua is interested in working with activists across different contexts to find holistic solutions to the injustices in the world. He has expertise in working with regional and global human rights mechanisms, and experience in media advocacy and capacity strengthening. Before joining our team, Joshua was Advocacy and programmes Manager at Iranti.


Paola is a cyber-feminist Chilean activist and creative thinker. Before joining The Engine Room, she led Poderomedia Foundation’s training programme for Latin American journalists and developers; as facilitator of a global community of activists mapping power networks; and as a consultant to nonprofits. She works closely with feminist and social justice activists and technologists, and she contributes actively to the organisational security practitioner and cyber-feminist communities.


How to apply

The programme is run in English and offers learning, knowledge-sharing and community engagement opportunities for a cohort of 10 social justice activists.

The programme is open to applicants who:

belong to a social justice organisation, network, coalition or collective, are rooted in their local context, collect data about their community for advocacy purposes, have questions about how to better handle data in their organisation, and are interested in learning more about the intersection between data and justice.

Participants will need to dedicate 2 to 3 hours to the programme per week, over 12 weeks. You can apply by filling out the application form before February 10. If you have any questions about the pilot program, you can reach us at paola@theengineroom.org or joshua@theengineroom.org.

Photo by Daniil Silantev on Unsplash

The post Applications are open! Learn how to bring a responsible data approach to your work with our cohort-learning programme first appeared on The Engine Room.

MOBI

NADA Joins MOBI to Accelerate Zero Trust Innovations for Information Security and Business Automation

Two of the largest mobility-focused consortia in the world join efforts to co-develop solutions for more robust, resilient, and privacy-preserving connected ecosystems. Los Angeles — 23 January 2023: MOBI (Mobility Open Blockchain Initiative) today welcomes the National Automobile Dealers Association (NADA), an innovation and advocacy group representing over 16,000 franchised new-car dealer member

Two of the largest mobility-focused consortia in the world join efforts to co-develop solutions for more robust, resilient, and privacy-preserving connected ecosystems.

Los Angeles — 23 January 2023: MOBI (Mobility Open Blockchain Initiative) today welcomes the National Automobile Dealers Association (NADA), an innovation and advocacy group representing over 16,000 franchised new-car dealer members, into its global community.

The MOBI community aims to develop and accelerate adoption of zero trust Web3 standards and solutions to enable seamless and secure business processes, while safeguarding sensitive business and consumer data. NADA brings a critical dealer viewpoint to this process. MOBI and NADA agree that the widespread adoption of a shared technology-agnostic framework will be critical to future innovations for members of both consortia. NADA and MOBI also believe that compliance with the Federal Zero Trust Strategy issued by the White House in 2022 will be crucial to maintain their competitive edge.

“We are thrilled to team up with NADA to co-develop solutions to help solve pain points for use cases such as Vehicle Registration, Titling, and Dealer Floorplan Audit,” said MOBI CEO and Founder, Tram Vo. “We look forward to collaborating on additional mobility and geolocation applications that improve security while preserving customer privacy.”

“NADA is excited to work with the experts at MOBI on behalf of dealers to ensure that this important technology improves and modernizes the auto retail experience, generates efficiencies, and brings increased convenience for dealers and consumers alike,” said Mike Stanton, President and CEO of NADA.

MOBI anticipates that NADA’s entry into the community at large will play a critical role in igniting a two-way exchange of expertise between the organizations’ respective communities, enabling greater collaboration on the road to more resilient mobility value chains and accelerating related innovations in research, development, and implementation. MOBI welcomes organizations of all sizes, industries, and locations to share expertise, define industry standards, and improve the sustainability, efficiency, and accessibility of mobility services around the world.

About MOBI

Mobility Open Blockchain Initiative (MOBI) is a global nonprofit smart mobility consortium. MOBI and our members are creating blockchain-based standards to identify vehicles, people, businesses, and MOBI Trusted Trip. We are building the Web3 digital infrastructure for connected ecosystem and IoT commerce. For additional information about joining MOBI, please reach out to Griffin Haskins (griffin@dlt.mobi) or visit www.dlt.mobi.

Media Contact: Grace Pulliam, MOBI Communications Manager

Email: grace@dlt.mobi | Twitter: twitter.com/dltmobi

###

The post NADA Joins MOBI to Accelerate Zero Trust Innovations for Information Security and Business Automation appeared first on MOBI | The New Economy of Movement.


Ceramic Network

Upgrade Your Node Before the Ceramic Hard Fork on February 15, 2023

Nodes that do not upgrade will no longer verify anchor commits from the network.

Ceramic’s first hard fork will take place on February 15, 2023.

All Ceramic nodes operating on Ceramic mainnet must upgrade to (at least) version 2.18.0 of the @ceramicnetwork/cli package by February 15, 2023.

The ‘HistorySync release’ will result in changes to Ceramic’s anchoring system, this work will enable the upcoming release of ComposeDB. These changes will allow nodes to discover and sync existing data, for a given data model, from other nodes on the network.

Nodes that do not upgrade by February 15th will no longer verify anchor commits from the network, which will lead to the corruption and loss of all writes due to CACAO timeout errors. If you are operating a Ceramic mainnet node, please let us know if you have any concerns about upgrading on the forum.

Protocol Changes

Two main changes will affect how anchoring (time-stamping) works:

There will no longer be simple, or regular, transactions to put Merkle tree roots onto Ethereum. Instead, an anchor smart contract will call an “anchor” function resulting in an “anchor event” that Ethereum RPC clients can subscribe to. This will make it easier for Ceramic nodes to discover when anchors happen and to sync the history of anchor events. The second change is designed to make the system more resilient to block reorgs on Ethereum. Ceramic anchor commits will no longer include the specific blockNumber and blockTimestamp that the anchor event was included in, as it’s possible the transaction could later be reorged into a new block. Instead, Ceramic will only use the transaction hash and will look up which blockNumber the transaction was included in from Ethereum directly.

Both of these updates require changes to the js-ceramic node implementation, so that the node can still know how to verify and apply anchor commits with the new behavior. Nodes that haven’t upgraded will experience failures when applying anchor commits. Data update commits that are not anchored within the CACAO timeout period will then fail to apply since the node will not be able to verify that the update was created during the time when the CACAO (the capability that gave the app permission to write into the stream on behalf of the user) was valid. Over time this can corrupt the state of streams and invalidate writes that seemed to be applied successfully at first, but then are later invalidated when CACAO times out.

Please get in touch with us on the forum for assistance or if you have any other questions!


Velocity Network

SAP’s Praveen Rao joins the expert panel at our India ecosystem launch event

The post SAP’s Praveen Rao joins the expert panel at our India ecosystem launch event appeared first on Velocity.

Friday, 20. January 2023

Oasis Open Projects

Invitation to comment on Electronic Court Filing v5.01 and ECF Web Services SIP v5.01

Electronic Court Filing defines a technical architecture and a set of components, operations and message structures for an electronic court filing system. The post Invitation to comment on Electronic Court Filing v5.01 and ECF Web Services SIP v5.01 appeared first on OASIS Open.

Second public review ends February 2nd

We are pleased to announce that Electronic Court Filing Version 5.01 and Electronic Court Filing Web Services Service Interaction Profile Version 5.01 from the LegalXML Electronic Court Filing TC [1] are now available for public review and comment. This is the second public review for Version 5.01 of these two specifications.

ECF defines a technical architecture and a set of components, operations and message structures for an electronic court filing system, and sets forth rules governing its implementation.

Electronic Court Filing Version 5.01 (ECF v5.01) consists of a set of non-proprietary XML and Web Services specifications developed to promote interoperability among electronic court filing vendors and systems. ECF v5.01 is a minor release that adds new functionality and capabilities beyond the scope of the ECF 5,0, 4.0 and 4.01 specifications that it supersedes.

Electronic Court Filing Web Services Service Interaction Profile defines a Service Interaction Profile (SIP), as defined in section 7 of the ECF v5.01 specification. The Web Services SIP may be used to transmit ECF 5.01 messages between Internet-connected systems.

The documents and related files are available here:

Electronic Court Filing Version 5.01
Committee Specification Draft 02
06 December 2022

Editable source (Authoritative):
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/ecf-v5.01-csd02.docx
HTML:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/ecf-v5.01-csd02.html
PDF:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/ecf-v5.01-csd02.pdf
XML schemas and Genericode code lists:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/schema/
XML example messages:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/examples/
Model and documentation:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/model/
UML model artifacts:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/uml/
Complete package in ZIP file:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/ecf-v5.01-csd02.zip
Public review metadata record:
https://docs.oasis-open.org/legalxml-courtfiling/ecf/v5.01/csd02/ecf-v5.01-csd02-public-review-metadata.html
************************

Electronic Court Filing Web Services Service Interaction Profile Version 5.01
Committee Specification Draft 02
06 December 2022

Editable source (Authoritative):
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/ecf-webservices-v5.01-csd02.docx
HTML:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/ecf-webservices-v5.01-csd02.html
PDF:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/ecf-webservices-v5.01-csd02.pdf
WSDL schemas:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/schema/
XML WSDL examples:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/examples/
Complete package in ZIP file:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/ecf-webservices-v5.01-csd02.zip
Public review metadata record:
https://docs.oasis-open.org/legalxml-courtfiling/ecf-webservices/v5.01/csd02/ecf-webservices-v5.01-csd02-public-review-metadata.html
***************************

How to Provide Feedback

OASIS and the LegalXML Electronic Court Filing TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public reviews start 19 January 2023 at 00:00 UTC and end 02 February 2023 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=legalxml-courtfiling).

Comments should clearly identify which of these two specifications they address.

Feedback submitted by TC non-members for these works and for other work of this TC is publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/legalxml-courtfiling-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [2] applicable especially [3] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about this specification and the ECF TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/legalxml-courtfiling/

Additional references:
[1] OASIS LegalXML Electronic Court Filing TC
https://www.oasis-open.org/committees/legalxml-courtfiling/
[2] https://www.oasis-open.org/policies-guidelines/ipr/
[3] https://www.oasis-open.org/committees/legalxml-courtfiling/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr/#RF-on-Limited-Mode
RF on Limited Terms Mode

The post Invitation to comment on Electronic Court Filing v5.01 and ECF Web Services SIP v5.01 appeared first on OASIS Open.


OSLC Tracked Resource Set v3.0 Project Specification 02 approved by the OSLC Open Project

The Tracked Resource Set protocol allows a server to expose a set of resources in a way that allows clients to discover that set of resources, to track additions to and removals from the set, and to track state changes to the resources in the set. The post OSLC Tracked Resource Set v3.0 Project Specification 02 approved by the OSLC Open Project appeared first on OASIS Open.

Project Specification 02 is ready for testing and implementation

OASIS is pleased to announce that OSLC Tracked Resource Set Version 3.0 from the Open Services for Lifecycle Collaboration Open Project [1] has been approved as an OASIS Project Specification.

Managing change and configuration in a complex systems development lifecycle is very difficult, especially in heterogeneous environments that include homegrown tools, open source projects, and commercial tools from different vendors. The OSLC initiative applies World Wide Web and Linked Data principles to enable interoperation of change, configuration, and asset management processes across a product’s entire application and product lifecycle.

The Tracked Resource Set protocol allows a server to expose a set of resources in a way that allows clients to discover that set of resources, to track additions to and removals from the set, and to track state changes to the resources in the set. The protocol does not assume that clients will dereference the resources, but they could do so. The protocol is suitable for dealing with sets containing a large number of resources, as well as highly active resource sets that undergo continual change. The protocol is HTTP-based and follows RESTful principles.

This Project Specification is an OASIS deliverable, completed and approved by the OP’s Project Governing Board and fully ready for testing and implementation. The applicable open source licenses can be found in the project’s administrative repository at https://github.com/oslc-op/oslc-admin/blob/master/LICENSE.md.

The specification and related files are available at:

OSLC Tracked Resource Set Version 3.0
Project Specification 02
24 November 2022

– OSLC Tracked Resource Set Version 3.0. Part 1: Specification
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set.html
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set.pdf

– OSLC Tracked Resource Set Version 3.0. Part 2: Vocabulary
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set-vocab.html
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set-vocab.pdf

– OSLC Tracked Resource Set Version 3.0. Part 3: Constraints
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set-shapes.html
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/tracked-resource-set-shapes.pdf

– OSLC Tracked Resource Set RDF Vocabulary definitions file:
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/trs-vocab.ttl

– OSLC Tracked Resource Set Resource Shape Constraints definitions file:
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/trs-shapes.ttl

Distribution ZIP file

For your convenience, OASIS provides a complete package of the specification and related files in a ZIP distribution file. You can download the ZIP file at:
https://docs.oasis-open-projects.org/oslc-op/trs/v3.0/ps02/trs-v3.0-ps02.zip

Members of the OSLC OP Project Governing Board approved this specification by Special Majority Votes [2] as required by the Open Project rules [3].

Our congratulations to the participants and contributors in the Open Services for Lifecycle Collaboration Open Project on their achieving this milestone.

Additional references:

[1] Open Services for Lifecycle Collaboration Open Project
https://open-services.net/

[2] Approval ballot:
https://lists.oasis-open-projects.org/g/oslc-op/message/1048

[3] https://www.oasis-open.org/policies-guidelines/open-projects-process/

The post OSLC Tracked Resource Set v3.0 Project Specification 02 approved by the OSLC Open Project appeared first on OASIS Open.


FIDO Alliance

CISO Series: Cyber Security Headlines: Bypassing patches, ChatGPT polymorphic malware, Bitwarden goes passwordless

Bitwarden acquires Passwordless.dev – This marks the first acquisition for the open-source password management platform, obtaining the Swedish startup PAsswordless.dev. The company specializes in tools for developers to integrate passwordless […] The post CISO Series: <a href="https://cisoseries.com/cyber-security-headlines-bypassing-patches-chatgpt-polymorphic-malware-bitwarden-goes-password

Bitwarden acquires Passwordless.dev – This marks the first acquisition for the open-source password management platform, obtaining the Swedish startup PAsswordless.dev. The company specializes in tools for developers to integrate passwordless authentication. Bitwarden supports some passwordless authentication already, including biometrics and the use of FIDO security keys.

The post CISO Series: <a href="https://cisoseries.com/cyber-security-headlines-bypassing-patches-chatgpt-polymorphic-malware-bitwarden-goes-passwordless/" target="_blank" rel="noreferrer noopener">Cyber Security Headlines: Bypassing patches, ChatGPT polymorphic malware, Bitwarden goes passwordless</a> appeared first on FIDO Alliance.


Forbes: Thousands Of PayPal Accounts Hacked—Is Yours One Of Them?

According to a PayPal notice of security incident dated January 18, attackers got unauthorized access to the accounts of thousands of users between December 6 and 8, 2022. The total […] The post Forbes: Thousands Of PayPal Accounts Hacked—Is Yours One Of Them? appeared first on FIDO Alliance.

According to a PayPal notice of security incident dated January 18, attackers got unauthorized access to the accounts of thousands of users between December 6 and 8, 2022. The total number of accounts that were accessed by threat actors using a credential stuffing attack is reported as being 34,942. While accepting that PayPal is seemingly doing the best it can for the customers involved in this security incident by recommending password changes, Jasson Casey, chief technology officer at Beyond Identity insists that “passwords – whether unique or complex – are fundamentally flawed.” Instead, Casey says, organizations should be moving to phishing-resistant credentials such as the FIDO Alliance standard blueprints.

The post Forbes: Thousands Of PayPal Accounts Hacked—Is Yours One Of Them? appeared first on FIDO Alliance.


B2B Cyber Security: Cybersecurity Trends for 2023

Perhaps the FIDO Alliance’s passkey solution is the first truly effective method to mitigate social engineering attacks. This is because the passkey for authentication on the respective website is based […] The post B2B Cyber Security: Cybersecurity Trends for 2023 appeared first on FIDO Alliance.

Perhaps the FIDO Alliance’s passkey solution is the first truly effective method to mitigate social engineering attacks. This is because the passkey for authentication on the respective website is based on the device unlocking method used by the user.

The post B2B Cyber Security: Cybersecurity Trends for 2023 appeared first on FIDO Alliance.


Oasis Open Projects

Call for Participation: OASIS Heimdall Data Format (OHDF) TC

Developing a standard vendor-agnostic data format to support cybersecurity product interoperability without the need for customized integrations. The post Call for Participation: OASIS Heimdall Data Format (OHDF) TC appeared first on OASIS Open.

New TC aims to develop a standard vendor-agnostic data format to support cybersecurity product interoperability without the need for customized integrations.

A new OASIS technical committee is being formed. The OASIS Heimdall Data Format (OHDF) Technical Committee (TC) has been proposed by the members of OASIS listed in the charter below. This is your invitation to join the TC and participate in the development of the specification if this is an area of interest to you. Note that contributions and technical discussions may not occur until the TC’s first meeting, but introductions are certainly welcome.

The eligibility requirements for becoming a participant in the TC at the first meeting are:

(a) you must be an employee or designee of an OASIS member organization or an individual member of OASIS, and

(b) you must join the Technical Committee, which members may do by using the Roster “join group: link on the TC’s web page at [a].

To be considered a voting member at the first meeting:

(a) you must join the Technical Committee at least 7 days prior to the first meeting (on or before 13 February 2023) and

(b) you must attend the first meeting of the TC, at the time and date fixed below (21 February 2023).

Participants also may join the TC at a later time. OASIS and the TC welcomes all interested parties.

Non-OASIS members who wish to participate may contact us about joining OASIS [b]. In addition, the public may access the information resources maintained for each TC: a mail list archive, document repository and public comments facility, which will be linked from the TC’s public home page at [c].

Please feel free to forward this announcement to any other appropriate lists. OASIS is an open standards organization; we encourage your participation.


[a] https://www.oasis-open.org/apps/org/workgroup/ohdf/

[b] See http://www.oasis-open.org/join/

[c] http://www.oasis-open.org/committees/ohdf/

CALL FOR PARTICIPATION
OASIS Heimdall Data Format (OHDF) Technical Committee Charter

The charter for this TC is as follows.

Section 1: TC Charter

(1)(a) TC Name

The OASIS Heimdall Data Format (OHDF) Technical Committee (TC)

(1)(b) Statement of Purpose

The purpose of the TC is to develop a standard format for exchanging normalized security data between cybersecurity tools. This data exchange specification will be called the OASIS Heimdall Data Format (OHDF).

In this context:

‘Standardization’ is the process of defining data elements in a consistent and contextualized manner. ‘Normalization’ is the process for mapping a format’s data elements into another format’s data elements.

Security tools typically generate data in unique formats that require multiple dashboards and utilities to process. This leads to a time-consuming process for completing security assessments, data in disparate locations and inconsistent semantics of a data element between formats. Furthermore, many security tools do not provide context to relevant compliance standards for comparison across security tools.

OHDF will provide a common data exchange format that:

Enables the consistent integration, aggregation, and analysis of security data from all available sources Preserves data integrity with original source data Maximizes interoperability and data sharing Facilitates the transformation and transport of data between security/management processes or technologies Allows for the mapping and enrichment of security data to relevant compliance standards (GDPR, NIST SP 800-53, PCI-DSS, etc.)

The TC will update OHDF as industry needs evolve.

Business Benefits

A standard vendor-agnostic data format supports cybersecurity product interoperability without the need for customized integrations.

Participating stakeholders and adaptors should benefit from this TC:

For Commercial and Vendor Cybersecurity Partners, OHDF defines a standardized, interoperable target format that vendor tools can consume across their customer base consistently and that is easily managed within the product lifecycle. For the Open Source Community, OHDF enables easy integration with commercial solutions without the need for direct partnerships. For Government Agencies, OHDF can streamline business processes by having a standard, open source, machine-readable format for all security data. For Academia, OHDF offers a structured way to communicate and enhance research findings throughout the security community. For Corporate and Federal CISOs/CIOs, OHDF can increase visibility across the enterprise by taking advantage of normalized security data in a standard format that supports risk information interoperability from a broad range of inputs to support security risk decision-making. For Security Engineers, OHDF can reduce resource requirements for multiple security data types by standardizing formatting across disparate security tools. For Risk Managers, OHDF can improve decision making by using a standardized format to facilitate automation, standardize communication requirements, and inform risk-based analysis. For DevSecOps/Software Engineers, OHDF can streamline CI/CD processes by leveraging a standardized format to collate/aggregate normalized security data to support automated and continuous security processes.

(1)(c) Scope

The scope of work of the TC is to produce a specification that defines the OHDF format, as well as supporting documentation and open source content. The TC will draft specifications, lexicons, or other documents to allow exchange of security data in a standardized manner. The TC will leverage pre-existing standards to the greatest extent practical.

The TC will base its initial efforts on HDF specifications generated by The MITRE Corporation as part of the MITRE Security Automation Framework (MITRE SAF ©). MITRE SAF © will contribute the open source specifications and related documentation developed for HDF to the OHDF TC.

Additionally, the TC will reference example implementations from MITRE SAF © tooling for accessing and visualizing the data. It is expected that other organizations and interested individuals in the larger community will also develop implementations and tooling.

(1)(d) Deliverables

An OASIS specification that defines the OASIS Heimdall Data Format (OHDF). (~6 months from start date) Other materials as necessary to ease adoption of the specification, such as: educational materials, supporting documentation, and open source content.

The OASIS Heimdall Data Format will be an evolving standard, and consequently this TC will continue to make changes and produce materials as required to adapt the format to any new security data considerations.

(1)(e) IPR Mode

This TC will operate under the Non-Assertion IPR mode as defined in Section 10.3 of the OASIS IPR Policy document.

(1)(f) Audience

Corporate and Federal CISOs/CSOs Security data vendors Federal contractors National standards agencies and institutes, e.g., US National Institute of Standards and Technology (NIST)

(1)(g) Language

English

(Optional References for Section 1)

https://saf.mitre.org (MITRE SAF© Home page)

https://github.com/mitre/heimdall2/tree/master/libs/inspecjs (example JavaScript implementation of the HDF standard)

Section 2: Additional Information

(2)(a) Identification of Similar Work

The TC will consider the relationship of OHDF to the following standards:

Asset Reporting Format (ARF) and Extensible Configuration Checklist Description Format (XCCDF) focuses on common configuration enumerations (CCE) Static Analysis Results Interchange Format (SARIF) describes common vulnerabilities and exposures (CVE) and common weakness enumerations (CWE) Open Command & Control (OpenC2) is a standardized language for the command and control of technologies that provide or support cyber defenses Posture Attribute Collection & Evaluation (PACE) is a project for understanding security posture which could benefit from OHDF as an input Structured Threat Information Expression (STIX) is a language and serialization format used to exchange cyber threat intelligence National Information Exchange Model (NIEM) is a standard for creating specific automated information exchanges within and across organizations and disciplines

Each of these specifications addresses a subset of the data exchange challenge. OHDF provides a way to preserve data from the aforementioned formats and allows for expanding their usability as described in this charter. The OHDF TC will consider how OHDF should interoperate with these standards, leveraging the strengths and specific use cases for each.

(2)(b) First TC Meeting

The first meeting of the TC will be held on 21 February 2023 at noon US eastern time. MITRE will host the meeting.

(2)(c) Ongoing Meeting Schedule

Monthly

(2)(d) TC Proposers

Mike Fraser, Sophos, Mike.Fraser@Sophos.com Andy Thomas, Sophos, Andy.Thomas@sophos.com Aaron Lippold, MITRE, alippold@mitre.org Brett Kreider, MITRE, kkreider@mitre.org Eugene Aronne, MITRE, earonne@mitre.org

(2)(e) Primary Representatives’ Support

I, Joe Levy, as OASIS Primary Representative for Sophos, confirm our support for the OHDF TC proposed charter and approve participation by our participant named in the charter as a co-proposer. As OASIS primary representative for MITRE, I, Raj Rajagopal, confirm our support for the OHDF proposed Charter and endorse our participants listed above as named co-proposers.

(2)(f) TC Convener

Aaron Lippold (MITRE)

(2)(g) Anticipated Contributions

Finalizing the draft specification Eliciting additional requirements Proposing reference implementation tooling and utilities

(2)(i) FAQ Document

MITRE Security Automation Framework (https://saf.mitre.org/#/normalize)

(2)(j) Work Product Titles and Acronyms

OHDF: OASIS Heimdall Data Format

The post Call for Participation: OASIS Heimdall Data Format (OHDF) TC appeared first on OASIS Open.


FIDO Alliance

Sky News: ‘Passkeys’ mean you ‘never need to know a password’

Image Matrix Tech Editor Djuro Sen says “passkeys”, developed by FIDO, means you “never need to know a password”. “It essentially removes the fact of using a password and you […] The post Sky News: ‘Passkeys’ mean you ‘never need to know a password’ appeared first on FIDO Alliance.

Image Matrix Tech Editor Djuro Sen says “passkeys”, developed by FIDO, means you “never need to know a password”. “It essentially removes the fact of using a password and you just use your device, your phone or something else, mostly your phone, to log in to say, through this example, eBay,” Mr Sen told Sky News Australia. “This uses a type of cryptography that has a public key and a private key.”

The post Sky News: ‘Passkeys’ mean you ‘never need to know a password’ appeared first on FIDO Alliance.


Financial IT: 2023 Predictions: Authentication, Digital Identity and In-Car Payments

On major trend from 2022 is the continued evolution of the fraud industry. A combination of active and passive authentication can ensure that the payments flow is secure while limiting […] The post Financial IT: 2023 Predictions: Authentication, Digital Identity and In-Car Payments appeared first on FIDO Alliance.

On major trend from 2022 is the continued evolution of the fraud industry. A combination of active and passive authentication can ensure that the payments flow is secure while limiting the impact on the customer experience. Major global payment schemes are introducing new regulations that will see banks recognize the authentication work done on the merchant side. This means that merchants are able to leverage industry authentication standards like FIDO Alliance to create their own checkout journey to reduce the friction between the customer and merchant services. This helps combat both fraud and cart abandonment, helping to deliver higher sales conversion rates and a better return on investment.

The post Financial IT: 2023 Predictions: Authentication, Digital Identity and In-Car Payments appeared first on FIDO Alliance.

Thursday, 19. January 2023

The Engine Room

Kicking off 2023 with a digital decluttering session

As an organisation we are committed to responsible data, which includes practices such as data minimisation, seeking consent, establishing a RAD plan (or a plan for Retention, Archival and Deletion of the data we use in our work) and more. Last week, in the spirit of starting the work year with some RAD energy, our […] The post Kicking off 2023 with a digital decluttering session first appeared o

As an organisation we are committed to responsible data, which includes practices such as data minimisation, seeking consent, establishing a RAD plan (or a plan for Retention, Archival and Deletion of the data we use in our work) and more.

Last week, in the spirit of starting the work year with some RAD energy, our team got together to have a “digital declutter fest”. Equipped with a cleaning-themed playlist (including tunes like Outkast’s “So Fresh, So Clean” and Hilary Duff’s “Come Clean”), we went through folders from past projects and listed which subfolders and files needed to be “cleaned” or re-organised.

During the declutter session we didn’t actually archive or delete files from our shared Drive.We used the template below to identify what needed to be deleted or archives and what needed a closer review (by project managers or other team members who worked on the project).

We especially kept an eye out for files that could possibly include personal data and/or sensitive data, and also files that have not been used in a long time and that we wouldn’t need in the future (such as email drafts or notes from calls with partners from projects that are no longer active). In short, every time we saw something that required further action, we made a note of it on the template. 

At the end of the session, we reflected on the process, noting how this was a practical way of acting on our Responsible Data values. We talked about the importance of  taking care of the data and information we might hold on our partners and research participants (like email addresses and interview notes), the usefulness of reviewing our RAD plan frequently (especially when working on long projects in which many years worth of information can accumulate) and the ways we can make a habit out of digital decluttering our organisational and individual files. 

After the call, we divided up the work of actually reviewing the identified files, with plans to responsibly archive the things we want to keep and delete what we no longer need. 

Declutter your own digital space

If you’re interested in having a similar digital decluttering session with your team, feel free to use and adapt the template we used (reproduced below!).

spreadsheet template for decluttering

I would also recommend taking a look at our resource “Becoming RAD – How to Retain, Archive and Dispose of data responsibly”. It’s a short collection of tipsheets designed to support civil society organisations taking the first steps towards becoming RAD and developing responsible and streamlined processes for retaining, archiving and deleting data, currently available in English, Spanish and French

We are also around to provide direct support to your organisation if you’re curious about responsible data. Get in touch!

Image by Steve Johnson via Unsplash.

The post Kicking off 2023 with a digital decluttering session first appeared on The Engine Room.

Origin Trail

Trace Labs, the core development company of OriginTrail, joins Sustainable Medicines Partnership to…

Trace Labs, the core development company of OriginTrail, joins Sustainable Medicines Partnership to help make medicines more accessible and sustainable Trace Labs, the core development company of OriginTrail, is excited to announce that we have joined the Sustainable Medicines Partnership (SMP), a not-for-profit, multi-stakeholder, action collaborative of 48 organizations. The SMP is committed to
Trace Labs, the core development company of OriginTrail, joins Sustainable Medicines Partnership to help make medicines more accessible and sustainable

Trace Labs, the core development company of OriginTrail, is excited to announce that we have joined the Sustainable Medicines Partnership (SMP), a not-for-profit, multi-stakeholder, action collaborative of 48 organizations. The SMP is committed to reducing waste of medicines and medicines packaging, making medicines more accessible and more sustainable.

The purpose of the partnership is to bring together leading organizations from across the healthcare and pharmaceutical industry to collaborate and share knowledge, in order to find sustainable solutions to the challenges facing the industry. Trace Labs will play an important role in this partnership by supporting the creation of industry relevant knowledge assets on the OriginTrail Decentralized Knowledge Graph (DKG).

These verifiable knowledge assets will be used to advance the mission of the SMP by providing data that can be used to identify and track the movement of medicines and packaging materials throughout the supply chain. This will enable the SMP to identify areas where waste is occurring and develop solutions to reduce it. Additionally, knowledge assets will be used to improve the accessibility and sustainability of medicines by providing insights into the distribution and delivery of medicines to patients.

The Sustainable Medicines Partnership (SMP) is a not-for-profit, multi-stakeholder, action collaborative of 48 organisations. The SMP is executing a four year programme to build science-based, scalable, sustainable solutions to reduce the waste of medicines and the waste from medicines.

Nazneen Rahman, Founder and CEO at YewMaker, executive lead of the Sustainable Medicines Partnership, said: “I’m delighted Trace Labs has joined the SMP. Their technological expertise in delivering accessible, transparent, verifiable data in complex supply chains is highly relevant for the SMP’s goals. We are hugely excited about the potential of the collaboration to improve the sustainability and accessibility of medicines.”

Jurij Skornik, Trace Labs’ General Manager, highlighted: “We are thrilled to be joining forces with other member organizations such as Pfizer, British Standards Institution (BSI), GS1, Walgreens, and AstraZeneca, to create a more sustainable and accessible healthcare system for everyone. As the healthcare industry continues to evolve, we believe that partnerships like the SMP are crucial for driving progress and finding innovative solutions. As part of this collaborative effort, we look forward to working with our partners to achieve our shared goal of reducing waste, improving access, and making medicines more sustainable.”

Trace Labs has already deployed OriginTrail-based solutions in the pharmaceutical industry that make use of verifiable knowledge assets. AidTrust, a product developed jointly with our partner BSI, increases transparency and trust in the distribution of medicines by bringing together BSI’s global presence and supply chain risk management expertise with the power of OriginTrail DKG. It enables visibility (product flows and utilization, inventory levels, etc.), risk alerts, and real-time decision-making at all stages of the supply chain while protecting the integrity, security, and privacy of data. Ultimately, the goal of AidTrust is to help donor organizations and NGOs demonstrate that donated medicines were handled properly and reached the intended patients, even in challenging environments.

By joining the Sustainable Medicines Partnership, Trace Labs will bring the power of OriginTrail knowledge assets to the mission of reducing waste of medicines and packaging and making medicines more accessible and sustainable. We are excited to work together on innovative and impactful solutions in the next months and years.

👇 More about OriginTrail 👇

OriginTrail is an ecosystem dedicated to making the global economy work sustainably by organizing humanity’s most important knowledge assets. It leverages the open source Decentralized Knowledge Graph that connects the physical world (art, healthcare, fashion, education, supply chains, …) and the digital world (blockchain, smart contracts, Metaverse & NFTs, …) in a single connected reality driving transparency and trust.

Advanced knowledge graph technology currently powers trillion-dollar companies like Google and Facebook. By reshaping it for Web3, the OriginTrail Decentralized Knowledge Graph provides a crucial fabric to link, verify, and value data on both physical and digital assets.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

👇 More about Trace Labs👇

Trace Labs is the core developer of OriginTrail — the open source Decentralized Knowledge Graph. Based on blockchain, OriginTrail connects the physical world and the digital world in a single connected reality by making all different knowledge assets discoverable, verifiable and valuable. Trace Labs’ technology is being used by global enterprises (e.g. over 40% of US imports including Walmart, Costco, Home Depot are exchanging security audits with OriginTrail DKG) in multiple industries, such as pharmaceutical industry, international trade, decentralized applications and more.

Web | Twitter | FacebookLinkedIn

Trace Labs, the core development company of OriginTrail, joins Sustainable Medicines Partnership to… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Velocity Network

Georgios Markakis joins the Velocity board

The post Georgios Markakis joins the Velocity board appeared first on Velocity.

Wednesday, 18. January 2023

Content Authenticity Initiative

The CAI welcomes Canon!

Welcoming Canon as the newest member of the Content Authenticity Initiative.

by Santiago Lyon, Head of Advocacy and Education, CAI 

We are delighted to welcome Canon, one of the world’s leading camera manufacturers, as a member of the Content Authenticity Initiative (CAI) joining more than 890 media and tech companies, NGOs, and academics working together to fight misinformation by furthering the implementation of open-source digital provenance technology.  

Founded in 1937, Canon is a leading global manufacturer of both professional and consumer cameras and lenses and brings a long history of innovative and ground-breaking photojournalism projects to the CAI. Canon’s equipment is widely used by photojournalists and others to capture impactful, compelling and beautiful imagery from around the world. Their equipment has been used to capture award-winning images in recent years including Pulitzer Prizes and World Press Photo awards, among many others. 

“Canon enthusiastically supports efforts to fight misinformation by ensuring the authenticity and provenance of digital images that are created and enjoyed by society. Joining the efforts of the CAI is an important step in this endeavor,” said Go Tokura, Chief Executive of Imaging Communication Business Operations at Canon. “We are looking forward to working with other technology companies and media partners to develop technology solutions that achieve these goals." 

Provenance technology provides secure, tamper-evident metadata that accompanies images and other file types along their journey from capture through editing to publication, showing the viewer where an asset came from, and any changes made to it along the way. The CAI looks forward to working with Canon on prototyping and implementing provenance technology into their future products. 

Today, Canon also joins the Coalition for Content Provenance and Authenticity (C2PA) as its latest member that will provide creators, consumers, and others with opt-in, flexible ways to understand the authenticity and provenance across media types. 

We encourage diverse organizations and individuals to join the Content Authenticity Initiative to advance our efforts for digital provenance. You can find more information here


Nyheder fra WAYF

WAYF indfører understøttelse af OIDC

Siden WAYFs begyndelse har SAML været enerådende som teknisk protokol for overførsel af logins fra brugerorganisationer til tjenester i føderationen. Men nu har vi udviklet understøttelse for at en tjeneste alternativt kan tilsluttes WAYF via protokollen OpenID Connect ("OIDC"). Language Danish Read more about WAYF indfører understøttelse af OIDC

Siden WAYFs begyndelse har SAML været enerådende som teknisk protokol for overførsel af logins fra brugerorganisationer til tjenester i føderationen. Men nu har vi udviklet understøttelse for at en tjeneste alternativt kan tilsluttes WAYF via protokollen OpenID Connect ("OIDC").

Language Danish Read more about WAYF indfører understøttelse af OIDC

Velocity Network

Velocity CEO Dror Gurevich speaks at Talview’s Instahiring 2023 event

The post Velocity CEO Dror Gurevich speaks at Talview’s Instahiring 2023 event appeared first on Velocity.

Dror Gurevich speaks at Neeyamo’s Evolve Beyond Borders

The post Dror Gurevich speaks at Neeyamo’s Evolve Beyond Borders appeared first on Velocity.

New Velocity podcast introduces our India ecosystem launch event

The post New Velocity podcast introduces our India ecosystem launch event appeared first on Velocity.

Blockchain Commons

Blockchain Commons 2022 Overview

2022 might have been Blockchain Commons’ strongest year ever. Though we laid down much of our foundational architecture a few years ago, we returned this year with one of our most important architectural elements ever: The Gordian Envelope. Meanwhile, our community grew to become the Commons we’d always envisioned. Here’s an overview of the year:

2022 might have been Blockchain Commons’ strongest year ever. Though we laid down much of our foundational architecture a few years ago, we returned this year with one of our most important architectural elements ever: The Gordian Envelope. Meanwhile, our community grew to become the Commons we’d always envisioned.

Here’s an overview of the year:

Read More Our Vision

We expanded our main vision statement last year, so that it now reads:

“Blockchain Commons advocates for the creation of open, interoperable, secure & compassionate digital infrastructure to enable people to control their own digital destiny and to maintain their human dignity online.”

The core is still there: Blockchain Commons advocates for human dignity online, so that we can control our digital destiny. But we’re now more explicit that we expect to reach this goal through the creation of “open, interoperable, secure & compassionate digital infrastructure”. That’s what Blockchain Commons has been working on since its inception, but we wanted to be more clear about why.

Our objectives for reaching this vision have remained the same: creating a commons, designing a self-sovereign architecture, inspiring demand for self-sovereignty through advocacy, and enabling our community of peers. Of those, our work on the Commons and the architecture were our main focus for the year.

Also see our updated Projects page for how everything fits together.

The Commons

We have always imagined a commons of different companies and individuals coming together to jointly create an interoperable architecture that supports the Gordian Principles of independence, privacy, resilience, and openness. Our Gordian Developer Community (formerly the Airgap Wallet Community) has enabled that from the beginning.

In 2022, that community truly matured through the appearance of a number of sponsors who are not only supporting Blockchain Commons but also actively working with us in weekly or biweekly calls and conferences. New faces in 2022 included Chia, CrossBar, and Proxy, who have joined extant sponsors such as Bitmark, Blockchainbird, Foundation, and Unchained Capital.

We’ve also begun reaching out more to the identity community, including at Rebooting the Web of Trust 11, where we collaborated on papers such as “Selective Correlation” (still in process). We hope for even more interaction with identity professionals in 2023.

Silicon Salons

The expansion of our commons has allowed us to bring various members of that community together to share their knowledge and express their requirements. One of the results of this is our Silicon Salons. We held two in 2022: Silicon Salon 1 in June and Silicon Salon 2 in September. At these Salons, hardware-wallet designers and semiconductor manufacturers came together to express their requirements and possibilities.

Silicon Salon 3 is occurring on January 18th. The plan is to continue them throughout 2023 as long as interest holds.

Gordian Envelope

We’ve also been able to work extensively with our community on Gordian Envelope, which we’ve detailed in a whole series of videos.

Gordian Envelope is our most notable new architectural design in a few years. It’s a Smart Document structure that allows for the storage and communication of sensitive data supported by powerful elision, encryption, and variable permit technology.

We introduced Gordian Envelope because we saw the need for a privacy-preserving data format. You can hold credentials and decide what to reveal. You can sign things anonymously, but later connect a real identity to those signatures. You can hide within herd privacy and choose whether your data is ever revealed. We wrote a whole set of use cases that reveal many of the other advantages of Gordian Envelope (such as metadata, selective encryption, and multilock permits).

As part of our Gordian Envelope development, we also released our first MVA cryptosuite, to detail the cryptographic algorithms and architectures we found most useful for Envelope. It’s not our final word on what the best algorithms are currently, or even what we’ll finalize in Envelope, but it’s a great introduction to our current thoughts on the state of crypto-technology.

CSR

Much of our Gordian-Envelope design in 2022 was in service to the work on Collaborative Seed Recovery, or CSR. This is a joint project that we’re working on with Bitmark, Foundation, and Proxy; this shows off another benefit of our growing community, because it allows more thoughtful attention from a variety of stakesholders for a project like this.

CSR is ultimately an application-level program, building on our Envelope architecture. It uses Shamir’s Secret Sharing (and our SSKR implementation) to allow for the sharding of seeds and their automated storage at remote locations. The ultimate goal is to create an interoperable CSR ecosystem where different providers can offer different services, allowing users to independently store their private keys or seeds (or other data) in a resilient and secure way.

We’re certain the result will be an improvement for users because existing procedures, such as the etching of BIP-39 words into metal plates, weren’t being followed by most users, leaving them vulnerable to loss; CSR will assure the resilience of those digital assets, a service that wallet providers will be eager to offer. We know that designers will be interested in interoperable specifications of this type not just because of the benefits they offer to their users, but also because interoperable specifications can reduce developer costs and liability risks and give them a leg up in the industry.

Our Adoption

While Gordian Envelope is still in a relatively early stage of development, we’ve been pleased to see continued adoption of some of our older specifications such as Uniform Resources (URs) and animated QR codes (which are largely a requirement for translating PSBTs into QR codes). It’s a sign of success that we’re no longer entirely sure exactly who’s adopting what, but it looks to us like Sparrow Wallet, Blue Wallet, Keystone Wallet, Casa, Nunchuk, and Blockstream’s Jade Wallet are now all using some part of our UR and/or QR work.

We’re also thrilled to have partners and contributors continue to expand our mature specifications by converting them to new languages. Thanks to Bryan Bishop and Cramium for lifehash-python and to Craig Raw and Sparrow Wallet for toucan, which is lifehash in C/C++.

Our Reference Apps

Blockchain Commons has long seen reference apps as the next stage in our architectural development after specification design. When we’re not working directly with designers (as we are with CSR), our Reference Apps give us a way to demonstrate the usage of our specifications and the best practices of our Gordian Principles to those developers.

We don’t talk as much about our CLI apps as we do our MacOS and iOS apps, but they’re crucial tools for developers to both see the implementation and usage of our specifications. Our most important new app is thus envelope-cli, which demonstrates the real-world usage of Gordian Envelope: all of the current Envelope specifications, including elision and encryption, are already entirely functional. envelope-cli works well with other apps such as bytewords-cli to allow for the testing and development of many of our new specs.

We also continued to update our flagship app, Gordian SeedTool with 1.4.0 and 1.5.0 releases. Most of that focused on experiments with NFC Tags as we continue to explore the best practices for resiliently securing data. Meanwhile, Gordian Server and our related Standup Scripts saw updates to support Bitcoin 23 (and were successfully tested against Bitcoin 24).

That pretty much closes out our architectural work, which was the majority of our attention in 2022, but our objectives of creating demand and generating new peers did get some attention.

Law & Advocacy

Obviously, we and our community can work together to advocate our principles and our specifications to the larger community of cryptographic engineers and designers. But we’ve spent more time working with legislators to try and encapsulate some of the usage and protection of digital assets in law.

Much of that continues to be focused on Christopher’s personal relationship with the Wyoming legislature. This year, that included support for e-residency and advocacy for Wyoming Registered Digital Assets. However, our biggest push has been for the protection of private keys in judicial venues, a battle that continues to be crucial in 2023.

By working to legally support digital assets with programs like e-residency and WRDAs and to protect them, we can create an environment where a resilient, private, open, and independent digital ecosystem isn’t just a nice idea, but the expectation: that’s what we mean when we say that we’re creating demand for the principles that we advocate.

Our Peers Program

Finally, our program of bringing new peers into the industry has long focused on our two major education programs, Learning Bitcoin from the Command Line and #SmartCustody. We weren’t able to give them a lot of attention this year, primarily because we haven’t been able to find sponsors interested in expanding these crucial programs. (If that’s actually you, contact us at team@blockchaincommons.com!)

Nonetheless, #SmartCustody got a bit of expansion early in the year, the most important of which was a new multisig scenario. Just five years ago, when we started work on #SmartCustody, self-sovereign multisig was still a pipe dream, so it’s thrilling that five years later we can offer it to sophisticated users. (Unfortunately, it’s still somewhat complex, though we think specifications like our crypto-request could resolve that.) We’ve also started writing some case studies examining how second-generation digital wallets fulfill the Gordian principles.

The other major element of our peers objective has been our internship program, which ran for its third year this summer. We got good support from interns in some of our projects, particularly during the summer session in Wyoming, and hope we were able to provide them with experience that will be valuable in the future.

The Future of Gordian Envelope

For the moment we’re pausing in our foundational design work on Gordian Envelope, as we have the core specification, software, and most crucial documentation in place. That means that we’re now in a place where we need feedback on the current specification before we take the next step.

We hope to gain some of that from international standards organization such as IETF or W3C, or foundational specification organizations such as DIF or the Linux Foundation, who we hope will take up Envelope in the next year. These are organizations that can offer vital feedback on topics such as our MVA Algorithms. Our first presentation is to the W3C Credentials Community Group, on January 31st.

Ideally, standards organizations could even develop our specification as a standard, which would make the adoption of Envelope by companies much easier. Obviously, any standardization would be an elongated process. If we don’t see a full adoption of the specification by an international standards organization, we’d at least like to see standards organizations begin to mandate usage of hash-based elision, as we’re aware of other proposals for that privacy function.

Another way that we can receive feedback during this period is via an independent third-party security review. We hope to lead the community in funding such a review this year.

After this brief pause, there’s lots more we’d like to do with Gordian Envelope, such as introducing Self-certifying IDs (SCIDs) and better supporting a variety of structured formats. We just need to lock the foundation of Envelope down first, and then there’s a variety of interesting and useful things that we can do with it next.

The Future of Other Architectural Work

Blockchain Commons is not a standards organization ourself: our foremost goal is to meet the goals of our vision, creating “open, interoperable, secure & compassionate digital infrastructure”. However, we recognize the advantages of standardization to businesses, and so we’d also like to see our older UR work picked up as a BIP and/or ERC.

To further support business, we’d also like to offer some conformance testing for our UR designs (and for Envelope for that matter), so that companies can announces when they’re doing work that meets the best practices of our most current specifications.

Meanwhile, a lot of our Envelope work was in service to CSR, so our next goal is to see that finalized. That primarily means supporting our partners to enable their debut releases of CSR, both technologically and by expanding the options they’re considering for sharding scenarios. We hope to see the first releases of CSR in Q1 or Q2. We also plan to create a reference app of our own, called Gordian Companion, which can be used to store other peoples’ SSKR shares; besides producing a mobile app, we also hope to create a Tor-enabled web service.

Our goal with reference apps is always to show developers the possibilities of our specifications and some best practices for their deployment, and that speaks to our long-term goal for CSR: passing development on to a variety of developers. By the end of the year, we hope to have at least two interoperable versions of CSR, released by two different companies (other than Blockchain Commons). Moreso, we’d like to see a real diversity of companies working in this space, including not just traditional wallet companies, but also identity companies, protected communication companies, and more.

We’ve primarily focused on MacOS and iOS to date for our reference apps and their reference libraries, with some support from our partners in creating Android conversions. We hope to be expanding our own ability to support Android in the next year by porting all of our reference code to Rust, beginning with Gordian Envelope.

Beyond that, there’s definitely new cryptographic ground to cover. Taproot and Schnorr are now increasingly mature. We also want to do more investigation of zero-knowledge proofs, Multiparty Computation (MPC), verkle trees, and cryptographic circuits. Unfortunately, these technologies aren’t quite ready for widespread usage yet, and so it’s putting a hold on some initiatives.

That includes Collaborative Key Management or CKM, which allows for not just the distributed storage of keys, but their distributed usage as well. CKM is much more cutting-edge technology than CSR, since it depends on the new field of MPC so it’s one of the topics that’s awaiting continued expansion in the field.

The Future of Other Objectives

Beyond our continuing and new architectural work, we expect to continue supporting our other programs.

For Silicon Salon, that means quarterly salons, tentatively scheduled for January, April, July, and October, each with 3-5 presentations followed by facilitated conversation. The turnaround on silicon design is slow, so this is by necessity something that we’re supporting over a long haul. Though we hope to see current chips integrating with new hardware to support specifications like Lifehash and UR in 2023, it’ll be at least 2024 before we see tapeouts of chips influenced by the Salons, thanks to real semiconductor manufacturers reacting to the real requirements of hardware-wallet companies. Nonetheless, we’re eager to see what that future brings!

Meanwhile, a lot of our continued advocacy comes courtesy of our early self-sovereign identity work. We expect to continue that in 2023, including more work in Taiwan, in Europe, and elsewhere. For Europe in particular we’re seeing problems with over-identification, where regulations are encouraging people to use the most intrusive forms of identification possible (which usually means biometrics) and to overcentralize. Simultaneously, the EU is going to be making huge decisions about identity wallets in the next year or two. We are supporting a variety of European parties who are concerned with all of these elements.

We are also, of course, continuing our work in Wyoming. We hope to see two notable bills this year, one on private-key protection and another on digital-asset registration, though we’re not entirely sure whether they’ll reach the agenda for discussion or not. if they do, and if they pass, we’ll need to do additional work to support them afterward.

For Smart Custody, we’re unfortunately blocked because of the speed of technological development, including incomplete deployments of multisigs and over on Ethereum, unfinished work on account abstractions. On the other hand, we’re right on the verge of larger MPC (Multi-Party Computation) deployments such as Schnorr-based FROST & ElGamal-based ECDSA threshold signatures. So, we foresee exciting futures for Smart Custody, but aren’t quite sure when we’ll be ready to dive in.

Overall, there are plenty of new technological possibilities for Blockchain Commons going into 2023, from the pragmatic completion of our tasks in progress to the imaginative innovation of the cryptographic future.

Meanwhile, we’d love to bring in more partners, though we’re aware that can be difficult in a potentially recessionary year. But often the best investments can be made when the market is down, and so we encourage our potential partners to consider that for 2023. The larger our community, the more we can ensure that everyones’ requirements are met, and the better we can release interoperable systems that will benefit everyone!

Our Thanks

Ultimately, it’s community that makes Blockchain Commons work: though we are advocating for personal control and independence, we can only get there together. So thank you for all the contributions to Blockchain Commons in time, money, and expertise from both individuals and companies; and thank you for your interest in self-sovereignty and our principles of privacy, independence, resilience, and openness. Together we can do this!

Christopher Allen
Principal Architect & Executive Director
Blockchain Commons

Tuesday, 17. January 2023

Digital Identity NZ

Will 2023 be remembered as ‘the melting pot year’ for digital law?

January Newsletter Kia ora e te whānau Happy New Year. I hope this finds you well and rested from the holidays. Thank you for reading DINZ’s first newsletter of 2023 on the beach, at the office, or in the tent or caravan out of the rain.  Fraud issues tend to surface in the news much … Continue reading "Will 2023 be remembered as ‘the melting pot year’ for digital law?" The post Will 2023 b

January Newsletter

Kia ora e te whānau

Happy New Year. I hope this finds you well and rested from the holidays. Thank you for reading DINZ’s first newsletter of 2023 on the beach, at the office, or in the tent or caravan out of the rain. 

Fraud issues tend to surface in the news much more over the holidays as increased online purchases offer greater opportunities for attackers. Matthew Evetts is the director of security at DINZ member Datacom and did a great job of explaining the issues in simple terms in this piece on RNZ

So with consumers and their data in mind, it is notable that the Government has just released a Cabinet Paper on the proposed Consumer Data Right (CDR) with a framework similar to that of the DISTF. Russell Mc Veigh has summarised its insights in this helpful publication here. While DINZ could debate whether the banking sector is a better starting point for CDR compared to digital identification, we expect to analyse the paper and offer commentary in due course.  

For DINZ itself, each new year brings the need to balance its limited resources across a range of activities. In 2022 greater focus was placed on documented evidence based thought leadership with its submissions, the research and the landscape report that will be published in a few weeks time. In 2023, expect to see more focus on events and interactive mahi, now that grounding documented mahi has progressed. Next month we’ll begin with events to showcase the landscape report on digital identity in Aotearoa that integrates the results of the research we highlighted late last year (we are always looking for member premises to host so please get in touch). We’ll also be kicking off a two-month Summer Series around perspectives of decentralised digital identity where we expect to feature members (get in touch!) with a decentralised service or pilot leading discussion and debate with their perspectives. Planning is already underway for April with an overseas expert visit intermingling with Privacy Awareness Week. So there’s plenty to look forward to before the Digital Trust Hui Taumata 2023 confirmed for 1 August 2023 at Te Papa Tongarewa, Wellington. Get in touch if you’re interested to sponsor or exhibit. 

What else is in store for us in the digital identity domain in 2023? Truckloads is the short answer. As I observed in my personal email to members before the break, in legislation alone, we expect to be looking at the DISTF in its final run up to enforcement on 1 January next year, MoJ’s proposed amendments to the AML/CFT legislation that DINZ analysed  back in 2020, OPC’s proposed Code of Practice for Biometrics that DINZ responded on last year, the exposure draft of NZ’s CDR legislation and further developments in Open Banking. 

The sheer scale of the compliance undertaking for any digital identity service provider looks quite daunting when you think that most service providers will need to comply with most of the above, and also with the Privacy Act 2020 and conceivably the EIV Act 2012, even if today it applies primarily to RealMe. Aotearoa is not alone in this regard. The EU is a great example where in the fullness of time the upcoming Data Act will sit beside the Data Governance Act, the GDPR, the Free Flow of Non-Personal Data Regulation, the Open Data Directive, the Digital Markets Act and the Digital Services Act as part of the European strategy for data, not to mention EIDAS 2.0. Will 2023 be remembered as the melting pot year for digital law? We’ll see.

But let’s hope that lawmakers everywhere are setting aside resources in the coming years for legislative harmonisation!  

Colin and the DINZ Executive Council of 2023

Read More: Will 2023 be remembered as ‘the melting pot year’ for digital law?

The post Will 2023 be remembered as ‘the melting pot year’ for digital law? appeared first on Digital Identity New Zealand.


MOBI

Traent

Traent has developed a Web3 business ecosystem supporting efficient, transparent, and sustainable activities of enterprises and their stakeholders, thanks to the creation of hybrid blockchains. The main characteristics are: Complete and ready-to-use solutions for ESG reporting, supply chain tracking, and optimization. Process efficiency. Risk mitigation. Cost reduction. Privacy and confidentiality

Traent has developed a Web3 business ecosystem supporting efficient, transparent, and sustainable activities of enterprises and their stakeholders, thanks to the creation of hybrid blockchains. The main characteristics are:

Complete and ready-to-use solutions for ESG reporting, supply chain tracking, and optimization. Process efficiency. Risk mitigation. Cost reduction. Privacy and confidentiality.

traent.com

The post Traent appeared first on MOBI | The New Economy of Movement.


Trust over IP

Why the digital identity juggernaut needs safety belts

How a new ToIP white paper provides a systemic view of how human harms function in digital identity ecosystems – and how to mitigate them The post Why the digital identity juggernaut needs safety belts appeared first on Trust Over IP.
Beekeeper photo by Bianca Ackermann on Unsplash

74 Civil Society Organizations wrote to the World Bank with “grave concerns” about the Bank’s ID4D programme.

“For too long, the emphasis has been on the development promises of digital ID systems, but it is past time to reckon with their vast potential for abuse and exploitation”.
Letter from global CSOs to the World Bank | Privacy International, September 2022 

Trust Over IP’s “Overcoming Human Harm Challenges in Digital Identity Ecosystems” paper (pdf) is our first attempt to describe the risks of harm that may arise through the use of Digital Identity, how those harms occur, and how to prevent or mitigate them. After one year of research and analysis, the paper is ready for your review. Can SSI harm people in the real world? Our paper says “yes” and why that’s so. In short: there are many examples of digital identity systems hurting people, and there’s no reason to imagine SSI-based systems will be exempt. We invite your feedback on this paper. If you are a ToIP member, you can comment directly in the Google doc. If you are not yet a member, please join or you can comment in the GitHub discussion.  To read the document without commenting, here is a pdf of the current version and the text on github.

Imagine someone you know has a life-changing tragedy, like… 

A voter wrongfully convicted of electoral fraud  A gambler’s identity used to fuel addiction An immigrant family unable to work, bank, or get basic services A teen suicide An indigenous people whose relationship with their forest is shattered A soldier’s biometrics targeted them for execution

In this paper we explain how each of these real life examples were the outcome of digital identity abused, misused, or gone wrong. In each case, SSI can make things better, or worse. So, can you do anything about these risks that affect others? And why should you bother? 

Can you do anything? Anything that matters?

We listed some responses to the risk of human harms:

Design for a balance of power. Well-executed decentralized or federated identity systems have a shot at assuring agency by empowering those at the edge of the ecosystem, and recognising the vulnerabilities of the least powerful people.  Add human harms to existing risk management. Cybersecurity teams, corporate finance, strategy, and legal teams can add negative externality risk assessments for identity ecosystems they support. Investing in detection, prevention, intervention, and recovery services enables identity ecosystem members to manage their exposure according to their respective risk appetites.  Align objectives and incentives to minimize regret. Human harms arise when ecosystem actors’ objectives conflict and when mistuned incentives drive harmful choices. Measure the cost of externalities. SSI’s trustworthy provenance might improve accountability and discourage harmful behavior within an ecosystem. But harms are infectious. The collective resilience of an ecosystem, like herd immunity, reduces harm. Build a community of practice on human harm reduction. New disciplines and practices will emerge, as they have for other areas of risk reduction. Start inside and join forces with your business ecosystem. 

You can imagine many more. Still, … 

Why should you act? 

Doing nothing is nearly always the easiest path. It can feel like a distraction from today’s work to be aware of negative externalities that show up downstream from your standards efforts, your product development, your engineering and operations. They are distant from you in time, in space, across regulatory jurisdictions, with many intermediaries between you and people who might be harmed. It’s hard to care about hypotheticals and abstract numbers. Should you care anyway? 

Let us offer you five reasons you should act. 

Economics. Prevention is cheaper. 

Nearly always.“Shift-left” is the practice of moving security concerns closer to the start of the product development lifecycle. Again, it’s easier and more complete to build security into services from the start than try to bolt security onto an inherently insecure system. Perhaps we can shift-left when it comes to harming others, embed harms work into the identity product and governance checklists so those considerations help us steer clear of foreseeable harms. 

But who pays for prevention and other harm reduction and remediation work? Who should invest in testing if your trust architectures come with few negative externalities built in? Should it be those closer to the utility side of things? Should governments pick up the slack? Will ecosystem operators that serve more vulnerable humans have a better shot at understanding their risks? 

We should have an “accountability” conversation sooner rather than later. 

Financial and Legal Risk. Action reduces liability. 

But can you avoid liability if laws and regulations are obsolete or enforcement is toothless or regulators are captured? How can we pool exposure so it’s safer to start digital identity ecosystems or to join them? 

Opportunity for value and advantage. 

Harm reduction practices improve your understanding of strategic context and unmet customer needs. Harm surveillance practices improve identity ecosystem situational awareness and security. Harm intervention and recovery practices improve customer service and speed crisis response. 

Harms are infectious and no-one is immune.

Harms impact beyond the close families and friends of those directly harmed to other actors in the ecosystem and in adjacent ecosystems. The 2022 FTX cryptocurrency exchange bankruptcy amplified and spread the human harms of lost investments throughout DeFi and crypto markets. 

Sleep better. It’s good karma. 

Being real about the potential for harm makes it easier to put it in perspective, to weigh your tradeoffs, and to act in good conscience. 

Start small, start together… 

Enroll. Join ToIP’s Human Experience Working Group to build our collective maturity and capability. If you’d like to lead Trust Over IP’s design and implementation of harms work, join us.  Awareness. Discuss SSI’s potential for bad things with your crew. You know you have buy-in when…  Educate. Review the paper with your team. Take a stab at identifying other harms or sorting the list. Comment on the paper.  Show up. At the ToIP All Members meeting on Wednesday, January 18th, 2023 at 10 am Pacific Time there was a presentation and discussion about the paper and how to get involved. Listen to a recording of the meeting.

This report is a product of the ToIP Human Experience Working Group’s SSI Harms Task Force

Principal authors are Nicky Hickman (CEO, Come to the Edge; Industry Advisor Blockchain & Digital Identity Lab, JYU; Advisor at cheqd), Phil Wolff (Wider Team) and Pyrou Chung (East West Management Institute). Contributors were Aamir Abdullah (University of Colorado Law School), Christine Martin and Darrell O’Donnell (Continuum Loop), Jacques Bikoundou, Dr Jill Bamforth (Swinburne University of Technology), John Phillips and Jo Spencer (Sezoo), Kaliya Young (Identity Woman), Kim Hamilton (Centre Consortium), Oskar van Deventer and Rieks Joosten (TNO), Paul Knowles (Human Colossus Foundation), Sankarshan Mukhopadhyay (Dhiway), Scott Perry (Schellman).  Many more joined Task Force calls and contributed their time and expertise. 

The post Why the digital identity juggernaut needs safety belts appeared first on Trust Over IP.


Oasis Open Projects

Open Cybersecurity Alliance Adds Indicators of Behavior (IoB) Sub-Project

Boston, MA, USA, 17 January 2023 — The Open Cybersecurity Alliance (OCA), a global, ​standards-based initiative to simplify ​​integration across the threat lifecycle, announced today that it has accepted the Indicators of Behavior (IoB) Working Group (WG) as a sub-project. The OCA IoB brings together like-minded stakeholders in the cyber threat intelligence community to collectively focus […]

Security Practitioners to Create Standardized Approach for Representing Cyber Threat Actor Behaviors in a Sharable Format

Boston, MA, USA, 17 January 2023 — The Open Cybersecurity Alliance (OCA), a global, ​standards-based initiative to simplify ​​integration across the threat lifecycle, announced today that it has accepted the Indicators of Behavior (IoB) Working Group (WG) as a sub-project. The OCA IoB brings together like-minded stakeholders in the cyber threat intelligence community to collectively focus on patterns of behavior associated with malicious cyber activity. By understanding the behavior patterns, innovative solutions can be developed to enable shared behavior sets, leading to more proactive detection, effective mitigations, and, through timely and actionable sharing, more prevention.

OCA IoB will work to improve detection and response to cyber threats in a broader capacity than what is currently possible with the primary actionable shared data, typically Common Vulnerability Enumerations (CVEs) and Indicators of Compromise (IoCs). While it is critical to ensure CVEs are mitigated and active IoCs are blocked, these actions by their very nature force a reactive posture to an ever-increasing cyber threat. OCA IoB aims to create a standardized approach for representing cyber threat actor behaviors in a shareable format.

“The overarching theme with OCA IoB is to foster collaboration across and between organizations. Machine-readable IoB objects and reference implementation code that can easily integrate representations of adversary behaviors provide rapid detection and response capabilities that can be readily accessible to all organizations,” said Charles Frick, OCA IoB Chair, of the Johns Hopkins Applied Physics Laboratory. “The OCA IoB provides standardization amongst the vendor community who can, in turn, help provide this capability to smaller organizations that may not have the resources for advanced threat hunting teams.”

The OCA IoB will make use of the OASIS STIX Version 2.1 standard for representation of IoB data in machine readable format. It will also use the STIX 2.0 format for any consumption or federation of cyber threat intelligence for operations purposes. For objects that may help organizations respond to threat behaviors via automated workflows, the OCA IoB will ensure that shared workflows are compliant with the OASIS Collaborative Automated Course of Action Operations (CACAO) standard as it is developed. 

The OCA IoB aligns with the project’s mission of integrating tools and solutions across security teams. OCA IoB will directly enable vendors and end users to advance OCA’s mission of building an open ecosystem where cybersecurity products interoperate without the need for customized integrations. OCA IoB joins the growing body of OCA work including: the Kestrel threat hunting tool, the STIX Shifter patterning library, and the Posture Attribute Collection and Evaluation (PACE) for cybersecurity readiness.

The OCA is hosted by OASIS Open, one of the most respected, international bodies in the world for open source and standards. To learn more about the OCA or other OCA technologies that are available to help security teams connect their security tools and data, please visit: https://github.com/opencybersecurityalliance.

Support for OCA IoB

Canadian Institute for Cybersecurity
“Monitoring Indicators of Behaviors (IoB) presents the best opportunity for organizations to hunt for advanced threats and attacks at an early stage. Automated learning from unexpected and unauthorized modifications to normal operating baseline will empower and shift the focus of an organization from reactive to preventive cybersecurity. The Canadian Institute for Cybersecurity (CIC) is a leader in leveraging AI and contextual data to identify and detect IoB with a low false positive rate.”
–Haruna Isah, Research Associate and Talent/Partnership Development Manager, CIC

Cydarm Technologies
“Cybersecurity is a team sport, and collective defense is the best way to impose costs on attackers. Threat actors can easily change payloads and infrastructure to evade detection by Indicators of Compromise, but it is much harder for them to change their Tactics, Techniques, and Procedures. As an OCA member and a leader in Cyber Response Management, Cydarm supports the OCA’s IOB Sub-Project, toward sharing of tradecraft, to enable better collective defense.”
– Dr. Vaughan Shanks, Co-founder and CEO, Cydarm Technologies

Cyware
“As the threat landscape and attacker sophistication continue to evolve rapidly, collaboration around tracking, monitoring, and aggregating IOCs provides an attractive path forward for more capable, collective defense. Cyware is thrilled to join this OCA initiative designed to define a structure for exchanging IOBs that shorten the window of success for evolving attacker behaviors and methodologies.”
– Avkash Kathiriya, VP Research and Innovation, Cyware

IBM Security
“Identifying attackers based on their behavior patterns is one of the most effective ways to detect advanced threats – but defenders need an easier way to share this information with each other, as attackers are constantly evolving their techniques. By creating open standards for these behavior-based attack indicators, this project will allow more proactive and complete threat detection analytics to be shared in the community, shining a light on previously undiscovered threats.”
– Jason Keirstead, CTO, Threat Management, IBM Security

Prophecy International
“We continue to give enthusiastic support to the OCA as it goes hand-in-hand with our mission to improve our customers’ cyber posture worldwide and ensure that our products support the growth and evolution of the global cybersecurity community. The mission is to improve trust, interoperability and to create a network of industry leaders committed to working together to solve the hard problems in cyber security, and the OCA is a powerful vehicle to drive towards those goals.”
– Brad Thomas, CEO, Prophecy International & OCA Project Governing Board Member

sFractal Consulting
“OCA IoB, along with the OCA sub-projects Kestrel, PACE, and STIX-Shifter, help automate more sophisticated responses to today’s complex cyber attacks. Threat actors are increasingly using coordinated, automated attacks that are more frequent, more impactful, and more sophisticated. To successfully defend against these attacks, it is essential for security teams to cooperate and automate their defenses. IoB goes a step further than traditional cooperation because IoB includes information about the behavior of the attackers.”
– Duncan Sparrell, Principal, sFractal Consulting

About the Open Cybersecurity Alliance

The Open Cybersecurity Alliance brings together vendors and end-users to create an open cybersecurity ecosystem where products can freely exchange information, insights, analytics, and orchestrated response. OCA supports commonly developed code and tooling and the use of mutually agreed upon technologies, data standards, and procedures. The OCA is governed under the auspices of OASIS Open, which offers projects a path to standardization and de jure approval for reference in international policy and procurement.

The OCA is led by these organizations committed to solving the costly problem of siloed cyber tools and products: Canadian Institute for Cybersecurity, Center for Internet Security (CIS), Cydarm, Cyware, EclecticIQ, IBM Security, Prophecy International, Rapid7, SAIC, sFractal Consulting, and VMware.  

Media inquiries
communications@oasis-open.org

The post Open Cybersecurity Alliance Adds Indicators of Behavior (IoB) Sub-Project appeared first on OASIS Open.


FIDO Alliance

Ça m’intéresse: 2023, year of the end of passwords?

Often criticized and victim of security flaws, the password system could soon be replaced by a more reliable technology: passkey. The post Ça m’intéresse: 2023, year of the end of passwords? appeared first on FIDO Alliance.

Often criticized and victim of security flaws, the password system could soon be replaced by a more reliable technology: passkey.

The post Ça m’intéresse: 2023, year of the end of passwords? appeared first on FIDO Alliance.


Velocity Network

RANDA Solutions CEO Marty Reed joins the Velocity board

The post RANDA Solutions CEO Marty Reed joins the Velocity board appeared first on Velocity.

foundit’s Meeta Karanth joins the expert panel at our India ecosystem launch event

The post foundit’s Meeta Karanth joins the expert panel at our India ecosystem launch event appeared first on Velocity.

Saturday, 14. January 2023

LionsGate Digital

Government of Canada launches National Quantum Strategy to create jobs and advance quantum technologies

Strategy will see major investments in quantum research, talent and commercialization January 13, 2023 – Waterloo, Ontario, Quantum science and technologies are at the leading edge of research and innovation, with enormous potential for commercialization and game-changing advances, including more effective drug design, better climate forecasting, improved navigation and inn

Strategy will see major investments in quantum research, talent and commercialization

January 13, 2023 – Waterloo, Ontario, Quantum science and technologies are at the leading edge of research and innovation, with enormous potential for commercialization and game-changing advances, including more effective drug design, better climate forecasting, improved navigation and innovations in clean technologies. The Government of Canada is committed to supporting the continued growth of this emerging sector as it helps drive Canada’s economy and supports highly skilled, well-paying jobs.

Today, the Honourable François-Philippe Champagne, Minister of Innovation, Science and Industry, announced the launch of Canada’s National Quantum Strategy, which will shape the future of quantum technologies in Canada and help create thousands of jobs. Backed by an investment of $360 million committed in Budget 2021, the strategy will amplify Canada’s existing global leadership in quantum research and grow Canada’s quantum technologies, companies and talent.

Minister Champagne was joined at the launch by Dr. Raymond Laflamme, professor in the Department of Physics and Astronomy and Canada Research Chair in Quantum Information at the Institute for Quantum Computing at the University of Waterloo, and Dr. Stephanie Simmons, associate professor in the Department of Physics and Canada Research Chair in Silicon Quantum Technologies at Simon Fraser University and founder and Chief Quantum Officer of Photonic Inc. Drs. Laflamme and Simmons will serve as co-chairs of a new Quantum Advisory Council, which will provide independent expert advice on the implementation of the strategy.

The National Quantum Strategy is driven by three missions in key quantum technology areas:

Computing hardware and software—to make Canada a world leader in the continued development, deployment and use of these technologies Communications—to equip Canada with a national secure quantum communications network and post-quantum cryptography capabilities Sensors—to support Canadian developers and early adopters of new quantum sensing technologies

The missions will be advanced through investments in three pillars:

Research—$141 million to support basic and applied research to realize new solutions and new innovations Talent—$45 million to develop and retain quantum expertise and talent in Canada, as well as attract experts from within Canada and around the world, to build the quantum sector Commercialization—$169 million to translate research into scalable commercial products and services that will benefit Canadians, our industries and the world

Efforts under the strategy are already under way. To reinforce Canada’s research strengths in quantum science and help develop a talent pipeline to support the growth of a strong quantum community, the Natural Sciences and Engineering Research Council of Canada (NSERC) is delivering an investment of $137.9 million through its Alliance grants and Collaborative Research and Training Experience (CREATE) grants.

Mitacs will deliver $40 million to support the attraction, training, retention and deployment of highly qualified personnel in quantum science and technology through innovation internship experiences and professional skills development.

The Quantum Research and Development Initiative (QRDI), a new $9 million program coordinated and administered by the National Research Council of Canada (NRC), is being established to grow collaborative, federal quantum research and development. QRDI will bring government—offering expertise and infrastructure—and academic and industrial partners together to work on advancing quantum technologies under the three missions of the National Quantum Strategy.

To help translate quantum science and research into commercial innovations that generate economic benefits and support the adoption of made-in-Canada solutions by businesses, the NRC is receiving $50 million to expand the Internet of Things: Quantum Sensors Challenge program and roll out its Applied Quantum Computing Challenge program. As well, Canada’s Global Innovation Clusters are receiving $14 million to carry out activities as part of the Commercialization pillar.

In addition, the government’s flagship strategic procurement program, Innovative Solutions Canada, is receiving $35 million over seven years to help innovative Canadian small and medium-sized enterprises grow, scale up, develop intellectual property, export and create high-value jobs in the quantum sector.

The quantum sector is key to fuelling Canada’s economy, long-term resilience and growth, especially as technologies mature and more sectors harness quantum capabilities. Jobs will span research and science; hardware and software engineering and development, including data engineering; manufacturing; technical support; sales and marketing; and business operations. The government will continue working with Canada’s quantum community to ensure the success of not only the National Quantum Strategy but also the Canadian scientists and entrepreneurs who are well positioned to take advantage of these opportunities.

Quotes

“Quantum technologies will shape the course of the future and Canada is at the forefront, leading the way. The National Quantum Strategy will support a resilient economy by strengthening our research, businesses and talent, giving Canada a competitive advantage for decades to come. I look forward to collaborating with businesses, researchers and academia as we build our quantum future.”
– The Honourable François-Philippe Champagne, Minister of Innovation, Science and Industry

Quick facts According to a study commissioned by the NRC in 2020, it is estimated that, by 2045, the Canadian quantum industry, once quantum technologies take hold, will be a $139 billion industry (including all economic effects) and account for 209,200 jobs. The Government of Canada has invested more than $1 billion in quantum since 2012. These foundational investments helped establish Canada as a global leader in quantum science. The National Quantum Strategy was developed following thorough public consultations through stakeholder roundtables and an online survey. A number of investments have already been announced through existing government programs, and additional partners are preparing their quantum programs for launch. Innovative Solutions Canada has awarded four contracts valued at a total of $2.1 million to test quantum sensing, communications and computing solutions developed by Xanadu Quantum Technologies Inc., CogniFrame Inc., Photon etc. Inc. and Zero Point Cryogenics Inc. To date, NSERC has awarded 17 grants valued at $1.5 million over three years and expects to announce further results of its quantum initiatives in the coming months, including the Alliance International Quantum grants, Alliance Quantum grants, Alliance Consortia Quantum grants and CREATE grants. The NRC has launched Challenge programs to support collaborations with academia and industry  that help drive commercial innovation and build on Canada’s position as a global leader in quantum technologies. The Internet of Things: Quantum Sensors Challenge program is focused on developing revolutionary sensors that use the extreme sensitivity of quantum systems to enhance measurement precision and sensitivity rates and even expand the kinds of phenomena that can be measured. The Applied Quantum Computing Challenge program is working to build capacity in quantum algorithms and software, as well as establish collaborative research and development projects with academia and industry to efficiently simulate complex physical systems, delivering new technologies for human health, climate change and advanced materials. The governments of Canada and the United Kingdom have jointly launched a call for proposals open to organizations from Canada and the UK that wish to form project consortia for collaborative projects focused on developing innovative products, processes or technology-based services in the area of quantum technologies. The call is delivered by the NRC’s Industrial Research Assistance Program and Innovate UK.

The post <strong>Government of Canada launches National Quantum Strategy to create jobs and advance quantum technologies</strong> appeared first on Lions Gate Digital.

Friday, 13. January 2023

FIDO Alliance

IT-Business: Netwrix Password Secure: How to ensure secure password management in companies

The Yubico Security Key is based on the authentication method of the FIDO Alliance and the World Wide Web Consortium (W3C) for mobile devices and browsers. Using the FIDO2 algorithm […] The post IT-Business: Netwrix Password Secure: How to ensure secure password management in companies appeared first on FIDO Alliance.

The Yubico Security Key is based on the authentication method of the FIDO Alliance and the World Wide Web Consortium (W3C) for mobile devices and browsers. Using the FIDO2 algorithm (Fast Identity Online) it is possible to identify yourself to online services with verified hardware.

The post IT-Business: Netwrix Password Secure: How to ensure secure password management in companies appeared first on FIDO Alliance.


Information Age: Cybersecurity predictions for 2023

Andrew Shikiar, executive director of FIDO Alliance, thinks that we’ll see a lot more high-profile, sophisticated attacks which bypass legacy multi-factor authentication (MFA) in 2023. <…> Andrew Shikiar, executive director […] The post <em><strong>Information Age: Cybersecurity predictions for 2023</strong></em> appeared first on FIDO Alliance.

Andrew Shikiar, executive director of FIDO Alliance, thinks that we’ll see a lot more high-profile, sophisticated attacks which bypass legacy multi-factor authentication (MFA) in 2023. <…> Andrew Shikiar, executive director of FIDO Alliance, predicts the metaverse will become a growing target for hackers, with MFA becoming a stronger imperative as attacks increase in volume and sophistication. <…> Being sent OTP text messages as part of multi-factor authentication will be seen as not fit for purpose once organisations understand how hackable they are, says Andrew Shikiar, executive director of FIDO Alliance.

The post <em><strong>Information Age: Cybersecurity predictions for 2023</strong></em> appeared first on FIDO Alliance.


TEISS: Cyber-security in 2023

Andrew Shikiar at FIDO Alliance gives his predictions for what can we expect from the cyber-security industry in 2023. The post <em><strong>TEISS: Cyber-security in 2023</strong></em> appeared first on FIDO Alliance.

Andrew Shikiar at FIDO Alliance gives his predictions for what can we expect from the cyber-security industry in 2023.

The post <em><strong>TEISS: Cyber-security in 2023</strong></em> appeared first on FIDO Alliance.


InfoSecurity: Point/Counterpoint: Are We Moving to a Passwordless Future?

Yes: Andrew Shikiar. You might think you’ve heard it all before, right? A passwordless age is the cyber utopia we all yearn for but promises of totally wiping out passwords […] The post <em><strong>InfoSecurity: Point/Counterpoint: Are We Moving to a Passwordless Future?</strong></em> appeared first on FIDO Alliance.

Yes: Andrew Shikiar. You might think you’ve heard it all before, right? A passwordless age is the cyber utopia we all yearn for but promises of totally wiping out passwords feels far-fetched and far off. After all, how do you go about unpicking a thread so tightly woven into your daily life? Eradicating passwords will take time, but recent industry news and progress is showing we can confidently say the future is passwordless – and getting nearer. 

The post <em><strong>InfoSecurity: Point/Counterpoint: Are We Moving to a Passwordless Future?</strong></em> appeared first on FIDO Alliance.


The Green Sheet: Fierce authentication for an omnichannel threatscape

The ability to transact across all channels has opened opportunities and threats, expanding retail and hospitality playing fields and creating a broader attack surface for fraudsters. Now more than ever […] The post The Green Sheet: Fierce authentication for an omnichannel threatscape appeared first on FIDO Alliance.

The ability to transact across all channels has opened opportunities and threats, expanding retail and hospitality playing fields and creating a broader attack surface for fraudsters. Now more than ever it is important to understand how to properly protect your company. While at Authenticate, Andrew Shikiar explains the FIDO journey and its 2022 achievements including passkey, stating that he believes, “we have an opportunity with authentication to be a bridge of the digital divide and not another wedge.”

The post The Green Sheet: Fierce authentication for an omnichannel threatscape appeared first on FIDO Alliance.


Transaction Trends (US): ETA Expert Insights: FIDO Designs Faster Deployments 

FIDO was founded in 2012 by PayPal, Lenovo, and Nok Labs. They are working to change authentication through open standards that are more secure than passwords, more straightforward for consumers, […] The post <em><strong>Transaction Trends (US): ETA Expert Insights: FIDO Designs Faster Deployments</strong></em>  appeared first on FIDO Alliance.

FIDO was founded in 2012 by PayPal, Lenovo, and Nok Labs. They are working to change authentication through open standards that are more secure than passwords, more straightforward for consumers, and more accessible for service providers to deploy and manage. FIDO’s latest innovation, passkey, provides fast, easy, secure sign-ins to websites and apps across multiple devices. Passkey users can sign in with a biometric, PIN or security key rather than a username and password, FIDO representatives noted, and without having to re-enroll devices or accounts each time, they sign in.

The post <em><strong>Transaction Trends (US): ETA Expert Insights: FIDO Designs Faster Deployments</strong></em>  appeared first on FIDO Alliance.


SDxCentral (US): FIDO Pushes Password Replacement as MFA Bypass Attacks to Surge in 2023

The FIDO Alliance expects to see more high-profile cyberattacks targeting cloud service providers that bypass traditional multi-factor authentication (MFA), which will push more major brands to adopt the passkey password […] The post <em><strong>SDxCentral (US): FIDO Pushes Password Replacement as MFA Bypass Attacks to Surge in 2023</strong></em> appeared first on FIDO Al

The FIDO Alliance expects to see more high-profile cyberattacks targeting cloud service providers that bypass traditional multi-factor authentication (MFA), which will push more major brands to adopt the passkey password replacement in 2023.

The post <em><strong>SDxCentral (US): FIDO Pushes Password Replacement as MFA Bypass Attacks to Surge in 2023</strong></em> appeared first on FIDO Alliance.


WIRED: The Password Isn’t Dead Yet. You Need a Hardware Key

After years of work, the tech industry finally took major steps in 2022 toward a long-promised passwordless future. The move is riding on the back of a technology called “passkeys” […] The post WIRED: The Password Isn’t Dead Yet. You Need a Hardware Key appeared first on FIDO Alliance.

After years of work, the tech industry finally took major steps in 2022 toward a long-promised passwordless future. The move is riding on the back of a technology called “passkeys” that are built on FIDO standards. Operating systems from Apple, Google, and Microsoft now support the technology, and many other platforms, browsers, and services have adopted it or are in the process of doing so. As much as you might wish it, though, passwords aren’t going to disappear anytime soon, thanks to their sheer ubiquity. And amid all the buzz about passkeys, hardware tokens are still an important protection option.

The post WIRED: The Password Isn’t Dead Yet. You Need a Hardware Key appeared first on FIDO Alliance.

Thursday, 12. January 2023

Origin Trail

European Union supports sustainability of construction industry with the BUILDCHAIN project…

European Union supports sustainability of construction industry with the BUILDCHAIN project: Bringing trustworthy knowledge exchange using the Decentralized Knowledge Graph Trace Labs, a Web3 development company, joins EU’s efforts to create a smarter and more sustainable built environment with the BUILDCHAIN project. Together with eleven other partners from Slovenia, Italy, Spain, Greece, Hungar
European Union supports sustainability of construction industry with the BUILDCHAIN project: Bringing trustworthy knowledge exchange using the Decentralized Knowledge Graph

Trace Labs, a Web3 development company, joins EU’s efforts to create a smarter and more sustainable built environment with the BUILDCHAIN project. Together with eleven other partners from Slovenia, Italy, Spain, Greece, Hungary, and Serbia, Trace Labs will use the OriginTrail Decentralized Knowledge Graph (DKG) to build a trusted knowledge base aiming to improve efficiency, reduce errors, and increase transparency and trust, ultimately leading to more sustainable construction projects.

Construction industry lacks trustworthy data exchange

Efficient, transparent, and trusted data exchange is a powerful tool for driving sustainability, resilience, and energy efficiency in construction, however there are several obstacles for trusted data exchange in this industry today:

Data silos: Construction projects involve multiple parties and stakeholders, each of which may have their own systems for storing and sharing information. This can lead to data silos and lack of coordination, which can make it difficult to access and trust the data. Lack of standardization: Different construction projects may use different formats for storing and sharing data, which can make it difficult to compare and combine information from different projects. Data security: Construction projects often involve sensitive information, such as building plans, materials lists, and inspection results. Ensuring that this information is kept secure and protected from unauthorized access can be a significant challenge. Lack of incentives: There are often few incentives for construction companies and other stakeholders to share data and collaborate on projects, which can make it difficult to establish trust and transparency. BUILDCHAIN to create a complete overview of the building life-cycle

Having received financial support from the European Union, BUILDCHAIN aims to develop technological solutions that will enhance data exchange and transparency in the industry and overcome the above-mentioned obstacles. Over the three-year period, the key innovative players in construction engineering, architecture, and research will build a knowledge base that will be used by various actors to trace all activities related to the complete life-cycle of buildings.

Trace Labs’ main contribution will be the integration of OriginTrail Decentralized Knowledge Graph (DKG) with the existing EU Digital Building LogBook system, creating a new powerful knowledge base with blockchain and artificial intelligence capabilities. The DKG will provide a tamper-proof and decentralized database that can be accessed by all parties involved in a building’s life cycle, including architects, contractors, building owners and regulators. This helps increase transparency, trust, and collaboration among all stakeholders through:

Connected knowledge assets as digital twins: DKG will provide an open, single source of truth for all building-related data, including plans, materials, inspection results, and more. This will make it easy for all parties to discover, access, and update the information in real time, improving communication and collaboration. Data verification: Using blockchain technology, all data added to the DKG is fingerprinted and cannot be tampered with, ensuring authenticity of the data of the building’s life cycle. Predictive maintenance: AI and machine learning can be used to analyze data from the DKG to identify patterns and predict future outcomes, such as maintenance needs. This can help improve building maintenance and reduce downtime. Compliance and regulations: DKG can be used to ensure compliance with building codes and regulations by storing all relevant information in a standardized way and making it accessible to regulators in real time. Driving value and technological advancement for construction industry

The advancements and benefits of building information modeling, automation, and digitalization encourage the development of more effective and efficient building information management. Given the fact that the global Building Information Modeling market is worth $14,7 billion and is predicted to grow to $53 billion by 2031, the BUILDCHAIN project enables the possibility of great value creation and a strong impact on sustainability of the entire sector.

Project information available here: BUILDCHAIN Project | Fact Sheet

This project has received funding from the European Union’s Horizon Europe Framework Programme, Climate Neutral, Circular and Digitiesed Production 2022, under grant agreement No 101092052.

👇 More about OriginTrail 👇

OriginTrail is an ecosystem dedicated to making the global economy work sustainably by organizing humanity’s most important knowledge assets. It leverages the open source Decentralized Knowledge Graph that connects the physical world (art, healthcare, fashion, education, supply chains, …) and the digital world (blockchain, smart contracts, Metaverse & NFTs, …) in a single connected reality driving transparency and trust.

Advanced knowledge graph technology currently powers trillion-dollar companies like Google and Facebook. By reshaping it for Web3, the OriginTrail Decentralized Knowledge Graph provides a crucial fabric to link, verify, and value data on both physical and digital assets.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

👇 More about Trace Labs👇

Trace Labs is the core developer of OriginTrail — the open source Decentralized Knowledge Graph. Based on blockchain, OriginTrail connects the physical world and the digital world in a single connected reality by making all different knowledge assets discoverable, verifiable and valuable. Trace Labs’ technology is being used by global enterprises (e.g. over 40% of US imports including Walmart, Costco, Home Depot are exchanging security audits with OriginTrail DKG) in multiple industries, such as pharmaceutical industry, international trade, decentralized applications and more.

Web | Twitter | FacebookLinkedIn

European Union supports sustainability of construction industry with the BUILDCHAIN project… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


We Are Open co-op

Open Recognition is for every type of workplace

Is 2023 the year of Communities of Practice at work? Communities of Practice (CoPs) are groups of people who share a concern or passion for things that they do. For example, they may be teachers, or parents, or paediatricians interested in helping young people. As they interact with one another on a regular basis, they learn to better do the things in which they are interested, which leads t
Is 2023 the year of Communities of Practice at work?

Communities of Practice (CoPs) are groups of people who share a concern or passion for things that they do. For example, they may be teachers, or parents, or paediatricians interested in helping young people. As they interact with one another on a regular basis, they learn to better do the things in which they are interested, which leads to improved outcomes for all.

CoPs can be seen as somewhat ‘messy’ from the outside, but they nevertheless integrate rituals and recognition as part of their practice. This post argues that now is the time to shift to a more holistic understanding of recognition in the workplace, one that is established by intentionally building CoPs across a landscape of practice in each sector.

a badge from our Community of Practice course on Participate.com Credentials aren’t everything

Workplace recognition happens in various ways. For example, recognition in some guise is often a prerequisite for employment. This often takes the form of a list of credentials and experience that a worker is expected to be able to prove. This is for good reason, as employers have to ensure that a worker can perform the functions required of them.

However, we live in a world where not everyone has the same access to the types of credentials or experience that workplaces may require (or think they require!). For example, too many white collar employment opportunities require, at a minimum, ‘a degree’ of any kind, using this as a proxy for a bundle of knowledge, skills, and behaviours.

In addition, some of the reasons for the Great Resignation include lack of career advancement, work-related stress, and the desire for a better work-life balance. Workplaces that recognise employees and contractors in meaningful ways are likely to better retain staff and have a happier, more productive workforce.

Greater workplace recognition practices can also increase resilience: interpersonal relationships at work allow for greater and more fluid knowledge transfer, which leads to greater innovation and the ability to route around problems which may emerge.

What would recognition in the workplace look like if we started to think of our working world as made up of Communities of Practice?

Recognising diversity

Let’s start at the beginning of the worker journey. The talent pool for a particular job is often limited by the lack of inclusive recognition. Potential workers don’t always have the opportunity to improve their standing due to socio-economic factors, geographic exclusion, and discriminatory practices. Our existing credentialing systems can unwittingly exacerbate this.

While employers are good at recognising formal credentials, they often don’t attempt to understand the wider picture of recognition.

We’re seeing innovative systems, like Open Recognition and badge pathways, begin to push at the inequity of the current system of employment. These approaches make skills more visible. Instead of hiring someone who simply has ‘a degree’, companies such as IBM are beginning to be more specific about the kinds of knowledge, skills, and behaviours they’re looking for.

These attempts, while valuable, are in the realm of ‘microcredentialing’ — breaking down ‘chunky’ credentials into smaller, more granular parts. Open Recognition goes beyond credentialing to consider relational factors and the wider, more holistic story. After all, people need us to help tell their story, and we need them to help us tell ours. Every credential involves recognition, but recognition does not have to be in the form of a ‘credential’.

Connecting networks

No-one works in isolation. We all have current and former colleagues who have seen us in action and who can attest to our abilities. A ‘closed’ (i.e. non-portable) example of this kind of recognition is the LinkedIn recommendation, which can lead to surprising insights for the recipient into the knowledge, skills, and behaviours that are valued by those with whom they work.

Not only can a recommendation or testimonial serve as a form of ‘credential’ but in addition to this, it also establishes a relationship between the recommender and the person being recommended. This is powerful when instantiated in a badge, which provides the additional benefits of being tamper-proof and portable between online systems.

Demonstrating the connections between our story and someone else’s helps show underlying networks and relationships. It can make visible the Communities of Practice that may be invisible to corporate hierarchies. Social graphs allow us to see, over time, how communities develop and how expertise is established.

Conceptualising workplaces as sites for Communities of Practice to flourish leads to meaningful recognition between colleagues. Instead of performative Slack messages which may serve as extrinsic motivators for a limited period of time, true Open Workplace Recognition can help peers encourage one another towards ever-more learning and growing. HR departments can help colleagues co-create interest-based career pathways that make sense across sectors.

Open Recognition needs to be pervasive

In order to put the power of a story clearly in the hands of the person who owns that story, we have to allow that person to tell it. We need ways to ensure that recognition belongs to individuals and can be used regardless of the type of workplace in which they currently find themselves. Just because you have a good, bad, or indifferent boss should not determine the wider recognition you receive when performing well.

Giving people the ability to describe their career path and employment history in a non-linear, non-time bound way will help employers too. Allowing a person to say what they look for in a workplace, what they believe they are good at, and how they are looking to grow helps reduce turnover and to get people in the correct roles more quickly.

Individuals can help potential employers or collaborators understand how they:

keep showing up and learn new things exhibit pro-social behaviours support their community

These things will help hiring managers get to know someone more holistically than a single, potentially irrelevant credential ever could. Turning workplaces into communities of practice changes the way an organisation functions. It requires recognition systems where everyone can be “seen” and can “see” others — and makes all learning and knowledge creation visible.

Conclusion

Communities of Practice provide ways for forward-thinking organisations to engage staff in a virtuous circle of improvement, to ensure positive recognition practices, and to bundle up experiences in credentials that can be used to help both the individual and organisation.

To establish a CoP within organisations involves both a top-down and bottom-up approach. From the top, it requires time, funding, and recognition, and from below it requires will, engagement, and encouragement.

We firmly believe that both CoPs and Landscapes of Practice have become one of the most important tools for organisations looking to make a sustained impact in the world.

2023 is going to be a big year for communities, and we’re here for it. To discuss this further with us, get in touch!

This post was co-authored by Julie Keane and Doug Belshaw. Images CC BY-ND Bryan Mathers

Open Recognition is for every type of workplace was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Velocity Network

Ekstep Foundation’s Gaurav Gupta speaking at our India ecosystem launch event

The post Ekstep Foundation’s Gaurav Gupta speaking at our India ecosystem launch event appeared first on Velocity.

Wednesday, 11. January 2023

EdgeSecure

Improve Cybersecurity Performance While Meeting JIF Insurance Requirements

The post Improve Cybersecurity Performance While Meeting JIF Insurance Requirements appeared first on NJEdge Inc.

Webinar

Municipalities and K-12 school districts have more pressure to meet cybersecurity requirements than ever. Cyber threats to the public sector continue to accelerate, and finding affordable, comprehensive cybersecurity insurance is becoming more and more difficult. As public sector organizations turn to local and statewide Joint Insurance Funds (JIFs), it is vital to understand the funds’ requirements, to ensure that your organization has an effective cybersecurity posture and the ability to be covered in the event of a breach.

In this session, Edge’s lead Virtual Chief Information Security Officer, Dr. Dawn Dunkerley, will outline:

Steps every municipality and K-12 district can take to be more attractive to cyber insurance providers in general Common cybersecurity requirements of JIFs, with a particular focus on the emerging requirements of NJ MEL How your organization can uncover and prioritize its most pressing cybersecurity needs How a nonprofit vCISO provider like Edge can align with your mission, and work with your team to maximize the impact and affordability of your cybersecurity efforts Complete the Form Below to Access Webinar Recording [contact-form-7]

The post Improve Cybersecurity Performance While Meeting JIF Insurance Requirements appeared first on NJEdge Inc.


Oasis Open Projects

Creating a Leading PCIE Based Ethernet Host Interface Standard: Infrastructure Data-Plane Function (IDPF)

Why OASIS?A few months ago, Intel NIC architects and members from Google, Red Hat, and Marvell decided to deeply explore the terrains of Peripheral Component Interconnect Express (PCIE) based Ethernet Host Interface standardization. We decided that a PCIE-centric standard was needed and thus we created IDPF. As an inventor and one of the founding architects […] The post Creating a Leading PCIE B

By Anjali Singhai Jain, Intel, co-chair of the IDPF TC

Why OASIS?
A few months ago, Intel NIC architects and members from Google, Red Hat, and Marvell decided to deeply explore the terrains of Peripheral Component Interconnect Express (PCIE) based Ethernet Host Interface standardization. We decided that a PCIE-centric standard was needed and thus we created IDPF. As an inventor and one of the founding architects of IDPF, I am aware that it is hard to justify the short term benefits of standardizing IDPF to my team, but we do see a long term benefit and I am grateful to my team at Intel and the rest of the founding member companies that they came along and trusted us in what we created.

OASIS, a world class open standards organization, made it possible for us to launch a PCIE-centric standard alongside other technologists. OASIS gave us the platform we needed to solve some difficult technical problems. For example, OASIS has a bevy of technical standards leaders with whom we could consult if we needed help. OASIS is widely regarded as a leading standards organization which allows us to easily seek help and build IDPF; we could easily reach out to Intel leaders internally who have worked with OASIS on other projects. We were also able to network with a global community of technologists who are passionate about standardization, which made creating and launching the IDPF educational, interesting, and challenging. When it comes to open standards, OASIS proves to be an international leader and provides a clear and direct path to standardization.

The Road to IDPF
Upon launching, there were some necessary hurdles to deal with such as why do we need a standard in this space, and if we do, why IDPF, why not something from scratch? There was a whole lot of perseverance and answering the hard questions. The wonderful support from OASIS definitely helped us get started. Every OASIS member is dedicated to excellence and making sure standards are fully equipped for de jure approval.

One reason a creator/inventor still carries forward is in the hope that either they will be able to create a thing of beauty used by many or the process will teach them some very important lessons about future courses of action.

In today’s Network Interface cards whether smart or not, an IPU (Infrastructure Processing Unit) or DPU (Dataplane processing Unit), the Ethernet Host interface exposed to the General-purpose cores comes in two flavors:

1) A highly optimized performant Host interface for the card but often proprietary to the card vendor. Examples of Type 1 are mlx4/5 (Nvidia), ice/iavf (Intel) etc.

2) A vendor-neutral interface that is the requirement of the infrastructure provider that can be used in a Virtual Machine, container etc. More often than not designed by the device vendor but by either the infrastructure provider or something organic that evolves from the SW emulated device land to create level playing ground for all – and most importantly something that has the ecosystem around it to facilitate live migration of the Virtual machine. Examples of Type 2 are Vmxnet3, ENA, virtio-net etc.

The reason the vendors are forced to support both are simple; when deployed without the need for virtualization like in an enterprise setup or for small internal use by the Cloud or Telco’s, an optimized interface even if it is vendor-specific is the right interface as getting the best performance is the key metric. In a cloud deployment, the key metrics are : virtualization, migration-ability for maintenance reasons, ability to source the device from multiple vendors (given supply-chain issues that are unpredictable, relying on a single vendor is not the wisest business decision). The deployment is quite large and the promise of cloud to its client in an IaaS deployment is that they should never have to deal with Infrastructure related issues (that’s what the client’s offload onto the Infrastructure provider by paying a service fee) and hence the need for Type 2 vendor-neutral interfaces with a good ecosystem for live migration.

In a cloud model, there is one more key metric to keep in mind, the datacenter nodes have evolved to provide greater security, modularity and isolation required between the untrusted Client and the Services provider. There is also the element of zero-trust (trust nothing and nobody) between the client renting the VMs and the Infrastructure provider. Which means the same network interface card (FNIC/SNIC/IPU/DPU) should provide different levels of exposure to the device through Host interfaces exposed to the client and to the infrastructure provider’s Control plane, often the network card’s dataplane processing pipeline must split logically and securely in a way to accommodate for Infrastructure processing offload needs and also for the host side accelerations/offloads.

Additionally, one of the important things that are very important in designing a Client facing Ethernet Host interface is to have almost a balloon like interface, which starts small with a very tiny footprint towards the device that is most essential (think fast path) and the rest should be negotiated in a way so as to always be able to provide connectivity with more or less host side accelerations/offloads depending on what any device can offer and what the service provider deems important for a certain customer’s usage. Start small but leave a lot of room to grow. Not just for current host side accelerations/offloads but also for the future ones as the systems and use cases evolve. A lot of consideration has been given for performance (given a PCIE interface has its greatest bottleneck/cost/delay at the PCIE boundary), abstraction from the underlying device so that it can evolve without the need to change the Host interface, live migration aspects etc. Basically, IDPF takes the goodness of the two types of interfaces mentioned above and makes one such interface that can be standardized.

While designing IDPF, we kept all these aspects in mind. IDPF is a result of collaboration between an Ethernet device vendor with lots of experience in designing high performance and feature rich devices and host interfaces and a well-known Infrastructure Cloud provider with a very deep understanding of various usages and deployment models. We did keep in mind the aspect that each vendor will come with their uniqueness to offer from their device as well and left room for such OEM specific features to be negotiated and used when needed as part of the IDPF negotiation sideband to the device.

As part of this standardization effort, we realized what are some of the standard Ethernet acceleration and offload features that most vendors offer to the customer as they are time tested and ubiquitous for a modern-day Ethernet interface. Some of these features are stateless offloads such as checksum, receive side scaling etc., some others which have some temporary state on the device side such as segmentation offload, receive side coalescing etc. and others that are more stateful such as flow-steering, crypto offload etc. Some others that are supplemental such as Precision Time offload, Earliest Departure time (pacing of flows) etc. This is by no means an exhaustive list, but also a list that will grow as the standards committee for IDPF will review and make further additions, in a hope that this standard will stand the test of time and will evolve adding some new features and deprecating some as we move forward. For example, we hope to standardize an RDMA Host interface as part of this IDPF effort, sometime in the future.

As part of this IDPF standardization, we (Intel) will contribute a seed Spec with all details on what is expected from the device, a linux upstream kernel driver, an emulated userspace device and a DPDK open-source driver. We will also contribute heavily towards the live migration ecosystem for a VM using such an interface on a linux hypervisor.
Creating an interface that is both performant and meets all other cloud needs is hard but standardizing one way is harder. Needless to say, it’s for a good cause, so that many vendors can play together – and overall the customer benefits from it (Example role models: USB, PCIE, NVMe). And we in the Ethernet world move the technology forward, focusing on real problems and leaving behind problems of non-compliance. Our goal is to seed the Standard Ethernet Host interface with something that we believe we have poured our hearts and minds into. Hope this helps everyone, and we can all benefit from joining forces.

Participation in the Infrastructure Data-Plane Function (IDPF) TC is open to all interested parties. Contact join@oasis-open.org for more information.

The post Creating a Leading PCIE Based Ethernet Host Interface Standard: Infrastructure Data-Plane Function (IDPF) appeared first on OASIS Open.


Next Level Supply Chain Podcast with GS1

The E-Commerce Field of Dreams: Just Because You Build It, Doesn’t Mean They’ll Come

Just because you build a product catalog and a brand presence, doesn’t mean customers will be able to find or purchase from you. Successful brands are not built in a linear sense, but rather they have the necessary resources in place to support inventory demand and then deliver on time to boost satisfaction and visibility. Join us as we chat with Envision Horizons founder, Laura Meyer, to talk abo

Just because you build a product catalog and a brand presence, doesn’t mean customers will be able to find or purchase from you. Successful brands are not built in a linear sense, but rather they have the necessary resources in place to support inventory demand and then deliver on time to boost satisfaction and visibility. Join us as we chat with Envision Horizons founder, Laura Meyer, to talk about the importance of having a logistics and inventory strategy as brands start up and scale.


Content Authenticity Initiative

A conversation with the Trusted Web podcast

How do we create a more trustworthy web? We joined the Trusted Web podcast to discuss digital content provenance as a solution for restoring trust and enabling authentic storytelling and verifiable media online.

How do we create a more trustworthy internet? In the year’s first episode of the Trusted Web podcast, we joined host and founder, Sebastiaan van der Lans, to discuss the Content Authenticity Initiative’s mission, our open-source tools, how to balance permanence and flexibility in technology and much more.

Andy Parsons, CAI Sr. Director, and Sebastiaan in conversation about:

A coming wave of content created with generative artificial intelligence and the importance of authentic storytelling enabled by digital content provenance technology to identify synthetic media and display attribution

How consumers, creators and industry may rebuild transparency online by engaging with provenance tools to advance verifiable media and digital literacy

The critical role of social media and search companies in adopting open-source tools and a standard for verifiable certificates and credentials  

A bold outlook, predicting the state of trust in the years ahead and more 

Listen to the podcast or watch it below. Share and tag @ContentAuth on Twitter

🎙Apple Podcasts

🎙Spotify

🎙Google Podcasts


Digital ID for Canadians

Schellman Joins the Voilà Verified Trustmark Program

Schellman Joins the Voilà Verified Trustmark ProgramVoilà Verified Program Builds Real Trust in Digital Solutions by Spotlighting World-Class Vetted Digital Identity Solutions. TAMPA, FL, January…

Schellman Joins the Voilà Verified Trustmark Program
Voilà Verified Program Builds Real Trust in Digital Solutions by Spotlighting World-Class Vetted Digital Identity Solutions.

TAMPA, FL, January 11, 2023 – The Digital ID and Authentication Council of Canada (DIACC) is pleased to officially welcome Schellman to the Voilà Verified Trustmark Program – the first and only certification program to determine digital identity service compliance with the Pan-Canadian Trust Framework™ (PCTF).

A non-profit coalition of over 115 public and private members, the DIACC develops research and opportunities to enable Canada’s confident, safe, and full participation in a global digital economy.

“One size does not fit all when it comes to identity solutions – but ensuring a solution delivers upon a defined duty of care is critical,” says Joni Brennan, president of the DIACC. “With the PCTF, and now with Voilà Verified, there is an opportunity to adopt a framework rooted in trust – and to earn compliance recognition. Voilà Verified identifies those who are ‘walking the walk’ and delivering safe and secure access to the global digital economy.”

“Schellman’s participation in DIACC’s Voilà Verified Program demonstrates our excitement and commitment to the Crypto and Digital Trust space. This new step will widen our global footprint as we expand all our assessment services in Canada,” says Avani Desai, Chief Executive Officer of Schellman.

The DIACC’s PCTF is a publicly available framework for identity solutions that defines client, customer, and individual duty of care. The Voilà Verified program provides a vetting and assessment opportunity where PCTF-compliant solution vendors can earn a public-facing trustmark. The result? Spotlight visibility of trustworthy, safe, reliable, and efficient solutions.

“Schellman is excited to participate in this landmark accreditation program. We look forward to serving the DIACC community with distinction as we enable Canada’s world-leading efforts towards true digital trust,” said Scott Perry, Principal at Schellman and leader of the firm’s Crypto and Digital Trust Services.

Voilà Verified presents an opportunity to grow provincial-level investments in digital identity solutions. Provincial governments which have launched identity services can now earn a trustmark of their own, and provinces that are on the cusp of entering the digital solution market can do so with confidence by seeking vendors with a Voilà Verified trustmark.

Ruth Puente, Voilà Verified’s Trustmark Verification Program Manager, says a leading component of the program’s development was to ensure its procedures aligned with the International Organization of Standardization (ISO).

“Voilà Verified is inclusive yet diligent in verifying PCTF-compliant solutions. The program was developed in alignment with ISO standards – and empowers informed decision-making in a rapidly growing ecosystem of identity solutions,” said Puente. “Delivering high-quality service, customer protection, and Increasing access to trustworthy solutions are our priorities. We have formed teams of international experts to perform assessments and to oversee the process through an impartial lens.”

Entities responsible for assessing PCTF compliance within the Voilà Verified program are Accredited Assessors, Readiness Advisors, and Testing Laboratories.

The Voilà Verified Trustmark Oversight Board (TOB) will make all final verification decisions based on reports from accredited entities. Made up of third-party volunteers with international expertise in identity management, auditing, compliance, cybersecurity, information security, and law, the TOB is the highest operating body of Voilà Verified. It is subject to impartiality, confidentiality and conflict of interest policies.

“Voilà Verified is a unique opportunity in which I am honoured to share my experience as an advisor and auditor within information security, compliance, and identity,” says Björn Sjöholm, Cybersecurity Entrepreneur of Seadot, and TOB Chair.

Vendors are turning to the Voilà Verified program for several reasons, but the leading value proposition is market differentiation. Trustmark holders stand out from competitors by unlocking global business opportunities through international recognition and credibility.

“Voilà Verified puts internationally reputable identity solutions on the map,” says Dave Nikolejsin, the DIACC’s Board Chair. “This is the way forward. With lateral growth of PCTF compliance across sectors – public and private – we establish a common value of trust. Voilà Verified is a monumental stride for Canada to influence a safe and secure global digital economy.”

To learn more about Voilà Verified and access your application package, visit the program overview on the DIACC website or contact voila@diacc.ca.

ABOUT DIACC

DIACC is a growing coalition of public and private sector organizations who are making a significant and sustained effort to ensure Canada’s full, secure, and beneficial participation in the global digital economy. By solving challenges and leveraging opportunities, Canada has the chance to secure at least three percent of unrealized GDP or $100 billion of potential growth by 2030. Seizing this opportunity is a must in a digital society as we work through the COVID pandemic challenges. Learn more about the DIACC mandate.

ABOUT SCHELLMAN

Schellman is a leading global provider of attestation, compliance, and certification services. Operating as an alternative practice structure as Schellman & Company, LLC, a top 100 CPA firm, and Schellman Compliance, LLC, a globally accredited compliance assessment firm, we have expanded beyond our original offering of SOC examinations to now offer clients services as a CPA firm, an ISO Certification Body, a PCI Qualified Security Assessor Company, a HITRUST assessor, a FedRAMP 3PAO, and as one of the first CMMC Authorized C3PAOs.

Renowned for expertise tempered by practical experience, Schellman’s professionals provide superior client service balanced by steadfast independence. Schellman’s approach builds successful, long- term relationships and allows our clients to achieve multiple compliance objectives using a single third- party assessor. For more information, please visit Schellman.com.

CONTACT:

Krista Pawley
Imperative Impact

krista@imperativeimpact.com


LionsGate Digital

Switch Reward Card Expands Beta Testing

LEHI, Utah, Jan. 10, 2023 /PRNewswire/ — Switch Reward Card, a blockchain-based financial services ecosystem that offers debit payment solutions for both traditional and cryptocurrencies, is excited to announce the following recent updates to its platform: Switch’s Node Network Charter was passed and the Switch Blockchain went live in December. Moved the Switch Blac

LEHI, Utah, Jan. 10, 2023 /PRNewswire/ — Switch Reward Card, a blockchain-based financial services ecosystem that offers debit payment solutions for both traditional and cryptocurrencies, is excited to announce the following recent updates to its platform:

Switch’s Node Network Charter was passed and the Switch Blockchain went live in December. Moved the Switch Black Card closed beta to open beta. The Switch Black Card currently has over 200 active users. The Switch Trading Platform has over 120 users in closed beta. Beta Users can buy, sell and send cryptocurrencies, and has the ability to top-up or load the converted currencies to the Switch Black Card. Switch plans to add additional users to the test group in January. Switch Reward Card Tap To Pay

In a December letter to the community announcing the launch of the blockchain, Switch Reward Card CEO, and former President of Discover Bank, Kathy Roberts said, “I feel fortunate to be making this announcement and to have kept that wonder about this emerging tech that will continue to change the world as it is being adopted faster than all previous technologies before it.”

Switch Reward Card President, COO and fellow co-founder Bradley Willden is excited about the versatility of the product suite: “The Switch platform has both custodial and non-custodial wallets giving our community members true ownership of their digital assets.”

Apply for the Switch Black Card today and join the open beta: https://dashboard.switchus.io/register/start

CONTACT
Switch Reward Card
Scott Touchton
Vice President of Sales and Marketing
stouchton@switchrewardcard.com

ABOUT Switch

Switch is a blockchain-based financial services ecosystem. The blockchain is empowered by a global decentralized node network where node licensees will be rewarded, by the blockchain, with Switch Digital Rewards. Switch offers debit payment solutions for both traditional and cryptocurrencies around the world.

This Press Release may contain forward looking statements that involve substantial risks and uncertainties. Forward looking statements discuss plans, strategies, prospects, and expectations concerning the business, operations, markets, risks, and other similar matters. There may be events in the future that we cannot accurately predict or control. Any forward-looking statement in this press release speaks only as of the date on which it is made. Factors or events that could cause our actual results to differ may emerge from time to time, and it is not possible for us to predict all of them. We do not plan to update or revise publicly any forward-looking statements except as required by law.

Rewards are not available for purchase from Switch. They are digital rewards earned in exchange for work and action on the Switch network. The digital reward is designed to have utility on the Switch platform for the purchase of Switch’s products and services. The digital reward is not an investment product and may never have any value outside of the Switch platform. Switch node owners should not expect to recognize any value from the digital reward other than its utility with Switch. Switch does not anticipate correlation between the digital reward value and Switch’s business activities.

No cryptocurrency is loaded onto the Switch Visa Card. All assets are converted to local fiat currency prior to loading on the Visa network. Switch provides financial services and is not a bank.

SOURCE Switch Reward Card

Photo by Kaysha on Unsplash

The post Switch Reward Card Expands Beta Testing appeared first on Lions Gate Digital.

Tuesday, 10. January 2023

The Engine Room

Building digital resilience is an ongoing effort

In early December, we held a community call to discuss the digital resilience challenges we are working through. Here are some takeaways from our conversations. The post Building digital resilience is an ongoing effort first appeared on The Engine Room.

In early December, we held a community call to discuss the digital resilience challenges we (and our partners) are working through. We were joined by organisers and activists from different countries and talked about what digital resilience meant for our movements in 2022. Here are some takeaways from our conversations:

Adapting workflows for uncertain times

During the call, participants shared that when the Covid-19 pandemic started, many organisations were forced to rapidly recalibrate their work and services in online spaces. A few years into the pandemic, attendees shared that adapting to new tech-related needs and infrastructure has been an on-going effort that they’ve taken on. Organisations and activists have been continuously working to adapt their workflows and methodologies to handle the tech and data concerns that emerged since the pandemic, such as digital security threats, restrictions on the ability to organise and increased surveillance.

Digital resilience needs to include environmental and climate activists 

We also talked about how conversations about digital resilience must include activists and grassroots organisations working on environmental and climate justice (and we’ve also seen this mentioned in other conversations too). One participant shared that, in their country, support for civil society’s digital security is often more readily available in metropolitan areas. This means that some local groups who work on environmental justice outside of the cities and who increasingly rely on digital technologies to communicate and collect data for advocacy purposes are exposed to threats such as surveillance from state parties and other actors.

Our recent research on the intersections of environmental and climate justice confirms this: we saw that while the persecution of those resisting land acquisition and forest encroachment by extractive industries has been going on for a long time, digital tools’ ability to follow, surveil and collect information without individuals’ knowledge has expanded governments’ and companies’ abilities to intimidate, harass and in some cases even to physically harm, dissenters.

A lot of the times, digital resilience is a balancing-act

Participants who advocate for digital security practices in their own organisations shared that when they start the process of implementing new tools they sometimes are met with: fatigue with learning new tools and reticence when it comes to using tools that are unfamiliar and sometimes, not-so-widely-known in their communities.

During the call, those attending shared their strategies for facilitating the process of becoming digitally resilient:  Combining the use of secure tools that are recommended by organisational security practitioners with learning how to use the more familiar tools more safely: One person shared that, in 2023, they are focusing on developing best practices for working with tools that staff are familiar with.Another shared that it’s important to map what types of information shouldn’t be shared on certain platforms and adopt data minimisation as best practices. Pacing out how the adoption of new tools and starting small: Another participant shared that working with asmall group and onboarding slowly are good ways of keeping people engaged in the process and to prevent staff from being overwhelmed. It’s also important to have conversations about how organisational security does come with tradeoffs: for example, sometimes choosing a more secure tool means that we might not have all the same affordances of privately owned tools. Here at The Engine Room, we’ve been documenting our journey on adopting new video-conferencing tools and it hasn’t been without a couple of obstacles here and there.   Look for support from peers and partners: As we adapt to new digital infrastructure, it can be so helpful to have contact with someone who knows the tools and who can provide support. Participants found that this can help build staff’s confidence and help with questions that will inevitably arise. You can also reach out to us for support in this process. The work ahead: what’s next on our digital resilience journey

In the past year, we’ve been internally reflecting on digital resilience as an organisation and with our partners. In 2023, we plan on sharing more resources on how civil society can build digital resilience, including on topics such as digital security, adopting different tech tools, organising safely and adapting to context shifts. 

In the meantime, we’re curious to know what digital resilience has looked like for you and your organisations! Reach out to tell us or to share questions, research or projects you’d like us to know about.

Image by Mahdi Bafande via Unsplash.

The post Building digital resilience is an ongoing effort first appeared on The Engine Room.

EdgeSecure

Catapulting Telecom into the Modern Age at a Small Liberal Arts College

The post Catapulting Telecom into the Modern Age at a Small Liberal Arts College appeared first on NJEdge Inc.

Webinar

Last summer, after months of discovery and deliberation, Davidson College migrated its telecom services from a traditional, on-premises VoIP solution to Zoom Phone. The story of that migration has all of the twists, turns, and lessons learned one might expect. From automating nomadic E911 and improving compliance, to surveying shifting user device preferences, to supply chain challenges, during this webinar, we’ll explore how one small college managed to pull it off with an agile team and lots of elbow grease.

Presenters:
• JD Mills, Manager of Digital Campus & Digital Transformation, Davidson College
• Johann Zimmern, Global Education Strategy Lead, Zoom

Complete the Form Below to Access Webinar Recording [contact-form-7]

The post Catapulting Telecom into the Modern Age at a Small Liberal Arts College appeared first on NJEdge Inc.


Oasis Open Projects

OASIS Board Member Spotlight Series: Q&A with Gershon Janssen

OASIS Board Member Spotlight Series: Q&A with Gershon Janssen The post OASIS Board Member Spotlight Series: Q&A with Gershon Janssen appeared first on OASIS Open.

Get to know Gershon and find out how he’s been making an impact at OASIS since 2007.

Each month, we’ll interview one OASIS Board Member to give you a better sense of who they are and why they serve the OASIS community. 

This month, we caught up with Gershon Janssen, OASIS Board Chair and Director of Solutions Architecture at Reideate, to talk about his path to the OASIS board, his extensive background in standards development, his goals for the organization, and more.

We’re happy to have you on the OASIS board. Can you tell us a little about your role at Reideate?
I’m the Director of Solutions Architecture at Reideate, responsible for the architecture and research and development activities.

We are a boutique consulting firm focused on solving complex business problems and driving digital transformation through simplification and innovation. Redoing the thinking or ideation behind those business problems is essential in our success. We’re specialized in information technology challenges, predominantly offering consulting services, but we also do research and development to keep ourselves on top of our game and lead by knowledge and expertise.

How long have you been involved with the board?
I’ve been involved with the board since 2012, serving as the chairperson since 2016.

What inspired you to join the OASIS Board of Directors?
A rare synergy evolved between my OASIS standards development activities and my day-to-day work, which make both natural extensions of each other. The Executive Director of OASIS noticed my close and active involvement in OASIS Technical Committees as well as my liaison role with various European developments. He approached me with the ask if I would be interested in taking on a broader role by considering being a director. After carefully considering the effort, responsibilities, how I can make a difference and contribute, as well as talking to a few peers, I decided to go for it.

How did you first get involved with OASIS?
I got involved with OASIS in 2007. I was involved in a project where business processes using web services were orchestrated making heavy use of the OASIS Business Process Execution Language (BPEL) standard. There was a strong need for human interactions within those processes. After exploring our options for this, we found the WS-BPEL Extension for People (BPEL4People) Technical Committee (TC) that exactly addressed our need for human interactions in BPEL. I joined OASIS, got more involved in the development of that standard, and started to broaden my activities in other TCs out of professional interest. I’ve been closely involved in OASIS since then.

What excites you about OASIS and why are you passionate about its mission?
OASIS, a leading and respected open standards development organization, develops needed first class standards that are well received and adopted by the industry, while fostering and preserving its key strengths as being member-driven, open and transparent.

OASIS also continues to stay relevant for its membership and new constituencies by having broadened its focus from strictly standards to open source and open standards which are the new confluence of what drives innovation and serves ever more agile communities.

Being part of and able to develop open standards and open source on a global stage with an active and respected community of experts is something that is really exciting to me.

What types of skills/expertise do you bring to the OASIS Board?
Through many years of active participation in different capacities within open standards development efforts, I carry a lot of knowledge, experience, and insights on what it takes to develop, promote, adopt, and use open products.

I bring relevant insights which help shape OASIS’ future strategy, voice the EU perspective on important themes, and strive for close collaboration with other organizations; knowing who’s doing what and effectively collaborating with each other results in better visible and well-adopted products by the industry.

Do you have a specific role(s) within the Board?
I’m the chair of the Board. I’m also on the executive, finance & audit, governance and process board sub committees.

How do you hope to make an impact as a board member during your term?
I like to help drive OASIS’ strategic agenda, making it a better, more effective, and valuable organization, pursuing modernization for the benefit of its members, with the good of the OASIS community at heart!

How do you see your background and experience complementing the current board?
As a result of my long involvement, I believe I gained a deep understanding of the OASIS organization and its transition over time. In addition, I’m able to voice the EU perspective on important themes.

Being experienced in working in large distributed organizations and providing leadership to multi-disciplinary teams, I have a strong sense for the various interests of different stakeholders and have the ability to achieve progress and results based on consensus of all involved.

What would you like to accomplish as a board member this year?
This year, I hope to further develop and speed up the implementation of OASIS’ strategic agenda to address the challenges ahead of us and warrant continued organizational relevance and improvement through better responsiveness, openness, and inclusiveness.

What are your thoughts about the OASIS community? What are some of the key aspects of OASIS that sets it apart from other organizations?
OASIS has a very broad community of experts with standards development activities in many areas. Cross collaboration is easy and the community is very welcoming.

Can you tell us about a role model or mentor that you have currently or have had in the past and what were the qualities that inspired you most about that person?
I had three people in particular that I had to honor to work with and served as role models helping me navigate and better understand the landscape of standards development: Jim Hughes (before Microsoft) from whom I learned the tricks of the trade on how to deal with complex governance issues, Martin Chapman (before Oracle) who always kept me sharp to not lose sight of important details, and John Sabo (before CA Technologies) who was an example to me on how to reach agreement no matter how much opinions differed.

What are few words that come to mind when you think about the work being done at OASIS?
Relevant, widely applicable and trustworthy.

What’s a fun fact about you?
Outside of work I am a passionate ballroom dancer.

The post OASIS Board Member Spotlight Series: Q&A with Gershon Janssen appeared first on OASIS Open.

Monday, 09. January 2023

EdgeSecure

How a VMware TAM Can Maximize Value and Accelerate Digital Transformation

The post How a VMware TAM Can Maximize Value and Accelerate Digital Transformation appeared first on NJEdge Inc.

Webinar

February 15, 2023
3:00 PM ET

Virtualized technology infrastructure sits at the heart of modernization and digital transformation for education and public sector institutions. With the essential role virtualization plays for IT and organizational strategy, it’s vital for your VMware solutions to perform at their best. How can you be sure your VMware infrastructure is optimized and well-positioned to serve your organization’s current and future goals?

A VMware Technical Account Manager (TAM) serves as an advocate and advisor, to equip your organization with proven methodologies and exclusive tools to keep your VMware initiatives on track. In this session, you’ll learn how your organization can access a TAM at steeply discounted rates via the Edge agreement, to support initiatives including:

Optimization assessments to realize cost savings on current VMware implementations Strategy development to take advantage of new VMware licensing models and programs Improved cybersecurity, resiliency, and efficiency of virtualized environments Acceleration of cloud initiatives based on current VMware investments Roadmap development to achieve future technology goals

Additionally, you’ll hear from public sector VMware users who’ve taken advantage of the TAM program to optimize their deployments, enhance security, and plan for the future.

Presenters:

Jamal Encami, Milwaukee Police Department Jason Weaver, State of Michigan Kam Chan, Technical Account Manager, VMware Bob Johnston, Technical Account Manager, VMware Register Today

The post How a VMware TAM Can Maximize Value and Accelerate Digital Transformation appeared first on NJEdge Inc.


FIDO Alliance

ITPro.: The IT Pro Podcast: Going passwordlessITPro.:

Passwords: they can be tricky at the best of times. Proper password hygiene is one of the most important factors in endpoint security, as it keeps sensitive data secure and […] The post ITPro.: The IT Pro Podcast: Going passwordlessITPro.: appeared first on FIDO Alliance.

Passwords: they can be tricky at the best of times. Proper password hygiene is one of the most important factors in endpoint security, as it keeps sensitive data secure and prevents threat actors from getting into important systems. 

But despite the risks, the use of weak or recycled passwords continues to be a problem even amongst IT professionals. While systems such as two factor authentication have been used as an extra layer of security, groups like the FIDO Alliance and World Wide Web Consortium have been working to make passwords a thing of the past, in favour of more secure methods.

This week, we spoke to Richard Meeus, EMEA director of security & technology strategy for Akamai Technologies, to explore the solutions driving secure sign ons, and how the sector can adapt to this change.

The post ITPro.: The IT Pro Podcast: Going passwordlessITPro.: appeared first on FIDO Alliance.


Biometric Update: PayEye dual biometrics win FIDO certification with zero PAD errors

Polish fintech provider PayEye has obtained its FIDO Biometric Component Certification and proven highly resistant to presentation attacks, according to a news release. PayEye submitted the latest version of its eyePOS payment […] The post Biometric Update: PayEye dual biometrics win FIDO certification with zero PAD errors appeared first on FIDO Alliance.

Polish fintech provider PayEye has obtained its FIDO Biometric Component Certification and proven highly resistant to presentation attacks, according to a news release.

PayEye submitted the latest version of its eyePOS payment system, equipped with the company’s proprietary iris recognition algorithm, to trials run by private U.S. lab iBeta. EyePOS uses a fusion of face and iris biometrics to authenticate the identity of the person making a payment.

The FIDO Alliance certification requires evaluation of both biometric matching performance and spoof attack detection.

The post Biometric Update: PayEye dual biometrics win FIDO certification with zero PAD errors appeared first on FIDO Alliance.


Payments Journal: On-Demand Webinar: Go Beyond Fraud Prevention with Identity

When companies use identity data effectively, they deliver a highly personalized, frictionless journey that offers the right products to the right customers at the right time. They’re also able to […] The post Payments Journal: On-Demand Webinar: Go Beyond Fraud Prevention with Identity appeared first on FIDO Alliance.

When companies use identity data effectively, they deliver a highly personalized, frictionless journey that offers the right products to the right customers at the right time. They’re also able to optimize fraud prevention while minimizing any irritation for customers. 

During a webinar, Adam Gunther, Senior Vice President and Senior Technology Officer at Kount, and Tim Sloane, Vice President of Payments Innovation at Mercator Advisory Group, discussed how identity data can be used to smooth the payments process and drive business. They also discussed new trends, including biometric data and tokenization.

The post Payments Journal: On-Demand Webinar: Go Beyond Fraud Prevention with Identity appeared first on FIDO Alliance.


Biometric Update: FIDO offers guidance for government service use as countries deploy blockchain, biometrics

Governments around the world continue to develop digital identity capabilities to give residents easier access to public services, and the FIDO Alliance has published a white paper to help them do so […] The post Biometric Update: FIDO offers guidance for government service use as countries deploy blockchain, biometrics appeared first on FIDO Alliance.

Governments around the world continue to develop digital identity capabilities to give residents easier access to public services, and the FIDO Alliance has published a white paper to help them do so in phishing-resistant ways. The paper comes as a pair of alternative approaches to digital identity for government services are implemented, in Turkey and Bangladesh.

The white paper is intended to guide policymakers and the heads of government agencies and departments as they consider FIDO authentication for government services.

The post Biometric Update: FIDO offers guidance for government service use as countries deploy blockchain, biometrics appeared first on FIDO Alliance.


CSO: Why it might be time to consider using FIDO-based authentication devices

Every business needs a secure way to collect, manage, and authenticate passwords. Unfortunately, no method is foolproof. Storing passwords in the browser and sending one-time access codes by SMS or […] The post CSO: Why it might be time to consider using FIDO-based authentication devices appeared first on FIDO Alliance.

Every business needs a secure way to collect, manage, and authenticate passwords. Unfortunately, no method is foolproof. Storing passwords in the browser and sending one-time access codes by SMS or authenticator apps can be bypassed by phishing. Password management products are more secure, but they have vulnerabilities as shown by the recent LastPass breach that exposed an encrypted backup of a database of saved passwords. For organizations with high security requirements, that leaves hardware-based login options such as FIDO devices.

The post CSO: Why it might be time to consider using FIDO-based authentication devices appeared first on FIDO Alliance.


OpenID

Notice of Vote for Proposed Second Implementer’s Drafts of Two FAPI 2.0 Specifications

Voting will begin on Monday, January 9, 2023 and end on Monday, January 23, 2023, now that the 45-day review of the specifications has been completed. The Financial-grade API (FAPI) working group page is https://openid.net/wg/fapi/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval […] The post Notice of Vote for Proposed Sec

Voting will begin on Monday, January 9, 2023 and end on Monday, January 23, 2023, now that the 45-day review of the specifications has been completed.

The Financial-grade API (FAPI) working group page is https://openid.net/wg/fapi/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/306.

– Michael B. Jones, OpenID Foundation Secretary

The post Notice of Vote for Proposed Second Implementer’s Drafts of Two FAPI 2.0 Specifications first appeared on OpenID.

Sunday, 08. January 2023

Project VRM

Syndication and the Live Web Economy

This is from a December 2009 newsletter called Suitwatch, which I wrote for Linux Journal, and was 404’d long ago. (But I kept the original.) I’m re-posting it here because I think syndication may be the most potent power any of us have in the Internet age—and because the really simple kind, RSS, has been with us […]

This is from a December 2009 newsletter called Suitwatch, which I wrote for Linux Journal, and was 404’d long ago. (But I kept the original.) I’m re-posting it here because I think syndication may be the most potent power any of us have in the Internet age—and because the really simple kind, RSS, has been with us since before I wrote this piece. (I also think RSS has VRM implications as well, but I’ll leave those for another post.) My only edits here were to remove arcana and anachronisms that are pointless today. This graphic illustrates how entrenched and widespread RSS already is:

Until recently, the verb “syndication” was something big publishers and agencies did. As a kid, I recognized “© King Features Syndicate” was the one unfunny thing about Blondie or Dennis the Menace. All it meant to me was that some kind of Business was going on here.

Now millions of individual writers syndicate their own work, usually through RSS (Really Simple Syndication). Publishers and other large organizations do too. This article is syndicated. So are updates to product manuals, changes to development wikis, updates on SourceForge, and searches of keywords. You name it: if there’s something that updates frequently on the Web, there’s a better chance every minute that the new stuff is syndicated if it isn’t already.

Far as I know, not many sources are making money with it. Lots, however, are making money because of it. The syndicated world may not look like an economy yet. But trust me, it is.

At this early stage in its long future history, syndication is primarily a feature of blogging, which is primarily the product of too many people to count. Blogging is not about large-scale things. It’s about human beings who have no scale other than themselves. Only you can be good at being you, and nobody else is the same as you. Syndication does more to expand individual human potential than anything since the invention of type. Or perhaps ever. The syndicated world economy is the one that grows around unleashed personal powers of expression, productivity, creation, distribution, instruction, influence, leadership, whatever.

In a loose sense, syndication is one side of the conversation. Think about conversation in the best sense of the word: as the way people teach and learn from each other, the way topics start and move along. Syndication makes that happen in huge ways.

The notion that “markets are conversation”, popularized by The Cluetrain Manifesto, was borrowed from this case I used to make for a form of marketing that was far more natural and powerful than the formal kind:

Markets are conversation, and Conversation is fire. Therefore, Marketing is arson.

If you want to set fires, start conversations that tend to keep going. Nothing does the latter better than syndication.

There are three reasons why we still don’t hear as much about syndication as we should (and will). First, it’s still new. Second, it didn’t come from The Big Guys. (It came from Dave Winer, father of RSS — Really Simple Syndication.) Third, it points toward a value system not grounded only in exchange — one especially suited for the Net, a deeply ironic worldwide environment where everybody is zero distance apart.

But let’s park the value system until later and talk about next week. That’s when I’ll be in San Francisco for Syndicate. It’s the second in a series of conferences by that name. The first was in New York last Spring.

Since I’m the conference chair (disclosure: it’s a paying gig), and since I’ll be giving both the introductory talk and the closing keynote, Syndication is on the front burner of my mind’s stove.

There are others subjects there as well, some of which will be visited in sessions at the show. RSS, for starters. And tagging—a practice so new it’s not even close to having standards of the sort we find at OASIS, the IETF, and the W3C. Instead, it has emerging standards, like the ones we find at microformats.org.

Like syndication, tagging is a long-tail activity. Something individuals do. Along with blogging and syndication, it helps outline a new branch of the Net we’re starting to call the Live Web — as opposed to the Static Web with “sites” that are “built” and tend not to change.

The World Live Web is the title of my December Linux For Suits column in Linux Journal. In it, I note that the directoryless nature of everything on the Web falls in the Unix file path east of the domain name. Every path to a document (or whatever) is a piece of straw in the static Web’s haystack. Google and Yahoo help us find needles in that haystack, but their amazing success at search also tends to confirm the haystack nature of the Static Web itself.

The Live Web is no less webby than the Static Web. They’re both parts of the same big thing. But the Live Web is new and very different. It cannot be understood in Static Web terms.

In that piece, I also observed that blogs, as continuing projects by human authors, leave chronological trails. These give the Live Web something of a structure: a chronological one that goes /year/month/day/date/post, even if that’s not the way each post’s URL is composed. There is an implicit organizational structure here, and it’s chronological.

Tagging, by which individuals can assign categorical tags of their own to everything from links to bookmarks to photos, has given the Live Web an ad hoc categorical structure as well.

So that’s what we’re starting to see emerge here: chronology and category. Rudimentary, sure, but real. And significant.

But not organized. New practices, and new ideas, are coming along too fast.

What matters, above all, is user-in-charge: a form of personal agency in the connected world. That’s a concept so key to everything else that’s happening on the Web, even on the Static one, that we may need a new word for it.

Or an old one, like independence, liberty, sovereignty, or autonomy. That’s my inner Libertarian, choosing those. If your sensibilities run a bit more to the social side, you may prefer words like actualization or fulfillment. Point is, the Big Boys aren’t in charge anymore. You are. I am. We are.

There’s an economy that will grow around us. I think free software and open-source practices (see various books and essays by Richard M. Stallman and Eric S. Raymond) put tracks in the snow that point in the direction we’re heading, but the phenomenon is bigger than that.

It’s also bigger than Google and Yahoo and Microsoft and IBM and Sun and Red Hat and Apple and the rest of the companies people (especially the media) look to for Leadership. For all the good those companies do in the world, the power shift is underway and is as certain as tomorrow’s dawn. The Big Boys will need to take advantage of it. We’ll need them to, as well.

This power shift is what I’d like to put in front of people’s attention when they come to Syndicate next week, or when they follow the proceedings in blogs and other reports.

Now more than ever, power is personal. Companies large and small will succeed by taking advantage of that fact. And by watching developments that aren’t just coming from The Usual Suspects. Including the Usual Economic Theories.

For example, not everything in an economy is about exchange, or the value chain, or about trade-offs of this for that. Many values come out of effort and care made without expectation of return. Consider your love for your parents, spouses, children, friends, and good work. Consider what you give and still get to keep. Consider debts erased by forgiveness. Consider how knowledge grows without its loss by anyone else.

Sayo Ajiboye, the Nigerian minister who so blew my mind in conversations we had on a plane nearly five years ago (Google them up if you like), taught me that markets are relationships, and not just conversations. Relationships, he said, are not just about exchange. They cannot be reduced to transactions. If you try, you demean the relationships themselves.

Also, in spite of the economic framings of our talk about morality and justice (owing favors, paying for crimes, just desserts), there is a deeper moral system that cannot be understood in terms of exchange. In fact, when you bring up exchange, you miss the whole thing. (Many great teachers have tried in futility to make this point, and I’m probably not doing any better.) Whatever it is, its results are positive. Growth in one place is not matched by shrinking in another. Value in both systems is created. But in the latter one, the purpose is not always, or exclusively, exchange, or profit. At least not from the activity itself. There are because effects at work. And we’re only beginning to understand them, much less practice them in new ways.

Toward that end, some questions…

Where did the Static Web, much less the Live Web, come from? What is it for? What are we doing with it? Whatever the answers, nothing was exchanged for them. (No, not even the record industry, the losses of which owe to their own unwillingness to take advantage of new opportunities opened by the Net.)

Nor was anything exchanged for Linux, which has grown enormously.

As Greg Kroah-Hartman said recently on the Linux-Elitists list,

Remember, Linux is a species, and we aren’t fighting anyone here, we are merely evolving around everyone else, until they aren’t left standing because the whole ecosystem changed without them realizing it.

Yes, we have living ends.

Friday, 06. January 2023

Ceramic Network

Geo Web Launches on Optimism With Ceramic

The Geo Web recently launched its mainnet on Optimism and is currently in a global Fair Launch Auction.
Introduction

The Geo Web is a protocol that creates consensus for browsing digital media anchored to real-world locations. It uses a partial common ownership system (aka Harberger taxes) to administer a global digital land market and fund public goods.

The Geo Web chose Ceramic to power mutable P2P content linking from its land parcel NFTs. Ceramic’s ecosystem of composable data, schemas, and applications allows Geo Web landholders to push the use case possibilities of the open metaverse.

The Geo Web recently launched its mainnet on Optimism and is currently in a global Fair Launch Auction that runs until January 14th, 2023.

The Fight for the Open Metaverse

Who will control how digital media is organized and experienced in the physical world?

The Geo Web was founded to provide a credibly neutral, open-source answer to that question.

The Geo Web creates a virtual layer covering the globe that allows publishers and users to create a shared augmented reality. It’s an alternative to app stores, algorithms, and ad brokers intermediating every experience and transaction in the metaverse era.

Each Geo Web parcel NFT is like a 3D website. Landholders have the exclusive right to publish arbitrary data, entertainment, and applications within the bounds of their parcel on the network. Users naturally discover and experience the content with a universal browser application that resolves content based on geolocation rather than a URL.

The Geo Web maintains its DNS-like digital land registry by enforcing the three basic rules of partial common ownership:

Landholders must maintain a public For Sale Price for their parcel(s) Any market participant can trigger the transfer of a parcel by paying the holder their For Sale Price Landholders must pay an ongoing Network Fee proportional to their For Sale Price (currently 10% per year)

All of the revenue from the resulting digital land market is reinvested into public goods and prosocial outcomes through participatory mechanisms. Through a flywheel of open-source investment and network effects, the Geo Web can help the internet of the next era return to its permissionless, decentralized roots.

Managing Dynamic, Decentralized Content

The Geo Web’s property rights system is a good use case for on-chain smart contracts. But when it comes to putting those property rights to use through content publishing, on-chain data is too slow, expensive, and unwieldy.

The Geo Web needed a dynamic, mutable content layer that could enable the mind-bending use cases of augmented and mixed reality. It needed to be user-friendly without sacrificing the principles of decentralization and user control.

The Geo Web team found Ceramic early in its architecture stage and knew it would be a fit. Geo Web was one of the first protocols to launch a testnet on Ceramic.

How Geo Web Built With Ceramic

Ceramic provides the mutable content root for each Geo Web land parcel NFT. A parcel’s active StreamID is deterministically derived based on the current licensor’s wallet address and the ParcelID.

Landholders frictionlessly manage their parcel’s Ceramic Tile Document through their authenticated DID.

Open schemas and data definitions shared between publishers and spatial browsers create an endless landscape for use cases.

The Geo Web team is excited to push new features, use cases, and integrations within the Ceramic ecosystem, now that the Geo Web mainnet is live.

Become a Geo Web Citizen

The Geo Web’s fair launch auction is live on Optimism (non-auction claims begin January 14th, 2023 at 17:00 UCTC). You can claim land parcels on the Geo Web Cadastre corresponding to your home, work, or favorite park then start adding content with Ceramic. All your land market fees go toward funding public goods.

If you’re a developer, creator, or organization looking to build on/with the Geo Web, get in touch with the team via Discord, @thegeoweb on Twitter, or through the project website.

Thursday, 05. January 2023

Trust over IP

The ToIP Trust Spanning Protocol

A deeper dive to explain the special role of the ToIP Trust Spanning Protocol, and why it is the keystone to achieving large-scale interoperability. The post The ToIP Trust Spanning Protocol appeared first on Trust Over IP.

On 14 November 2022, the ToIP Foundation announced the release of the first public review draft of the ToIP Technology Architecture Specification V1.0. This specification represents over a year’s work by the ToIP Technology Architecture Task Force to define the technical requirements for each of the four layers of the ToIP stack—the centerpiece of the ToIP Foundation’s work to develop an interoperable architecture for decentralized digital trust.

In this post, we will dive deeper into one particular layer — Layer 2 — to explain the special role of the ToIP Trust Spanning Protocol and why it is the keystone to achieving large-scale interoperability.

But First — Some Context

It may first be helpful to understand the overall context of this work. The ToIP Foundation recently published a new document for this very purpose, Evolution of the ToIP Stack, that contains the following diagram:

Figure 1: The four stages of development of the ToIP stack

Here is a summary of the purpose of each of the four development stages:

Design Principles. Before we could dive into the design of the ToIP stack itself, we needed to agree on the principles governing the design. A six month effort produced Design Principles for the ToIP Stack V1.0—a document we highly recommend for a much deeper understanding of the problem space. Technical Architecture. Based on the Design Principles, we next needed to come to consensus on the architectural requirements of the overall system and the requirements for each of the four layers in the stack of the endpoints. This year-long effort resulted in the ToIP Technology Architecture V1.0 Specification, just released for public review. Component Specifications. The ToIP Technology Architecture Specification cannot be implemented directly by itself—it only specifies requirements that must be met by a set of component specifications. Each component specification is a building block that can then be assembled into a complete implementation of the ToIP technology stack. Interoperability Testing. Once a sufficient set of component specifications are ready, stakeholders in digital trust ecosystems can develop ToIP interoperability profiles against which test suites can be written to provide objective measurements of functional interoperability.

As Figure 1 indicates, we are already 2.5 years into the process and just reaching the end of the first iteration of stage 2. Note that we expect iteration at each stage, i.e., as we gain experience at higher stages, we expect to make minor adjustments to the foundational documents.

Given that we expect the ToIP Technology Architecture Specification V1.0 to be finalized in the first quarter of 2023, we are now ready to move forward into the component specification stage. As the Evolution of the ToIP Stack explains, most of these will not be developed by the ToIP Foundation — they are specifications that either already exist or are in progress at other standards development organizations such as DIF, W3C, IETF, ISO, and so on. 

One exception is the ToIP Trust Spanning Protocol—the one and only protocol required at ToIP Layer 2. Since the requirements for this protocol have only just been published, no other standards development organization currently has it as a focus. The purpose of this article is to explain the unique role this protocol plays in the ToIP stack and how we propose to proceed with its development.

The Hourglass Model

Section 6.2 of the ToIP Technology Architecture Specification explains that the design of the ToIP stack is based on the same core design principles as the Internet’s TCP/IP stack, and in particular the principle known as the Hourglass Model. Here is the one sentence summary of this principle from section #3 of the Design Principles for the ToIP Stack V1.0:

In a layered protocol architecture, the most successful design takes an hourglass shape where a single “spanning layer” in the middle connects a family of higher-level application-facing protocols with a family of lower-level transport protocols. 

Figure 2, from an August 2001 presentation by Steve Deering of Cisco, illustrates how the TCP/IP stack implements the Hourglass Model.

Figure 2: The hourglass model as implemented by the TCP/IP stack

The spanning layer in the TCP/IP stack is the Internet Protocol (IP), which defines the addressing system required to identify and connect any two Internet-connected devices. The spanning layer in the ToIP stack has the same goal except the purpose is to establish a trust relationship, i.e., a communications relationship whose authenticity can be cryptographically verified by both sides. Figure 3 illustrates how the hourglass design applies to the four layers of the ToIP stack.

Figure 3: The Hourglass Model applied to the four layers of the ToIP stack

Appendix B of the ToIP Technology Architecture V1.0 Specification includes a functional view of the ToIP stack (Figure 4) that goes into greater detail about the kinds of functions required above and below the Trust Spanning layer. Layer 1 (Trust Support) needs the services for storing and processing the cryptographic key material and encrypted data that is required for secure communications over Layer 2. Layer 3 (Trust Tasks) is a growing family of protocols that can be developed to automate common functions required in digital trust relationships.

Figure 4: A more detailed diagram of how the hourglass model applies to the ToIP stack Requirements for the ToIP Trust Spanning Protocol

Given the starring role of the ToIP Trust Spanning Protocol, it should be no surprise that it is the target of 60% of the requirements in the ToIP Technology Architecture V1.0 Specification. Of the 30 requirements in the specification (all aggregated in Appendix A), 18 are for Layer 2. They are included in Table 1 below for quick reference.

Req #RequirementL2.1A ToIP Endpoint System MUST communicate with another ToIP Endpoint System using the ToIP Trust Spanning Protocol.L2.2A ToIP identifier MUST be unique within the context in which it is used for identification.L2.3A ToIP identifier MUST be a verifiable identifier, i.e., verifiably bound to at least one set of cryptographic keys discoverable via an associated discovery protocol.L2.4A ToIP identifier SHOULD be a decentralized identifier, i.e., a verifiable identifier that does not require registration with a centralized authority.L2.5A ToIP identifier SHOULD be an autonomous identifier, i.e., a decentralized identifier that is self-certifying and fully portable.L2.6A ToIP identifier SHOULD support rotation of the associated cryptographic keys for the lifetime of the identifier.L2.7A ToIP identifier MAY also support rotation to an entirely different ToIP identifier that can be cryptographically verified to be a synonym of the original ToIP identifier.L2.8A ToIP identifier SHOULD support the ability to: a) associate the identifier with the network address of one or more ToIP Systems that can deliver to one or more Endpoint Systems under the locus of control of the ToIP identifier controller, and, b) if desired by the controller, enable that association to be discoverable.L2.9The ToIP Trust Spanning Protocol specification MUST define how to construct and format messages that are cryptographically verifiable to have the following four properties: (1) Authenticity: the message was sent from a sender who has control over the ToIP identifier. (2) Integrity: the contents of the message transmitted by the sender are received by the recipient without modification. (3) Confidentiality: the contents of the message are only accessible by authorized parties. (4) Privacy: the contents of the message are bound to conditions of usage agreed to by the parties.L2.10In a ToIP Endpoint System, an implementation of the ToIP Trust Spanning Protocol MUST support authenticity and integrity.L2.11In a ToIP Endpoint System, an implementation of the ToIP Trust Spanning Protocol MAY support confidentiality and privacy.L2.12The ToIP Trust Spanning Protocol MUST enable the composition of higher-level Trust Task Protocols (such features as co-protocols).L2.13The ToIP Trust Spanning Protocol MUST support extensible message schema.L2.14The ToIP Trust Spanning Protocol MUST support resolution of ToIP identifiers to: a) the network addresses of receiving Endpoint Systems, and b) any required cryptographic keys.L2.15The ToIP Trust Spanning Protocol MUST support transport of messages via ToIP Layer 1 interfaces.L2.16The ToIP Trust Spanning Protocol MUST support delivery of messages to the Layer 2 interface of the Endpoint System of the ultimate receiver of the message.L2.17The ToIP Trust Spanning Protocol MUST support delivery of messages via Intermediary Systems.L2.18The ToIP Trust Spanning Protocol MUST support confidentiality with regard to the metadata required for message routing.Table 1: The 18 requirements for ToIP Layer 2 — the Trust Spanning Protocol ToIP Identifiers

The astute reader will notice that 7 of these 18 requirements apply to ToIP identifiers, the term the spec uses for the cryptographically verifiable identifiers needed to establish the ToIP trust spanning layer. The complete taxonomy of ToIP identifiers is shown in Figure 5:

Figure 5: The different subclasses of ToIP identifiers

From broadest to narrowest, the three subclasses of ToIP identifiers are:

Verifiable identifiers (VIDs) include any identifier bound to at least one cryptographic key pair so that control over the identifier can be cryptographically verified. HTTPS URLs backed by X.509 digital certificates are one example of a VID (typically one that uses a domain name based on a centralized DNS registry). Decentralized identifiers (DIDs) are VIDs that: a) do not require a centralized registry, b) provide a standard mechanism for resolution, and c) permit dynamic discovery of associated network service endpoints (such as an endpoint that supports the ToIP Trust Spanning Protocol). See the W3C DID Spec Registries for an extensive list of DID methods that implement the W3C Decentralized Identifiers (DIDs) 1.0 specification. Autonomous identifiers (AIDs) are DIDs generated algorithmically from a crypto- graphic key pair in such a way that they are self-certifying, i.e. the binding with the public key can be verified without the need to consult any external blockchain or third party. KERI is an example of a decentralized identity technology based entirely on AIDs.

ToIP identifiers are essential to the ToIP Trust Spanning Protocol for the same reason that IP addresses are essential to the IP protocol: they provide the new form of address required to achieve a universal spanning layer. In the case of the IP protocol, the spanning layer enables data to flow between any two endpoints regardless of their local network domain. In the case of the ToIP Trust Spanning Protocol, the spanning layer enables cryptographically verifiable data to flow between any two endpoints regardless of their local trust domain.

Forerunners for the ToIP Trust Spanning Protocol

As DIDs and AIDs emerged, they quickly catalyzed interest in new protocols that used the unique new properties of these identifiers to establish secure communications channels. Table 2 lists the three best-known of these protocols:

ProtocolHostPrimary SpecificationDIDComm V2DIFhttps://identity.foundation/didcomm-messaging/spec/ KERI / CESRToIP / DIF / IETFhttps://github.com/WebOfTrust/ietf-keri DWN (Decentralized Web Node)DIFhttps://identity.foundation/decentralized-web-node/spec/ 

Table 2: Protocols designed explicitly to work with verifiable identifiers

DIDComm takes its name directly from the concept of general-purpose “DID-to-DID communications. DIDComm V1 originated at Hyperledger Aries, then as interest grew, a DIDComm working group was established at the Decentralized Identity Foundation. DIDComm V2 was released in July 2022. KERI (Key Event Receipt Infrastructure), which uses CESR (Composable Event Streaming Representation), is currently being developed by a GitHub community that is preparing Internet Drafts for submission to IETF. KERI focuses on providing the robust key management necessary to secure authentic connections between any two AIDs. DWN (Decentralized Web Node) is, to quote the spec, “a data storage and message relay mechanism entities can use to locate public or private permissioned data related to a given DID.” DWN is a work item of the DIF Secure Data Storage Working Group. Although DWN is more data-oriented than either DIDComm or KERI, it too specifies a message-based protocol for communicating with a DID-identified endpoint.

In addition to these three general-purpose secure messaging protocols that are clearly relevant to ToIP Layer 2, it is worth pointing out another set of protocols that have been developed specifically for issuing and presenting digitally verifiable credentials. Although from a ToIP layering standpoint, these protocols are all primarily Layer 3 (Trust Task) protocols, some of them include some Layer 2 functions for the simple reason there has not been any clearly defined Layer 2 protocol yet. Table 3 lists several of the most popular:

ProtocolHostPrimary SpecificationOIDC4VCOpenID Foundationhttps://openid.net/specs/openid-4-verifiable-credential-issuance-1_0.html https://openid.net/specs/openid-4-verifiable-presentations-1_0.html Hyperledger Aries
Aries Interop ProfilesHyperledgerhttps://github.com/hyperledger/aries-rfcs/tree/main/concepts/0302-aries-interop-profile#aries-interop-profile-version-20VCAPIW3C CCGhttps://github.com/w3c-ccg/vc-api/ CHAPIW3C CCGhttps://w3c-ccg.github.io/credential-handler-api/ ISO mDL
(currently local exchange only)ISOhttps://www.iso.org/obp/ui/#iso:std:iso-iec:18013:-5:ed-1:v1:en 

Table 3: Protocols designed for issuance or presentation of digital credentials

We mention all of these protocols (and undoubtedly there are others) for two reasons:

All of these protocols may have something to contribute to the design of the ToIP Trust Spanning Protocol. All of them enable the exchange of secure and privacy-respecting content between endpoints. None of these protocols, as currently specified, meet all the requirements of the ToIP Trust Spanning Protocol. While some may be very close, none were explicitly designed to function as a spanning layer protocol—a protocol that by definition must be, to paraphrase Einstein, “as simple as possible but no simpler”. The ToIP Trust Spanning Protocol Task Force

The fact that such strong candidate protocols already exist has three key implications for development of the ToIP Trust Spanning Protocol:

This work does not need to “start from scratch”—or anywhere near it. Rather it can focus immediately on what can be learned/inherited from these existing protocols—and especially on what can be removed from them (and put into a Layer 3 trust task protocol). This work can benefit enormously from the participation of the architects of these other protocols. These experts have already poured several years of their lives into designing protocols that implement a superset of the functionality needed by the ToIP Trust Spanning Protocol. Our hope is that with their help, we can achieve consensus relatively quickly about the subset of features that are absolutely essential. The simpler the protocol, the faster we can have multiple implementations. By definition, a spanning protocol needs to be adopted very widely. The faster we can show multiple interoperable implementations, the faster we can kick start adoption.

For all these reasons, the ToIP Technology Stack Working Group launched the Trust Spanning Protocol Task Force (TSPTF) on Wednesday, January 18. You can listen to a Zoom recording of the first meeting.

The TSPTF will now begin holding pairs of meetings on Wednesdays —one for NA/EU time zones at 08:00-09:00 PT / 16:00-17:00 UTC and one for APAC time zones at 18:00-19:00 PT / 02:00-03:00 UTC. For exact dates and meeting logistics of all ToIP meetings, see the ToIP Calendar.

A Collaborative, Multi-Organizational Effort

The most important principle of this work is that it should be as open and inclusive as possible. The whole point of a trust spanning protocol is to enable a new trust ecosystem consisting of interoperable trust applications, where those trust applications remain independent of the underlying trust infrastructure that the trust spanning protocol connects. So for this work in particular we want to involve all the other non-profit orgs and SDOs in our space: DIF, Hyperledger, W3C, IETF, ISO, IEEE, and any other stakeholders in decentralized digital trust infrastructure.

To this end, Daniel Hardman, co-chair of the DIF DIDComm Users Group (DUG) and now co-lead of the TSPTF, held a special meeting of the DUG on 09 January to discuss the topic of multi-stakeholder collaboration. Over 40 members of the decentralized identity and trust ecosystem attended. We recommend listening to the Zoom recording of the DUG special meeting and/or reviewing Daniel’s slides to understand his specific recommendations for how we can all collaborate more successfully in 2023.

The post The ToIP Trust Spanning Protocol appeared first on Trust Over IP.

Wednesday, 04. January 2023

DIF Blog

DIF Member Spotlight with Tarun Gaur of Qixfox

In the second installment of our DIF Member Spotlight series we sat down with Tarun Gaur, the CEO and founder of qixfox, a cybersecurity system that includes a decentralized browser. He will share more about his journey into the decentralized identity space and about his product which also supports quantum

In the second installment of our DIF Member Spotlight series we sat down with Tarun Gaur, the CEO and founder of qixfox, a cybersecurity system that includes a decentralized browser. He will share more about his journey into the decentralized identity space and about his product which also supports quantum resistant cryptography. Watch the entire interview on YouTube

Limari: First of all, I want to welcome you Tarun. Thank you so much for joining me today and having this discussion. I must say, when I spoke to you a few months ago, when you went through your short demo on qixfox, I was actually really excited about what you guys are doing. I’m really glad we get to have this discussion and go a little bit more in. First what I always like to do when I start out is I like to hear people's story. Everyone in this community is so interesting. and decentralized identity, it is a space which is relatively new. it would be great to just get a sense of your journey, how you came to work on qixfox and also decentralized identity generally. So maybe you can just give us a little bit of an introduction on that

Tarun Gaur: Absolutely first of all, thanks for having us here, very excited to be an associate member at DIF. As far as my story is concerned, I worked for companies like Deloitte, AOL and Microsoft and I've been all over the place. I'm an engineer by profession, never been a business guy, but somehow by default got into entrepreneurship. I left Microsoft to start my own software consulting firm growing it to around five hundred people and then I had a successful exit from it. Then qixfox was my favorite project since my college days. I always had this feeling that something is drastically wrong with the Internet. It’s very information driver and obviously the technologies were not there. Early on web 1.0 was very information-driven.

Since my college days I wanted to build an internet browser in the horizontal platform, and that's how ultimately how man and machine came together. You know, I had the resources to do it this time around the hardware and software was now up to par, I could build very immersive experiences and a lot of technology was open source. I said, this is the right time to do it. We are kind of sitting at the beginning of the end of the first era of the Internet. So I think this is the perfect time to do it.

Limari:  Okay, wonderful. If you can give us maybe just a bit of an overview of what qixfox is that would be great.

Tarun Gaur: Absolutely, we love to call ourselves the trustworthy internet company which practically means that we are building the tool shed for the trustworthy Internet. As part of that we are solving two problems: fixing what is broken in the web, which is safety, security, and privacy and second, design the future which practically means build a more democratic, equitable, and decentralized Internet.

In the past peer to peer computing has been tried before or decentralized web has been tried before. It did not fail because we could not connect computers together, it practically failed because we could not contain spam. You know for example you have bit torrent, and you download bit torrent, content from bit torrent and you can't put a finger on what are you downloading from and what the content is all about, it could be a malware, it could be a key log it could be anything.

We have this set of problems in web 2.0 around safety, security and privacy and we are dragging all these problems to web3.0 You know, with all the Hoopla around privacy and trust we thought you cannot solve privacy without solving safety and security, and we thought you need a horizontal platform to do that. Honestly, the more we kind of researched other browsers we realized that, there were some basic problems that other browsers were not solving that we had an opportunity to solve at qixfox. That's how we transpired. You know a number of our team members, they've worked with me in the past. We thought it might be a great idea to get together, and fix this once and for all.

The icing on the cake was that my mother got scammed for $600 some time back, and her running joke was that, hey, you're good for nothing you can't fix this problem for me and all of the mails. I'm like, okay mom let's fix this problem for everybody. Let's make Internet more safe, trustworthy and reliable and that's what we are doing at qixfox, practically.

Limari: Yeah, that's great and I know there are a number of browsers that may claim, you know to be security browsers. It would be great if you can give us also a bit of an idea of

what makes qixfox different from other security browsers that are out there that people can choose from.

Tarun Gaur: Absolutely. You know tongue in cheek, I generally tend to say that we don't consider other browsers to be our competitors. We consider every other horizontal platform to be a competitor. For example, I think Google is a competitor, we think Microsoft is a competitor. Apple is a competitor. We don't consider Oprah or Brave to be a competitive per se. Our intention was to build a platform that is designed for trustworthy Internet. It doesn't matter whether you are centralized or decentralized, whether you're using web2 technologies, web2.5 technologies or web3.0 technologies.

We had a feeling that the consumer is not getting a consumer-friendly reliable experience.

So that's where we kind of delineate ourselves from the likes of Brave and what have you. Their claim to fame is practically that we take care of privacy. As I mentioned before, we think that you cannot solve privacy if you don't solve safety and security. I'll give you a simple example. You have an email from a phishing website. You click on that website, and you willingly provide all your information. There goes your privacy out of the window. If the browser does not have the technology to identify that the entity you are dealing with is not a legitimate business, or you are not interacting with a reliable second or third party, then your privacy is not secured. So we jokingly say that privacy is the new Kool Aid these days. Everybody talks about privacy but very few know how to fix it.

Then safety security privacy, I always had this opinion that it's things like honesty, integrity, are not your speciality, they should be there by default. Safety, security, and privacy are not a speciality that you design the platform around. It is a prerequisite and then comes the things that you want to do. For example, we want to design the future, we want a consumer to be able to create an online shop with a click of a button. We want them to be able to find each other very easily. We want them to be able to verify each other's identity easily. We want them to feel private and then be anonymous if they want to, but at the same time share information at the point of view. That's where decentralized identity was a critical building block for us. We couldn't have done this without decentralized identity, because re-writing the Internet kind of practically starts and ends at creating these building blocks in the horizontal platform.

So, in short, you know it's quite a mouthful. You know I love to say it in a simple manner that we are solving the problems the fundamental problems that exist in internet today

and then build pleasant, consumer, friendly, centralized, as well as decentralized experiences for the consumers for the future.

That's how I would put it. So that's where I think we deviate from all of the browsers like Firefox, Opera, Brave, etc.

Limari: When I spoke to you months ago, you mentioned that this was a cybersecurity system, and a feature of it is the browser that has decentralized identity baked into it. You describe it as, once you have access to it, you can access all your various accounts. Can you maybe give a little bit of a summary for some of our audience of kind of how that might work?

Tarun Gaur: Very early on in my career, in one of my incarnations at Microsoft, I used to work with the telecom providers. With telecom providers they were latching on to data networks. It was almost always about the applications that you can bring on a platform. At the end of the day a platform is as successful as the applications that you built on it and decentralized identity is the one such paradigm. If you don't have a working, active application for it, it is very difficult for the consumer to understand the difference between a password manager, and identity.

You see that in decentralized apps, they are in a dismal state today. You know there are more than 4,000 decentralized apps and their average active users are 699. You look at 350 identity wallets, and practically when you open the application it's completely empty.

So we thought, having an identity wallet singularly and separately, and in an isolated fashion, does not make sense unless you were Apple or Google. So where do you kind of put your identity wallet? The identity wallet has to be at the point of use, and you have to make the transition for the consumer seamless, from using a password manager to be able to use decentralized identity or using their verifiable credentials as part of that workflow. So we have around four different reference applications that we have built. One of them is a collaboration engine like Zoom, another one is an in built anti-virus, then the third one is a smart stack application, which is our answer to IPFS. All these technologies are put together, they all use decentralized identity. For example, if you log in you have a profile created in our browser, and you want to log into any of these applications you don't have to do anything. One simple question is asked, this service provider is asking for so and so information.

You click yes, and then you are logged in automatically, because that exchange between the identity wallet and the service provider happens automatically. For the consumer it’s as simple as just clicking on the password manager and selecting the password. This was the experience that we wanted to kind of get to.

We had a simple philosophy, that if we can't make it simple for the consumers throw it in the dust bin. At the end of the day the browser needs to be used by the consumers and small businesses alike. Another reasoning was how do you bring it to the masses? How do you make sure that, identity systems become an integral part of the next generation experiences. While we were working on building these, reducing the entry barrier for consumers to create a website, to build a shop, you know all those experiences, identity becomes an integral fundamental part of it. We wanted to make sure that it is built into the browser.

So today all you need to do is you'll go to the browser, there is an application called anti-virus, which is built into the browser. You click on that anti-virus, your identity is automatically shared, your subscription is validated, and if your subscription is still valid (everything happens with decentralized identity), then you get access to the application. We’ve taken care of those advanced use cases like revocation lists and ensuring privacy for the consumer, not going back to the issuer and so and so forth. But all of it is completely transparent to the consumer. The consumer thinks just another password manager.

Limari: I know you mentioned that your mother was one of the individuals who fell prey to fraud and the boomer generation you mentioned was really your primary market. These are the primary people who are coming to you to use qixfox and that makes sense. My parents are boomers, a lot of us we've experienced strange links that get sent from people, you know they've been hacked. I’m curious to hear more about how you see this moving into other markets, or what's your vision of where you see qixfox going, and how it may change the way people do things and think about things.

Tarun Gaur: I think this is something first thing yes, my mother, you know I had an ear full from her after she got scammed, and it was all my fault, because I was supposedly an IT guy and I had to fix it! How can she get scammed from her computer? That's how it started. As far as baby boomers are concerned it was our beachhead market, because we realized that you know there are 73 million baby rumors in the United States. Around 46 million are on the Internet every single day, and out of them 21 million purchase an antivirus subscription. We realized that if we can enter this market, and we can safeguard baby boomers from getting scammed that'll give us a lot of credibility to kind of branch out to SMBs, and then ultimately go towards mega corps and then ultimately open up the other verticals and other customer segments.

So that was the plan, but if you kind of go to our website today and you look at the message it's a very simple message. It says the browser that keeps you safe. It brings you peace of mind, and this is the message for the consumers who have not yet latched on to the crypto bus, or who have not latched onto the web3, marketing buzz words. These are the consumers who use the internet every single day, and they are concerned that they may get scammed or their credit card information will be stolen.

These are the consumers that we thought are our primary market, and all across the planet. What was surprising when we kind of started doing our market research, we started to realize that it is not the baby boomers who are buying the product, it's actually their kids who are buying it for their parents. The kids are like, oh this seems very interesting. I mean, this browser is able to identify that domain name if you kind of browse to a domain name that is the legitimate business in the United States or not.

That in itself, this one simple tweak to the to the platform, actually safeguards consumers I mean from 90% of the threats. As far as our future is concerned. I think our future is all about web3. We’ve kind of identified the value proposition when it comes to the regular consumers, that’s the browser that keeps you safe.

Now the browser of the future is where we in 2023 and 2024, that's where we are working

In two different segments. One we want to make sure that we redefine what a browser means for small and medium businesses, second, we kind of enable this decentralized, trustworthy web3. That’s where I can say that blockchain in generally a small toolchain. Blockchain is practically a data structure, blockchain is not decentralized internet. Decentralized internet means you and I can connect with each other, we can share in information, we can share conversations we can have performed transactions then finally we can share experiences.

If we can do all these four things, then we are actually laying the foundation of the next generation of decentralized fragmented internet. I think we are getting there one step at a time, and all this churn that we are seeing right now is an essential part of it, and through this churn will come out the winners and the losers

Limari: Thank you for sharing all that. The other question I do have is specifically about DIF. What brought you to DIF? I’m always curious kind how you came across us, if there are any work items that were of interest to you at your company, groups that you enjoy attending, or members of your team enjoy. It would just be great to hear your thoughts on that.

Tarun: Well, I think we've always been curious. We were kind of following  two different tracks when it came to decentralized identity and creating decentralized profiles. There is another school of thought that thinks that rather than decentralized identity you can just have nfps, non-fungible profiles and kind of just post them on blockchain and use them as a springboard to kind of define identity.

But I've been throughout my career, I've been a standards junkie. Very early on you've understood right if you make a lot for yourself, it should work for everybody otherwise you’re the only one using the lot. So the key has to work for everybody.

I think that's where the standards bodies come in. I think the biggest help that comes from DIF is the establishing of standards and writing of specs and creating use cases. We have practically used it as our prds. Our focus was building the Internet you know, the trustworthy Internet. If we had to kind of build this entire set of, architectural paradigms, the security and the specifications and the use cases around decentralized identity, it would have taken us another two years to do it. So, thanks to the entire community. We keep on following almost all the specs you know, we kind of passively follow. So thanks to the entire community for the amazing work that everybody has done to kind of bring us to this point.

It's amusing to sometimes see our implementation then ultimately see the specifications that happen at the DIF and the all the debates that go around. One of the sticking points still that we have is the revocation lists, how to deal with the revocation lists. So that's where we are very curious. We are also very curious to see how Google and Apple react to it. What is their roadmap in in terms of implementing it in their browsers? We are not complaining, because we already have it today. As I was mentioning to you last time around, you say that Jack Dorsey has a spec for web5. We have a product that does all of that today. We are not complaining, I mean Google and Microsoft and Apple can take their own sweet time debating whether they are going to go the decentralized identities out or not.

I think the speed with which we've been able to implement decentralized identity it's all thanks to DIF. If you're checked in into the future, you have to know what's happening in the identity space. Identity is fundamental to any horizontal platform so I can't really recall how I know about DIF, but let's just say, I just know it from the day one

Limari: Right, if you're in the space you come across it eventually. That's great to share because it is a great place for you to just come in, you can get your hands on the code, or you can just watch, you know you can get into the Github repos. If they’re members they can join our Slack channel, and we have great discussions that are happening there all the time.

Tarun Gaur: There is one other thing I wanted to mention very quickly. Decentralized identity when combined with quantum resistant cryptography is amazing. I guess we are the only browser that actually ensures that both these technologies are in. So we are today the only browser that has, if your server can support quantum resistant cryptography, our browser supports it today. This is what we are doing for the enterprises to make them more safe and secure.

Limari: Great thanks for sharing. I'm sure a lot of people in our audience will be very interested in that. That really brings us to the end of things. It was really great to speak with you today, Tarun. Lastly, can you give people a sense of how they, if they want to follow up with you. If they want to learn more about qixfox, what's the best way to do that.

Tarun Gaur: qixfox.com. My name is Tarun Gaur and I am available on LinkedIn, Facebook, Twitter, till we build our own social network as part of the super app. I'm reachable, and I’m busy to nothing. If you’ve got an amazing use case, I would love to hear from you. If I can be of any help to anybody in the community I would love to do that.

Limari: Awesome.  Thank you so much for joining me today. This was a great interview, and I look forward to being in touch.

Tarun Gaur: Absolutely. Thank you. Thanks again.

Tuesday, 03. January 2023

OpenID

Announcing the 2023 OpenID Foundation Individual Community Board Members Election

This is to announce the OpenID Foundation individual community board members 2023 election schedule. Those elected will help determine the role the Foundation plays in facilitating the develop and adoption of important open identity standards enabling global interoperability as well as the strategic directions of the Foundation. Per the Foundation’s bylaws, three individual community board me

This is to announce the OpenID Foundation individual community board members 2023 election schedule. Those elected will help determine the role the Foundation plays in facilitating the develop and adoption of important open identity standards enabling global interoperability as well as the strategic directions of the Foundation.

Per the Foundation’s bylaws, three individual community board members are elected.  Nat Sakimura’s and John Bradley’s two-year terms are expiring this year. (George Fletcher has an additional year remaining on his two-year term.) I want to thank Nat and John for their continued service to the Foundation and the community at large. Nat and John are eligible to be re-elected, should they choose to run again.

The individual community board member election will be conducted on the following schedule:

Nominations open:  Wednesday, January 4, 2023 Nominations close:  Wednesday, January 18, 2023 Election begins:  Thursday, January 19, 2023 Election ends: Thursday, February 2, 2023 Results announced by: Friday, February 3, 2023 New board term starts: Thursday, February 16, 2023

All members of the OpenID Foundation are eligible to nominate themselves, second the nominations of others including those who self-nominated and vote for candidates. If you’re not already a member of the OpenID Foundation, we encourage you to join now at https://openid.net/foundation/benefits-members/.

Voting and nominations are conducted on the OpenID Foundation web site: https://openid.net/foundation/members/elections/56. You will need to log in at https://openid.net/foundation/members/ to participate in nominations and voting. If you experience problems participating in the election or joining the Foundation, please send an email to help@oidf.org.

Board participation requires a substantial investment of time and energy. It is a volunteer effort that should not be undertaken lightly. Should you be elected, expect to be called upon to serve both on the board and on its committees. If you’re committed to open identity standards and collaborate well with others, we encourage your candidacy.

You are encouraged to publicly address these questions in your candidate statement:

What are the key opportunities you see for the OpenID Foundation in 2023? How will you demonstrate your commitment in terms of resources, focus and leadership? What would you like to see the Foundation accomplish in 2023; how do you personally plan to contribute? What other resources do you bring to the Foundation to help the Foundation attain its goals? What current or past experiences, skills, or interests will inform your contributions and views?

Please forward questions, comments and suggestions to me at director@oidf.org.

Best,

Gail Hodges
Executive Director
The OpenID Foundation

The post Announcing the 2023 OpenID Foundation Individual Community Board Members Election first appeared on OpenID.

Thursday, 29. December 2022

OpenID

Implementer’s Draft of OpenID Connect Native SSO for Mobile Apps Approved

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft: OpenID Connect Native SSO for Mobile Apps 1.0 This is the first Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product […] The

The OpenID Foundation membership has approved the following specification as an OpenID Implementer’s Draft:

OpenID Connect Native SSO for Mobile Apps 1.0

This is the first Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This specification is a product of the OpenID Connect Working group.

The Implementer’s Draft is available at:

https://openid.net/specs/openid-connect-native-sso-1_0-ID1.html

The voting results were:

Approve – 54 votes Object – 0 votes Abstain – 10 votes

Total votes: 64 (out of 275 members = 23.3% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Implementer’s Draft of OpenID Connect Native SSO for Mobile Apps Approved first appeared on OpenID.

Tuesday, 27. December 2022

Energy Web

Energy Web 2022 Lookback: Navigating the Storm Before the Calm

Let’s dive into the past 12 months and reflect on how Energy Web’s technology helped companies navigate the energy transition! It borders on cliche at this time of year to look back on the previous 12 months and reflect on what an eventful year it has been. But within the orbit that Energy Web occupies — namely the intersection of the global energy transition and Web3 technology — even hyperbole
Let’s dive into the past 12 months and reflect on how Energy Web’s technology helped companies navigate the energy transition!

It borders on cliche at this time of year to look back on the previous 12 months and reflect on what an eventful year it has been. But within the orbit that Energy Web occupies — namely the intersection of the global energy transition and Web3 technology — even hyperbole seems inadequate to describe what we witnessed in 2022.

From an energy and climate perspective this year was record-setting in ways both horrifying and inspiring. While the energy transition is more urgent than ever, there are encouraging signs that 2022 may well be a critical tipping point in the trajectory towards net-zero.

Meanwhile, in the Web3 and crypto world, well… you already know what an exhausting year it’s been, with headlines ranging from merely disappointing to downright infuriating.

So how did Energy Web navigate such turbulent and tumultuous times?

As anyone who’s weathered a storm well knows, outcomes are largely determined by a mix of advance preparation and sound decision making when the going gets rough. Nobody could have predicted what a mess 2022 turned out to be, but in hindsight Energy Web benefitted from both strategies.

In terms of laying a solid foundation, we’re unusually well-positioned amongst Web3 organizations to withstand the current moment thanks to our laser-focus on building solutions for a specific industry. Having collectively invested tens of thousands of hours into designing, deploying, and refining solutions with our members over the past six years, we’re crystal clear about what real-world problems we can solve and where we can (and can’t) create value in the energy transition. Thus instead of relying on speculative crypto nonsense to fuel our growth, we generate steady revenues by delivering tangible business advantages to our partners in the here and now.

When it came to in-the-moment actions, this year we made a subtle but important strategic change that will help us ride the tailwinds of accelerating clean energy investment while avoiding the headwinds that befell much of the tech/crypto industries. In short, we transitioned from offering the Energy Web Decentralized Operating System (EW-DOS) as a stack of standalone components to bundling those components into three comprehensive solutions: Green Proofs, Data Exchange, and Asset Management. With this shift not only have we found strong product-market fit, we’ve also enabled a more flexible and scalable commercial model that shares elements with other popular open-source projects.

As we take stock of the last year and look ahead to 2023, we’re in an extremely strong position to deliver against our mission even as the world around us remains chaotic. We’re not going to change course, or sit around and wait for conditions to improve. Instead, we’re going to continue on our journey, with three themes as our guide:

Theme 1: Solve Real Problems

Within the Web3 world there’s been a historical tendency to focus on attributes of technology rather than their practical implications. This was always a dubious practice, but with the recent fiascos in the crypto industry it’s now entirely unviable. Advertising a solution as “decentralized” or “self-sovereign” is meaningless unless you can articulate the ends that justify those means.

This year we took this lesson to heart. Recognizing both the capabilities as well as limitations of EW-DOS, we honed in on a few narrow use cases where our technology provides material business benefits relative to alternative solutions. Our commitment to problem-solving is reflected in both our communications and commercial strategy. Highlights from 2022 include:

Helping distribution utilities strengthen cybersecurity while improving visibility into remote grid assets. Helping the aviation industry scale markets for sustainable fuel. Helping grid operators and aggregators coordinate the operation of virtual power plants. Helping electric vehicle owners simplify their charging experiences.

Theme 2: Productize Solutions

Back in 2019 we first envisioned EW-DOS as a box of Legos: interoperable building blocks that companies could mix and match to build bespoke creations. This approach has proven reasonably successful in developing minimum viable products, but it’s not scalable in production. To maximize our impact we need to make the adoption of EW solutions as transactional and repeatable as possible, while minimizing the need for us as an organization to assist with customization and implementation. In short, we need to get closer to offering curated (or even pre-assembled) Lego sets. For Energy Web this means eventually offering solutions under multiple commercial models, with fully self-hosted and managed services options available. This year we took a few small steps in this direction with the release of Energy Web RPC nodes in the AWS Marketplace, and next year we’ll be announcing much more in this space.

Theme 3: Emphasize Execution, Not Just Innovation

Innovation is, and always will be, a core part of our organizational DNA. But 2022 is the first end-of-year cycle where instead of thinking primarily about what prototypes we’ll deliver, or what technologies we want to experiment with in the new year, our focus is on executing on our roadmap to deliver three core solutions in production, at scale. Whereas in the past we measured success by our engineering output, or our partnerships, as we mature as an organization and enter a new stage in our lifecycle we will prioritize impact and adoption above all else. Looking ahead to the end-of-year report for 2023, we want headlines not to focus simply on the release of a green EV charging solution, but on how many drivers are using it; not just on partners experimenting with our technology, but operating markets with it; and not just plans to decarbonize crypto mining, but how many miners have been certified.

In sum, we’re excited about the future and can’t wait to carry our mission forward next year!

About Energy Web
Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, data exchange, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Energy Web 2022 Lookback: Navigating the Storm Before the Calm was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

Gordian Envelope Use Cases Overview

The Gordian Envelope Smart Document is a powerful new method of data storage and transmission that focuses on protecting the privacy of its contents through cryptographic functions such as signatures, elision, and inclusion proofs. But what does that mean? Why would you use it? To answer these questions we’ve published a set of 24 use cases that not only demonstrate many innovative uses for Gordian

The Gordian Envelope Smart Document is a powerful new method of data storage and transmission that focuses on protecting the privacy of its contents through cryptographic functions such as signatures, elision, and inclusion proofs.

But what does that mean? Why would you use it? To answer these questions we’ve published a set of 24 use cases that not only demonstrate many innovative uses for Gordian Envelopes, but also show precisely how those Envelopes would be structured — because these use cases aren’t theoretical, but instead real possibilities with the current iteration of the Gordian Envelope specification.

Most of the following use cases are offered progressively: additional use cases build on earlier ones, expanding the fundamental ideas with new functionality in each example. (That functionality is listed as part of each use case’s name.)

Read More Educational & Credential Industry Use Cases

Educational use cases demonstrate how Gordian Envelope can transmit sensitive student information, including educational credentials.

Part One: Official Credentials

The first set of use cases demonstrates how recognized issuers can create and use credentials.

Danika Proves Her Worth (Credentials, Signature) — Issuing authenticated credentials with Gordian Envelope Danika Restricts Her Revelations (Elision) — Using elision to allow a holder to selectively hide Envelope contents. Thunder & Lightning Spotlights Danika (Third-Party Repackaging) — Adding content to an existing Envelope & republishing it. Part Two: Web of Trust Credentials

Individuals may instead want to create peer-to-peer credentials.

Omar Offers an Open Badge (Web of Trust Credentials) — Creating a credential based on personal authentication. Part Three: Herd Privacy Credentials

Another possibility for credential release is through large data dumps that allow the user to stay in control over whether they’re ever revealed.

Paul Privately Proves Proficiency (Herd Privacy) — Creating highly private credentials. Paul Proves Profiency with Improved Privacy (Herd Privacy with Non Correlation) — Using design formats to improve herd privacy. Burton Bank Avoids Toxicity (Herd Privacy with Selective Correlation) — Avoiding toxic data by selective correlating unrevealed information. Software Industry Use Cases

Software use cases demonstrate how the structure of Gordian Envelope can innovate procedures requiring signing, such as software releases.

Part One: Chained Signing

Gordian Envelopes can automate releases of data over time by creating and updating a root of trust within the Envelope.

Casey Codifies Software Releases (Multiple Signatures, Structured Data) — Structuring release data and authenticating it with multiple signatures. Blockchain Everyday Confirms Casey (Repackaging Data, Third-Party Verification) — Adding additional levels of data verification by repackaging Envelopes. Casey Chains His Software Releases (Chained Data) — Using Envelope structure to automate the release of future data. Casey Checks Compliance (Attestation, Metadata) — Adding signed metadata to a structured data set. Casey Changes Up His Software Releases (Chained Changes) — Using structured data to announce changes in trust over time. Part Two: Anonymous Signing

Authentication can be combined with elision to allow for signing that is pseudonymous yet validated.

Amira Signs Anonymously (Anonymous Signature, Web of Trust) — Using a Web of Trust to verify a signature made pseudonymous through elision. Amira Reveals Her Identity (Progressive Trust) — Removing elision over time to gain reputation from previously published works. Data Distribution Use Cases

Data distribution is crucial for a variety of use cases, from the supply chain to the medical industry. The use cases in this section take as an example the distribution of user data based on a WebFinger-like protocol, highlighting the advantages of building a privacy-first data structure.

Part One: Public CryptFinger

The most fundamental usage of Gordian Envelope is to publish verifiable data that is entirely public.

Carmen Makes Basic Information Available (Structured Data) — Using Gordian Envelope to release structured data. Carmen Makes CryptFinger Verifiable (Signatures) — Adding authentication to allow for data portability. Carmen Adds Chronology to CryptFinger (Timestamp) — Easily expanding verifiable data with improtant metadata. Part Two: Private CryptFinger

Building on the elision capabilities of Gordian Envelope can produce data that is not (initially) viewable by everyone, but which remains provable and ultimately releasable.

Carmen Protects CryptFinger (Elision) – Eliding data for some viewers and not others. Carmen Makes CryptFinger Provable (Inclusion Proof) — Using inclusion proofs to purposefully allow selective correlation. Carmen Makes CryptFinger Progressive (Progressive Trust) — Building a progressive trust algorithm using selective correlation. Part Three: Herd Private CryptFinger

A herd-privacy variant of CryptFinger’s design can allow users to maintain their privacy as much as they wish, as discussed in this overview.

Part Four: Data Distribution Advancements

There are many other options for cryptographic data distribution, building on signatures, provability, repackaging, and encryption permits, as is discussed in this further overview.

Financial Industry Use Cases

Although the financial industry can make many uses of Gordian Envelopes to preserve assets, these use cases concentrate on self-sovereign control of assets: how an individual can use Gordian Envelopes to make sure he doesn’t lose them.

Part One: Self-Sovereign Storage of Secrets

Envelopes can simply and securely store digital assets.

Sam Stores a Secret (Secure Storage with Metadata) — Using metadata in an Envelope to increase the resilience of stored assets. Sam is Salty about Compliance (Non-Correlation) — Salting data to eliminate correlation dangers. Sam Gets Paranoid about Privacy (Wrapped Encryption) — Improving privacy at the cost of resilience. Part Two: Raising Resilience of Restricted Results

However, resilience can be further improved with Gordian Envelope permits.

Sam Gets Rigorous about Resilience (SSKR Permit) — Sharding keys to improve resilience. Sam Promotes a Partner (Multi-Permit) — Improving resilience through multiple permits. The Common Thread of Use Cases

The first common thread among the use cases for Gordian Envelopes is the need for secure and privacy-enhancing solutions for transmitting and storing complex, sensitive data. Gordian Envelopes can help to prevent tampering or other security breaches and ensure that only verified, trustworthy data is transmitted. This can be particularly important in industries such as healthcare, finance, and government, where the verifiability of personal data is of utmost concern.

The second common thread is a focus on protecting human rights. The core value proposition of the privacy features of Gordian Envelopes cannot be overstated: through its support for selective disclosure and progressive trust, Gordian Envelopes allow for disclosure of only specific parts of the data, and only to authorized parties, ensuring that the privacy rights of individuals are respected.

The final common thread is diversity. Data stored in Gordian Envelope may be related to a wide range of industries, have cross-industry contexts, or cross international borders. It often involves diverse parties with different business models, risk models, and trust boundaries. When you have a diversity of users, requirements, and models, Gordian Envelopes can offer a flexible solution that allows different uses for different entities.


Why CBOR?

Why CBOR? Blockchain Commons is dedicated to developing open-source technical specifications, reference implementations, and tooling that help developers solve common problems with hardware and software that need to be decentralized, secure, preserve privacy, and enhance human independence. As part of this mission, we have decided to use the IETF CBOR (Concise Binary Object Representation) standard

Why CBOR?

Blockchain Commons is dedicated to developing open-source technical specifications, reference implementations, and tooling that help developers solve common problems with hardware and software that need to be decentralized, secure, preserve privacy, and enhance human independence.

As part of this mission, we have decided to use the IETF CBOR (Concise Binary Object Representation) standard in our specifications, including Gordian Envelope.

We chose CBOR as our serialization format choice for several key reasons:

Read More Structured Binary. We required a structured, binary format. Many of the solutions we are concerned with involve cryptographic keys, signatures, and other forms of data best represented as binary. Text formats like JSON require additional encoding layers like Base-64, adding bulk and complexity, especially when you want to continue parsing down inside that data. Conciseness. We wanted the serialized structured data to be as concise as possible. Small structures should result in messages of no more bytes than reasonably necessary. This makes CBOR much better for us than formats such as BSON, for example, which has a surprisingly large serialization footprint, as it trades off conciseness for the ability to easily update it in place in a database. Self-Describing. We prefer a self-describing format. This means that the serialized data contains the associated metadata that describes its semantics. Self-describing formats can be schemaless, which makes sense in a world where both ends of a communication relationship are evolving rapidly. Like JSON, which is fundamentally schemaless, we also wanted the option to support formal schemas as needed while avoiding tying developers to specific schema processors or toolchains. Fully Extensible. We needed a system that is entirely extensible, with any data types desired, which ensures self-description even when you’re working with a variety of data types, such as our own listing of UR tags. Constraint Friendly. We needed a format that works well in constrained environments, like special-purpose embedded systems and the Internet of Things. This means the codec implementations should be straightforward and efficiently implementable in a minimum number of lines of code. Streaming Friendly. We were pleased to also have a system that works well with streaming, without requirements for extra memory; CBOR’s tagging system allows for data to be skipped if it is irrelevent or unknown. Independent. We required a format that is not closely tied to any particular hardware, software platform, or programming language. Standardized. We wanted a format that has had many experienced eyes on it, which means a format that has been through the standards process. The CBOR standard offers us a precise, exemplary specification and multiple reference implementations with test vectors. The fact that CBOR is an international standard through IETF also reduces resistance to adoption, increasing the likelihood of an active community of developers and projects relying on the code and tools that support the standard. CBOR is further standardized by the registration of common data types through IANA, which multiplies these benefits. Easy Adoption. We wanted a format that is easy to implement and is platform- and language-agnostic. CBOR offers a mature tool chain so developers can quickly adopt it regardless of their hardware, software platform, or programming language. Implementation are available in a variety of languages. Deterministic. Of particular importance for our Gordian Envelope specification is that the CBOR standard defines a “deterministic encoding”: a set of rules that ensure the encoded CBOR output is unequivocally unique. These rules mean that for a given semantic input, exactly one coding will always be output. This is essential for cryptographic techniques that rely on encoding the same data repeatably. A deterministic encoding avoids the need for post-processing to “canonicalize” data before it can be used in cryptographic constructs.

In summary, the choice of CBOR as the serialization format for Blockchain Commons allows for the development of technical specifications and tooling that are easily adopted by developers and can help solve common problems with decentralized, secure, private, and independent hardware and software. We’ve found that CBOR is the best choice for our needs, and we are confident that it will continue to serve our community of developers and supporters well into the future.

Also see a comparison to Protocol Buffers, a comparison to Flatbuffers, and a comparison to other binary formats and of course our video on this topic.

Friday, 23. December 2022

FIDO Alliance

Teiss: Cyber-security in 2023 

Andrew Shikiar at FIDO Alliance gives his predictions for what can we expect from the cyber-security industry in 2023. The last twelve months online haven’t been short of action – […] The post Teiss: Cyber-security in 2023  appeared first on FIDO Alliance.

Andrew Shikiar at FIDO Alliance gives his predictions for what can we expect from the cyber-security industry in 2023. The last twelve months online haven’t been short of action – good and bad. From high-profile breaches and widespread smishing attacks to metaverse speculation and Twitter drama. Not forgetting, of course, the major news in March that saw public backing of new passwordless log-in technology (FIDO passkeys) from some of the world’s largest platform vendors – admittedly we were extra excited about this one… 

The post Teiss: Cyber-security in 2023  appeared first on FIDO Alliance.

Thursday, 22. December 2022

Content Authenticity Initiative

The CAI’s 2022 Year in Review

by Coleen Jose, Sr. Marketing & Engagement Manager, Content Authenticity Initiative   As we close out this third year at the Content Authenticity Initiative (CAI), we want to spotlight the fast-growing CAI community that is leading responsible innovation and implementing the technical standard for restoring trust and transparency through digital content provenance. The urgency to ad

by Coleen Jose, Sr. Marketing & Engagement Manager, Content Authenticity Initiative  

As we close out this third year at the Content Authenticity Initiative (CAI), we want to spotlight the fast-growing CAI community that is leading responsible innovation and implementing the technical standard for restoring trust and transparency through digital content provenance. The urgency to address mis- and disinformation, facilitate transparency and creator attribution online has become even more critical with the speed of developments in generative artificial intelligence.  

Into 2023, we’re furthering our commitment by incorporating CAI technology into Adobe creative tools and collaborating closely with the CAI members deploying offerings in generative AI. More to come on this. 

Launching a standard and ecosystem for transparency online 

We started 2022 with the launch of technical standards and supporting guidance from the Coalition for Content Provenance and Authenticity (C2PA), an alliance that Adobe leads alongside Arm, Intel, BBC, Microsoft, Sony and Truepic. These standards are the foundation for certifying and displaying the provenance or source and history of media content. 

In June, the CAI team released a suite of open-source tools, enabling a broad developer community to integrate the C2PA technical standards across web, desktop, hardware, mobile projects and more—regardless of their level of technical depth. We’re so thrilled to see community adoption, experimentation and feedback that’s critical to progress and wide adoption. 

Since the release, the content provenance ecosystem has seen new and exciting implementation like that of Pixelstream, a provenance-based platform and system for sharing and delivery of authentic media. At Adobe MAX, we announced partnerships with Leica and Nikon. The industry leading camera manufacturers showcased exhibiting cameras—Leica's iconic M11 Rangefinder and Nikon’s industry-leading mirrorless Z9—with provenance technology. The milestone brings authenticity to digital images at the point of capture, equipping photographers and creators alike with attribution tools. 

We also announced improvements to Content Credentials in Adobe Photoshop, which we launched in 2021 to allow users to add their attribution details to their exported images. Enhanced support for a range of actions including working with smart objects and new global settings to keep content credentials on by default or at the document level brings flexibility to any creative workflow.   

At Adobe MAX this year, Scott Belsky, CPO and EVP for Creative Cloud, announced CAI milestones with Leica and Nikon.

Content provenance for audiences everywhere 

When false narratives or misleading information go viral, we often “Google” to compare coverage and cross-check available information. This common behavior and digital provenance history drive Verify, the CAI website where anyone can upload an image to trace the history and edits made to a piece of digital content. This year, we updated Verify to support matching and recovery of provenance and attribution history in files exported with content credentials, ensuring CAI metadata is permanently associated to your content. 

We welcomed many new CAI members this year—leading publishers and visual content providers including the Associated Press, Agencia EFE, El Tiempo, EPA Images, Reuters and the Wall Street Journal to name a few. Our growing global community of more than 860 members includes creative professionals, civil society, academics, media and technology companies implementing and promoting adoption of content authenticity standards online. Consider joining us. 

Meeting online consumers and digital creators where they are is fundamental to the CAI’s mission—realized through open-source tools and cross-industry collaboration. 

A commitment to provenance and the C2PA standard   

This year also included significant policy collaboration and developments in Europe and the United States.

In June, we achieved a milestone of collaborating with the European Commission’s 2022 Code of Practice on Disinformation, the first international code to specifically include commitments on provenance and the C2PA standard. The code aims to encourage adoption by signatories which include Adobe, Google, Meta, Microsoft, TikTok and others. 

An Adobe sponsored bill in the State of California will establish a Deepfake Working Group to study the risks and impact of digitally altered media while exploring the adoption of content provenance as a solution to identify deepfakes. 

The CAI team also spoke at a number of policy events internationally, including a policy workshop at the Royal Society in London in September. This event was a deep dive into the recommendation from its report on disinformation on provenance technology (pp.16) where Andy Parsons, Sr. Director at the CAI, addressed an audience of ~50 stakeholders from academia, technology and policy alongside CAI and C2PA member, the BBC. We’ll share the learnings from this workshop when it’s published in early 2023.  

"Addressing the issue of content authenticity at scale is a long-term, interdisciplinary, collaborative mission," Andy said. "And it is more essential than ever before with the arrival of mainstream generative AI. This coming year is set to bring deeper collaboration, wider adoption and new innovation in the provenance community. 2022 was a year of critical foundation-building for us and I see 2023 as the year of utility and adoption, built upon that foundation." 

We’re so excited for the year ahead, continuing to co-create an ecosystem built on trust and grounded in open technologies that enable creator attribution and digital transparency. 

Sign up for our newsletter and consider joining the CAI community.   


eSSIF-Lab

Have a chat with Ivan Basart ☕️ CTO at Validated ID(VCL)

Validated ID was born in 2012 with the purpose of offering a digital signature service with high legal robustness and very easy to use. They bring their experience in different areas of work and developed VIDsigner Bio, a handwritten electronic signature service on tablet, very easy to integrate and above all, legally secure.

Today, we have the pleasure to talk with Ivan Basart, Validated ID CTO. His professional career is specialized in cryptography, digital identity and digital signature services. He is currently in charge of the technical evolution of our digital signature service VIDsigner and the construction of our digital identity service (SSI), VIDchain. He also participates in forums related to sovereign identity, such as DIF and RWOT.

1. Introduce yourself and your company, and explain what makes you different.

My name is Iván Basart and I have more than 20 years of professional experience in the field of cryptography, electronic signatures and digital identity. I develop my professional activity as CTO of Validated ID and as one of the founders of the company, my role is to handle the technical evolution of VIDsigner, our digital signature service, as well as the construction of our digital identity service (SSI), VIDchain.

We have been pioneers in the field of SSI. We started working on it in 2017 when it was at the very beginning. We have participated in various international forums and initiatives such as the DIF (we are founding members) the RWOT, IIW, Sovrin, Alastria, EBSI, Lacchain…. I have been a regular speaker at events related to the topic such as the European Blockchain convention, and the Global Legal Tech and I have been a professor in the three-point blockchain master for 4 editions.

At Validated ID we offer security trust services for electronic signature and digital identity verification processes. We are a Qualified Trust Service Provider and Issuer of Digital Certificates and qualified seals, in compliance with the eIDAS Regulation. Our solutions, certified by ENS, ISO and HDS, help improve processes and offer the security and confidence that both companies and individuals need. We are one of the few companies working on SSI that have an in-depth experience in the field of eIDAS and being a QTSP (Qualified Trust Service Provide)

2. What services or products do you offer?

We offer solutions that help achieve a more digital and sustainable world, improve our quality of life and guarantee the privacy, rights and freedom of people.

With our portfolio of solutions: VIDsigner (the multi-channel electronic signature service); VIDchain (the digital identity service); and SP4i (the electronic invoicing service), our goal is to consolidate ourselves as technological leaders in these services, both nationally and internationally.

With VIDchain we want to solve the fundamental problem of the Internet with the issue of digital identity, providing a service with which you can own, control and manage your own digital identity and from your phone. VIDchain is based on a new universal digital identity paradigm called Self Sovereign Identityor SSI, and it’s built on blockchain technology. The new Web3 will not exist until we have a reliable and universal digital identity scheme that guarantees our privacy.

With the VIDsigner multi-channel electronic signature service, we seek to offer a safe and easy way to sign documents online.Our different types of electronic signatures are adapted to our customer needs: whether it is a handwritten signature on a tablet, remotely with a smartphone or through digital certificates. Our technology is based on the collection of the maximum evidence, which is why we ensure that the maximum amount of evidence possible is collected in each signature process: from biometric information during the electronic signature process on a tablet (speed, inclination, pressure, etc.); to two-factor authentication, unique IP or certificates and more, all from any device.

On the other hand, we have SP4i, an electronic billing system that has been developed to send and receive billing documents between companies and/or public administrations. With SP4i, companies and freelancers can create and send invoices to customers and public administrations within the European Union.

3. What milestones have you achieved so far since your project launch?

We have been working with national and international partners, such as Sovrin,DIF, Alastria, ToIP, COVID Credentials Initiativeand Good Health Pass, to establish an open and decentralized identity ecosystem accessible to everyone.

Our digital identity and attribute verification solution has been awarded first place in Cuatrecasas and Telefónicaaccelerator and first place in the Alastria Open Call Cataluñaproject competition, the leading blockchain consortium in Spain.

Since we started developing VIDchain, we have participated in a large number of projects and won several awards and recognitions for our contributions:

Here is a list of our recent projects and achievements:

Santander X Global Challenge– Winner of the Scale Up category of the Santander X Global Challenge EBSI Wallet Conformance– VIDwallet is the first ID wallet to become EBSI compliant – Early Adopters EBSI program– We successfully designed a scenario where students in Europe can get quick and direct access to all kinds of discounts with their student cards. – Gavius Project– As experts in SSI and EBSI, we help create solutions to facilitate the communication and social benefit requests between local administrations and citizens. – Star4big Project– We created a solution to showcase how users can request, hold and present VC supported on different blockchains using VIDwallet. SportChain – We have created a decentralized ecosystem for the sports Industry to elevate the trust in sports data. eIDAS Brige-We help ensure legal validity of electronic documents and cross border trust services, such as electronic signatures and seals. To make eIDAS available as a trust framework in the SSI ecosystem, the European Commission developed the eIDAS bridge. 4. What have you achieved with your idea thanks to eSSIF_Lab project?

In the eSSIF Lab we worked on a project called eIDAS Bridge. The idea behind this project is that although there are many credential wallets under development and several companies like us are looking forward to this prominent paradigm, the reality is that the legal framework is still not fully mature. Currently, we have the eIDAS regulation, mostly focused on traditional PKIs and Certificates. In June 2021, the EC approved a new draft of this regulation that states that the new identities of European citizens will be based on the SSI principles and backed by identity wallets. However, this regulation still needs to be formally approved and developed”. In a nutshell, there is still not a clear trust framework. Therefore, the eIDAS bridge has been raised as an in-between step.

We started working on the eIDAS Bridge project as an initiative by the European Commission (EC) within the ISA2 program where Validated ID participated as expert of matter in PKI and SSI. The EC developed eIDAS bridge to promote eIDAS as a trust framework for the SSI ecosystem. In a nutshell, this project pretends to provide a solution to one of the most urgent existing challenges SSI faces: having a trust framework on which to rely on. The result of this project, i.e. the technical specifications, integration guidelines and legal reports produced, can be found here.

Sometime later, thanks to the eSSIF Lab, we had the opportunity to evolve the original idea and project developed in the ISA2 program. The main achievement during the eSSIF Lab has been to provide an implementation of eIDAS bridge and to prove the interoperability between different provider implementations. The results of this project are available as open source.

5. What are your goals for the middle/long-time future?

As a Trust Service Provider, we are building the services that will allow us to position ourselves as one of the relevant players in the new eIDAS 2.0 scenario. We are participating in the ETSI ESI groups in order to get ready for the upcoming standards in this space.

We try to deliver an easy-to-consume service that packages the complexity on the technology behind SSI, that supports the evolving standards in this space and that fulfils the requirements of the upcoming regulation.

We already have an ID Wallet (EBSI compliant). As most of the existing ID wallets have been made by and for tech users, but at some point, everybody, also older people or with no technical background should be able to use the wallet. UX is often forgotten but very important. We are working on a full redesign in order to deliver a UX that makes possible that ID Wallets will be easy to use an understand by any user.

6. Any piece of advice for those who are looking for public funding?

It is very important to select properly the program to which you are applying. Beyond the money needs to be aligned with the objectives of your company otherwise you will have the risk of losing focus on the core things of your company. Think in a more strategic way than in a tactical way.

Be resilient, getting public funding is not easy and many people are competing for it. Take advantage of the feedback you get from the rejections in order to improve the proposals for resubmission.

The post Have a chat with Ivan Basart ☕️ CTO at Validated ID(VCL) first appeared on eSSIF-Lab.

Monday, 19. December 2022

OpenID

FAPI 2.0 – Announcing New Drafts and Security Analysis

The OpenID Foundation’s FAPI working group is pleased to announce the public review period has started for new Implementer’s Drafts of the FAPI 2.0 Security Profile and the FAPI 2.0 Attacker Model. These drafts coincide with the recently completed formal security analysis of the FAPI 2.0 specifications, the result of a first-of-its-kind collaboration between security […] The post FAPI 2.0 – Annou

The OpenID Foundation’s FAPI working group is pleased to announce the public review period has started for new Implementer’s Drafts of the FAPI 2.0 Security Profile and the FAPI 2.0 Attacker Model. These drafts coincide with the recently completed formal security analysis of the FAPI 2.0 specifications, the result of a first-of-its-kind collaboration between security researchers at the University of Stuttgart and the OpenID Foundation in the area of web protocols, work co-funded by the Australian government.


Why FAPI 2.0?

The FAPI 1.0 standards have been widely implemented and the working group has gained valuable insight from ecosystems, vendors and developers. The FAPI 2.0 suite of standards builds on this insight and wider learnings from the OAuth ecosystem including the latest OAuth Security Best Current Practice.

FAPI 2.0 aims to meet and exceed the security characteristics as FAPI 1.0 while reducing the overall complexity and the optionality in the core security profile. This will make FAPI 2.0 easier and more cost-effective to implement and will ensure interoperability across ecosystems. Add-on specs such as Grant Management, Message Signing and CIBA provide ecosystems with additional features where required. The attacker model of FAPI 2.0 makes the standard more amenable to formal security analysis and helps to delineate security boundaries, enabling implementers to better understand the security FAPI 2.0 provides.


Why a Formal Security Analysis?

The standardization process in the OpenID Foundation ensures a comprehensive review of standards under development from experts both at the OpenID Foundation and external organizations . Nonetheless, complex attacks and subtle problems can evade scrutiny and therefore, additional safeguards are required to ensure that protocols are secure even under adverse conditions.

Formal methods allow for a rigorous and systematic in-depth analysis of standards and have proven to be a useful tool to ensure the security of protocols, famously demonstrated during the development of TLS 1.3. While the current methods for formal security analysis of web protocols require highly specialized knowledge, they are the best tool for uncovering vulnerabilities rooted in the logic of the protocols and can even discover previously unknown types of attacks. Conversely, formal proofs of security can exclude large classes of attacks. In OAuth 2.0, the main building block for OpenID Connect and FAPI, new attacks were found and fixed using formal analysis although its security had been studied extensively before, demonstrating the power of formal security analysis.

The OpenID Foundation’s FAPI 1.0 underwent formal security analysis by a team of researchers at the University of Stuttgart. This analysis uncovered several potential attack vectors that the FAPI working group were able to either mitigate or document.

With co-funding from the Australian Government and the OpenID Foundation, the FAPI working group was able to commission the formal analysis of FAPI 2.0 by the same team at the University of Stuttgart.

The analysis on FAPI 2.0 took place over the summer of 2022 and has now been published. This marks the first time1 that a detailed formal security analysis has directly accompanied the development of a new web authentication/authorization standard from the very beginning


Results of the Security Analysis

The researchers at the University of Stuttgart, Institute of Information Security led by Prof. Ralf Küsters, Pedram Hosseyni, and Tim Würtele were able to prove the security properties of the FAPI 2.0 Security Profile (formerly known as FAPI 2.0 Baseline). This is a great result and should give implementers of FAPI 2.0 further confidence in the security benefits of implementing the specifications.

As part of the analysis, the FAPI working group worked with the research team to further refine the FAPI 2.0 Attacker Model and the FAPI 2.0 Security Profile.

There is a good summary of these changes in the formal security analysis.


Attacks and Mitigations

The security analysis uncovered a few potential attacks that are now dealt with in the FAPI 2.0 Security Profile.

Some of these attacks are rooted in the foundations of how the web works and are impossible to fully prevent with existing technology, however they are applicable to all redirect-based authentication and authorization protocols. Since the attacks were already known from FAPI 1.0 and other protocols, it was expected that they would come up during the detailed analysis. Nonetheless, to provide adopters of FAPI 2.0 with all information required to make the best security decisions, the attacks are now described in the security considerations section of the FAPI 2.0 Security Profile. Here are a few examples:

Cuckoo’s Token Attack (Injection of stolen access tokens)

This is a theoretical attack where an attacker has managed to steal a valid access token and gain control of an authorization server trusted by both a client and the target resource server. This is a very high bar, but is a theoretical possibility. FAPI 2.0 requires sender-constrained access tokens, which is a huge improvement over most OAuth 2.0 based deployments that are currently live. If tokens are not sender-constrained, this attack only requires a stolen access token and is much simpler. Essentially the formal model has shown that even with sender-constrained access tokens there are some scenarios where a sender-constrained token could be used by an attacker, if the attacker is able to control an authorization server trusted by the client. In many FAPI ecosystems the preconditions for this attack are such that this attack is all but impossible. The FAPI 2.0 Security Profile details three possible mitigations for this attack if an ecosystem decides it is necessary to defend against.

Authorization Request Leaks that lead to CSRF

This is an attack where via a CSRF vulnerability, an attacker can break session integrity and engineer a situation where the honest user is tricked and let to believe they have accessed their own account, while in reality they have accessed an attacker’s account. In some circumstances this is dangerous, for example, a user could end up uploading sensitive data to the attacker’s account. All redirect-based flows are vulnerable to this type of attack and the FAPI 2.0 Security Profile details three possible mitigations to this type of attack.

Browser-Swapping Attacks

All redirect-based flows are vulnerable to this particular attack. Again, it has unlikely preconditions: the attacker has to trick a user into following a link, and then be able to capture the authorization code issued to that user following successful authentication. FAPI 2.0 details possible mitigation strategies for this attack.

 

Conclusion

The FAPI working group is committed to helping international ecosystems deliver secure APIs. The FAPI 2.0 Security Profile is an important resource and we encourage implementers to consider adopting it.

The first-of-its-kind collaboration with researchers and the formal security analysis help to make sure FAPI 2.0 is highly secure and its properties well-understood and documented.

To ensure that implementations are secure and interoperable, the FAPI 2.0 specifications will soon have a comprehensive set of open source, conformance tests and a low-cost, flexible certification program. The OpenID Foundation and FAPI WG strongly encourage all implementors of FAPI 2.0 to pursue certification to ensure their implementations and communities benefit fully from the security and interoperability inherent to the FAPI 2.0 protocols, and whenever possible, to mandate ongoing conformance to ensure the ongoing benefits. The OpenID Foundation supports direct self-certification or will license 3rd party entities to perform certification, as a service to all entities that select FAPI for their Open Banking, Open Finance and Open Data implementations.

The FAPI working group is free to attend and membership in the Foundation is not required but encouraged. Working group contributors are required to accept the OpenID IPR Policy by signing a Contribution Agreement.

The post FAPI 2.0 – Announcing New Drafts and Security Analysis first appeared on OpenID.

FIDO Alliance

Webinar: Making FIDO Deployments Accessible to Users with Disabilities

In achieving FIDO Alliance’s mission of more secure and password-free authentication, we must ensure the needs and preferences of people with disabilities – an estimated 15% of the world’s population […] The post Webinar: Making FIDO Deployments Accessible to Users with Disabilities appeared first on FIDO Alliance.

In achieving FIDO Alliance’s mission of more secure and password-free authentication, we must ensure the needs and preferences of people with disabilities – an estimated 15% of the world’s population – are taken into account. 

During this webinar accessibility experts from FIDO Alliance board member companies Meta and VMware discussed how to make your FIDO deployment accessible to users with a wide range of disabilities. 

View the presentation.

Speakers:

Yao Ding, Accessibility Research Lead, Meta Joyce Oshita, Accessibility Test Engineer, VMware

The post Webinar: Making FIDO Deployments Accessible to Users with Disabilities appeared first on FIDO Alliance.


Webinar: Making FIDO Deployments Accessible to Users with Disabilities

In achieving FIDO Alliance’s mission of more secure and password-free authentication, we must ensure the needs and preferences of people with disabilities – an estimated 15% of the world’s population […] The post Webinar: Making FIDO Deployments Accessible to Users with Disabilities appeared first on FIDO Alliance.

In achieving FIDO Alliance’s mission of more secure and password-free authentication, we must ensure the needs and preferences of people with disabilities – an estimated 15% of the world’s population – are taken into account. 

During this webinar accessibility experts from FIDO Alliance board member companies Meta and VMware discussed how to make your FIDO deployment accessible to users with a wide range of disabilities. 

Watch the video.

Speakers:

Yao Ding, Accessibility Research Lead, Meta Joyce Oshita, Accessibility Test Engineer, VMware

The post Webinar: Making FIDO Deployments Accessible to Users with Disabilities appeared first on FIDO Alliance.


Videos: FIDO Alliance Public Seminar in Korea

On December 6, 2022, the FIDO Alliance Public Seminar in Korea was held at the SK Telecom Pangyo Office. This seminar provided global updates, in-depth training on passkeys, and the […] The post Videos: FIDO Alliance Public Seminar in Korea appeared first on FIDO Alliance.

On December 6, 2022, the FIDO Alliance Public Seminar in Korea was held at the SK Telecom Pangyo Office. This seminar provided global updates, in-depth training on passkeys, and the latest local FIDO deployment case studies.

You can access the recorded version by clicking on the video below and/or the videos by visiting our YouTube channel.

The post Videos: FIDO Alliance Public Seminar in Korea appeared first on FIDO Alliance.


MyData

Fair data use enables better smart city services for residents

Digitalisation is changing the way citizens connect and engage with service providers. The goal is to deliver smoother services for everyday life, but implementing such services requires new ways of data management. Read more from the MyData post.
Digitalisation is changing the way citizens connect and engage with service providers. The goal is to deliver smoother services for everyday life, but implementing such services requires new ways of data management. Read more from the MyData post.

Wednesday, 14. December 2022

Energy Web

Energy Web joins Hyperledger Foundation to Accelerate Web 3 Interoperability

By joining Hyperledger Foundation, Energy Web aims to expand its existing open source community and directly contribute to ongoing open source projects hosted by the Linux Foundation. Zug, Switzerland, 14 December 2022— Energy Web, a non-profit building open-source technology solutions for energy systems, has today joined Hyperledger Foundation, a project of the Linux Foundation.. By joining Hype
By joining Hyperledger Foundation, Energy Web aims to expand its existing open source community and directly contribute to ongoing open source projects hosted by the Linux Foundation.

Zug, Switzerland, 14 December 2022— Energy Web, a non-profit building open-source technology solutions for energy systems, has today joined Hyperledger Foundation, a project of the Linux Foundation.. By joining Hyperledger Foundation, Energy Web aims to expand its existing open source community and directly contribute to ongoing open source projects hosted by the Linux Foundation.

Specifically, Energy Web will contribute towards enterprise interoperability of Web 3 solutions via Hyperledger FireFly. Interoperability between different blockchains and identity solutions has emerged as a top priority for Energy Web as the organisation begins constructing solutions that leverage multiple technologies from different enterprise and Web 3 ecosystems. FireFly, and more generally collaboration with the Hyperledger community, represents a unique opportunity for Energy Web’s technology to be more tightly coupled with other innovations taking place in the Web 3 space.

“Our team has been monitoring the growth of Hyperledger Foundation and, in particular, HyperledgerFireFly for some time,” said Mani Hagh Sefat, Chief Technology Officer of Energy Web Foundation. “We couldn’t be more excited to finally jump into the initiative and contribute directly to it as our stack evolves to better integrate with multiple legacy and Web 3 technologies.”
“Interoperability is critical to maximizing the value of Web3 and other blockchain and related applications,” said Daniela Barbosa, Executive Director, Hyperledger Foundation, and General Manager Blockchain and Identity at the Linux Foundation. “With its open source tech stack, Energy Web is already helping to advance industry-wide collaboration and support a diverse and decentralized ecosystem. We welcome Energy Web as a member and contributor and look forward to the innovation and effort they will bring to our community, technologies and the next wave of Hyperledger-powered applications.”

Energy Web recently announced its intention to launch a new blockchain, Energy Web X, also aimed at increasing interoperability of Web 3 identity solutions.

About Energy WebEnergy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, data exchange, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Energy Web joins Hyperledger Foundation to Accelerate Web 3 Interoperability was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 13. December 2022

FIDO Alliance

White Paper: FIDO for e-Government Services

The global COVID-19 pandemic closed offices and forced governments to rapidly move services online, if they weren’t already, to serve its citizens. Although usernames and passwords are easy to deploy […] The post White Paper: FIDO for e-Government Services appeared first on FIDO Alliance.

The global COVID-19 pandemic closed offices and forced governments to rapidly move services online, if they weren’t already, to serve its citizens. Although usernames and passwords are easy to deploy and easy for citizens to use, they leave systems and users vulnerable to cyberattacks. They are especially vulnerable to phishing attacks designed to steal login credentials and compromise legacy multi-factor authentication (MFA) tools like those using one-time passwords (OTP) and push notifications. With phishing attacks on the rise, it is imperative for governments to support “phishing-resistant” MFA technology that is also accessible, efficient, and cost-effective.

Enterprises and governments around the globe are turning to modern online authentication solutions featuring FIDO specifications based on public key cryptography. Governments and industries have embraced FIDO as the preferred way to deliver high-assurance MFA to consumers. Notably, the Cybersecurity & Infrastructure Security Agency (CISA), a component of the U.S. Department of Homeland Security (DHS), refers to FIDO security keys as the gold standard of MFA1

Several governments globally have deployed and/or supported FIDO authentication for citizens to securely conduct government transactions, including making tax payments and applying for and accessing government benefits. Governments leveraging FIDO authentication solutions have realized reduced operational costs and increased consumer satisfaction.

This white paper provides guidance for policymakers and department/agency heads seeking to learn about FIDO authentication to support or deploy FIDO for e-government services.

The post White Paper: FIDO for e-Government Services appeared first on FIDO Alliance.


Blockchain Commons

Musings of a Trust Architect: Progressive Trust

A New Approach to Building Trust in Decentralized Systems by Christopher Allen Musings of a Trust Architect is a series of articles by Life with Alacrity author and Blockchain Commons founder Christopher Allen that lays out some of the foundational ideas and philosophies behind the technology of Blockchain Commons. ABSTRACT: Progressive trust is the concept of gradually building trust over time. It
A New Approach to Building Trust in Decentralized Systems by Christopher Allen

Musings of a Trust Architect is a series of articles by Life with Alacrity author and Blockchain Commons founder Christopher Allen that lays out some of the foundational ideas and philosophies behind the technology of Blockchain Commons.

ABSTRACT: Progressive trust is the concept of gradually building trust over time. It differs from classical models of trust, which rely on authentication mechanisms and centralization; and zero-trust models, which assume that trust should never be relied upon or which mandate the use of “trust frameworks” or “trust registries”. Progressive trust is based on the idea that trust is not a binary state but instead a dynamic and evolving process involving gradually learning about your partners through successful interactions. It’s how trust works in the real world, between people and groups. This architecture is critical for protecting human rights and dignity, as it allows individuals to defend against coercion and violations of their privacy, autonomy, agency, and control. To support a progressive trust architecture, capabilities such as data minimization, elision/redaction, escrowed encryption, and cryptographic selective disclosure must be used.

Read More

I first wrote about progressive trust back in 20041. There, I suggested that software should “better serve our needs and the needs of the groups that we are involved in” and that to do so, “we need to figure out how to apply an understanding of how human groups behave and work.” Recently, that concept has finally been gaining traction: in the world of decentralized systems, it can be used as a new approach for modeling and building trust.

The basic idea behind progressive trust is to model how trust works in the real world, between real people, groups, and businesses, rather than solely relying on mathematical or cryptographic trust. To explain this concept, I often use the example of meeting someone at a conference.

When we meet at a conference, we spend time listening to and understanding each other. We do this because being at the same conference has allowed us to create a simple credential for each other: that we are both interested in the same things. We continue to talk and exchange information, such as people we know in common, shared interests, and meaningful ideas. As our conversation progresses, we may unconsciously check to see if others are listening and adjust our conversation or location accordingly.

If we decide to meet again to continue our discussion, we may authenticate some of the credentials we have been given to ensure that we are comfortable with the level of trust we have in each other. As our collaboration grows, we may seek out more credentials and endorsements and engage in more tests of our mutual trust. Eventually, we may bring in third parties to witness and enforce our mutual obligations.

This is the way human trust works. It is also similar to how groups function and how businesses manage risk. It allows us to build trust and collaboration in a more natural and effective way. A progressive trust architecture thus permits us to support this gradual building of trust over time algorithmically.

The traditional algorithmic mechanism for building trust is to verify every interaction or transaction as “trusted.” This is often done through authentication mechanisms such as passwords or digital certificates and/or by identifying interactions as being inside a trusted firewall or VPN. However, these mechanisms can be easily compromised and do not adequately capture the dynamic and evolving nature of trust between people and groups.

More modern zero-trust architectures instead assume that trust should never be relied upon by any system or network. They mandate the use of a centralized third-party, or more recently a “trust framework”2 or “trust registry”3, to manage and enforce the trust relationships. All parties must consult that registry to determine who is a trusted issuer.

While this approach has the advantage of removing the need for individual parties to trust each other, it has numerous shortcomings:

Trust registries create new risks of centralization and vulnerability to coercion. In addition, trust registries may not be able to capture the dynamics of trust-building over time, which can be vital to building trust in complex or evolving systems. Further, trust registries can become outdated or irrelevant as requirements and details change for each party, resulting in gaps that make it difficult to determine the authenticity and reliability of new data with a privacy-breaking “phone home.” Finally, trust registries must rely on a third party to hold and update the registry. This highlights some of the flaws of centralization, such as the trust registry not treating the risks of all parties equally, or focusing on mitigating the risks of those parties with more power to influence the registry, or creating a dependence that is likely to be expensive and only benefits the few.

The problems with trust registries highlight the importance of using architectures that support the autonomy and agency of all parties. Progressive trust offers this alternative through its model of how trust is built and maintained in the real world. It is based on the idea that trust is not a binary state but rather a dynamic and evolving process. As a result, trust is built gradually over time through a series of interactions and transactions that allow parties to test and verify each other’s credentials and capabilities.

This progressive trust architecture protects human rights and dignity in a way that traditional models do not. Unlike traditional models of trust, progressive trust instead focuses on the choices of each party, allowing individuals to defend against coercion, financial and data loss, and violations of their privacy and authority. With progressive trust, individuals can do all of this, as the architecture is designed to support the autonomy and agency of each party.

Specific technical capabilities must be in place to support a progressive trust architecture. These include data minimization, elision/redaction, escrowed encryption, and various cryptographic selective disclosure techniques. Data minimization, for example, involves limiting the amount of shared data to the minimum necessary to protect privacy and reduce the risk of data loss or harm. Elision/redaction allows parties to decide what information to share, removing or masking portions to give them greater control over managing their own risks. Cryptographic selective disclosure enables parties to prevent future data correlation, and escrowed encryption allows information and promises to be enforced in the future. Fundamentally, these are all techniques that we use in real-life when progressively increasing trust with someone else; they just need to be modeled in digital space.

In terms of design principles, progressive trust architectures must be flexible and scalable to adapt to changing trust requirements over time. This means leveraging modular and atomic credentials, claims, and proofs that can be combined and updated as needed. It also requires using various cryptographic tools such as cryptographic inclusion proofs and zero-knowledge protocols and leveraging data models that express how the various sub-credentials are connected, allow for gaps, and minimize undue correlation.

One of the key challenges in building a progressive trust architecture is that trust is not binary; instead, it includes more gray areas. Trust is also not universal: each party will have a different view about it. For example, in a decentralized system, not all parties may be willing or able to share sensitive information, leading to gaps in the data. Additionally, as new information is added and existing data gaps are updated, the level of trust may change, making it difficult to evaluate the authenticity of a piece of information with certainty. These challenges require verifiers to understand their own business risks and choices and apply them instead of relying on third parties to make these decisions.

Ultimately, mechanisms must be put in place to ensure that the progressive trust architecture is secure and resilient. Robust software security development techniques and practices must be used, including requiring proper cryptographic security reviews and developing decentralized governance models that can support the decision-making processes necessary for building and maintaining trust in the system.

Progressive trust offers an important new alternative to traditional and zero-trust models for online trust. By understanding and applying the principles of how human trust and collaboration work in real life, we can build trust and collaboration more naturally and effectively. Progressive trust supports each party’s autonomy and agency and allows trust to evolve and adapt to changing requirements and information. As more organizations and individuals begin to explore decentralized systems, progressive trust architecture usage will only continue to grow.

Allen, Christopher (2004). Progressive Trust. [online] Life With Alacrity (blog). Available at: http://www.lifewithalacrity.com/2004/08/progressive_tru.html [Accessed 8 December 2022] 

MATTR (2021) Trust Frameworks MATTR Inc. (website) Available at: https://learn.mattr.global/docs/concepts/trust-frameworks [Accessed 8 December 2022] 

Johnson, Anna (2022) Trinsic Basics: What Is a Trust Registry? Trinsic Inc. (website). Available at https://trinsic.id/trinsic-basics-what-is-a-trust-registry/ [Accessed 8 December 2022] 

Monday, 12. December 2022

eSSIF-Lab

eSSIF-Lab Final Event: leveraging self-sovereign identities around Europe

More than 80 self-sovereigneSSIF-Lab, an EU-funded Research and Innovation project about self-sovereing identities celebrated its final event last Thursday, December 1st, at La Tricoterie in Brussels, counting with over 80 SSI and NGI experts and many more via an online livestream.

eSSIF-Lab, an EU-funded Research and Innovation project about self-sovereing identities celebrated its final event last Thursday, December 1st, at La Tricoterie in Brussels, counting with over 80 SSI and NGI experts and many more via an online livestream.

The project was designed to advance the broad uptake of Self-Sovereign Identities (SSI) as a next-generation, open and trusted digital identity solution for faster and safer electronic transactions via the Internet and in real life.

Digital initiatives are becoming critical to ensure the safety and security of citizens online, as the Internet is an essential tool in our daily lives – Jorge Gasos (eSSIF-Lab Programme Officer)

 

The main mission of the event was to highlight and showcase eSSIF-Lab’s beneficiaries and the project’s main achievements since its launch in 2019. It was the perfect opportunity to empower, leverage and consolidate the SSI ecosystem in Europe.

The conference started with the introduction of the Next Generation Internet (NGI) concept by Jorge Gasos, eSSIF-Lab Programme Officer, who explained what this initiative has achieved so far and what are its values, the technologies that are included in its scope, what type of funding the subgrantees can benefit from in every NGI project and what it’s the role and profile of the innovators that shape the Next Generation Internet ecosystem.

After Jorge Gasos’ presentation, Oskar Van Deventer, eSSIF-Lab Scientific Coordinador, talked about actions and milestones the eSSIF-Lab project achieved in helping consolidate and advance the SSI ecosystem in Europe. He also presented some of the best SSI solutions born out of the eSSIF-Lab project and congratulated them for their achievements.

We don’t want to create unicorns, instead, we want to create a European ecosystem of startups working together to create a standard beneficial for all – Oskar van Deventer (eSSIF-Lab Scientific Coordinator)

The attendees had the chance to listen to the event’s keynote speaker Anil John, Technical Director of the Silicon Valley Innovation Program (USA), who spoke about how the Blockchain Research & Development (R&D) systems can transform into Decentralized Identity Deployment.

We share the same approaches to ensure interoperability and solutions, and we also want to make sure that the public funding is used wisely and professionally – Anil John

After these fruitful conversations, the audience saw nine top beneficiaries of the eSSIF-Lab project pitching their solutions and sharing how their technologies can improve the lives of European citizens. They were : Andrea Danielli from Amlet, Paul Knowles from Human Colossus Foundation, Minh Triet Le from Zenlife, Nick Meyne from Resonate, Marie-Laëtitia Brunet from BCdiploma, Irene Hernández from Gataca, Yanis Kyriakides from eOrigin, and Caspar Roelofs from Gimly.

A reflection on how the SSI shapes the Internet of Humans

During the round table, Joao Rodrigues and Maxime Lemm from the The EU Commission Directorate-General for Informatics (DIGIT), Daniel Du Seuil, Convenor of European Self Sovereign Identity Framework (EBSI), and Maya Madrid, EIDAS representative, talked together about the pillars of European Digital Identity and how the European Commission will promote and encourage the use of Self-Sovereign identities in Europe.

We want to be a part of #web3 that is fair, privacy-preserving and decentralised” – Joao Rodriguez from@EU_Commission

As a final touch, Kaliya Young, famously known as Identity Woman, was involved in the award ceremony praising the programmes achievements and the European Commission’s cascade funding modality. Gataca and Systems Integration Solutionswere the winners of the Best SSI Infrastructure Extension eSSIF-Lab and Best SSI Business Solution eSSIF-Lab awards. This was a recognition of their exceptional performance during the eSSIF-Lab project.

The event was the perfect occasion to present eSSIF-Lab’s beneficiaries and results to the world, leverage the European SSI ecosystem and attendees had the chance to listen to and engage with some of the sector’s top European and American experts and thought leaders.

The post eSSIF-Lab Final Event: leveraging self-sovereign identities around Europe first appeared on eSSIF-Lab.

Friday, 09. December 2022

FIDO Alliance

heise: Passkeys: What makes them tick and how they work

Passwords: previous attempts to replace them have consistently failed. The Fast Identity Online Alliance (FIDO) and the World Wide Web Consortium W3C made a promising push in 2018 with the […] The post heise: Passkeys: What makes them tick and how they work appeared first on FIDO Alliance.

Passwords: previous attempts to replace them have consistently failed. The Fast Identity Online Alliance (FIDO) and the World Wide Web Consortium W3C made a promising push in 2018 with the FIDO2 standard.

The post heise: Passkeys: What makes them tick and how they work appeared first on FIDO Alliance.


Tech Target: 4 key elements of successful IoT device onboarding

FIDO: New help on the way: In 2021, the FIDO Alliance developed an open standard called the Fido Device Onboard (FDO) protocol. FIDO will make it easier to connect IoT […] The post Tech Target: 4 key elements of successful IoT device onboarding appeared first on FIDO Alliance.

FIDO: New help on the way:

In 2021, the FIDO Alliance developed an open standard called the Fido Device Onboard (FDO) protocol. FIDO will make it easier to connect IoT devices to cloud-based and on-premises device management platforms. The FIDO protocol can autonomously onboard an IoT device without an admin knowing how to configure or access the underlying network and Internet infrastructure the device runs on. The updated FIDO2 protocol enables IoT device onboarding without the need for a password and maintains the network’s security and governance requirements.

The post Tech Target: 4 key elements of successful IoT device onboarding appeared first on FIDO Alliance.


ZDNet France: What if we replaced the “security by obscurity”, dependent on passwords, with “security by community”?

Security by obscurity is an outdated approach, not suited to today’s cyberattacks, nor to today’s digital world, according to Andrew Shikiar, Executive Director and CMO of the FIDO Alliance. The post ZDNet France: What if we replaced the “security by obscurity”, dependent on passwords, with “security by community”? appeared first on FIDO Alliance.

Security by obscurity is an outdated approach, not suited to today’s cyberattacks, nor to today’s digital world, according to Andrew Shikiar, Executive Director and CMO of the FIDO Alliance.

The post ZDNet France: What if we replaced the “security by obscurity”, dependent on passwords, with “security by community”? appeared first on FIDO Alliance.


Origin Trail

Unfolding supply chains with interoperability and decentralisation

Thanks to authors: Ken Lyon, Global expert on logistics and transportation and Trace Labs advisor, Jurij Skornik, Trace Labs’ General Manager Well-oiled supply chains have always been the key enabler of successful international trade. We have seen a notable rise in their prominence and importance in the last few years, driven by a significant increase in consumer demand and events that expos

Thanks to authors:
Ken Lyon, Global expert on logistics and transportation and Trace Labs advisor,
Jurij Skornik, Trace Labs’ General Manager

Well-oiled supply chains have always been the key enabler of successful international trade. We have seen a notable rise in their prominence and importance in the last few years, driven by a significant increase in consumer demand and events that exposed how fragile these business networks are, such as the Evergreen blockage of the Suez canal. Understanding the ins and outs of supply chains is critical to tackling such challenges through successful deployment of innovative technology, improving not only business outcomes, but also the sustainability of supply chains in the broadest sense. In this article we deep dive into the complexities of global supply chains, look at how state of the art technology such as OriginTrail can be a driver of change, and examine an industry example of a tech-driven initiative that recently closed down, TradeLens.

Supply chains as networks

Supply chains are, by definition, interconnected groups of organisations that have agreed to perform a series of tasks which usually results in the delivery of a product (or service) to a customer.

Over time supply chains have become extended, often span the globe, operate to a much faster tempo and are also significantly more fragile. The principal causes have been the growing consumer demand in industrialised nations and advances in technologies such as computers, communications and manufacturing. In response, many organisations, particularly logistics service providers, have transformed the scope and scale of their operations, such that the largest of them have established physical global networks operating continuously to collect and deliver “almost everything to pretty much anywhere”.

Supply chains are really networks rather than linear ‘chains’. As these complex webs of connectivity extend, some of the links become very fragile and prone to unexpected failures or disconnects. Because they are usually physical channels, the ‘unexpected’ will include weather events, political or social upheaval, or just basic equipment failures.

Network of supply chains

Establishing many of these networks has in some cases taken years of effort to establish the necessary levels of trust. But this approach conflicts with the demands for the agile and adaptable supply chains that can respond to unexpected events or challenges. It is true that the Internet has made it much easier to exchange data and collaborate with partners, but the necessary levels of trust are still reinforced through either personal relationships or third party services that struggle to operate at scale.

Global networks, global complexities

As an example, a small jewellery manufacturer will probably buy materials from companies they have known for a while. They will sell to customers either in a physical store, where the customer will pay for the products directly, or through an online platform where a third party payment system supported by a bank will confirm the transaction. This arrangement will continue to work well, providing there is not a significant step change in demand. In short, if the jeweller usually sells 50 items a week, and demand for a popular item suddenly jumps to 4,000 a week, it becomes impossible to fulfil within a reasonable timeframe without drastic changes in the process.

In contrast, a major automotive manufacturer has several supply chains that will operate across multiple geographies, with thousands of suppliers, including a handful of global component manufacturers, who each have their own constellations of suppliers and subcontractors, with some common to both. Like the previous example of the jeweller, the major relationships have been established over long periods of time and the value exchanges of information, money and material, are usually significant. There will also be a degree of churn and change at the edges as small suppliers fail, or stop providing the products required. These changes are usually handled within the existing (established) processes that are defined by the need to ‘trust’ any new party entering the network.

These processes are often cumbersome, inflexible and may not reveal any inherent weakness that could compromise the entire network in the future. They rely on a third party somewhere in the chain of command, to validate any new entrants before they can perform the services they are required to deliver. In a high velocity supply chain this can introduce problems further along the chain. In short, it is inefficient.

This requirement for validation by third parties can be a significant weak spot in diverse and extended supply chains. In 2011, extensive rainfall across south east Asia led to massive floods in Thailand. This stopped operations at a significant number of the worlds contract manufacturing plants specialising in disc drive production who were clustered in that area. Computer manufacturers scrambled to find alternative suppliers, but due to the requirement to validate new suppliers, it was easier to work with existing suppliers and have them boost output in other locations. This proved challenging, as it required the suppliers with functioning plants elsewhere to prioritise customers. Some 14,000 manufacturers were impacted and it took months to resolve, impacting delivery schedules, and forcing some out of business.

Furthermore, switching production to other locations required the establishment of new logistics pathways and working with new subcontractors, all of whom had to be validated and approved.

The issue wasn’t that the organisations involved did not have the capabilities to perform what was required. It was that they needed to be ‘validated’ so that they could be trusted by their partners in the network. This is where decentralised trust mechanisms, such as those illustrated by blockchains, are very valuable to extended supply chains with huge numbers of participant organisations. These decentralised networks use a consensus network of thousands of independent systems to ensure immutability and verifiable integrity of identities, transactions and computation. They can support massive scale and are robust and extremely difficult to compromise.

OriginTrail and knowledge assets

The OriginTrail Decentralised Knowledge Graph (DKG) was designed to exploit and advance these technologies to support the necessary flexibility and adaptability of critical global supply chains. It achieves that by bringing the complex, fragmented processes in global supply chains to a common denominator using a simple yet powerful building block — a knowledge asset. With the endless variety of possible scenarios in global supply chains, a knowledge asset is there to represent any single (small or large) part of it, whether that’s raw material, finished product, professional service, factory building, compliance certificate, quality claim or anything else you can think of. But even though they can be so incredibly diverse, they all share the same core characteristics:

unique identity which can be pointed to using a Unique Asset Locator (UAL is derived the same way how URLs are created in the world wide web), verifiable ownership ensuring chain of custody can be followed throughout the asset lifecycle, knowledge asset graph of data combining all relevant claims about an asset with precise data permissioning capabilities from fully private claims, limited access or fully public claims. An example of OriginTrail Decentralised Knowledge Graph (DKG) in the pharmaceutical supply chain.

Each of these characteristics ensures a powerful building block which we can use, almost like lego blocks, to recreate the fragmented reality that are modern supply chains. The most important part, however, is that all these assets aren’t created in data silos but rather issued on a global DKG where they are organised in a way that they can be discovered, verified and used. Because the DKG is not owned, controlled or run by any single entity (it is a permission-less network), it provides the ideal, neutral, common ground where each participant can own and control their assets and important business information while still be able to share it with anyone they wish to share it with. It is perhaps one of the most critical innovations for supply chain, as for once, interoperability of systems based on global standards is the fundamental starting point rather than an after-thought or even seen as something that is not desirable when service providers are pursuing vendor lock-ins.

De-siloing data for frictionless international trade

The enormous potential of OriginTrail’s DKG is that it can also address many of the challenges embedded within the regulatory structures that underpin the global trading system. Much of the inherent inertia that permeates global supply chains is due to the bureaucracy required to validate and authorise parties and contracts.

Governments and trade organisations are constantly trying to harmonise regulations and agree on common standards, but technology may now be able to reframe the essence of the debate with the advent of the DKG. With the collaboration of the BSI and others, OriginTrail has been able to demonstrate how the DKG can serve as a universal and immutable system of record for licences and credentials. These digital ‘assets’ of the organisation can now be referenced from anywhere by any relevant authority, thus avoiding unnecessary duplications of certifications, authorizations, inspection and audit reports.

This has huge implications as there is an enormous amount of cost and operational overheads in the duplication, sharing and managing the documentation necessary for legislative compliance across global supply chain operations. The Supplier Compliance Audit Network (SCAN) is a clear example of how this can work in practice. There are currently 52 SCAN members that have audited some 22,000 factories across 80 countries for security and social responsibility compliance. This collaboration of SCAN members and a small number of audit firms now enables supplier audit reports to be shared with SCAN members removing the necessity for them to conduct their own audits.

The key element of this is that the approach ensures anonymity and confidentiality, so members do not see proprietary sourcing information, only the information confirming that the audit report confirmed compliance with the agreed standards. This avoids duplication of effort and cost and saves huge amounts of time when establishing new trading partner networks.

This initiative is growing and is helping the US CBP, CTPAT, and other Authorised Economic Operators (AEO) establish common security standards that can easily be referenced and checked in a secure manner. As more organisations are accredited with AEO status, the ability to record this accreditation on a secure, immutable platform such as the DKG will be immensely beneficial. It is this evolution of a recognized credential into a digital asset, immutably owned by the accredited party and capable of being instantly referenced from anywhere that is transformational.

As the SCAN example has shown, being able to reference existing accreditations in a fast, trusted and secure manner saves time and also reduces the operational costs when setting up new trading relationships.

But the real advantages of the DKG will be realised within organisations that are able to record their physical inventories as immutable digital assets on the DKG. Being able to trust inventory data across the supply chain is essential. Inaccurate and unverified inventory results in duplications, excess cost and often disappointed customers due to stock shortages.

As an example, when inventory items are moved, they may be recorded in a number of different systems. These systems frequently assign different identifiers to the inventory items, without corresponding links back to the original reference. This disconnect makes it harder to identify missing items or duplications. This can be expensive if the supply chains involve pharmaceuticals, or semiconductors.

For decades there has been a demand for a ‘Single version of the Truth’ when considering visibility systems. The track and trace systems of the global integrated carriers were the first to provide some kind of end to end view of a shipments progress. But as soon as it left their information domain (e.g. passed to another carrier), the information flow either broke down or was delayed. Across a global supply chain comprising multiple partners and a myriad of systems, these data update delays and disconnects illustrate that ensuring referential integrity is very difficult.

The advent of the DKG provides the opportunity to rethink supply chain visibility. If inventory items are represented as digital assets on the DKG there should be no need to keep recording them in different systems, merely refer back to the original digital asset and its attributes. As it physically moves through the supply chain as an order or shipment, each stage of the journey should reflect the ownership, status and chain of custody as a cluster of digital assets, immutably recorded on the DKG.

This is a very seductive and powerful proposition, as it implies a huge improvement in efficiency and reduction in errors and exceptions. This is because every participant will have access to the same information, certain in the knowledge that all of the participants are looking at the same data, albeit in their specific context.

Clearly, it will take time to achieve this as it is not a trivial exercise to transform existing stores of inventory data into digital assets. The performance of the underlying technology will also have to improve, but initiatives such as Version 6 of the DKG, the Network Operating System (nOS) enabler for enterprises, and Project Magnify are accelerating things.

Trace Labs’ Network Operating System (nOS)

Innovative companies and curious staff members in the horizon scanning and scenario planning functions of major corporations should explore what the OriginTrail and the Trace Alliance have been doing. As the world evolves from Web2 to a Web3 landscape, those organisations that understand and appreciate the power of Decentralised Knowledge Graphs will have the advantage.

TradeLens takeaways: Not less, but more decentralisation and interoperability

The importance of having a common infrastructure rooted in decentralisation, neutrality, and interoperability can be reinforced by examining one of the high-profile industry initiatives that has recently announced it is closing down, TradeLens. TradeLens was a joint effort of Maersk and IBM established in 2018 and was considered by many as one of the most promising ventures deploying blockchain technology in the shipping industry. But if that was the case, why did it fail, and how does OriginTrail DKG address the shortcomings that were the root causes of this failure?

There are a plethora of reasons that led to TradeLens’ demise, but below are some of the ones we consider the most important.

Low level of digitalization in the shipping industry. In a report from October this year, McKinsey estimated that only around 1% of all Bill of Ladings (BLs) are digital today. Such low levels of digitalization present a significant barrier to industry-wide initiatives such as TradeLens. To help overcome that barrier, the Network Operating System (nOS) offers multiple avenues to help organisations create assets on OriginTrail DKG. This ranges from as easy as providing inputs via e-mail to more automated API data ingestion capabilities. Interoperability among multiple inputs is achieved thanks to supporting global data standards such as GS1 EPCIS and DSCA’s Electronic Bill of Lading (eBL), ensuring all created assets are interoperable. And there is a lot of potential in the digitalization of the shipping industry. In the same report mentioned above, McKinsey estimates that 100% adoption of the eBL would have an annual impact of around USD 50bn in the ocean trade ecosystem alone.

Lack of willingness to share data. It’s no surprise that industry players are reluctant to share sensitive business data, as questions around data privacy and ownership inevitably pop up. This is especially true when the underlying infrastructure is provided by a competitor, as was the case with TradeLens — Maersk being the largest shipping company globally. OriginTrail provides industry players with a way to transform their existing data into assets that are discoverable and verifiable on the DKG, but remain fully within their control and can be shared as needed, for example with customs authorities or other government agencies. This is important as assets (and data) need to be discoverable to be useful, but that should never happen at the expense of data privacy.

Level of decentralisation and neutrality. Closely tied to the previous point, the level of decentralisation and neutrality of TradeLens’ infrastructure was not sufficient to facilitate an industry-wide adoption. If any industry-wide initiative is to succeed, the infrastructure needs to ensure that all participants are on an equal footing and minimise the potential for vendor lock-ins. OriginTrail DKG provides this neutral, common ground as a permissionless, open-source infrastructure that is not owned, controlled, or run by any single entity.

Interoperability of data and infrastructures. Interoperability is crucial to successful industry-wide initiatives. And there are two facets of interoperability, the first being interoperability of data and the second being interoperability of infrastructure. The TradeLens experiment made it clear that more focus needs to be put on both facets, and that industry-wide permissioned blockchains don’t deliver the expected business value. OriginTrail tackles this interoperability challenge from both angles. Data interoperability is achieved by assets on OriginTrail utilising global data standards such as GS1 EPCIS and DSCA’s Electronic Bill of Lading (eBL). From an infrastructure point of view, OriginTrail is multichain by design, meaning a global graph backed by multiple permissionless blockchains (currently Polkadot, Ethereum, Gnosis, and Polygon), giving organisations full flexibility in terms of what infrastructure they want to use.

While we still have a long way to go, the digitalisation of global supply chains continues, and TradeLens provided valuable insights, as well as highlighted some of the core issues that need to be addressed moving forward. With the paradigm-changing shift from data to assets, OriginTrail can serve as one of the components that will help address those issues and facilitate the successful digital transformation of international trade.

👇 More about OriginTrail 👇

OriginTrail is an ecosystem dedicated to making the global economy work sustainably by organizing humanity’s most important knowledge assets. It leverages the open source Decentralized Knowledge Graph that connects the physical world (art, healthcare, fashion, education, supply chains, …) and the digital world (blockchain, smart contracts, Metaverse & NFTs, …) in a single connected reality driving transparency and trust.

Advanced knowledge graph technology currently powers trillion-dollar companies like Google and Facebook. By reshaping it for Web3, the OriginTrail Decentralized Knowledge Graph provides a crucial fabric to link, verify, and value data on both physical and digital assets.

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

👇 More about Trace Labs👇

Trace Labs is the core developer of OriginTrail — the open source Decentralized Knowledge Graph. Based on blockchain, OriginTrail connects the physical world and the digital world in a single connected reality by making all different knowledge assets discoverable, verifiable and valuable. Trace Labs’ technology is being used by global enterprises (e.g. over 40% of US imports including Walmart, Costco, Home Depot are exchanging security audits with OriginTrail DKG) in multiple industries, such as pharmaceutical industry, international trade, decentralized applications and more.

Web | Twitter | FacebookLinkedIn

Unfolding supply chains with interoperability and decentralisation was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

Momentum for FIDO in Japan Grows as Major Companies Commit to Passwordless Sign-ins with Passkeys

Yahoo! JAPAN, KDDI and NTT DOCOMO have adopted or committed to passkeys TOKYO, December 9, 2022 – Global, industry-wide commitment is bringing the passwordless future closer to reality, FIDO Alliance […] The post Momentum for FIDO in Japan Grows as Major Companies Commit to Passwordless Sign-ins with Passkeys appeared first on FIDO Alliance.

Yahoo! JAPAN, KDDI and NTT DOCOMO have adopted or committed to passkeys

TOKYO, December 9, 2022 – Global, industry-wide commitment is bringing the passwordless future closer to reality, FIDO Alliance members shared today at the first in-person FIDO seminar in Japan since December 2019. During the seminar, leading organizations shared major updates that will further the Alliance’s mission to replace passwords with simpler and stronger authentication. 

A significant milestone came last May when Apple, Google and Microsoft announced plans to expand support for FIDO with passkeys, a phishing-resistant replacement for passwords that provide faster, easier, and more secure sign-ins to websites and apps across a user’s devices. Passkeys can be leveraged across devices and platforms to offer an end-to-end passwordless sign-in option, or bound to a particular device such as a FIDO security key for high-assurance use cases. Passkeys are supported today in iOS 16, macOS Ventura, Android and ChromeOS, with Windows coming soon.

Notably, global service providers such as PayPal have expanded their FIDO support and are offering passkey sign-ins, while early FIDO adopters in Japan have announced passkey commitments or adoption as their next steps towards passwordless:

Yahoo! JAPAN has been working on passwordless initiatives with FIDO since 2015, and more than 38 million active users in 2022 are signing in without passwords. Yahoo! JAPAN now supports passkeys iOS, iPadOS and MacOS. KDDI has first launched FIDO in 2020 for its au ID platform with more than 30 million customers. Now au ID is accessible with passkeys on iOS and FIDO2 on Android.  NTT DOCOMO has been a leader both within and outside FIDO Alliance beginning with its Board appointment in 2015 and is the first mobile operator to deploy FIDO authentication at scale. DOCOMO has announced its intention to support passkeys for its more than 50 million of d ACCOUNT users beginning in early 2023. 

“From the very beginning of the FIDO Alliance, Japan has been a global hub of innovation, support and deployments of FIDO authentication. It is not a surprise that several leading organizations in the region will be some of the first globally to offer their customers FIDO sign-ins with passkeys,” said Andrew Shikiar, executive director and CMO of the FIDO Alliance. “This is illustrative of our global membership’s commitment to the passwordless future, and their collaboration to maximize the reach, usability and security of FIDO authentication.” 

Within the FIDO Alliance’s 250+ members, 58 actively take part in the FIDO Japan Working Group, now beginning its 7th year working together to spread awareness and adoption of FIDO in the region. 

About the FIDO Alliance 

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies, and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.

The post Momentum for FIDO in Japan Grows as Major Companies Commit to Passwordless Sign-ins with Passkeys appeared first on FIDO Alliance.

Thursday, 08. December 2022

FIDO Alliance

Authenticate Summit Recap: The FIDO Fit in IoT

By: FIDO Staff The Internet of Things (IoT) is an increasingly critical and difficult area for IT devices that need to be secured. At the Authenticate Virtual Summit: The FIDO […] The post Authenticate Summit Recap: The FIDO Fit in IoT appeared first on FIDO Alliance.

By: FIDO Staff

The Internet of Things (IoT) is an increasingly critical and difficult area for IT devices that need to be secured.

At the Authenticate Virtual Summit: The FIDO Fit in IoT held on Dec. 7, a series of experts outlined FIDO Alliance efforts to help device manufacturers and developers better secure IoT. A key theme of the event was all about understanding how the FIDO Device Onboarding (FDO) specifications can help improve IoT security.

David Turner, director of standards development at FIDO Alliance, kicked off the event by noting that passwords remain a large problem across the IT industry. The challenge of passwords is compounded with IoT devices, which scale into the millions and potentially billions of devices. Challenges with passwords for IoT include password re-use, which can be a huge problem with IoT. If a system ships with a default password, it can be trivially easy for attackers to exploit.

“Hackers don’t break into IoT, they log into it,” Turner said.

One way to help secure IoT is with the FIDO Alliance’s FDO standard. Turner explained that FDO is an open standard that allows organizations to quickly and securely onboard IoT devices.

Small things, big impact: The path to FDO

Rolf Lindemann, director of product at Nok Nok and one of the leaders of the FIDO Alliance IoT Technical Working Group, explained that FIDO authentication standards are applicable to users as well as device authentication.

Lindermann said that there is a clear need to have a strong foundation to help secure IoT. The first step is to have hardened hardware elements at the CPU level including things like TPMs, TrustZone and SGX which are provided by the silicon vendors. The next critical step is to add device level attestation to help with supply chain integrity that also helps to reduce the complexity for device onboarding. The third step is to have strong authentication, that ensures only legitimate entries get access.

“To make the IoT ecosystem more secure, you need strong authentication that’s the front door providing fishing resistance and being still practical for daily large scale use,” Lindermann said. 

How FDO tackles the onboarding challenge

The challenge of onboarding is where the FDO specifications come into play.

Richard Kerslake, general manager of industrial controls and robotics, IoT business unit at Intel, explained that onboarding is the process by which a device can establish a trusted connection with a service or platform.

“We have an IoT device, it’s going to connect to a platform or service and we just need to be sure that everyone in that equation is who they say they are,” Kerslake explained. “Is the device talking to the platform that it thinks it is talking to, and is the platform talking to the device that it thinks it is talking to. So we really need to make sure that both sides of that equation are true.”

Onboarding today is often a very manual process. The promise of FDO is an automated approach that benefits from strong authentication. Kerslake explained that in December 2019 the decision was made to base the FDO specification on Intel’s Secure Device Onboard technology. The FDO 1.0 specification was released in March 2021 and updated to version 1.1 in April 2022.

Going a step further beyond just the specifications FIDO has worked with the Linux Foundation’s LF Edge project which has an open source implementation of FDO.

Going for a deep dive with FDO

There is a fair amount of nuance and details that go into the FDO specification.

In a deep dive session, Geoffrey Cooper, principal engineer, IoTG at Intel, explained the workflow, technical specification and procedures that enable FDO implementations.

Cooper explained that for example if a device is drop-shipped to a location and the device gets powered up and connected to the network, the goal with FDO is to enable that device to figure out who it’s supposed to connect to with proper authentication, sets everything up, and then it goes right into service.

“The idea is we’re taking something that was a very heavy touch kind of operation that we’re turning it into a zero touch operation,” Cooper said.

Enabling that zero-touch approach with FDO involves a series of protocols that are part of the specification. The protocols include device initialization and onboarding components. There is also a concept known as the FDO Service Info Module (FSIM) that provides an extension mechanism to help support devices.

During a robust Q&A session during the Authenticate virtual event, attendees asked a wide variety of questions.

Among the questions was one about what’s needed to help spur adoption for FDO.  Kerslake said there are companies today in different industry verticals including the energy sector, where operators are saying they will not proceed with bringing in new devices without an automated secure onboarding solution.

There are also a growing number of industry solutions that support FDO. Megan Shamas, senior director of marketing at the FIDO Alliance, said that by developing FDO in an industry standards body there are lots of opportunities for collaboration and promotion as well.

“We are in the midst of creating an implementer showcase, which should be live on the website soon,” Shamas said.

The path toward FDO certification

Looking beyond just the FDO specification there is also a need for certification, which is something the FIDO Alliance is now working on.

Paul Heim, director of certification at FIDO Alliance, said that  product certification ensures standardization and interoperability of products within an industry. He added that one of the most important factors about certification is that it helps to ensure consumer enterprise, and industrial protection. The lifecycle for FDO certification includes both functional and security certification.

“The FIDO device onboard certification program is intended to certify IoT devices and onboarding services certification that will be available for both FIDO members, and non-members,” Heim said.

The certification effort is still in development with a program launch set for the first quarter of 2023.

The post Authenticate Summit Recap: The FIDO Fit in IoT appeared first on FIDO Alliance.


Energy Web

Energy Web: Now Available on Amazon

Energy Web RPC nodes on AWS Marketplace is the first step towards Energy Web as-a-service Zug, Switzerland — December 8, 2022 — This year, Energy Web’s technology roadmap focused on achieving one objective: making it easier for enterprises to use Web 3 technology to create real business value in support of the global energy transition. A key pillar of this strategy is to dramatically streamline t
Energy Web RPC nodes on AWS Marketplace is the first step towards Energy Web as-a-service

Zug, Switzerland — December 8, 2022 — This year, Energy Web’s technology roadmap focused on achieving one objective: making it easier for enterprises to use Web 3 technology to create real business value in support of the global energy transition. A key pillar of this strategy is to dramatically streamline the way companies access EW solutions by offering them as-a-service.

Today we’re excited to announce the initial step in this journey with the release of Energy Web RPC nodes in the AWS Marketplace. For the first time, companies can now configure and deploy a component of the Energy Web Decentralized Operating System with the push of a button. Over the coming months we will expand these offerings to include additional components and public cloud marketplaces.

Why we’re offering EWC technology in cloud marketplaces

There’s an adage in the business world that to be successful you need to meet your customers where they are. There are lots of ways to interpret this, but ultimately it boils down to delivering a product or service in a way that aligns with customers’ existing practices and habits. In the context of enterprise software, historically this has been a weakness of Web3 solutions, including Energy Web.

Today nearly every category of enterprise software — from marketing to resource planning, HR management to project management, and teleconferencing to file sharing — is dominated by “Software-as-a-Service” (SaaS) solutions that are offered via subscription models. With few exceptions, gone are the days where large companies roll out complex business applications don’t need any specialized skills or dedicated technical resources in order to adopt a new system; rather, with just a couple of clicks they can launch and customize their solution, and it just works.

In contrast, up until recently implementing an Energy Web solution required a team of developers to provision a dedicated hosting environment prior to creating a bespoke deployment pipeline to configure, install, and integrate multiple discrete components downloaded from public Github repositories. Even with robust supporting documentation and tutorials, that’s a tall ask for companies with mature engineering departments. For companies with more limited IT resources, this approach is simply a nonstarter.

Public cloud marketplaces can significantly reduce this friction by automating and abstracting all of the deployment steps, and delivering a clean user interface to configure and administer EW solutions. Beyond the technical benefits, the marketplace approach also simplifies procurement by consolidating costs under an existing cloud account. Just as popular mobile app stores radically simplified distribution of consumer applications, public cloud marketplaces can act as a springboard for driving widespread adoption of enterprise software.

At Energy Web, we view cloud marketplaces as a critical channel for scaling our impact in 2023 and beyond.

“The business value of Energy Web solutions is clear, but when it comes to implementation we need to deliver the user experience that enterprises have come to expect” said Energy Web CTO, Mani Hagh Sefat. “With SaaS, things that used to take lots of time and effort can now be done with the click of a button. By offering EW solutions in cloud marketplaces, we can enable any business user to start using our technology and getting value from it right away.”
What’s currently available?

RPC nodes, with support for the high-performance Nethermind client (EWC main net, and Volta test net) and legacy OpenEthereum client (EWC main net, and Volta test net), are the first EW service being offered in the AWS Marketplace. RPC nodes act as a gateway to connect to a blockchain, and are required for any application that reads data from or initiates transactions on the EWC (and Volta test network). Prior to the AWS Marketplace offering, the only options for connecting to the EWC were to subscribe to a public RPC node (which is rate limited and not suited for enterprise workloads) or to use the command line to run a dedicated private node (which is technically complex).

With RPC nodes now available in AWS Marketplace, developers and enterprises can easily deploy their own dedicated nodes to enable programmatic and high-volume interactions with the Energy Web Chain.

How it started: “quickstart” instructions for EW RPCHow it’s going: EW RPC in AWS Marketplace
The Nethermind team has long collaborated with Energy Web to provide smooth, reliable, performant solutions for running EWC nodes. We look forward to seeing enterprises enjoy this improvement to quick node deployment and welcome any opportunities to build solutions for clean and distributed energy resources together. — Tomasz Stańczak, CEO
What’s next?

Long-term, our vision is to deliver all EW Solutions (and individual EW-DOS components) via a fully self-sovereign solution-as-a-service offering that delivers the simplified user experience of public cloud marketplaces with the flexibility and customization features of existing self-hosted models. In this offering, customers will have the ability to deploy EW solutions in multiple public clouds and/or their own environments.

In the near term, we will expand public cloud marketplace offerings (both increasing the number of EW services available, such as Data Exchange and Green Proofs solutions, and adding more cloud providers, such as Azure and Google Cloud) in Q1 — Q2 2023.

About Energy Web
Energy Web is a global non-profit accelerating the clean energy transition by developing open-source technology solutions for energy systems. Our enterprise-grade solutions improve coordination across complex energy markets, unlocking the full potential of clean, distributed energy resources for businesses, grid operators, and customers. Our solutions for enterprise asset management, data exchange, and Green Proofs, our tool for registering and tracking low-carbon products, are underpinned by the Energy Web Chain, the world’s first public blockchain tailored to the energy sector. The Energy Web ecosystem comprises leading utilities, renewable energy developers, grid operators, corporate energy buyers, automotive, IoT, telecommunications leaders, and more. More information on Energy Web can be found at www.energyweb.org or follow us on Twitter @EnergyWebX

Energy Web: Now Available on Amazon was originally published in Energy Web on Medium, where people are continuing the conversation by highlighting and responding to this story.


FIDO Alliance

FIDO Alliance Provides Guidance on Making FIDO Deployments Accessible to People with Disabilities

By Christina Hulka, executive director and COO of the FIDO Alliance FIDO Authentication has reached broad support across the web – all major operating systems, browsers and billions of devices […] The post <strong>FIDO Alliance Provides Guidance on Making FIDO Deployments Accessible to People with Disabilities</strong> appeared first on FIDO Alliance.

By Christina Hulka, executive director and COO of the FIDO Alliance

FIDO Authentication has reached broad support across the web – all major operating systems, browsers and billions of devices support FIDO Authentication today. Having reached such a milestone and the resulting FIDO roll outs from a broad array of service providers, the FIDO Alliance is increasingly focused on ways to make FIDO Authentication more usable and accessible for all. 

In achieving FIDO Alliance’s mission of more secure and password-free authentication, we must ensure that we meet the needs and preferences of people with disabilities. Today, we are pleased to announce the publication of “Guidance for Making FIDO Deployments Accessible to Users with Disabilities,” to provide guidance on planning FIDO deployments that are accessible to users with a wide range of disabilities. It also aims to help hardware manufacturers identify opportunities to deliver more accessible external authenticators.

An estimated 15% of the world’s population lives with some sort of disability today, and in many countries, laws prohibit discrimination to help ensure that these people can fully and equally participate in every aspect of society. Authentication is an important component of the ability to participate, as it provides digital access to many aspects of society including (but not limited to) education, employment, and entertainment. While legacy forms of multi-factor authentication (MFA) like SMS or email codes are technically “accessible,” they often require advanced skill, knowledge and/or assistive technology to enter the codes. FIDO, with its stronger and simpler authentication model, is well positioned to provide accessible authentication, as it supports a wide range of options that accommodate vastly diverse needs. The paper released today details why, and considerations for, deploying FIDO with the needs of people with disabilities in mind. We strongly encourage service providers to reference these guidelines in planning their FIDO deployments.

Much work and collaboration went into this paper. We would like to thank Yao Ming of Meta for his extensive work as lead author on this paper. We’d also like to thank Joyce Oshita of VMware for her contributions, including providing her own experiences leveraging various authentication methods, including FIDO, as a person who has lost her eyesight. 

In addition to the white paper, Yao and Joyce will be joining us on December 15, 2022 at 2pm ET for a webinar to discuss their perspectives on this topic. To attend the webinar, register here.

The paper is available here; feedback is always appreciated – please drop a line at info@fidoalliance.org.  

The post <strong>FIDO Alliance Provides Guidance on Making FIDO Deployments Accessible to People with Disabilities</strong> appeared first on FIDO Alliance.


We Are Open co-op

How to Gather Data on a Community of Practice

Answering the question: “Are we maturing?” cc-by-nd Bryan Mathers for WAO We’ve done a lot of thinking about how a Community of Practice (CoP) matures and have written a variety of posts on tools we use to grow healthy, sustainable communities. This post looks at how you might use data to prove that your CoP is maturing. After a year since the launch of the KBW community, we wanted to know i
Answering the question: “Are we maturing?” cc-by-nd Bryan Mathers for WAO

We’ve done a lot of thinking about how a Community of Practice (CoP) matures and have written a variety of posts on tools we use to grow healthy, sustainable communities. This post looks at how you might use data to prove that your CoP is maturing.

After a year since the launch of the KBW community, we wanted to know if the good vibes of the community were all in our minds or indeed backed up by data. We looked at how community members engaged with each other and the community of practice as a whole. See the full output in slide form.

We are analysing this data to have a look at how the community is progressing through growth, activity, and sense of community. Are we maturing? Let’s find out!

For a more complete walkthrough of the data we’ve collected, here is a video (transcript included!).

Why you want to look at data

It’s awesome that you might feel that your community is healthy, but likely the community ambitions extend beyond being a friendly and inclusive place. As a community of practice evolves, you likely have specific goals, outcomes and activities you’re promoting.

To judge whether or not you are achieving these goals, you will want to look at how the community changes over time. We do this with data! Start by setting some baselines to measure against. With a data-driven approach, we can more accurately observe how a community is having impact.

Methodology and types of metrics Our three buckets of data

You’ll want to set a couple of different kinds of metrics, both qualitative and quantitative. You’ll want to look at how people engaged, how people were recognised and gather insights from the community members themselves. Your first step is to think about what you want to know.

1. Individual Engagement

Individual engagement is a way to look at how community members are interacting with one another and the community more broadly. Typically, you will want to look at things like:

Did people introduce themselves? Did people made a contribution to or start a discussion? Did they join an event, either online or in real life? Did they sign up to courses or seminars on offer? What other signup or attendance rates can you collect? 46% of our sample was actively engaged

Depending on the size of your community and the data capabilities of the platform(s) you’re using, you may need to come up with a sampling technique. Sampling allows you to select a fair and impartial subset of your community members to set engagement baselines.

The sampling technique we used for the KBW community was to select every 5th member who joined the community (listed by date). This technique allowed us to look at approximately 20% of the entire community, a decent sized sample.

2. Community Metrics

Many platforms have data capabilities of some kind. You might be able to look at:

the total number of posts per month the total number of active members the total number of new discussions.

Setting baselines for posting activities can help you determine if the community is becoming more engaged.

41% of the KBW community earned at least one badge

The KBW community issues a variety of badges to recognise engagement and to encourage pro-social behaviours. With our community metrics, we were able to see how many badges were earned and what the most popular badges were. We found that 41% of the community earned at least one badge! Community members can earn badges that are discussed and displayed in the community as well as stealth badges. There are badges related to specific activities as well, such as Badge Wiki badges for people who contribute to the community’s knowledge base.

Looking at these metrics over the course of a year can help you determine ebbs and flows of activity. They can also help you see what kinds of content led to the most engagement.

3. Surveys

The third thing you should look at is what community members think and feel. Create an end of year survey focused on the key factors you’d like to know about.

How do you want community members to feel? What kind of ambitions for the community do you hope people have? How do community members want to improve the community’s offerings? we asked people how they felt about the KBW community

For the KBW Birthday Survey, we used a Likert Scale and a subset of questions from McMillan and Chavis’s 1986 paper Psychological Sense of Community. We wanted to know if people felt like they belonged, if they identified with the values of the community and if their needs were being met. We also wanted to see if they thought the community had influence and trust.

The best part of surveying actual community members is the things they write in, so be sure and provide open text fields. People offer kudos as well as constructive criticism, so you’re sure to learn a lot about your community.

What’s next

Once you’ve collected and analysed your data, share it with your community! Other members of the community will have their own insights and ideas to make your community of practice even better.

We recommend putting on some music and diving into your community data. It’s easy to get excited about the things you learn because your data-driven insights can propel you, collectively, forward towards your goals.

Join us in the KBW community and learn more about communities of practice and all things badges

How to Gather Data on a Community of Practice was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. December 2022

EdgeSecure

Dartmouth College Teams with Edge to Bring IEEE DataPort to Their Researchers for Data Management and Storage

The post Dartmouth College Teams with Edge to Bring IEEE DataPort to Their Researchers for Data Management and Storage appeared first on NJEdge Inc.

NEWARK, NJ, December 9, 2022 – Edge has partnered with IEEE, the world’s largest technical professional organization dedicated to advancing technology for humanity, to announce their first collective milestone with the sale of an institutional subscription of IEEE DataPort to Dartmouth College. Researchers at Dartmouth College selected IEEE DataPort as their unified data and collaboration platform which researchers can leverage to efficiently store, share, access, and manage research data, accelerating institutional research efforts for data management and storage.

The initiative is a collaborative effort to offer increased awareness of institutional subscriptions to IEEE DataPort — a web-based, cloud services platform supporting the data-related needs of the global technical community — making it available to academic, government, and not-for-profit institutions across the United States.

“Edge is pleased to have the opportunity to work with IEEE in bringing IEEE DataPort to researchers at Dartmouth College. As research in nearly all domains becomes more data intensive, providing institutions with the ability to store, share, access, and manage high quality data is critical. The partnership between Edge and IEEE facilitates that opportunity and advances the mission of EdgeDiscovery, a research and discovery framework providing access to leading-edge technology to support collaborative research and educational opportunities,” said Dr. Forough Ghahramani, Assistant Vice President for Research, Innovation, and Sponsored Programs at Edge.

With IEEE DataPort, researchers at subscribing institutions gain access to the more than 3,800 research datasets available on the platform and the ability to collaborate with more than three million global IEEE users. The platform also enables institutions to meet funding agency requirements for the use of and sharing of data.

“IEEE welcomes the opportunity to offer all members of Dartmouth College the ability to leverage the many benefits of IEEE DataPort. Dartmouth College conducts extremely valuable research and by sharing their research data on IEEE DataPort, Dartmouth will gain global exposure for their research efforts and help researchers around the globe,” said Melissa Handa, Senior Program Manager, IEEE DataPort.

Existing Edge members and other North American academic and government institutions interested in learning more about IEEE DataPort, can contact Edge at ieeedataport@njedge.net.

Corporations and institutions outside North America can subscribe to IEEE DataPort directly through IEEE. Learn more at https://innovate.ieee.org/ieee-dataport/

About Edge:
Founded in 2000, Edge, a 501(c)(3) nonprofit corporation, serves as a purpose-built research and education wide area network and technology solutions partner. Edge connects members with affordable, easy access to high-performance optical networking, commodity Internet and Internet2 services, and a variety of technology-powered products, solutions, and services. The Edge member consortium consists of colleges and universities, K-12 schools and districts, government entities, healthcare networks, and businesses spread throughout the continental US. The group is governed by the New Jersey Presidents’ Council with offices in Newark, Princeton, and Wall Township, NJ. For more information, please visit: www.njedge.net.

Media Contact:
Adam Scarzafava
AVP for Marketing and Communications, Edge
855-832-3343
adam.scarzafava@njedge.net

About IEEE

IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity. Through its highly cited publications, conferences, technology standards, professional and educational activities, IEEE is the trusted voice in a wide variety of areas ranging from electrical engineering, aerospace systems, telecommunications and computer science to biomedical engineering, artificial intelligence, and consumer electronics. IEEE has expanded its open access program and launched many new fully open access titles in fields such as computing, telecommunications, biomedical engineering, nanotechnology and more. Learn more.

The post Dartmouth College Teams with Edge to Bring IEEE DataPort to Their Researchers for Data Management and Storage appeared first on NJEdge Inc.


The Engine Room

Weaving tech into human rights work: case studies with Amnesty Tech

We were honoured to work with Amnesty Tech on a pair of case studies looking back at how Amnesty International has woven a critical understanding of technology into their human rights defence work. The post Weaving tech into human rights work: case studies with Amnesty Tech first appeared on The Engine Room.

When we look at the state of defending human rights today, it is clear that digital technologies play a role – both as a creator of risk and harm and, potentially, as a tool to protect rights and strengthen the work of rights defenders. However, this was not always as clear, and even today, it can be difficult to understand how digital technologies magnify existing harms, introduce new dynamics, and can be used by advocates.  

Learning from what others have tried – what has worked and what hasn’t – is important in this space. As such, we were honoured to work with Amnesty Tech on a pair of case studies looking back at how Amnesty International has woven a critical understanding of technology into their human rights defence work. We conducted desk research and interviews with Amnesty Tech in 2020, and developed the case studies we’re sharing today. Despite the time that has passed, the learnings continue to be relevant for exploring questions like:

How can we weave a deeper understanding of tech into human rights programme work? What impact will that have on our work? What does it take to shift internal ways of working? What are ways to approach digital threats and risk management?

In the first case study The journey of Amnesty Tech: How a large organisation integrated technology into human rights work, we look at how Amnesty Tech came to be – the advocates, resources and successes that led to Amnesty Tech becoming an integral part of the broader organisation’s human rights work.

In the second, Amnesty Tech Empowerment: Transforming digital security support for human rights defenders, we see how Amnesty’s deep knowledge of human rights defenders and their needs around risk management and protection shaped their approach to rising digital threats.

Download both case studies, below:

The journey of Amnesty Tech [pdf] Amnesty Tech Empowerment [pdf]

Image by Omid Armin via Unsplash.

The post Weaving tech into human rights work: case studies with Amnesty Tech first appeared on The Engine Room.

Next Level Supply Chain Podcast with GS1

What’s Trust Got To Do With Digitized Supply Chains

Hundreds, even thousands, of interactions happen along the supply chain for each product before it reaches the consumer. As a result, human touch points along this journey are becoming less feasible, so automation and computers have to find ways to “build trust” in a supply chain data exchange. Join us for a fascinating discussion with digital identity expert, Paul Dietrich, as he walks us through

Hundreds, even thousands, of interactions happen along the supply chain for each product before it reaches the consumer. As a result, human touch points along this journey are becoming less feasible, so automation and computers have to find ways to “build trust” in a supply chain data exchange. Join us for a fascinating discussion with digital identity expert, Paul Dietrich, as he walks us through how we are innovating and building greater trust along our supply web…with less human interaction.

Tuesday, 06. December 2022

MOBI

Integration between Mobility and Energy

All Smart City services are delivered through technology platforms. From a technology perspective, a global trend is clearly underway — namely, the progressive integration of all services into technology platforms. Of the wide range of platforms in the global digital marketplace, the mobility platform boasts the most complex technology. The mobility platform provides the [...] The post Integrati

All Smart City services are delivered through technology platforms. From a technology perspective, a global trend is clearly underway — namely, the progressive integration of all services into technology platforms. Of the wide range of platforms in the global digital marketplace, the mobility platform boasts the most complex technology. The mobility platform provides the foundational technology layer for integrating all user-centered services, thus becoming the hub of service delivery.

Challenges of the Smart City Digital Ecosystem

Today’s great challenge for Smart Cities is developing a “meta-platform” capable of integrating IoT devices, connectivity, data, and apps providing core and value-added services. Around the world, integration is emerging as a new paradigm not only in technology but in business and governance; and interoperability must become a standardized practice in the public and private sectors.

Achieving full interoperability is therefore critical to overcoming fragmentation among several functional areas (Smart Governance, Smart Education, Smart Energy, Smart Infrastructure, Smart Mobility, Smart Healthcare, Smart Buildings, Smart Technology, and others) in the global Smart City market. Interoperability requires open and shared standards.

Energy Insurance

The insurance sector plays a key role in climate adaptation and building climate resilience through promoting climate mitigation and incentivizing green behaviors. This can range from asset management and underwriting to supporting clean technologies and climate-friendly operational changes.

Current global climate and environmental emergencies are pushing regulatory frameworks, business models, and marketing toward standardizing units of measurement. In this global framework, the carbon footprint is emerging as the standard global unit of measurement for a net-zero society.

Carbon footprint measurement is beginning to appear in companies’ reports on their environmental, social, and governance (ESG) activities. This data may also be required for B Corp, ISO, and GHG certifications, eco-labeling, participation in carbon markets, and carbon tax calculations.

MOVENS and wefox

One global leader bringing the insurance industry into the digital age is German-based insurtech firm wefox, which developed a new approach to revolutionizing insurance, focused on consumer empowerment and prioritizing solutions for secure data-driven experiences. wefox has developed an insurance platform that leverages data from energy consumption to determine what appliances are in use, enabling innovative and precise coverage for users.

wefox adopted MOVENS technology developed by Henshin Group. MOVENS is a cloud-agnostic IoT platform designed to be the Integration Hub for the Smart City, connecting all layers and related entities involved in the Smart City ecosystem.

The microservice architecture of the wefox platform can cover not only households but also small and medium-sized enterprises and offices by processing the infinite streams of data from MOVENS.

In fact, by reading data and energy curves from machinery and household appliances, wefox can design insurance coverages that cover these risks and any possible damages. This can ultimately help companies avoid breakdowns and business interruptions in addition to helping families keep their disposable income. In addition, to encourage the adoption of environmentally friendly practices, wefox offers insurance coverages that reward sustainable behaviors. More specifically, by collecting data that show evidence of sustainable driving behavior, wefox can offer rewards and incentives in collaboration with other partners to promote risk-averse behaviors and foster loyalty among customers. This, in turn, can result in lower premiums and higher cost savings.

Henshin and MOBI

MOVENS platform is evolving according to the global standards set by MOBI. Its high-level microservice abstraction architecture enables the integration of all layers involved in the Smart City ecosystem and encourages the implementation of cutting-edge technology for each layer. The platform thus provides the foundational technology layer for integrating all user-centered services.

MOVENS technology currently includes verticals in the Mobility, Energy, and Insurance sectors. The platform speeds up value creation through an open innovation model, supporting the energy transition.

Henshin joined MOBI for a simple reason: to be part of the global innovation ecosystem and to have the chance to collaborate with other valuable members and their breakthrough technologies. Collaboration is key to driving persistent innovation and accelerating the evolution of the MOVENS platform as the technological infrastructure for the implementation of Smart City initiatives, starting with the integration between mobility and energy sectors.

Henshin is a member of the DRIVES Program and has contributed to the release of two recent standards: the Battery SOH (State of Health) Standard and Rechargeable Battery Identification Number (BIN).

Watch the Recorded Lecture

The post Integration between Mobility and Energy appeared first on MOBI | The New Economy of Movement.


GS1

Maintenance release 3.1.23

Maintenance release 3.1.23 daniela.duarte… Tue, 12/06/2022 - 09:48 Maintenance release 3.1.23
Maintenance release 3.1.23 daniela.duarte… Tue, 12/06/2022 - 09:48 Maintenance release 3.1.23

GS1 GDSN accepted the recommendation by the Operations and Technology Advisory Group (OTAG) to implement the 3.1.23 standard into the network in May 2023.

Key Milestones:

See GS1 GDSN Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.

Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools on understanding the release and any impacts to business processes.

Business Message Standards including Message Schemas Updated For Maintenance Release 3.1.23

Trade Item Modules Library 3.1.23 (Jan 2023)

GS1 GDSN Code List Document (Jan 2023)

Delta for release 3.1.23 (Jan 2023)

Delta ECL for release 3.1.23 (Dec 2022)

Validation Rules (Dec 2022)

Delta for Validation Rules (Dec 2022)

Approved Fast Track Attributes (Jan 2023)

Unchanged for 3.1.23

BMS Shared Common Library (Dec 2021)

BMS Documents Carried Over From Previous Release

BMS Catalogue Item Synchronisation

BMS Basic Party Synchronisation

BMS Price Synchronisation 

BMS Trade Item Authorisation

 

Schemas

Catalogue Item Synchronisation Schema including modules 3.1.23 (Dec 2022)

Changed Schemas for 3.1.23 (Jan 2022)

Party Synchronisation Schema

Price Synchronisation Schema

Trade Item Authorisation Schema

Release Guidance

GS1 GDSN Attributes with BMS ID and xPath (Jan 2023) 

GPC to Context Mapping 3.1.23 (Dec 2021)

Delta GPC to Context Mapping 3.1.23 (Dec 2021)

Migration Document (Jan 2023)

Approved WRs for release

Unchanged for 3.1.23

Packaging Label Guide (Nov 2022)

Deployed LCLs (Nov 2022)

GS1 GDSN Module by context (May 2021)

GS1 GDSN Unit of Measure per Category (Apr 2022)

Flex Extension for Price commentary (Dec 2018)

Any questions?

We can help you get started using GS1 standards.

Contact your local office


Belgian association mobile app uses GDSN to give consumers information about the food they buy

Belgian association mobile app uses GDSN to give consumers information about the food they buy daniela.duarte… Tue, 12/06/2022 - 08:54 Belgian association mobile app uses GDSN to give consumers information about the food they buy 06 December 2022
Belgian association mobile app uses GDSN to give consumers information about the food they buy daniela.duarte… Tue, 12/06/2022 - 08:54 Belgian association mobile app uses GDSN to give consumers information about the food they buy 06 December 2022 Belgian consumers who want to know more information about food items beyond what fits on the label can now turn to an app that uses product data provided by brand owners.

Non-profit organisation Aktina offers a free web and smartphone app designed to help Belgian consumers understand the information on food labels.
The app—called BATRA for BArcode TRAnslator—allows shoppers to decipher a food product label in the blink of an eye, simply by scanning the barcode on the product. It’s part of Aktina’s work to promote healthy behaviours and sustainable mindsets.

Read the case study Thanks to GS1 GDSN, shoppers access data straight from the source

The information provided by BATRA is drawn from product data made available by brand owners themselves, because it draws upon the product data in My Product Manager, the GS1 GDSN-Certified Data Pool of GS1 Belgilux.

The Aktina BATRA app helps consumers better understand the terms, ratings and graphics on food labels, so they can shop thoughtfully and with confidence.

Data enables helpful notifications for consumers

BATRA also allows users to create personalised settings and notifications. For example, consumers can request to be alerted if a product contains a certain allergen. They can also ask to be informed if the AFSCA, Belgium’s Federal Agency for Food Chain Safety, has issued a recall on a product being scanned – or even a product the user has scanned at some point in the past.

GDSN helps create a direct link between consumers and suppliers

BATRA is an excellent example of how the GDSN can be part of B2B2C* strategies, serving the needs of business, brands and manufacturers all while they, in turn, serve the needs of end-consumers.

*Business To Business To Consumer

Read the case study

LionsGate Digital

OCEG releases Integrated Data Privacy Capability Model and Professional Certification

The open-source Data Privacy Capability Model is a step-by-step guide to designing, running, and evaluating a strong data privacy program for any organization. PHOENIX, ARIZONA, UNITED STATES, December 5, 2022 /EINPresswire.com/ — OCEG is pleased to announce the release of the new Integrated Data Privacy Capability Model (IDPM) and associated certification for Integrat

The open-source Data Privacy Capability Model is a step-by-step guide to designing, running, and evaluating a strong data privacy program for any organization.

PHOENIX, ARIZONA, UNITED STATES, December 5, 2022 /EINPresswire.com/ — OCEG is pleased to announce the release of the new Integrated Data Privacy Capability Model (IDPM) and associated certification for Integrated Data Privacy Professionals. Together with a global review committee of privacy experts and the input of Singapore-based OCEG training partner, Straits Interactive, we have developed an open-source capability model that offers a detailed step-by-step guide to designing, running, and evaluating a strong data privacy program for any organization.

Carole Switzer, OCEG Co-Founder, and President notes, “Unlike other available resources and certifications, which generally outline the requirements of various data privacy regulations and tell you what you need to comply with, through this Capability Model we are seeking to help you understand how to meet those needs. Following the structure of our respected GRC Capability Model, we walk you through every stage of identifying relevant requirements for your organization, keeping track of where and how you are collecting and processing personal information, and ensuring that your data privacy program is transparent and auditable.”

The Capability Model is a free, open-source resource that anyone may download directly from the OCEG site.

“We are pleased to offer this important certification as an addition to our GRC Professional and GRC Audit certifications,” says OCEG Founder and Chair, Scott Mitchell, “and to include it at no additional cost for anyone who holds an OCEG All Access Pass. The AAP provides access to our premium resources and all our certification exams at one low cost, which ensures that every organization can have skilled and credentialed teams to address these critical needs.”

Said Kevin Shepherdson, CEO, Straits Interactive, “This Model, which we are honored to have authored with OCEG, offers something that has been missing in the information market for data privacy, that is, looking at privacy from an integrated governance, risk management and compliance perspective. The certification demonstrates an individual’s knowledge of how to build, run and assess an effective and agile data privacy program. As data privacy management becomes more challenging, it would be in every organization’s interest to ensure that its data privacy team and every manager of units with data privacy responsibilities, follow the model and obtain this valuable certification.”

Integrated Data Privacy Capability Model vs Other Privacy Certifications
The Integrated Data Privacy Capability Model provides a roadmap for both GRC and privacy professionals. Those with specialized privacy certifications can now broaden their expertise into GRC as data privacy involves governing personal data, managing risks relating to personal data as well as complying with data privacy laws and regulations.

As for those who have the GRCP certification, this provides a roadmap for GRC professionals to cover data privacy which has become a big focus due to the General Data Protection Regulation and the introduction of many new data privacy laws around the world. By achieving the certification in the Integrated Data Privacy Capability Model, they have the option to then specialize in the area of data privacy and take on additional privacy certification qualifications.

Testimonials from Testers
Testimonials from those who recently took the training have been encouraging and are shared here:
“The course has a definitive guide for Data Protection Officers who are looking towards being operationally ready. What I learned the most would be the specific steps in preparing a robust data protection management program.”

“Relevant to my consulting practice going forward [the Model] provides a more detailed framework to advise clients on how to set up their privacy management plan.”

“The ‘learn and align’[component structure] provides a good way to frame the settings for our consulting with the management to align with their business objectives and enroll support.”

“The training provides in detail the steps required to set up a data privacy program (right from the start).”

“The training is very useful, how we combine data privacy knowledge and GRC perspective.”

“Found it useful to have understood the privacy framework in the larger context of GRC.”

To obtain the free, open-source Integrated Data Privacy Capability Model and learn more about the certification and available training, visit https://go.oceg.org/integrated-data-privacy-capability-model

About
OCEG
OCEG is a global nonprofit organization and community. We inform, empower, and help advance the careers of our 120,000+ members who work in governance, strategy, risk, compliance, security and audit. We created GRC to help every organization and every person achieve objectives, address uncertainty and act with integrity. This approach to business, and to life, is what we call Principled Performance®. For over 20 years, we’ve set the standards for GRC and the associated critical disciplines that comprise GRC. The foundational standards for GRC form the basis of OCEG’s GRC Capability Model (the “Red Book”) and additional OCEG capability models.

Straits Interactive
Straits Interactive delivers sustainable data governance solutions that help organizations create trust in today’s data-driven world. As trusted advisors to SMEs, MNCs and data protection authorities in the region, we provide comprehensive competency, consulting and capability roadmaps in data protection and governance. We enable these competencies by partnering top universities in the region and international certification bodies to provide advanced diplomas, degrees and certification courses. Our hands-on advisory services, combined with our software-as-a-service solutions, help reduce risk and create value from data to help businesses achieve their digitalization and innovation objectives.

Carole Switzer
OCEG
+1 236-464-6254
email us here

The post OCEG releases Integrated Data Privacy Capability Model and Professional Certification appeared first on Lions Gate Digital.

Monday, 05. December 2022

Ceramic Network

Use 'Create Ceramic App' to Launch Your Project

Create Ceramic App is a new way to create an app using Ceramic, much like Create React App or Create Next App.

Create Ceramic App is a new way to create an app using Ceramic, much like Create React App or Create Next App. We built Create Ceramic App to ensure that you know all the parts and packages that go into using our decentralized network—from interacting with data to using a Web3 wallet for user authentication.

Before you get started with Create Ceramic App we recommend that you read some of our documentation—it clarifies the terms that we’ll use throughout this post. Create Ceramic App will handle the implementation, but reading about Composites, DIDs, and other important concepts will help you understand how Ceramic works.

0:00 / 1× Getting Started

Get started with Create Ceramic App by running npx @ceramicnetwork/create-ceramic-app clone . You will encounter a series of prompts, after running the command, that we will explain in a later section.

Provided Templates DID-DataStore

This template is a simple implementation of our DID-DataStore package using NextJS and DID Sessions (a successor to 3ID Connect). This template will interact with the published BasicProfile DataModel on the Clay Testnet.

ComposeDB [Developer Preview]

Using a locally created Composite, which is similar to the BasicProfile DataModel, you can create data for any DIDs that you control. This template also provides access to this data through either the provided ComposeClient or through GraphiQL. This template also provides scripts to handle deploying and updating your Composite and will dynamically run all dependencies that the application requires.

Creating App

After running npx @ceramicnetwork/create-ceramic-app clone you’ll encounter a simple CLI where you can fill in the project name and choose your template. We’ll just call it ‘sample’ for now.

We provided templates for DID-DataStore and for the Developer Preview of ComposeDB, both templates have a similar structure so you can focus on the specifics of the two different libraries. We’ll go into details on the differences in the next section.

Below you’ll find the output of selecting the DID-DataStore template. This template does not cover any DataModel creation and does not create a configuration file, so there’s nothing to note.

Our ComposeDB template does include creating a Composite (if you’re familiar with DID-DataStore this is roughly equivalent to a DataModel); because of this, we need to have a configuration file generated that contains an Admin DID (here we use a Key DID), this Admin DID is critical for interacting with Ceramic’s restricted admin APIs. The image below shows the three major steps we take to set up your project.

We clone the GitHub Repository that hosts our projects We generate our Admin DID and its seed We generate our configuration file with the Admin DID included so your new project can have indexing enabled from the first time you start coding

When we selected the ComposeDB template we also created a new Key DID to authenticate our Composite, as well as a configuration file to allow indexing of our newly-created Composite. The configuration file is saved as composedb.config.json and the seed for the newly generated Key DID is saved as admin_seed.txt. Neither of these files should be included in your version control and both have been explicitly mentioned in the provided .gitignore.

Starting the Application

Regardless of the template that you use, starting the application is the same. Just use the command npm run dev in your directory like you would with a NextJS application.

Idea for a feature? Need more documentation?

Please post any feedback on the Create Ceramic App to our Forum, we'd love to hear from you and discover more about what you're building. If you want to learn and develop more with ComposeDB we recommend checking out the latest docs.

Saturday, 03. December 2022

LionsGate Digital

Macleod: The Upside-Down World Of Currency

Authored by Alasdair Macleod via GoldMoney.com, The gap between fiat currency values and that of legal money, which is gold, has widened so that dollars retain only 2% of their pre-1970s value, and for sterling it is as little as 1%. Yet it is commonly averred that currency is money, and gold is irrelevant. As the product of statist propaganda, this is incorrect. Originally

Authored by Alasdair Macleod via GoldMoney.com,

The gap between fiat currency values and that of legal money, which is gold, has widened so that dollars retain only 2% of their pre-1970s value, and for sterling it is as little as 1%. Yet it is commonly averred that currency is money, and gold is irrelevant.

As the product of statist propaganda, this is incorrect. Originally established in Roman law, legally gold is still money and the states’ debauched currencies are not — only a form of credit. As I demonstrate in this article, the major western central banks will be forced to embark on a new round of currency debasement, likely to put an end to the matter.

Central to my thesis is that commercial bank credit will contract sharply in response to rising interest rates and bond yields. This retrenchment is already ending the everything bubble in financial asset values, is beginning to undermine GDP, and given record levels of balance sheet leverage makes a major banking crisis virtually impossible to avoid. Central banks which are already in a parlous state of their own will be tasked with underwriting the entire credit system.

In discharging their responsibilities to the status quo, central banks will end up destroying their own currencies.

So, why do we persist in pricing everything in failing currencies, when that will almost certainly change?

When the difference between legal money and declining currencies is finally realised, the public will discard currencies entirely reverting to legal money. That time is being brought forward rapidly by current events. 

Why do we impart value to currency and not money?

A question that is not satisfactorily answered today is why is it that an unbacked fiat currency has value as a medium of exchange. Some say that it reflects faith in and the credit standing of the issuer. Others say that by requiring a nation’s subjects to pay taxes and to account for them guarantees its demand. But these replies ignore the consequences of its massive expansion while the state pretends it to be real money. Sometimes, the consequences can seem benign and at others catastrophic. As explanations for the public’s tolerance of repeated failures of currencies, these answers are insufficient.

Let us do a thought experiment to highlight the depth of the problem. We know that over millennia, metallic metals, particularly gold, silver, and copper came to be used as media of exchange. And we also know that the use of their value was broadened through credit in the form of banknotes and bank deposits. The relationships between legal money, that is gold, silver, or copper and credit in its various forms were defined in Roman law in the sixth century. And we also know that this system of money and credit with the value of credit tied to that of money, despite some ups and downs, has served humanity well ever since.

Now let us assume that in the absence of metallic money, in the dawn of economic time a ruler instructed his subjects to use a new currency which he and only he will issue for the public’s use. This would surely be seen as a benefit to everyone, compared with the pre-existing condition of barter. But the question in our minds must be about the durability of the ruler’s new currency. With no precedent, how is the currency to be valued in the context of the ratios between goods and services bought and sold? And how certain can one be about tomorrow’s value in that context? And what happens if the king loses his power, or dies?

Clearly, without a reference to something else, the king’s new currency is a highly risky proposition and sooner or later will simply fail. And even when a new currency has been introduced and linked to an existing form of money, if the tie is then cut the currency will struggle to survive. Without going into the good reasons why this is so, the empirical evidence confirms it. Chinese merchants no longer use Kubla Khan’s paper made out of mulberry leaves, and German citizens no longer use the paper marks of the early 1920s. But they still refer to metallic money.

Yet today, we impart values to paper currencies issued by our governments in defiance of these outcomes. An explanation was provided by the great Austrian economist, Ludwig von Mises in his regression theorem. He reasonably argued that we refer the value of a medium of exchange today to its value to us yesterday. In other words, we know as producers what we will receive today for our product, based on our experience in the immediate past, and in the same way we refer to our currency values as consumers. Similarly, at a previous time, we referred our experience of currency values to our prior experience. In other words, the credibility and value of currencies are based on a regression into the past.

Mises’s regression theory was broadly confirmed by an earlier writer, Jean-Baptiste Say, who in his Treatise on Political Economy observed: 

“Custom, therefore, and not the mandate of authority, designates the specific product that shall pass exclusively as money, whether crown pieces or any other commodity whatever.”[i]

Custom is why we still think of currencies as money, even though for the last fifty-one years their link with money was abandoned. The day after President Nixon cut the umbilical cord between gold and the dollar, we all continued using dollars and all the other currencies as if nothing had happened. But this was the last step in a long process of freeing the paper dollar from being backed by gold. The habit of the public in valuing currency by regression had served the US Government well and has continued to do so.

The role of a medium of exchange

Being backed by no more than government fiat, to properly understand the role that currencies have assumed for themselves, we need to make some comments about why a medium of exchange is needed and its characteristics. The basis was laid out by Jean-Baptiste Say, who described the division of labour and the role of a medium of exchange.

Say observed that human productivity depended on specialisation, with producers obtaining their broader consumption through the medium of exchange. The role of money (and associated credit) is to act as a commodity valued on the basis of its use in exchange. Therefore, money is simply the right, or title, to acquire some consumer satisfaction from someone else. Following on from Say’s law, when any economic quantity is exchanged for any other economic quantity, each is termed the value of the other. But when one of the quantities is money, the other quantities are given a price. Price, therefore, is always value expressed in money. For this reason, money has no price, which is confined entirely to the goods and services in an exchange.

So long as currency and associated forms of credit are firmly attached to money such that there are minimal differences between their values, there should be no price for them either, other than a value difference arising from counterparty risk. A further distinction between money and currencies can arise if their users suspect that the link might break down. It was the breakdown in this relationship between gold and the dollar that led to the failure of the Bretton Woods agreement in 1971.

Therefore, in all logic it is legal money that has no price. But does that mean that when its value differs from that of money, does currency have a price? Not necessarily. So long as currency operates as a medium of exchange, it has a value and not a price. We can say that a dollar is valued at 0.0005682 ounces of gold, or gold is valued at 1760 dollars. As a legacy of the dollar’s regression from the days when it was on a gold standard, we still attribute no price to the dollar, but now we attribute a price to gold. To do so is technically incorrect.

Perhaps an argument for this state of affairs is that gold is subject to Gresham’s law, being hoarded rather than spent. It is the medium of exchange of last resort so rarely circulates. Nevertheless, fiat currencies have consistently lost value relative to legal money, which is gold, so much so that the dollar has lost 98% since the suspension of Bretton Woods, and sterling has lost 99%. Over fifty-one years, the process has been so gradual that users of unanchored currencies as their media of exchange have failed to notice it. 

This gradual loss of purchasing power relative to gold can continue indefinitely, so long as the conditions that have permitted it to happen remain without causing undue alarm. Furthermore, for lack of a replacement it is highly inconvenient for currency users to consider that their currency might be valueless. They will hang on to the myth of its use value until its debasement can no longer be ignored.

What is the purpose of interest rates?

Despite the accumulating evidence that central bank management of interest rates fails to achieve their desired outcomes, monetary policy committees persist in using interest rates as their primary means of economic intervention. It was the central bankers’ economic guru himself who pointed out that interest rates correlated with the general level of prices and not the rate of price inflation. And Keynes even named it Gibson’s paradox after Arthur Gibson, who wrote about it in Banker’s Magazine in 1923 (it had actually been noted by Thomas Tooke a century before). But because he couldn’t understand why these correlations were the opposite of what he expected, Keynes ignored it and so have his epigonic central bankers ever since.

As was often the case, Keynes was looking through the wrong end of the telescope. The reason interest rates rose and fell with the general price level was that price levels were not driven by interest rates, but interest rates reacted to changes in the general level of prices. Interest rates reflect the loss of purchasing power for money when the quantity of credit increases. With their interests firmly attached to time preference, savers required compensation for the debasement of credit, while borrowers — mainly businesses in production — needed to bid up for credit to pay for higher input costs. Essentially, interest rates changed as a lagging indicator, not a leading one as Keynes and his acolytes to this day still assume.

In a nutshell, that is why Gibson’s paradox is not a paradox but a natural consequence of fluctuations in credit and the foreign exchanges and the public’s valuation of it relative to goods. And the way to smooth out the cyclical consequences for prices is to stop discouraging savers from saving and make them personally responsible for their future security. As demonstrated today by Japan’s relatively low CPI inflation rate, a savings driven economy sees credit stimulation fuelling savings rather than consumption, providing capital for manufacturing improvements instead of raising consumer prices. Keynes’s savings paradox — another fatal error — actually points towards the opposite of economic and price stability. 

It is over interest rate management that central banks prove their worthlessness. Even if they had a Damascene conversion, bureaucrats in a government department can never impose decisions that can only be efficiently determined by market forces. It is the same fault exhibited in communist regimes, where the state tries to manage the supply of goods— and we know, unless we have forgotten, the futility of state direction of production. It is exactly the same with monetary policy. Just as the conditions that led the communists to build an iron curtain to prevent their reluctant subjects escaping from authoritarianism, there should be no monetary policy.

Instead, when things don’t go their way, like the communists, bureaucrats double down on their misguided policies suppressing the evidence of their failures. It is something of a miracle that the economic consequences have not been worse. It is testament to the robustness of human action that when officialdom places mountainous hurdles in its path ordinary folk manage to find a way to get on with their lives despite the intervention.

Eventually, the piper must be paid. Misguided interest rate policies led to their suppression to the zero bound, and for the euro, Japanese yen, and Swiss franc, even unnaturally negative deposit rates. Predictably, the distortions of these policies together with central bank credit inflation through quantitative easing are leading to pay-back time. 

Rapidly rising commodity, producer and consumer prices, the consequences of these policy mistakes, are in turn leading to higher time preference discounts. Finally, markets have wrested currency and credit valuations out of central banks’ control, as it slowly dawns on market participants that the whole interest rate game has been an economic fallacy. Foreign creditors are no longer prepared to sit there and accept deposit rates and bond yields which do not compensate them for loss of purchasing power. Time preference is now mauling central bankers and their cherished delusions. They have lost their suppressive control over markets and now we must all face the consequences. Like the fate of the Berlin Wall that had kept Germany’s Ossies penned in, monetary policy control is being demolished.

With purchasing powers for the major currencies now sinking at a more rapid rate than current levels of interest rate and bond yield compensation, the underlying trend for interest rates is now rising and has further to go. Official forecasts that inflation at the CPU level will return to the targeted 2% in a year or two are pie in the sky. 

While Nero-like, central bankers fiddle commercial banks are being burned. A consequence of zero and negative rates has been that commercial bank balance sheet leverage increased stratospherically to compensate for suppressed lending margins. Commercial bankers now have an overriding imperative to claw back their credit expansion in the knowledge that in a rising interest rate environment, their unfettered involvement in non-banking financial activities comes at a cost. Losses on financial collateral are mounting, and the provision of liquidity into mainline non-financial sectors faces losses as well. And when you have a balance sheet leverage ratio of assets to equity of over twenty times (as is the case for the large Japanese and Eurozone banks), balance sheet equity is almost certain to be wiped out.

The imperative for action is immediate. Any banker who does not act with the utmost urgency faces the prospect of being overwhelmed by the new interest rate trend. The chart below shows that the broadest measure of US money supply, which is substantially the counterparty of bank credit is already contracting, having declined by $236bn since March.

Contracting bank credit forces up interest rates due to lower credit supply. This is a trend that cannot be bucked, a factor that has little directly to do with prices. By way of confirmation of the new trend, the following quotation is extracted from the Fed’s monthly Senior Loan Officers’ Opinion Survey for October:

“Over the third quarter, significant net shares of banks reported having tightened standards on C&I [commercial and industrial] loans to firms of all sizes. Banks also reported having tightened most queried terms on C&I loans to firms of all sizes over the third quarter. Tightening was most widely reported for premiums charged on riskier loans, costs of credit lines, and spreads of loan rates over the cost of funds. In addition, significant net shares of banks reported having tightened loan covenants to large and middle-market firms, while moderate net shares of banks reported having tightened covenants to small firms. Similarly, a moderate net share of foreign banks reported having tightened standards for C&I loans.

“Major net shares of banks that reported having tightened standards or terms cited a less favourable or more uncertain economic outlook, a reduced tolerance for risk, and the worsening of industry-specific problems as important reasons for doing so. Significant net shares of banks also cited decreased liquidity in the secondary market for C&I loans and less aggressive competition from other banks or nonbank lenders as important reasons for tightening lending standards and terms.”

Similarly, credit is being withdrawn from financial activities. The following chart reflects collapsing credit levels being provided to speculators.

In the same way that the withdrawal of bank credit undermines nominal GDP (because nearly all GDP transactions are settled in bank credit) the withdrawal of bank credit also undermines financial asset values. And just as it is a mistake to think that a contraction of GDP is driven by a decline in economic activity rather than the availability of bank credit, it is a mistake to ignore the role of bank credit in driving financial market valuations.

The statistics are yet to reflect credit contraction in the Eurozone and Japan, which are the most highly leveraged of the major banking systems. This may be partly due to the rapidity with which credit conditions are deteriorating. And we should note that the advanced socialisation of credit in these two regions probably makes senior managements more beholden to their banking authorities, and less entrepreneurial in their big-picture awareness than their American counterparts. Furthermore, the principal reason for continued monetary expansion reflects both the euro-system and the Bank of Japan’s continuing balance sheet expansion, which feed directly into the commercial banking network bolstering their balance sheets. It is likely to be state-demanded credit which overwhelms the Eurozone and Japan’s statistics, masking deteriorating changes in credit supply for commercial demand. 

The ECB and BOJ’s monetary policies have been to compromise their respective currencies by their continuing credit expansion, which is why their currencies have lost significant ground against the dollar while US interest rates have been rising. Adding to the tension, the US’s Fed has been jawing up its attack on price inflation, but the recent fall in the dollar on the foreign exchanges strongly suggests a pivot in this policy is in sight.

The dilemma facing central banks is one their own making. Having suppressed interest rates to the zero bound and below, the reversal of this trend is now out of their control. Commercial banks will surely react in the face of this new interest rate trend and seek to contract their balance sheets as rapidly as possible. Students of Austrian business cycle theory will not be surprised at the suddenness of this development. But all GDP transactions, with very limited minor cash exceptions at the retail end of gross output are settled in bank credit. Inevitably the withdrawal of credit will cause nominal GDP to contract significantly, a collapse made more severe in real terms when the decline in a currency’s purchasing power is taken into consideration.

The choice now facing bureaucratic officialdom is simple: does it prioritise rescuing financial markets and the non-financial economy from deflation, or does it ignore the economic consequences of protecting the currency instead? The ECB, BOJ and the Bank of England have decided their duty lies with supporting the economy and financial markets. Perhaps driven in part by central banking consensus, the Fed now appears to be choosing to protect the US economy and its financial markets as well. 

The principal policy in the new pivot will be the same: suppress interest rates below their time preference. It is the policy mistake that the bureaucrats always make, and they will double down on their earlier failures. The extent to which they suppress interest rates will be reflected in the loss of purchasing power of their currencies, not in terms of their values against each other, but in their values with respect to energy, commodities, raw materials, foodstuffs, and precious metals. In other words, a new round of higher producer and consumer prices and therefore irresistible pressure for yet higher interest rates will emerge.

The collapse of the everything bubble

The flip side of interest rate trends is the value imparted to assets, both financial and non-financial. It is no accident that the biggest and most widespread global bull market in history has coincided with interest rate suppression to zero and even lower over the last four decades. Equally, a trend of rising interest rates will have the opposite effect.

Unlike bull markets, bear markets are often sudden and shocking, especially where undue speculation has been previously involved. There is no better example than that of the cryptocurrency phenomenon, which has already seen bitcoin fall from a high of $68,000 to $16,000 in twelve months. And in recent days, the collapse of one of the largest crypto-exchanges, FTX, has exposed both hubris and alleged fraud, handmaidens to extreme public speculation, on an unimaginable scale. For any student of the madness of crowds, it would be surprising if the phenomenon of cryptocurrencies actually survives.

Driving this volte-face into bear markets is the decline in bond values. On 20 March 2020, when the Fed reduced its fund rate to zero, the 30-year US Treasury bond yielded 1.18%. Earlier this week the yield stood at 4.06%. That’s a fall in price of over 50%. And time preference suggests that short-term rates, for example over one year, should currently discount a loss of currency’s purchasing power at double current rates, or even more.

For the planners who meddle with interest rates, increases in rates and bond yields on that scale are unimaginable. Monetary policy committees, being government agencies, will think primarily about the effect on government finances. In their nightmares they can envisage tax revenues collapsing, welfare commitments soaring, and borrowing costs mounting. The increased deficit, additional to current shortfalls, would require central banks to accelerate quantitative easing without limitation. To the policy planners, the reasons to bring interest rates both lower and back firmly under control are compelling.

Furthermore, officials believe that a rising stock market is necessary to maintain economic confidence. That also requires the enforcement of a new declining interest rate trend. The argument in favour of a new round of interest rate suppression becomes undeniable. But the effect on fiat currencies will accelerate their loss of purchasing power, undermining confidence in them and leading to yet higher interest rates in the future.

Either way, officialdom loses. And the public will pay the price for meekly going along with these errors.

Managing counterparty risk

Any recovery in financial asset values, such as that currently in play, is bound to be little more than a rally in an ongoing bear market. We must not forget that commercial bankers have to reduce their balance sheets ruthlessly if they are to protect their shareholders. Consequently, as over-leveraged international banks are at a heightened risk of failing in the new interest rate environment, their counterparties face systemic risks increasing sharply. To reduce exposure to these risks, all bankers are duty bound to their shareholders to shrink their obligations to other banks, which means that the estimated $600 trillion of notional over the counter (OTC) derivatives and on the back of it the additional $50 trillion regulated futures exchange derivatives will enter their own secular bear markets. OTC and regulated derivatives are the children of falling interest rates, and with a new trend of rising interest rates their parentage is bound to be tested.

We can now see a further reason why central banks will wish to suppress interest rates and support financial markets. Unless they do so, the risk of widespread market failures between derivative counterparties will threaten to collapse the entire global banking network. And that is in addition to existential risks from customer loan defaults and collapsing collateral values. Central banks will have to stand ready to rescue failing banks and underwrite the entire commercial system. 

To avert this risk, they will wish to stabilise markets and prevent further increases in interest rates. And all central banks which have indulged in QE already have mark-to-market losses that have wiped out their own balance sheet equity. We now face the prospect of central banks that by any commercial measure are themselves financially broken, tasked with saving entire commercial banking networks.

When the trend for interest rates was for them to fall under the influence of increasing supplies of credit, the deployment of that credit was substantially directed into financial assets and increasing speculation. For this reason, markets soared while the increase in the general level of producer and consumer prices was considerably less than the expansion of credit suggested should be the case. That is no longer so, with manufacturers facing substantial increases in their input costs. And now, when they need it most, bank credit is being withdrawn. 

It is not generally realised yet, but the financial world is in transition between economies being driven by asset inflation and suppressed commodity prices, and a new environment of asset deflation while commodity prices increase. And it is in the valuations of unanchored fiat currencies where this transition will be reflected most.

Physical commodities are set replace paper equivalents

The expansion of derivatives when credit was expanding served to soak up demand for commodities which would otherwise have gone into physical metals and energy. In the case of precious metals, this is admitted by those involved in the expansion of London’s bullion market from the 1980s onwards to have been a deliberate policy to suppress gold as a rival to the dollar. 

According to the Bank for International Settlements, at the end of last year gold OTC outstanding swaps and forwards (essentially, the London Bullion Market) stood at the equivalent of 8,968 tonnes of bullion, to which must be added the 1,594 tonnes of paper futures on Comex giving an identified 10,662 tonnes. This is considerably more than the official reserves of the US Treasury, and even its partial replacement with physical bullion will have a major impact on gold values. Silver, which is an extremely tight market, is most of the BIS’s other precious metal statistics content and faces bullion replacement of OTC paper in the order of three billion ounces, to which we must add Comex futures equivalent to a further 700 million ounces. 

On the winding down of derivative markets alone, the impact on precious metal values is bound to be substantial. Furthermore, the common mistake made by almost all derivative traders is to not understand that legal money is physical gold and silver — despite what their regulating governments force them to believe. What they call prices for gold and silver are not prices, but values imparted to legal money from depreciating currencies and associated credit. 

While it may be hard to grasp this seemingly upside-down concept, it is vital to understand that so-called rising prices for gold and silver are in fact falling values for currencies. Some central banks, predominantly in Asia are taking advantage of this ignorance, which is predominantly displayed in western, Keynesian-driven derivative markets.

Perhaps after a currency hiatus and when market misconceptions are ironed out, we can expect legal money values to behave as they should. If a development which is clearly inflationary emerges, it should drive currency values lower relative to gold. But instead, in today’s markets we see them rise because speculators take the view that currencies relative to gold will benefit from higher interest rates. A pause for thought should expose the fallacy of this approach, where the true relationship between money and currencies is assumed away.

In the wake of the suspension of the Bretton Woods agreement and when the purchasing power of currencies subsequently declined, interest rates and the value of gold rose together. In February 1972, gold was valued at $85, while the Fed funds rate was 3.3%. On 21 January 1980 gold was fixed that morning at $850, and the Fed funds rate was 13.82%. When gold increased nine-fold, the Fed’s fund rate had more than quadrupled. And it required Paul Volcker to raise the funds rate to over 19% twice subsequently to slay the inflation dragon. 

In the seventies, the excessive credit-driven speculation that we now witness was absent, along with the accompanying debt leverage in the financial sectors of western economies and in their banking systems. A Volcker-style rise in interest rates today would cause widespread bankruptcies and without doubt crash the entire global banking system. While markets might take us there anyway, as a deliberate act of official policy it can be safely ruled out. 

We must therefore conclude that there is another round of currency destruction in the offing. Potentially, it will be far more extensive than anything seen to date. Not only will central-bank currency and QE expansion fund government deficits and attempt to compensate for the contraction of bank credit while supporting financial markets by firmly suppressing interest rates and bond yields, but insolvent central banks will be tasked with underwriting insolvent commercial banks.

At some stage, the inversion of monetary reality, where legal money is priced in fiat, will change. Instead of legal money being priced in fiat, fiat currencies will be priced in legal money. But that will be the death of the fiat swindle.

Source: Zerohedge

World Of Currency image by Silicon Forest

The post Macleod: The Upside-Down World Of Currency appeared first on Lions Gate Digital.


OpenID

Initiating User Registration via OpenID Connect is now a Final Specification

The OpenID Foundation membership has approved the following OpenID Connect specification as an OpenID Final Specification: Initiating User Registration via OpenID Connect A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision. The Final Specification is available at: https://openid.net/specs/openid-connect-prompt

The OpenID Foundation membership has approved the following OpenID Connect specification as an OpenID Final Specification:

Initiating User Registration via OpenID Connect

A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision.

The Final Specification is available at:

https://openid.net/specs/openid-connect-prompt-create-1_0-final.html

The voting results were:

Approve – 48 votes Object – 0 votes Abstain – 11 votes

Total votes: 59 (out of 270 members = 21.9% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Initiating User Registration via OpenID Connect is now a Final Specification first appeared on OpenID.

Friday, 02. December 2022

FIDO Alliance

Infosecurity Magazine: November’s M&A News Roundup

On November 3, 1Password announced the acquisition of passwordless authentication company Passage. The move will enable 1Password to accelerate the adoption of passkeys for developers, businesses and their customers. The […] The post Infosecurity Magazine: November’s M&A News Roundup appeared first on FIDO Alliance.

On November 3, 1Password announced the acquisition of passwordless authentication company Passage. The move will enable 1Password to accelerate the adoption of passkeys for developers, businesses and their customers. The deal follows 1Password joining the open industry association FIDO Alliance earlier in 2022.

The post Infosecurity Magazine: November’s M&A News Roundup appeared first on FIDO Alliance.


Project VRM

ESC

VRM Day had an extraordinary outcome this time: a movement to end surveillance capitalism. The movement began with a talk by Roger McNamee titled Saving us from Big Tech: the Gen Z Solution. It was the latest in the Ostrom Workshop‘s Beyond the Web salon series, which on this occasion took place live and in person […]

VRM Day had an extraordinary outcome this time: a movement to end surveillance capitalism.

The movement began with a talk by Roger McNamee titled Saving us from Big Tech: the Gen Z Solution. It was the latest in the Ostrom Workshop‘s Beyond the Web salon series, which on this occasion took place live and in person simultaneously in the Computer History Museum‘s Boole room and on the Web via Owl and Zoom, through the Workshop at Indiana University, where people also participated in a room and virtually. You can see the first hour of the talk here.

The conversation with Roger was super-energized, continued well past the scheduled hour, and onward through breakout sessions on each of the three days that followed at the Museum during IIW, and since then on Signal and Zoom. The conversation informally called itself “Roger and We,” and it vectored toward what it says on the t-shirt design above, drawn on a whiteboard during the third of the IIW sessions: End Surveillance Capitalism or ESC. (Also implying ESCape). One of us at the session created this graphic—

—and used it to create this t-shirt at Zazzle.com:

He’s bought a number of them, so far, because when he wore the first to Thanksgiving dinner, other people there also wanted one. In the spirit of freedom and openness, please feel free to use the same graphic (which, if you drag it off, is quite large ), or something like it, to make one or more of your own. Or run with it any way you please. Movements work that way.

This is where I pause and thank Shoshana Zuboff for making surveillance capitalism a full-sized Thing. Also to Brett Frishcmann and Evan Sellinger for explaining what it does to all of us, personally.

Where this goes is up to the group, which is small, growing, and gathering weekly in virtual space while corresponding asynchronously as well. It’s still small but growing.

To succeed, its fire needs to be so large and hot that profiting by tracking people will fail because neither people nor regulators will put up with it. It is also sobering to know that similar efforts to end surveillance capitalism have faltered in the past (which is still now), in spite of the simple fact that spying on people without their clear invitation (not mere “consent”) or a court order is wrong on its face, regardless of the purposes to which that spying is put.

We talked about lots of other stuff during VRM Day, of course. For example, Don Marti led a session on the W3C’s Private Advertising Technology Community Group, which he encouraged everyone in the room to join. (Please do.)

But the main outcome was ESC.

Now, some background for those not familiar with ProjectVRM.

From its start at the Berkman Klein Center in 2006, ProjectVRM has had (says here) “the immodest ambition of turning business on its head — for its own good, and for everyone else’s as well.” Perhaps ESC will be the thing to do that, after sixteen years of encouraging countless other efforts, some of which are listed here. (There is no easy way to keep up with all of them.)

If you’re interested in joining this cabal, write to me (the email is doc @ my last name dot com). You can also follow along on the ProjectVRM mailing list.

 

 


FIDO Alliance

Silicon: Cybersecurity: automation, to the delight of attackers?

Automation stands out as one of the major trends in the predictions that cybersecurity market players are issuing for 2023. The post Silicon: Cybersecurity: automation, to the delight of attackers? appeared first on FIDO Alliance.

Automation stands out as one of the major trends in the predictions that cybersecurity market players are issuing for 2023.

The post Silicon: Cybersecurity: automation, to the delight of attackers? appeared first on FIDO Alliance.


Global Security Mag: FIDO Alliance Predictions 2023

Andrew Shikiar, Executive Director and CMO of the FIDO Alliance delivers his authentication and cybersecurity predictions for the year 2023. The post Global Security Mag: FIDO Alliance Predictions 2023 appeared first on FIDO Alliance.

Andrew Shikiar, Executive Director and CMO of the FIDO Alliance delivers his authentication and cybersecurity predictions for the year 2023.

The post Global Security Mag: FIDO Alliance Predictions 2023 appeared first on FIDO Alliance.


ChannelPartner: How Apple’s password alternative works in practice

Passkeys are not an Apple invention: The new standard, also called WebAuthn, is backed by the so-called FIDO Alliance and the W3C. However, it is supported by the three platforms […] The post ChannelPartner: How Apple’s password alternative works in practice appeared first on FIDO Alliance.

Passkeys are not an Apple invention: The new standard, also called WebAuthn, is backed by the so-called FIDO Alliance and the W3C. However, it is supported by the three platforms Android, Windows and iOS/macOS, and Apple, Google and Microsoft are working together on this. Passkeys are intended to completely replace passwords and enable simple and secure login to websites and applications.

The post ChannelPartner: How Apple’s password alternative works in practice appeared first on FIDO Alliance.

Thursday, 01. December 2022

Digital Scotland

How Zoom Put the Boom in Partner Engagement and Demand Generation with Personalized Video

A webinar from StructuredWeb on how organizations like Zoom are leveraging personalized video marketing to drive 8x higher conversion and 5x higher engagement rates. The post How Zoom Put the Boom in Partner Engagement and Demand Generation with Personalized Video appeared first on digitalscot.net.

In this video Steven Kellam walks the audience through how Zoom put the boom in partner engagement and demand generation with personalized video.

At 00:10, he introduces himself as the Chief Revenue Officer at StructuredWeb, and from 00:42 introduces the guest speakers for this webinar: Gerard Suppa, Director of Product Management at StructuredWeb and Joan Morales, Head of Partner Marketing at Zoom.

At 01:25, Steven outlines the 3 takeaways of the webinar. The first one is ‘what is a Personalized video and why is it an imperative?’

The second one is ‘How can channel marketers easily enable Personalized video for their partners?’ and the third is ‘The real demonstration of Personalized video from both partner’s and vendor’s perspectives’.

Overview of StructuredWeb

From 02:28, Steven gives a brief overview of StructuredWeb: They provide a powerful and easy-to-use channel marketing automation platform, and he outlines their three key differentiators:

The first is Partner Experience, the second is that StructuredWeb is entirely focused on channel marketing, and the third differentiator is that StructuredWeb is a data-driven organization.

From 04:01 Steven describes a snapshot of the tools and capabilities that StructuredWeb enables for channel marketers and partners.

There are 20 tools that are further categorized into 5 categories of marketing tools, digital advertising, content syndication, and customizable assets. Among customizable assets, one of the tools is a Personalized video, which this webinar explains.

At 05:08 Steven shows their hybrid engagement model and the three ways StructuredWeb allows vendors to interact with their partners.

The first and the simplest way is a self-serve portal, the second is a marketing concierge where they execute and build campaigns for the partners, and the third is an agency marketplace which is for high-potential strategic partners.

At 05:58, he shows the brands and technology partners that they work with. These include IBM, Zoom, Google, etc.

Personalized Video

From 06:41 Steven invites Gerard to explain what is a Personalized video from a partner’s perspective.

Gerard explains that the main concept of personalized video is to infuse video with your company branding into a video, and as you would expect the key element is to personalize the message that you are sending in a video. This personal touch will help in connecting with customers, increase email conversions, and improve customer satisfaction.

At 07:53 Steven invites Joan from Zoom to talk about why personalized video is so critical to them. Joan states that being a part of Zoom, video is naturally very critical for everything they do from a communication perspective.

Secondly, it provides partners a way to reach out to their customers, and thirdly in many cases partners need to sell complex solutions and a video can best explain the details, such as whether it is a subscription service or a product.

From 09:24 Steven adds to Joan’s perspective on the importance of video for the faster transferring of complex messages to customers. Joan also states that video can now be inserted in many places on the web including all social media and other platforms.

Vendor Demonstration

From 11:05 Steven hands the discussion over to Joan, who will walk the audience through personalized video from a vendor’s perspective.

11:28 – Joan takes the stage and starts with the partner demand center that they have developed for their partners, sharing his screen to show the environment that they have built. At Zoom, they didn’t have a co-marketing platform for scale and so they built it with StructuredWeb.

On this platform, they have a section for campaigns that partners can leverage. Multiple campaigns are available here and more are created regularly. Partners can also filter that content easily or can activate any of the campaigns and it includes the facility for translating any of the campaigns into any language. In these campaigns, you have landing pages, emails, social media, banners, and everything that will be part of the campaign.

You can also edit and personalize your campaigns, and download them. All of this is available in campaigns and the asset library section. In the asset library, the partners can come and find an asset that they may need, where they can search for anything and the results will be filtered. Partners can also filter these results by language, solution, type of asset, and campaign focus. The important thing is that you can also share these results with one of your partners.

In this partner demand center, they also have social syndication, web syndication, event syndication, and video marketing. Finally, if partners need any sort of help, everything is present in the knowledge hub. Partners can receive help via the marketing concierge service and support is provided in their region from a local member of their team.

How does Video Personalization work?

At 18:52 Steven turns to Gerard to walk the audience through how simple it is to make video personalization work from both the vendor and the partner side.

From 19:02 Gerard shares his partner’s demand center and shows the video section in the navigation bar. Here, not only is the video library present but you can also go back to any previous videos that they have customized previously.

In the video library, we can see a plethora of videos that have already been produced. There are professional marketing videos with some of them having language support. We can preview any of the videos and can also perform certain actions with videos. We can share them via direct link, embed videos on a website, and also download a particular video.

At 21:10 Gerard clicks on customizing the video option, which takes a couple of seconds, and makes a customized version of this. Gerard also mentions the key aspect of the interface, i.e., simplicity and ease of use. In the customized options panel, there are a few more options that we can use to personalize this video. We can edit the intro scene message, which will appear at the start of the video.

If we want to further personalize that video, we can add the person’s name whom we are going to address. There is also an option for the outro where we can put another message. We can also select the music for the intro and outro. In the last option, we can select the background color intro and outro according to our branding.

At 24:40 he applies changes to the setting made on the video. During this time, on the right-hand side of the video preview, we can see the changes being processed and applied to the customized video. After the video is ready to be used, we can then preview it. Once the video completes, we can see that the outro also contains our personalized information pulled in automatically. All of this information is pulled from our partner profile within the platform.

From 27:10 Steven asks Gerard a question about whether one can embed a Calendly link in the outro, for a clear call to action? Gerard then says that it can be displayed on the video but it would not be an active link on the video itself. However, it can be included as a visual piece for customers.

At 27:54, Gerard explains the next step once we have done customizing our video. Once we are done with our video editing, we can then go to the share tab. On the share tab, there are three different options on how we can use this video. The first is a direct share link, the second is embedding the video on a website, or we can also download the video.

Wrap Up

At 29:10 Steven concludes the webinar by pulling up some numbers for the audience. Steven asks Christopher to show some facts and figures, who highlights that almost 87% of businesses use video for marketing purposes. 84% of them say that video helps them generate more leads, and 78% say that it increased sales.

Demonstrating the power of StructuredWeb, for personalized video these facts become even better. It generated 8x higher conversion rates and 5x higher engagement rates.

At 30:00 Steven outlines the five key takeaways for partners and vendors.

The first one is the ease of use both from the vendor and partner’s perspectives. The second was brand integrity, third was personalized messaging where we can send messages personalized to our users. The fourth was co-branding by which we can use our brand color and company logo. Joan explains the fifth one which is amplification reach by which partners and vendors can both experience increased business success.

The post How Zoom Put the Boom in Partner Engagement and Demand Generation with Personalized Video appeared first on digitalscot.net.


DIF Blog

DIF Member Spotlight: PassiveBolt's Kabir Maiga

In the first of our series of upcoming member spotlights we interviewed PassiveBolt’s co-founder Kabir Maiga. His company is an Associate Member of DIF and he shares some of PassiveBolt’s work in advancing decentralized access as a Tier1 supplier. TRANSCRIPT: Limari: Welcome Kabir: Thanks for having

In the first of our series of upcoming member spotlights we interviewed PassiveBolt’s co-founder Kabir Maiga. His company is an Associate Member of DIF and he shares some of PassiveBolt’s work in advancing decentralized access as a Tier1 supplier.

TRANSCRIPT:

Limari: Welcome

Kabir: Thanks for having me.

Limari: What I always like to ask people just in the decentralized space is really what's your journey, because decentralized identity, it is relatively new and people have all sorts of interesting stories about how they came into this space. So I just wanted to maybe get your background and how you came to work on what you are developing at this moment?

Kabir: Yeah. Oh, absolutely. So I started out as an engineer, gosh I guess I'm getting old, I graduated almost 14 years ago now. Wow. And yeah, electrical engineer, worked in the automotive industry, worked in investment banking, as a software engineer, went and got a master's degree in business from Ross at the University of Michigan, and really ended up going back into the automotive industry where I got into access control. That's basically keyless entry systems for cars. You're probably familiar with some of the tech that I used to work on, which is touching the vehicle door handle to unlock, not using your phone to control access to your car, things like that. It was a great experience, great team. We worked on tech that eventually has gotten deployed in over 120 million vehicles globally. So a really exciting project. But it's really working on that project I think I had an epiphany or idea which is, the automotive industry at the time, this was the 2016-18 timeframe, seemed to be ahead of the rest in access control, in terms of  its technology stack. I think one of the reasons why that was the case is because it had an entire Tier 1 class of companies that basically innovated and supplied OEMs. (Original Equipment Manufacturers) So OEMs never had to worry about designing the latest closure systems, they would just source it from innovative suppliers. So really bringing that type enabling class of company to the rest of access control was what I set out to do, and worked to spin off with my core team out of continental automotive, where I used to work, which is a 50 billion euro automotive Tier 1 one of the leading Tier 1s. I was able to get the blessing to really spin off and start PassiveBolt as a separate company with the express intent of really looking at how do we enable companies that make products and solutions, security solutions for controlling how people get in and out of secure spaces?

One opportunity that we saw is really providing that tier one base to security industry companies that design access control solutions across the board. In doing that, we quickly realize how fragmented it all is. A lot of it had to do with systems being siloed. Each system is behind its own login. I  don't care if you bought a smart safe, to unlock it you need an app, or  to unlock your smart lock you added on some door. It doesn't matter what you do. If I sat down and said what would it all look like in five, ten or twenty years down the line, I can imagine owning 100 apps just to access physical spaces.  Because as hotels, car rentals, you know all physical spaces became smart, I will just be onboarding and creating digital identities in several different silos and my data living on all of those systems. That's really what got us into decentralized identity, because that clearly was not a future that I wanted to live in. So we really transformed PassiveBolt into a web3 focused company that's really pioneering access control by leveraging an identity meta system. We're creating and establishing basically the identity layer, the missing identity layer in physical space network communications. That's for the security industry more specifically. In other words, we basically want to enable identity wallets, SSI or decentralized identity based wallets to unlock any and all physical spaces. One obvious application of digital identity is in machine to machine interactions, you know, password-less login or supply chain or provenance, etc. IoT is a great opportunity as well, in the sense that we do live in a physical world and access in physical spaces, it means that if I'm leveraging my digital identity to do so, I need protocols in place to allow me or my wallet to talk to physical spaces, right to be able to convey a verifiable credential that shows that I have access to a specific space.

Limari: Yeah that's very interesting. I think for a lot of people, the  issue of identity, it's kind of front and center for them. I think just the general populace are confronted with the problems of that. It's interesting just imagining a world where you need 100 different apps to access everything that you own, all spaces that you own and just how unsustainable that is. I think that some people may be wondering, “How is this different from my keyless entry I already have, or  is what you're doing better than what I have. You highlight a very important point in that, as you keep growing this centralized model, it becomes very unsustainable. But I would be interested, I mean, our audience, a lot of them are involved with deep tech. If you did want to go a little bit more in depth on the decentralized aspect of PassiveBolt, feel free to fill some of that in some more, and then maybe, what are some of the interests you have in the decentralized identity community and how it intersects that?

Kabir: Yeah so we've basically created a technology stack from the root, meaning the blockchain layer all the way up to physical devices for physical spaces. So you know the bottom layer is obviously the identity meta system itself so the root of trust. We created a blockchain that is purpose built for digital identity. So really it's effectively a registry, a verifiable data registry for digital identifiers, but specifically addresses the needs of the security industry. We could have gone with an existing platform, or an existing blockchain, I should say. But we wanted a blockchain that's built for and by the security industry, that all manufacturers of physical security systems can leverage. So consider that a common public registry for the security industry. The second layer, and I should say that, first, the bottom layer, was obviously put in, in the public domain, it's a public utility for the security industry. So PassiveBolt obviously conceived of it, built it, but again, it is intended to be governed and maintained by the security industry itself as a whole.  To clarify maybe for some of our viewers, by security industry I'm referring specifically to manufacturers of access control systems hardware like locks, etc, or software solutions like physical access control software for issuing key cards to people or mobile keys to people, things like that. The second layer, obviously once you have a registry is now you have to have the software stack to go with it. So  there's the identity wallet, and you have the attestation infrastructure, or to use the W3C language, the verifiable credential layer. So we have those two layers above where we basically created protocols that allow the use of verifiable credentials, digital agents and, or wallets, containing wallets, to be able to communicate with physical devices. So in essence, any lock that can do cryptography can be made into a web3 device.

One thing that was important to us is not imagining every manager of physical space having to rip out their hardware. That might be counterproductive, and the investment would be significant to be able to deploy decentralized identity in terms of physical access. So our solution is really to create an offline way of being able to utilize a verifiable credential to unlock physical spaces and that's what we did.  So that layer protocol that we've created at PassiveBolt really allows any and all digital agents to be able to achieve that. Our identity wallet that we created  is just one first implementation on top of that blockchain that I mentioned. It allows you to set up your PII  to be able to receive verifiable credentials and be able to share them. PII being personally identifiable information. But what's great about this is there was a time where if you wanted access to physical spaces, your only option was to surrender PII somewhere in exchange for a keycard. So here's who I am, put it in a database that you get to store and I get a key card and I can badge into that hotel room or office or whatever. Then we switched to doing it over mobile using mobile keys, but then we never really did fix that identity piece of it. Now I'm logging, I'm registering, I'm surrendering PII to create an account number which is effectively a digital identity into that siloed system. Then I can receive a digital key. In this paradigm, leveraging decentralized identity, you don't have to log into anything of course with your identity while you're self sovereign. Using just your identifier, we have protocol,  where you can basically be assigned access to a space, without sharing any PII, without the access provider needing to store any personal information about you. It makes it interoperable. So you can get into, with your identity wallet, you can get into any and all web3 capable physical spaces.

The team at PassiveBolt

Limari: That's amazing. Maybe you can give us just kind of a brief idea, maybe  a vision for people someday using this technology, this is how you're going to be able to go through spaces, maybe kind of like a real life use case.

Kabir:  Here's what I want you to be able to do. I want you a couple years down the road, to be able to grab your identity wallet, and book a vacation somewhere using your identity wallet. You publish your need. So looking for a hotel, let's say you're going to New York City, looking for a hotel and a car rental in New York City for these days. You get offers, private offers from hotels and  travel service providers, generally speaking, so that they are now able to give you a direct offer via DID to DID communication, which means they're not giving you a rate that's public. So they can be more aggressive to get your business to improve their occupancy rate. You're able to book seamlessly and deal with a direct relationship with a service provider. Then you show up at the airport, I want you to be able to just tap your device and prove your identity. You are who you say you are, and it's verified. Obviously, for government purposes, let's throw in two factor authentication or facial recognition. You get to the plane, you've got your plane ticket that was delivered in that booking that you did with one click and you're able to just tap again, to just prove that you are who you are, and you walk onto the plane with your seat. You land in New York City. I want you to just be able to walk onto the Hertz parking lot or company alpha and just go straight to spot seven B where they told you your vehicle is parked and you tap and it unlocks. You go in, you push a button, it starts, no key exchange no need to interact with anyone perhaps. You drive to hotel beta, at that hotel beta, you walk in, you go to room 402, no stops needed, unless you want to say hello to somebody. If you tap your identity wallet again, the room unlocks and you go in.  All of this is possible because from when you did your booking, be it through the wallet or through an aggregator, doesn't have to be through the wallet, you were delivered verifiable credentials through that device, through that wallet. Those verifiable credentials have allowed you to unlock all of those physical spaces along the way. I really emphasize access to physical spaces, because there's a tendency to always focus on virtual spaces like apps, websites and the like. I think there's significant value as we build decentralized identity ecosystems to really capture physical spaces, because that's where we live. This whole journey that I described, your trip from California to New York is you needing to get into all the physical spaces in between to get to your hotel room, and then to maybe that concert later that evening. It's all physical spaces. That's really what we want to help shape and build.

Limari: That's really really amazing and I know that for a lot of people not working within this space, on this technology, some of this is very hard to grasp because we are so shaped by just the centralized system that we are living in today. The fact that we are able to do that without giving away all of this personally identifiable information, it's just amazing that we can have that capability. So one last thing I'm curious to hear your thoughts about is just in terms of the community at DIF, if there's been any work items, like maybe you've had your eye on that you find are interesting here in this space that you think might contribute to what you're doing at PassiveBolt?

Kabir: I think  many of the working groups are phenomenal, messaging is one that comes to mind. I think the group that I've been particularly interested in and really participated in every week is the Travel SIG within DIF. The Travel SIG is currently working on travel profiles, which is really a way to simplify how you communicate information data to travel providers. So that experience I had described between California and in New York your ability to do it with a single click, just give them the information they need, authorized access to information, they need to provide you of service, but in a way that preserves PII and preserves privacy, that's key. The work that DIF is doing, I think is to come up with a standard. We have a way we're doing it at PassiveBolt, but working through DIF we’re hoping to have a standardized way of doing it.  I think all the great minds you have there, you know  people with significant industry experience, travel experience, that we're able to work together with. As technologists, there's nothing better than to really hear direct input from people who've been in the industry for 30, 40 years to make sure that the systems we're designing are taking into account all the different dynamics in the space. So I think that SIG is particularly interesting and work product of a traveler profile schema that will come out of it will be very beneficial, not just to PassiveBolt, but all companies and all entities looking to help us or help everyone really design the traveler experience of the future.

Limari: Just to clarify for our audience who may not be familiar with our open groups at DIF, Kabir is referring to the Hospitality and Travel Special Interest group. It's one of the groups that we have at DIF that anybody can just hop on, and you can join in the discussion there. So awesome. Well, thank you so much. It's been such an interesting conversation. I truly enjoyed learning more about PassiveBolt, what you're doing and how this technology will be applied to travel and to going between spaces, accessing our homes, hotels, etc. So,  truly enlightening. Also, maybe if you like to let people know how they can learn more about PassiveBolt.

Kabir: Go to PassiveBolt.com or you can reach out to me if you want to connect Kabir [at] passivebolt.com

Limari: Thank you so much. Once again, I appreciate you taking the time. And thank you everyone who decided to join us today and watch and we'll have many more interviews to come very soon. So thanks.

Kabir: Thank you

You can learn more at https://passivebolt.com/

If you would like to become a member of the Decentralized Identity Foundation join us at https://identity.foundation/

Wednesday, 30. November 2022

DIF Blog

Executive Director: DIF is hiring!

DIF is hiring a full-time Executive Director to advance DIF’s objectives and grow membership. Interviewing immediately for a Jan. 2023 start!

The end of the year gives us a chance to reflect, but also invites us to consider where we go next. Are you contemplating a new challenge for 2023? Looking to explore emerging and rapidly-growing technology fields?

DIF is looking for an engaged, experienced professional as the new Executive Director of the Decentralized Identity Foundation. This remote, full-time role will involve representing DIF, growing our membership and delivering on our core objectives. We are interviewing immediately, for a January 2023 start!

Click through for a detailed job description here.

Apply directly via LinkedIn here or by sending a CV and cover letter to jobs@identity.foundation.

Feel free to share this opportunity with your network, and don't hesitate to contact us with any questions!

org/Executive-Director.md at master · decentralized-identity/org DIF docs, wiki, and organizational material. Contribute to decentralized-identity/org development by creating an account on GitHub. GitHubdecentralized-identity DIF Executive Director job description

Content Authenticity Initiative

Introducing Eyes on Provenance, a spotlight on the people building trust and transparency online

This month, we spoke to the founders at Pixelstream, a platform to help make verifying and sharing authentic media easier and more accessible. Learn about

This month, we spoke to the founders at Pixelstream, a platform to help make verifying and sharing authentic media easier and more accessible. Learn about their journey in provenance implementation and how other developers may get started and contribute. Read more.


FIDO Alliance

Raconteur 2022 Report: Authentication & Digital Identity

Insight: Sharing cybersecurity successes and failures leads to improvement – Andrew Shikiar, executive director and CMO at the FIDO Alliance, explains why a culture of secrecy surrounding cybersecurity is holding […] The post Raconteur 2022 Report: Authentication & Digital Identity appeared first on FIDO Alliance.

Insight: Sharing cybersecurity successes and failures leads to improvement – Andrew Shikiar, executive director and CMO at the FIDO Alliance, explains why a culture of secrecy surrounding cybersecurity is holding back progress

If your organisation were hit by a cyber attack, would you tell anyone?

Historically, the answer would be an unequivocal no. Many believe that sharing that you were a target exposes your company’s (or your personal) vulnerabilities, making you more susceptible to further attack or ridicule. But this ‘security by obscurity’ mindset is not only outdated, it hinders the industry’s ability to harden our collective defences, most notably by eliminating our dependence on passwords and other knowledge-based credentials. 

While this year saw a 5%-7% drop globally in the use of passwords for entry, it is still by far the most popular online authentication method, which is a big problem. Passwords are not only highly insecure, but they also cause major consumer headaches and are costing businesses; 59% of consumers gave up on accessing an online service and 43% abandoned a purchase when asked for a password in the past month. More than 82% of data breaches are caused by weak or stolen login credentials. 

The benefits of multi-factor authentication (MFA) are widely reported but many firms have been sheepish about sharing their adoption figures. 

This may be because the figures weren’t great. Twitter revealed its two-factor-authentication adoption figures last summer, revealing that just 2.3% of accounts had it enabled. Of those, 80% relied on SMS-based backup, the least secure mode. Communicating this doesn’t make Twitter any less secure. Instead, it sets a powerful benchmark for improvement, and gives the industry a reality check that considerable work remains to get more customers using MFA. 

Other organisations to be applauded are Cloudflare and Twilio. The two cloud computing giants recently reported that they were targeted by a near-exact phishing attack. Employees were targeted with a text message from a supposed IT department, directing them to a fake website requesting a password change. Neither Twilio nor Cloudflare’s monitoring systems detected the attack, and, as you’d expect, some employees were caught off-guard and shared credentials. 

While Twilio fell victim to the attack (along with dozens of other companies), Cloudflare’s employees were protected because they use Fast ID Online (FIDO) security keys which are tied to users. Origin binding also prevented any credentials from being shared. Since the incident, Twilio has followed Cloudflare’s lead, as it shared in its updated incident report. This is a great example of how sharing successes and failures alike leads to two on the whole. 

At the FIDO Alliance, we’re working with the world’s leading tech companies and consumer service providers to solve this challenge. Together, we’ve created technology that’s increasingly cited as a ‘gold standard’ by governments, including the US’s cybersecurity body, CISA, and the UK’s National Cyber Security Centre. 

To best defend against cyber attacks, organisations should take inspiration from the Twilio and Cloudflare story and build in security protocols that are phishing-resistant. These protocols are often implemented with USB keys or built-in biometric authentication on devices, and can be added as a critical layer of security to both an organisation’s own network and information, and for customers accessing its services. 

Of course, the work we do at the FIDO Alliance, creating and implementing new technology, is an important part of moving the world away from passwords and other weak forms of legacy authentication – but it isn’t the most critical piece. Industry-wide commitment to creating intuitive and common user journeys, underpinned by architectural best practices, will enable the kind of cultural shift and mass adoption of this technology that will be required if we want to remove passwords from our daily lives. 

Collaboration and transparency are key ingredients that raise the bar for all involved – including for hackers, who need to have a far harder time executing remote attacks.

Download the Full Report

The post Raconteur 2022 Report: Authentication & Digital Identity appeared first on FIDO Alliance.


Blockchain Commons

Silicon Salon 3 Call for Contributions

The Silicon Salon is Back for the New Year! Sign Up Now at Eventbrite Blockchain Commons will be facilitating Silicon Salon 3 in mid-January, tentatively on January 18th. This series of virtual Silicon Salons is intended to bring together digital wallet developers, semiconductor manufacturers, and academics. Their objective: to ensure that the next generation of cryptographic semiconductors meets e

The Silicon Salon is Back for the New Year!

Sign Up Now at Eventbrite

Blockchain Commons will be facilitating Silicon Salon 3 in mid-January, tentatively on January 18th. This series of virtual Silicon Salons is intended to bring together digital wallet developers, semiconductor manufacturers, and academics. Their objective: to ensure that the next generation of cryptographic semiconductors meets everyone’s needs, advancing the entire cryptography industry. There is a gap between wallet requirements and semiconductor development, between academic research and real-world practice; we want to bridge it.

We will bring together semiconductor developers, cryptographers, and other experts present, with presentations focused on new silicon-logic-based cryptographic functionality & leveraging opportunities for semiconductor acceleration, such as Multi-Party Computation (MPC) and ZK-proofs.

The field of MPC (Multi-Party Computation) for security applications has been an area of energetic academic research since first introduced by Andrew Yao in 1986. More recently, increases in computation capability and improvement in the efficiency of algorithms have enabled MPC to move from the theoretical to the practical. Varous MPC-TSS (MPC-Threshold Signature System) approaches offer significant benefits in security, robustness, and recovery versus traditional “single private key” systems. However, practical MPC is still in its infancy and is rapidly evolving, which introduces challenges in providing silicon support for the future of cryptography.

Read More

Some presentations lined up so far are:

Toward a More Open Secure Element Chip (bunnie studios). What are the elements that make a semiconductor more or less “open”? How do you maintain openness in a proprietary ecosystem, and is there a purpose to secrecy in security? Silicon & MPC (Cramium). In this talk Cramium will overview silicon architecture approaches to addressing concerns of security, performance and efficiency, economic concerns, and flexibility to accommodate future improvements. We will overview some facets of MPC-based distributed key management that receive little academic attention, but are important in a practical context. A Fast Large-Integer Extended GCD Algorithm and Hardware Design for Verifiable Delay Functions and Modular Inversion (Kavya Sreedhar). This presentation will cover work on developing a large-integer extended GCD (XGCD) algorithm and hardware design, published in CHES2022. It uses carry-save arithmetic and conducts a design space exploration over XGCD algorithms and application requirements to design an accelerator that supports fast average and constant-time evaluation and is easily extensible for polynomial XGCD.

If you are interested in making a presentation at Silicon Salon 3, please contact us with a proposal. Include the following:

The title. A summary of what your presentation will be about. A summary of how that relates to silicon-logic-based cryptographic acceleration & new functionality and/or the general topic of integrating cryptography into new semiconductors. Note that this can be a discussion of capabilities from the point of view of a semiconductor manufacturer, of needs from a wallet manufacturer, or other discussions from someone in the broader decentralized community. The name of the presenter(s). A description of who they are and how they or their company have the expertise, capability, or reach to benefit the Silicon Salon conversation.

Final presentations should be about five minutes long, supported by a slide deck or some sort, which you will present in our Zoom salon on the date of the salon.

Please note the following deadlines for Silicon Salon 3 proposals and contributions:

December 23 - Final date for submission of proposals. January 4 — Blockchain Commons selection of proposals. January 11 — Submission of draft slidedecks to Blockchain Commons for any comments. January 16 — Submission of final slidedecks to Blockchain Commons for inclusion in post-event web pages. January 18 — Presentation at Silicon Salon 3. January 25 — Blockchain Commons finalization of web site release.

Also please note that the January 18th date is tentative. We are checking with likely participants for conflicts, and ensuring that the holidays won’t cause too much conflict. If it moves, then the January deadlines will move accordingly.

Thank you for your interest in Silicon Salon 3 and the future of semiconductor integration with cryptography! If you have any questions or want more information, please email us at team@blockchaincommons.com.

Even if you do not want to present, please Save the Date of January 18th, 2023, so that you can participate in the conversation. We’ll make an announcement as soon as we’ve finalized the date and have an Eventbrite page availabale for signups.

See https://www.siliconsalon.info/ for our salons to date. Silicon Salon 3 will continue this discussion.

Thank you to our sustaining sponsors who make Silicon Salon possible: Bitmark, Chia, Cramium Labs (a subsidiary of CrossBar), Foundation, Proxy, and Unchained Capitol. We are also seeking additional sponsors. Mail us at team@blockchaincommons.com or become a sponsor on GitHub and let us know it’s to support the Silicon Salons!

Christopher Allen Blockchain Commons

Monday, 28. November 2022

Trust over IP

What — Exactly — Is a Digital Trust Ecosystem?

Drawing on the characteristics of natural ecosystems and human economies to propose a concrete definition. The post What — Exactly — Is a Digital Trust Ecosystem? appeared first on Trust Over IP.

by Trinh Nguyen-Phan

The concept of an “ecosystem” has been increasingly adopted in data management, innovation, and business strategy. Yet, more often it is used metaphorically without a specific reference to the literature. As Socrates (470 – 399 B.C.) put it, “The beginning of wisdom is the definition of terms.” This is especially relevant to the ToIP Ecosystems Foundry Working Group (EFWG) because our very name is based on this term. So the EFWG has been working for over a year to deliver a white paper called Defining Digital Trust Ecosystems (PDF). The first version of this white paper has now been approved by the ToIP Steering Committee and we are pleased to announce its official release.

The white paper draws on the characteristics of natural ecosystems and human economies to propose a concrete definition of digital trust ecosystems. It also discusses the characteristics of digital trust ecosystems, the challenges of sustaining digital trust ecosystems, and the role of the ToIP Foundation.

Given that multiple disciplines use the term “ecosystem” with an implied reference to its biological root, the EFWG  asked a biodiversity scientist, Dr. Autumn Watkinson, to introduce us to ecology as a discipline. Based on her talk, we saw many parallels to the work we do in the EFWG. Surveying the literature in biology, psychology, business strategy, and data management, we discovered interesting convergences and divergences of the concept of “ecosystems” in biology and other disciplines. Interconnectedness and competition are the two most referenced characteristics of natural ecosystems in other disciplines; whereas resilience, diversity, and the sustainability mechanism of natural ecosystems are by and large dismissed. This might be caused by the lack of understanding of what natural ecosystems entail, as the renowned forester and writer Wohlleben said in ‘The Hidden Life of Trees: What They Feel, How They Communicate’: 

“The forest ecosystem is held in a delicate balance. Every being has its niche and its function, which contribute to the well-being of all. Nature is often described like that, or something along those lines; however, that is, unfortunately, false.”

– (Wohlleben, 2015, p. 113)

We suggested three lessons from nature:

First, ecosystems must optimize the diversity of its membership.  Second, ecosystems survive and thrive without clear-cut boundaries and no ecosystem is completely independent of others.  Third, and most importantly, Digital Trust Ecosystems must have a built-in mechanism to counteract overexploitation.

As the Trust Over IP Foundation aspires to foster decentralized digital trust, these three lessons are particularly pertinent to support, facilitate, and sustain digital trust ecosystems. This applies not just within one ecosystem, but between different digital trust ecosystems. We believe that this work is fundamental to further work on designing ecosystems governance frameworks. We invite you to read the white paper, join the discussion of digital trust ecosystems, and examine the applications of these concepts to your related work.

Photo: Robynne Hu on Unsplash

The post What — Exactly — Is a Digital Trust Ecosystem? appeared first on Trust Over IP.


OwnYourData

NGI ONTOCHAIN Funding for Babelfish

Babelfish: Service Integration in Heterogeneous Environments has been selected for funding as one of the brightest projects for building the Next Generation Internet in Europe OwnYourData and the Kybernos ESG Data Services GmbH are taking part in ONTOCHAIN, to co-develop a new software ecosystem for trusted, traceable & transparent ontological knowledge WHAT IS ONTOCHAIN? ONTOCHAIN […] Der B
Babelfish: Service Integration in Heterogeneous Environments has been selected for funding as one of the brightest projects for building the Next Generation Internet in Europe

OwnYourData and the Kybernos ESG Data Services GmbH are taking part in ONTOCHAIN, to co-develop a new software ecosystem for trusted, traceable & transparent ontological knowledge

WHAT IS ONTOCHAIN?

ONTOCHAIN is a new software ecosystem for trusted, traceable & transparent ontological knowledge management funded by the European Commission as part of the Next Generation Internet initiative (NGI).

ONTOCHAIN empowers internet innovators to develop Blockchain-based knowledge management solutions that address the challenge of secure and transparent knowledge management, as well as service interoperability on the Internet.

The ONTOCHAIN software ecosystem aims to demonstrate its potential in high impact domains, such as eHealth, eGovernment, eEducation, eCommerce, decentralised infrastructures and similar, in order to achieve trustworthy information exchange and trustworthy and transactional content handling.

Babelfish is taking part in ONTOCHAIN to provide service integration in heterogeneous environments. Our project proposes to describe services on a technical, semantic, and governance layer and will implement a component that uses such descriptions to translate interfaces (APIs), data, and data agreements from a foreign (and maybe proprietary format) to an interoperable format understood by the recipient. A registry maintains a list of all services and thus spans up an interoperable data space.

HOW WILL IT WORK?

ONTOCHAIN Open Call 3 was looking for interoperable and sustainable applications that employ Semantic Web and Blockchain concepts, to enhance data quality aspect, as well as the trustworthiness of data communication and handling processes.

Web3 innovators were invited to propose applications covering real needs of end users in vital sectors of the European economy, built on top of the software services of the ONTOCHAIN ecosystem. Applicants could also submit proposals around missing blocks of the ONTOCHAIN infrastructure.

A total of 105 projects applied for the call and the evaluation process resulted in the selection of 14 proposals addressing the following topics:

ONTOCHAIN INFRASTRUCTURE

Service Integration (Gateways APIs) for ONTOCHAIN applications Energy-efficient and sustainable hosting infrastructure for the ONTOCHAIN software ecosystem and services

ONTOCHAIN APPLICATIONS

Semantic Digital Logbooks for Companies, Buildings, Cars or similar Decentralised Fact Checking and Data Credibility for Social Content Semantic energy data management Automotive, e.g., electric vehicle charging, road side management, car insurance, communication interoperability Distribution Logistics / Supply Chains Using Trustworthy Semantic Data Data/Digital content /Multimedia marketplace, including social media Decentralised Public Services & Common Goods Other applications aligned with ONTOCHAIN objectives

ONTOCHAIN will support Babelfish through a 10-month programme and provide funding support up to 119.500 Euros. As part of the action, experts in diverse fields will also provide technology development guidance, working methodology, as well as access to top infrastructure, coaching, visibility and community building support.

 

FOLLOW OUR JOURNEY THROUGH ONTOCHAIN!

Take a look at the ONTOCHAIN innovators portfolio to see more information about the projects selected. For the Babelfish project find also detailed information at websites from OwnYourData and Kybernos.

To read more about ONTOCHAIN please visit the website: ontochain.ngi.eu.

Der Beitrag NGI ONTOCHAIN Funding for Babelfish erschien zuerst auf www.ownyourdata.eu.


Digital Scotland

Whisky on the Blockchain – EXPONENTIAL Growth Strategies for Scotland’s Economy

How Scottish whisky suppliers are improving the appeal and sales of their products through tapping into mega trends like NFTs. The post Whisky on the Blockchain – EXPONENTIAL Growth Strategies for Scotland’s Economy appeared first on digitalscot.net.

A major accelerator for Scotland’s economy is the intersection of powerful new technologies with existing high-growth markets. The intersection yields potential for exponential growth as there is a multiplier effect, of further boosting sales of already booming industries, as well as advancing the local technology capabilities in IT sectors that themselves also present hyper-growth opportunities.

A keynote example of this is ‘Whisky on the Blockchain’, how Scottish whisky suppliers are improving the appeal and sales of their products through tapping into mega trends like NFTs.

As the Scotch Whisky Association reports in 2021 the value of Scotch Whisky exports was up 19% to £4.51bn, and as Markets and Markets reports the global Blockchain Market was valued at $4.9 billion in 2021 and is projected to reach $67. 4 billion by 2026.

Building Scotland’s innovation capacity in a market set to grow to over $60 billion, by growing one it’s largest export sectors, offers a concentrated, exponential strategy for economic growth.

The Whisky Barrel – Scottish Innovators Pioneer World’s First Single Cask Scotch with a Digital Provenance Certificate

Pioneers of this field include The Whisky Barrel, who have launched the world’s first single cask Scotch with a ‘Digital Provenance Certificate’.

The Whisky Barrel project demonstrates i) how they have integrated it into and enhanced their e-commerce strategy, and ii) the technology field is still so young that it’s possible to achieve world firsts, and thus significant competitive advantage and associated PR potential.

As Insider reported they have released one of the world’s first single-cask Scotch whiskies with a digital provenance certificate, using Non-Fungible Tokens (NFTs) to digitally certify its whisky.

This securely transmits essential product information on a public ledger, helping collectors and connoisseurs alleviate the risk of investing in counterfeit whisky.

Each of the 152 individually-numbered bottles feature a unique QR code that links to its corresponding digital certificate. This token provides digital proof of ownership, as well as the provenance of each bottle. Each NFT is minted on the Solana blockchain platform, which was chosen for its carbon-neutral and low-energy consumption qualities.

Featured Digital Scot: CD Corp.

The Whisky Barrel has pioneered this project with the support of CD Corp, who offer a range of marketing, branding & design, web development services as well as commanding experience in creating value-driven NFT projects.

Industry Innovators: Whisky and Digital Art

The intersection is headlined by initiatives from whisky suppliers to combine their product offerings with NFT digital art.

Other industry innovators include Glenrothes offering a micro (168 bottle) batch of 36-year-old Single Cask Scotch Whisky for $3,600 a bottle with an interactive NFT artwork, and Benriach launching a luxury twinset with two bottles exclusively as an NFT.

Johnnie Walker is offering seven whisky connoisseurs the chance to own an extremely rare bottle of 48-year-old Johnnie Walker Masters of Flavour NFTs, alongside their very own digital art – The tokens unlock access to a unique piece of graphic design by artist Kode Abdo aka BossLogic.

As Bernard Marr writes and explains in the featured videos distillers William Grant and Son have recently sold 15 bottles of 46-year-old Glenfiddich whisky for $18,000 apiece, each one accompanied by its own NFT revolving image/ artistic impression of the bottle that not only allows them to show off their purchase but also acts as a counterfeit-proof certificate of ownership.

The post Whisky on the Blockchain – EXPONENTIAL Growth Strategies for Scotland’s Economy appeared first on digitalscot.net.

Thursday, 24. November 2022

Origin Trail

From Data to Assets: Transforming Global Supply Chains with OriginTrail Decentralized Knowledge…

From Data to Assets: Transforming Global Supply Chains with OriginTrail Decentralized Knowledge Graph Ongoing digital transformation of global supply chains Organizations have been on a journey of digitally transforming their supply chains for a number of years. This journey was accelerated by the COVID pandemic that highlighted the fragility of global supply chain networks, coupled with a
From Data to Assets: Transforming Global Supply Chains with OriginTrail Decentralized Knowledge Graph Ongoing digital transformation of global supply chains

Organizations have been on a journey of digitally transforming their supply chains for a number of years. This journey was accelerated by the COVID pandemic that highlighted the fragility of global supply chain networks, coupled with a rapid increase in consumer demand and e-commerce volumes. Transformation efforts have been taking place on multiple fronts, such as transparency, sustainability, purchasing, warehouse operations, and trade documentation, to name a few. While progress has been made in each of those areas, we’ve seen few leapfrog advances. There are several reasons for that, ranging from a complex landscape of involved players to siloed and poorly interoperable data that prevents organizations to have a holistic understanding of what’s going on in their supply chain networks.

Even with the slow progression, however, advanced technologies that will enable supply chains to enter and benefit from the metaverse — the fusion of physical and digital worlds — are steadily being adopted.

In one of their reports, Accenture found that about two-thirds of supply chain management executives think that metaverse will have a positive impact on their organization. They see metaverse technologies benefiting organizations in multiple ways, from enabling a better alignment of supply and demand through enhanced customer engagement to facilitating more seamless navigation between physical and digital worlds and providing real-time insights that reduce the gap between supply chain planning and execution. But we still have a lot of work to get there.

OriginTrail — introducing discoverable and verifiable Web3 assets to supply chains

Achieving the vision of fully metaverse-enabled global supply chains requires a strong and paradigm-shifting foundational infrastructure. OriginTrail Decentralized Knowledge Graph (DKG) brings this required radical change in the way supply chains operate by enabling organizations to transform their data into Web3 assets. To make it easier to understand, Web3 assets in the context of supply chains can be a variety of things:

Products or groups of products that are flowing along the supply chain. Logistics equipment such as shipping containers, trailers, pallets, and other equipment used to handle, transport, and store products. Business locations where products are manufactured, transported, stored, and sold. Supply chain events describing what happened to products and equipment — we use the GS1 EPCIS 2.0 standard to ensure interoperability. Trade documents like Bills of Lading (BL), Airway Bills (AWB), CMRs, Credit notes, Proof of Delivery (POD), Purchase Orders, Invoices, and others. Credentials of trade partners such as facility audits and certificates. Other relevant supply chain components.

Assets from various supply chain partners on OriginTrail DKG are discoverable, verifiable, and linked, providing a holistic insight into the supply chain network and making it easy to use them in a variety of business applications — anything from track and trace and security monitoring to precise recall management and insurance claims. Additionally, as OriginTrail DKG assets use global data standards such as GS1 EPCIS, W3C Verifiable Credentials, and e-Bill of Lading (eBL), they are interoperable and can be used in any existing IT system that “understands” the standards.

One concern that usually pops up is data privacy and ownership — most of the supply chain data is of a sensitive nature after all. No worries there, as OriginTrail DKG allows a full spectrum of privacy configurations, ranging from assets that are fully private to assets that are fully public, and everything in between. Assets on OriginTrail DKG also incorporate the concept of ownership, as asset ownership can be verifiably transferred from one supply chain partner to another as required. This opens up a world of exciting opportunities for process improvements and new ways to generate value. For example, no more paper handovers of trade documents (such as Bills of Lading) and fuzzy to non-existent audit trails. Or imagine organizations being able to engage with and understand their customers better by transferring ownership of a product in the form of a Web3 asset to them after purchase. The possibilities are endless.

What I described above is not theoretical — OriginTrail DKG is already positively impacting supply chains in different industries. For example, SCAN Trusted Factory uses OriginTrail to safeguard security audits of the largest US retailers such as Disney, Walmart, and Target. AidTrust provides trust and transparency in pharmaceutical supply chains. Trusted Bytes looks at enabling UK Customs to achieve comprehensive shipment risk assessments. And many more are on the way.

Network Operating System (nOS) — where the magic happens

But how do we get from the siloed, fragmented, and poorly interoperable data landscape in supply chains today to discoverable, verifiable, and linked Web3 assets on OriginTrail DKG? Enter Trace Labs’ Network Operating System (nOS), designed to streamline the connection between legacy IT systems (ERP, WMS, LMS, and others) — where data that can be transformed into assets is stored — and OriginTrail DKG.

Network Operating System (nOS )

nOS allows organizations to create, update, and utilize their supply chain assets on OriginTrail DKG from existing data in their IT systems in a simplified way, without the need to deal with complexities usually associated with decentralized infrastructure (e.g. crypto token management is automated with nOS). This is facilitated with three sets of tools within nOS:

Data ingestion tools, enabling organizations to easily connect nOS to their IT systems and set up automated (API-based) or manual data ingestion, depending on their business requirements. For example, an organization can set up an automated connection to its Transport Management System (TMS), ensuring shipment data is ingested as it becomes available. Asset creation tools, allowing organizations to transform their data into discoverable, verifiable, and linked assets on OriginTrail DKG. This set of tools also allows the application of different sets of global data standards (e.g. GS1 EPCIS, W3C Verifiable Credentials, eBill of Lading, eCMR, etc.), ensuring created assets are fully interoperable. For example, an organization that has extracted shipment event data from its TMS can transform that data into GS1 EPCIS 2.0 structured assets. Asset querying tools, providing organizations with an easy way to query for assets on OriginTrail DKG for use in their business applications such as supply chain traceability, credential verification, and others. For example, an organization (or multiple organizations in a supply chain network) that has created GS1 EPCIS 2.0 structured assets can set up the necessary queries to utilize those assets in its track and trace application.

With the help of nOS, turning siloed data into discoverable and verifiable assets on OriginTrail has never been easier.

So what’s next?

OriginTrail DKG introduces a new, asset-focused paradigm to global supply chains that can be a significant enabler for a variety of other metaverse technologies, ranging from augmented reality and AI to Internet of Things (IoT) and next-generation digital twins. All these technologies require a strong foundation to deliver business value and this is where OriginTrail DKG plays an important role. And with nOS facilitating the transition from siloed data in legacy IT systems to verifiable Web3 assets on OriginTrail DKG, we are on a strong path towards the vision of metaverse-enabled global supply chains.

👇 More about Trace Labs👇

Web | Twitter | FacebookLinkedIn

👇 More about OriginTrail 👇

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

From Data to Assets: Transforming Global Supply Chains with OriginTrail Decentralized Knowledge… was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


We Are Open co-op

We are thankful for Keep Badges Weird

It is Thanksgiving week for all the people who celebrate it. We Are Open folks live in the UK and Europe though, so we’re just going to use this time to say thank you! CC BY-ND Bryan Mathers Thank you for this awesome first year of Keep Badges Weird! Thank you for all the great discussions, community calls and reaching out to each other! And thank you to all the community mem

It is Thanksgiving week for all the people who celebrate it. We Are Open folks live in the UK and Europe though, so we’re just going to use this time to say thank you!

CC BY-ND Bryan Mathers

Thank you for this awesome first year of Keep Badges Weird!

Thank you for all the great discussions, community calls and reaching out to each other!

And thank you to all the community members that have joined and shared:

Don for contributing so much to Badge Wiki, Alex for having all the questions and links and comments, Philippe for sharing his great work around Open Recognition, Anabella for Keeping Badges Weird in her keynotes and work, Esther for sharing her projects online and offline, Simone for all his great weird badge ideas Mark and Julie for supporting this work and their sparkling ideas, Matt for rhyming his badge framework, Justin for rocking the rich skills descriptors work, And also many other people that we can’t name, for all your work around Open Badges, Open Recognition and Keeping Badges Weird! One year of Keep Badges Weird in blog posts

We always write blog posts to document processes and reflect on our work. In this first year of Keep Badges Weird we wrote lots of reflections, promising practices, invitations and other posts. It was great fun to head into our archive and see where we came from and where we got with the community. Here are four lists of all the posts we’ve written so far.

Community Building

These posts dive into the nuts and bolts of community building.

Catalysing the KBW community — Keeping Badges Weird in a way that’s sustainable 3 Tips on how to build a sustainable community with the example of Keep Badges Weird. 🔥 Towards a maturity model for online, networked communities (v0.1) — Adapting and applying existing work to combine the best elements of each Showcasing our work around community maturity and how we intend to grow KBW. Steps to Success when building a Community of Practice — 🤩 Convening systems for maturity and development Collecting all the theoretical work we’ve done around advocacy, systems convening and community development into a handy guide for other communities. How to be a great moderator — A simple moderation process A good community needs good moderators. In this post we explain how to be one. Activism

KBW isn’t just a community of practice, it’s a community advocating for open recognition. These posts explain some of how we think about that purpose.

Advocating for learner-centric badge systems — Some thoughts on campaigning for the right things What is KBW advocating for? What is Open Recognition, anyway? — Going beyond credentialing and the formal/informal divide Opening the discussion around Recognition vs. Credentialing again. How badges can change the world — Part 1: The Two Loops Model for Open Recognition advocacy “As one system begins to deteriorate, a new system begins to emerge. This got us thinking about how this model applies to the world of Open Recognition.” How badges can change the world — Part 2: Why we need to transition “To heal ourselves and our world, we need lifelong learners who are curious about the world and actively try to make it better.” Keep Badges Weird is about breaking boundaries — How the KBW community is convening systems Breaking down the ways in which the KBW community is convening systems, through the lens of the Wenger-Trayner Model. Open Recognition is for every type of learning — From cold hard credentialing to warm fuzzy recognition Building on a previous post, a list of scenarios to help explain what we mean when we talk about Open Recognition. Badge Design

These posts showcase how we build badges for the community and beyond.

Designing Badges for Co-creation and Recognition — Individual Learner, Communities of Practice Reflection on co-creating badges with WAO’s intern (that was me!) but also how to co-create badges in a Community of Practice. WTF are ‘Stealth Badges’? — The case of the O.G. Badger Did you earn one of the stealth badges yet? Keep Badges Weird: helping people understand the badges landscape — Our easy-to-repeat workshop for The Badge Summit 2022! A workshop explaining how to design equitable and emergent badge systems with the help of pizza metaphors. With CC-licenced resources. Events

KBW is hosting events as well as attending events. This posts are invitations but also reflections.

Keep Badges Weird… — at the Badge Summit Invitation to join us for the launch of Keep Badges Weird on the 27th October 2021 at remote the Badge Summit. Join the Keep Badges Weird Community Call! Invitation to KBW’s first community call. Emergent community building Reflecting on the first community call. Summing up Badge Summit — More system convening, community recognition and a dose of whimsy and weirdness Reflection on what we experienced at the Badge Summit in Boulder, Colorado. Spoiler: It was a lot!! Thank you for this year!

We are looking foward to another great year, new ideas, more discussions, more advocacy and all your inputs.

To end of this year we created a 1st Birthday survey because vibes and feels are great, but data is also important. We’ll share back our findings, but need at least 10% of the community to fill in the survey for it to be meaningful.

🎉 Fill in the KBW 1st Birthday Survey! 🎉

Fill in the survey

See you all in the community!

We are thankful for Keep Badges Weird was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 22. November 2022

The Engine Room

Join our community call: thinking about tech through the lens of digital resilience

Join our community call on December 8th at 10am ET to engage in a peer-to-peer space on digital resilience. The post Join our community call: thinking about tech through the lens of digital resilience first appeared on The Engine Room.

This year we started an organisational project to improve our digital resilience. We use the term “digital resilience” to refer to a set of practices that enable an organisation to protect itself from — and respond to — digital security threats, to ensure the wellbeing of its people and to adopt infrastructures that respond to ever-changing needs and contexts.

Throughout this project we mapped digital resilience issues that social justice organisations we work with are facing, as well as strategies to approach them. One of the overall findings was that organisations are going through constant change both internally and externally due to factors outside their control. Challenges such as organisational changes, contextual shifts in the region, pandemic effects on human resources and even new tools can result in fundamental shifts to the ways we work internally.

For this reason, The Engine Room will be hosting a Community Call December 8th at 10am ET to create a peer-to-peer space as a way to address this need to share what we’ve learned in the last year. We will be presenting our own learnings from our attempts to adapt to change as well as facilitate a conversation to reflect on challenges and ways to address and adapt to them going forward.

Register HERE.

All are welcome! Whether you’re a practitioner, a civil society organisation working with impacted communities, or just keen to learn more, join us at our Community Call. See you on December 8th!

Image by Pawel Czerwinski via Unsplash.

The post Join our community call: thinking about tech through the lens of digital resilience first appeared on The Engine Room.

EdgeSecure

Evolving Beyond OPMs

The post Evolving Beyond OPMs appeared first on NJEdge Inc.

The post Evolving Beyond OPMs appeared first on NJEdge Inc.


GS1

Maintenance release 3.1.22

Maintenance release 3.1.22 daniela.duarte… Tue, 11/22/2022 - 16:54 Maintenance release 3.1.22
Maintenance release 3.1.22 daniela.duarte… Tue, 11/22/2022 - 16:54 Maintenance release 3.1.22

GS1 GDSN accepted the recommendation by the Operations and Technology Advisory Group (OTAG) to implement the 3.1.22 standard into the network on 25 February 2023.

Key Milestones:

See GS1 GDSN Release Schedule

As content for this release is developed it will be posted to this webpage followed by an announcement to the community to ensure visibility.

Data Pools should contact the GS1 GDSN Data Pool Helpdesk to understand the plan for the update. Trading Partners should work with their Data Pools on understanding the release and any impacts to business processes.

Business Message Standards including Message Schemas Updated For Maintenance Release 3.1.22

Trade Item Modules Library 3.1.22 (Nov 2022)

GS1 GDSN Code List Document (Nov 2022) 

Delta for release 3.1.22 (Nov 2022)

Delta ECL for release 3.1.22 (Nov 2022) 

Validation Rules (Dec 2022)

Delta for Validation Rules 3.1.22 (Dec 2022)

Unchanged for 3.1.22

BMS Shared Common Library (Dec 2021)

Approved Fast Track Attributes (Dec 2021)

BMS Documents Carried Over From Previous Release

BMS Catalogue Item Synchronisation

BMS Basic Party Synchronisation

BMS Price Synchronisation 

BMS Trade Item Authorisation

 

Schemas

Catalogue Item Synchronisation Schema including modules 3.1.22 (Nov 2022)

Changed Schemas for 3.1.22 (Nov 2022)

Party Synchronisation Schema

Price Synchronisation Schema

Trade Item Authorisation Schema

Release Guidance

Packaging Label Guide (Nov 2022)

GS1 GDSN Attributes with BMS ID and xPath (Dec 2022) 

Deployed LCLs (Nov 2022)

Approved WRs for release (Nov 2022)

Unchanged for 3.1.22

GPC to Context Mapping 3.1.19 (Dec 2021)

Delta GPC to Context Mapping 3.1.19 (Dec 2021)

GS1 GDSN Unit of Measure per Category (Apr 2022)

Migration Document (Dec 2021)

GS1 GDSN Module by context (May 2021)

Flex Extension for Price commentary (Dec 2018)

Any questions?

We can help you get started using GS1 standards.

Contact your local office


FIDO Alliance

FIDO Alliance Announces Authenticate Virtual Summit focused on Securing IoT

Industry experts to share insights into how FIDO and related technologies can bring passwordless authentication to IoT Mountain View, Calif., November 22, 2022 – The FIDO Alliance today announces its […] The post FIDO Alliance Announces Authenticate Virtual Summit focused on Securing IoT appeared first on FIDO Alliance.

Industry experts to share insights into how FIDO and related technologies can bring passwordless authentication to IoT

Mountain View, Calif., November 22, 2022 – The FIDO Alliance today announces its latest Authenticate Virtual Summit: Securely Onboarding All the Things: The FIDO Fit in IoT, sponsored by Daon and Nok Nok. Responding to rising industry demand for more insight into the role of FIDO and passwordless technology in IoT, the free event will offer attendees expert perspectives and education from leading industry organizations and solution providers on strengthening authentication in IoT. The program will take place virtually on December 7 2022, from 8:00am – 12:00pm PT, and will be made available to registrants on-demand following the event. 

Lack of IoT security standards and outdated processes, such as shipping with default password credentials and manual onboarding, leave devices and the networks they operate on open to large-scale attacks. As the IoT market continues to grow, projected to surpass the $1 trillion mark in 2022, the FIDO Alliance formed the IoT Technical Working Group to address these challenges – aiming to provide a comprehensive authentication framework for IoT devices relying on passwordless authentication. 

Launched in 2021, the FIDO Device Onboard (FDO) specification is the working group’s first output: an open IoT standard which enables devices to simply and securely onboard to cloud and on-premise management platforms. The upcoming virtual summit will delve into this specification and FIDO’s role in IoT with speakers from Intel, Qualcomm, FIDO Alliance and more:

Introduction: The FIDO Fit in IoT Introduction to FIDO Device Onboard FIDO Device Onboard: Technical Deep Dive FDO Demo FDO Case Study FDO Certification 101

Register for the event here

Sponsorship Opportunities 

The Authenticate 2022 Virtual Summit series is accepting applications for sponsorship, offering a number of lead generation and brand visibility opportunities. Visit the Authenticate sponsorship page for more information or contact authenticate@fidoalliance.org.

About the FIDO Alliance

The FIDO (Fast IDentity Online) Alliance, www.fidoalliance.org, was formed in July 2012 to address the lack of interoperability among strong authentication technologies, and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler, stronger authentication that define an open, scalable, interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger, private, and easier to use when authenticating to online services.

PR Contact
press@fidoalliance.org

The post FIDO Alliance Announces Authenticate Virtual Summit focused on Securing IoT appeared first on FIDO Alliance.


Digital ID for Canadians

Video Gamers, Say Hello to Digital Identity

By Chris Ferreira, Senior Program Manager, DIACC. Additional contributions made by members of DIACC’s Outreach Expert Committee I remember the first time I played a video…

By Chris Ferreira, Senior Program Manager, DIACC. Additional contributions made by members of DIACC’s Outreach Expert Committee

I remember the first time I played a video game as if it was yesterday. My older brother spent an entire summer working so he could afford to buy an Atari 2600 game console which was the pinnacle of home gaming entertainment at the time. News of this purchase spread like wildfire and the next thing we knew, we had a room full of kids and adults huddled around a TV playing Pac-Man and Asteroids for hours. From that moment forward, I was a gamer for life.

Today, the video game industry is larger than the movie and music industries combined, drawing in over 2 billion gamers worldwide. The industry generates a staggering amount of revenue annually. In 2020, players spent $4.5 billion USD on immersive games, also known as Virtual Reality gaming, alone. When considering console sales, gaming subscriptions, mobile gaming, and micro-transactions (i.e., in-game purchases with real-world money), that revenue figure skyrockets.

By the end of 2021, the video game market generated more than $180 billion USD in revenue and it’s predicted that the global gaming market is set to reach over $268 billion USD by 2025.

So how does Digital Identity come into play?

Digital Identity can be defined as a digital representation that uniquely identifies a person and can be used to verify their identity when they want to access services. Look at it as a way to prove who you are online without the need for paper documents such as a driver’s license. People who use Google or Facebook profiles to create accounts or login to a particular service already have a form of Digital Identity. When used properly, Digital Identity can even help ensure video gaming is done securely and safely by protecting personal information and virtual treasures from being stolen.

Facial recognition systems are being deployed to prevent young gamers from playing age-gated content and between certain hours of the day. The hugely successful Roblox game has introduced an optional age verification for its users that will combine an ID check with a selfie scan that will be required to access in-game voice chat features and will allow developers to “create new experiences that will rely on identity verification in the future”. Facial recognition technology can provide a much more secure and trusted way of verifying a person’s identity compared to simply using a user name and password. However, these tactics, although stemming from good intentions, are being scrutinized by some privacy groups who have concerns with how securely a person’s data is being stored, where the data is being kept, and by whom the data can be accessed by.

With an industry that generates such significant global revenue, it’s bound to attract the attention of cyberattackers looking to steal company and gamer data for malicious use. Video game companies are targeted frequently because they don’t need to adhere to the same security and regulatory requirements as other organizations such as banks or hospitals that are mandated to protect client data. As with most people, gamers are exposed to security risks as they often use the same or weak passwords across multiple sites which makes it easy for hackers to obtain their login credentials. It’s an awful feeling to log into your game to see all your virtual items and in-game currency – that took hours, days, and months to accumulate – stolen by an unknown virtual bandit. Or worse, hackers often steal gamers’ personal information and even financial information like credit card details.

Fortunately, game studios have started taking these risks and threats seriously by ensuring that seamless account security mechanisms are in place. Game studios are also increasing their player’s awareness when it comes to their identity management controls so they understand the risks associated with password and account sharing and purchasing game add-ons from unapproved vendors. Two-factor authentication (2FA) is becoming more common practice amongst gamers to improve their security and reduce chances of account takeover. 

Major game studios are stepping up to do their part to protect their customers’ information while at the same time being more aggressive with their internal operational security measures to prevent their game code from being stolen. Concepts like Self-Sovereign Identity (SSI) are also coming into play allowing gamers even more control over their personal information’s use. When game studio Midnight Society released their video game DeadDrop, they integrated Digital Identity that was verified via Blockchain. During the game’s testing period, players were issued digital credentials that allowed them to access the beta through the Polygon Blockchain network. Those digital credentials would allow the user to validate their identity and play the game before its release.

For gamers, their data is under constant threat but there are things that can be done to protect themselves from cyber-villains such as: 

Using strong passwords or a password manager and avoiding using the same passwords for several accounts to prevent brute force attacks.  Being suspicious of emails appearing to be from their game’s studio asking them to login into their game account. No company should ever ask a person to share their password under any circumstances. Using a VPN (Virtual Private Network) to make the computer’s location unclear for added protection. Setting up two-factor authentication to ensure that a hacker can’t access their account even if they manage to steal their password. Only downloading game add-ons from trusted and verified sources. 

Digital Identity and video games will continue to be embedded into our social fabric. And much like gamers wanting to protect their virtual characters from countless dangers, the same care should be taken with their real-life personal data. Evolving Digital Identity solutions, policies, and governance will be essential to ensuring gamers’ virtual, and real-world loot, remains safe and sound.

Monday, 21. November 2022

Elastos Foundation

Elastos Bi-Weekly Update – 18 November 2022

...

Friday, 18. November 2022

MOBI

Integration between Mobility and Energy | 7 December 2022

This lecture took place on 7 December 2022. Current global climate and environmental emergencies are pushing regulatory frameworks, business models, and marketing toward standardizing units of measurement. In this global framework, the carbon footprint is emerging as the standard global unit of measurement toward a net-zero society. German Insurtech company Wefox Group has developed [...] The po

This lecture took place on 7 December 2022.

Current global climate and environmental emergencies are pushing regulatory frameworks, business models, and marketing toward standardizing units of measurement. In this global framework, the carbon footprint is emerging as the standard global unit of measurement toward a net-zero society.

German Insurtech company Wefox Group has developed an insurance platform that leverages data from energy consumption to determine which appliances are in use, enabling innovative and precise coverage.

Wefox adopted MOVENS technology developed by Henshin Group. MOVENS is a cloud agnostic IoT and standard platform designed to be the Integration Hub for the Smart City, connecting all layers and related entities involved in the Smart City ecosystem.

About the Speakers

Domenico Mangiacapra has been working around new technology mobility platforms for many years, holding several executive positions in various IT companies. Currently, he is CEO of Henshin Group, a tech company focused on developing IoT infrastructure designed to be an open-source platform ecosystem for mobility and energy.

Tomaso Mansutti has a long-established experience in automotive through many partnerships with OEMs and is currently Head of International Partnerships of Wefox Group, primary leader in the global InsurTech Landscape. Wefox Group has been named Europe’s number 1 digital insurer that provides OEMs with seamless and integrated digital solutions, capable of delivering pay per use insurance for electric and combustion vehicles with a competitive advantage on ADAS ready solutions.

The post Integration between Mobility and Energy | 7 December 2022 appeared first on MOBI | The New Economy of Movement.


Origin Trail

GS1 Healthcare Conference: Paris was more than just croissants

This week I had the opportunity to take part in the GS1 Healthcare Conference in Paris together with BSI’s Global Director for Healthcare, Courtney Soulsby. As usual, GS1 event brought together a lot of people and organizations looking for innovative approaches to solving the industry’s biggest challenges. Speaking with some of the largest players in the space, a few recurring themes were present

This week I had the opportunity to take part in the GS1 Healthcare Conference in Paris together with BSI’s Global Director for Healthcare, Courtney Soulsby. As usual, GS1 event brought together a lot of people and organizations looking for innovative approaches to solving the industry’s biggest challenges. Speaking with some of the largest players in the space, a few recurring themes were present across the majority of our conversations that showcased how Web3 assets can play a key role in addressing them.

Courtney Soulsby, BSI’s Global Director for Healthcare, and Jurij Skornik, Trace Labs’ General Manager at Global GS1 Healthcare Conference

While today the Web3 industry might be appearing in the news for a plethora of reasons, I’m certain that the true and biggest opportunities are being seized by those focusing on education and building tangible value. By understanding how we can leverage this technology, we can address some of humanity’s biggest challenges, like those present in the industry that comprises ⅔ of the world’s GDP — supply chains. Here are 3 pain points that were mentioned often in our conversations in Paris and can very effectively be addressed using Web3 technology:

Access to information — actionable knowledge derived from information that is currently scattered across supply chain networks can unlock possibilities of precise alignment of supply and demand; quality/origin claims backed by verifiable data; forward tracking to minimize diversions and gray market sales; and many other high-impact solutions. Digital twins — creating a Web3 asset representation of a physical product, digital certificate/credential, or even a service — is a game changer. Not only are you able to have a much more precise understanding of asset ownership (chain of custody), but you have a single place to access key information about each product. This can be useful in ensuring product authenticity (counterfeit medicine is a huge problem), precise patient support during complex treatments and pin-pointing critical items in the unfortunate event of recalls. Expanding the supply chain — no supply chain today ends at the point of sale but rather extends to the consumer, and in the event of circular approaches, can even end up back at the producer. In healthcare, the element of extending the supply chain towards patients is more and more critical. Receiving real-time information about the correct effects of the medicine and medical devices can not only hugely increase the speed of innovation but also save lives. However, a big challenge of handling patient data and providing direct incentives for data sharing exists, both of which Web3 assets can be the right solution for.

The discussions we had also came as a great validation for solutions we’re building at Trace Labs. With the new OriginTrail — decentralized knowledge graph version 6 (DKG V6) — enabling us to describe any supply chain using Web3 assets as building blocks we can tackle all these huge challenges head-on.

Before signing off — Trace Labs is also releasing a big update to our Network Operating System (nOS) allowing us to address challenges like those above with efficient deployments. Having it entirely transformed to fully exploit the new and more efficient DKG V6, the nOS now allows any user to evolve the data structured according to GS1 and other important data standards into Web3 assets that can be made discoverable (available) and verifiable for plethora of business cases (some solutions built on top of the nOS can be seen here). The nOS also abstracts all complexities with regards to digital assets required to use the public decentralized infrastructure, so all you have to focus on is creating value for your business.

Network Operating System (nOS)

Subscribe to our newsletter to be among the first ones to receive news about the latest technology updates!

P.S.: The croissants in Paris were amazing, too.

👇 More about Trace Labs👇

Web | Twitter | FacebookLinkedIn

👇 More about OriginTrail 👇

Web | Twitter | Facebook | Telegram | LinkedIn | GitHubDiscord

GS1 Healthcare Conference: Paris was more than just croissants was originally published in OriginTrail on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ceramic Network

Soulbonds Launch Dynamic NFTs on Ceramic

Ceramic enables Soulbonds to store data in a decentralized way without users incurring transaction fees.
Introduction

Soulbonds are evolving NFTs that reflect a Web3 user's on-chain activity. Users can construct, customize and upgrade their Souls (using 220+ hand-drawn traits) to represent adventures they’ve embarked on and tailor the NFT to their unique experience.

"The Soul becomes their reflection in the decentralized web, a true reflection of their digital identity," the Soulbonds teams says.

Soulbonds chose to build on Ceramic to significantly improve usability. With Ceramic, users can apply new traits to their NFTs when they obtain on-chain achievements. Leveraging Ceramic to update Soulbonds metadata avoids all costs related to these data transactions.

"Storing Soulbonds data on Ceramic also offers users a decentralized gateway that allows Soulbonds the same accessibility as IPFS, removing any centralization risk or server damage, and ensures that user content remains permanently available," the Soulbonds team says.

Ultimately, Ceramic enables Soulbonds to store data in a decentralized way without incurring transaction fees for users.

Before creating Soulbonds, the team noticed a lack of popular projects that allow the customization of NFTs. Using a centralized server with APIs was one possible solution to create evolving NFTs. Zerion DNA, for example, offers customizable NFTs but it comes at the cost of losing decentralization. The Soulbonds team also considered another option: upgrading NFTs via on-chain transactions; however, this approach would become prohibitively expensive for users frequently updating their data.

The team chose to build Soulbonds on Ceramic because of its ability to perform off-chain data updates in a decentralized environment.

Ceramic Offers Mutability and Decentralization

NFTs are a set of links between blockchain addresses, NFT metadata, and media content. These links could lead to:

Centralized servers of the project company (i.e. CryptoKitties) Decentralized storage (i.e. Arweave or IPFS)

Decentralized storage solutions like Arweave or IPFS are immutable—this means that changing the stored content is usually impossible for certain use cases, like an ever-evolving NFT. From a security point of view, immutable storage is important, but it limits the number of possible applications (like the ability for NFTs to evolve). On Arweave or IPFS, a change in content means that the link also changes, this requires paying gas fees on blockchains every time. This can be a big issue for NFT projects that allow customization to be done frequently, since they need to use a proxy that will contain the latest version of the NFT.

Ceramic allows decentralized access to the entire history of changes affecting the Soulbonds NFT (including new features), as well as the most recent version of the metadata. With Ceramic, users do not have to pay for updates or sacrifice decentralization.

How Soulbonds Built With Ceramic

For Soulbonds, Ceramic stores a sequential stream of metadata along with links to media stored on IPFS. Using one of the API endpoints, projects can request the current state of the content in the stream. Since Ceramic is decentralized, there is no threat that content owned by users will someday be lost.

In addition to dynamic NFTs, the Soulbonds team continues to push new features, including a leaderboard to rank users via on-chain activity and bootstrapping mechanisms to reward Soulbonds token holders.

Get in Touch

The Soulbonds team completed beta testing and encourages the ecosystem to check them out on these channels.

Join the Discord to learn about how you can build your application on Ceramic!


FIDO Alliance

L’Eclaireur FNAC: Passkey: towards the end of passwords in 2023?

The FIDO Alliance is developing an alternative authentication system that will eliminate complex passwords that are difficult to remember. We explain how. The post L’Eclaireur FNAC: Passkey: towards the end of passwords in 2023? appeared first on FIDO Alliance.

The FIDO Alliance is developing an alternative authentication system that will eliminate complex passwords that are difficult to remember. We explain how.

The post L’Eclaireur FNAC: Passkey: towards the end of passwords in 2023? appeared first on FIDO Alliance.


ZDNet: Phishing-resistant multifactor authentication

Multifactor authentication (MFA) can be compromised by phishing. The key is to make MFA more resistant. There are several ways to implement phishing-resistant MFA, the most common approach is called […] The post ZDNet: Phishing-resistant multifactor authentication appeared first on FIDO Alliance.

Multifactor authentication (MFA) can be compromised by phishing. The key is to make MFA more resistant. There are several ways to implement phishing-resistant MFA, the most common approach is called FIDO.

The post ZDNet: Phishing-resistant multifactor authentication appeared first on FIDO Alliance.


Tech Radar: Apple’s announcement could spell the end for passwords – and the beginning for biometrics

Apple’s Passkey technology uses established industry standards from the FIDO Alliance, which Apple helped develop and is working with other technology companies and service providers around the world to reduce […] The post Tech Radar: Apple’s announcement could spell the end for passwords – and the beginning for biometrics appeared first on FIDO Alliance.

Apple’s Passkey technology uses established industry standards from the FIDO Alliance, which Apple helped develop and is working with other technology companies and service providers around the world to reduce the collective reliance on passwords. The FIDO Alliance passwordless login standards are already supported by billions of devices and all modern web browsers.

The post Tech Radar: Apple’s announcement could spell the end for passwords – and the beginning for biometrics appeared first on FIDO Alliance.


Heise: PayPal: Passkey instead of password for Apple users

PayPal is the first major service to jump on the FIDO passkey bandwagon: iPhone users will be able to log in to the payment service without a password. Users of […] The post Heise: PayPal: Passkey instead of password for Apple users appeared first on FIDO Alliance.

PayPal is the first major service to jump on the FIDO passkey bandwagon: iPhone users will be able to log in to the payment service without a password. Users of Apple devices are able to exchange their password with the service for a passkey. The login is then carried out via a public key encryption procedure in accordance with the FIDO standard, the password stored by the user becomes superfluous.

The post Heise: PayPal: Passkey instead of password for Apple users appeared first on FIDO Alliance.


Teiss: Security by obscurity keeps us password-dependent

We need security, by community. Andrew Shikiar of the FIDO Alliance calls on more businesses to see that sharing is caring when it comes to cyber security. The post Teiss: Security by obscurity keeps us password-dependent appeared first on FIDO Alliance.

We need security, by community. Andrew Shikiar of the FIDO Alliance calls on more businesses to see that sharing is caring when it comes to cyber security.

The post Teiss: Security by obscurity keeps us password-dependent appeared first on FIDO Alliance.


Financial IT: FIDO Alliance study reveals password usage still dominates financial services – and is proving costly

The FIDO Alliance published its second annual Online Authentication Barometer, which gathers insights into the state of online authentication in 10 countries across the globe. New to the Barometer this […] The post Financial IT: FIDO Alliance study reveals password usage still dominates financial services – and is proving costly appeared first on FIDO Alliance.

The FIDO Alliance published its second annual Online Authentication Barometer, which gathers insights into the state of online authentication in 10 countries across the globe. New to the Barometer this year, the FIDO Alliance has begun tracking authentication in the metaverse and plans to incorporate the utilization of technologies like passkeys in future editions of the report.

The post Financial IT: FIDO Alliance study reveals password usage still dominates financial services – and is proving costly appeared first on FIDO Alliance.


Dark Reading: Microsoft’s Certificate-Based Authentication enables phishing resistant MFA

Microsoft has removed a key obstacle facing organizations seeking to deploy phishing-resistant multifactor authentication (MFA) by enabling certificate-based authentication (CBA) in Azure Active Directory. This comes as experts anticipate advanced […] The post Dark Reading: Microsoft’s Certificate-Based Authentication enables phishing resistant MFA appeared first on FIDO Alliance.

Microsoft has removed a key obstacle facing organizations seeking to deploy phishing-resistant multifactor authentication (MFA) by enabling certificate-based authentication (CBA) in Azure Active Directory. This comes as experts anticipate advanced phishing attacks will rise next year. “I think social engineering and MFA bypass attacks will continue to grow in 2023, where some other major service providers suffer meaningful breaches like we did this year,” Andrew Shikiar says.

The post Dark Reading: Microsoft’s Certificate-Based Authentication enables phishing resistant MFA appeared first on FIDO Alliance.


Axios: 1 big thing: passkeys enter the mainstream

Poor password hygiene is the root cause of more than 80% of data breaches, according to the FIDO Alliance. Efforts to ditch easy-to-guess, phrase-based passwords are gaining more traction with […] The post Axios: 1 big thing: passkeys enter the mainstream appeared first on FIDO Alliance.

Poor password hygiene is the root cause of more than 80% of data breaches, according to the FIDO Alliance. Efforts to ditch easy-to-guess, phrase-based passwords are gaining more traction with the adoption of passkeys, paving the way for the passwordless future cybersecurity pros dream of. Growing passkey adoption requires widespread availability, industry collaboration and regulatory support, says FIDO Alliance executive director Andrew Shikiar.

The post Axios: 1 big thing: passkeys enter the mainstream appeared first on FIDO Alliance.


Intelligent CISO: Biometric technology revolutionises identity space

The use of biometric technology has taken off in recent years which could largely be down to loopholes in password security becoming more apparent. Biometrics represents the most convenient and […] The post Intelligent CISO: Biometric technology revolutionises identity space appeared first on FIDO Alliance.

The use of biometric technology has taken off in recent years which could largely be down to loopholes in password security becoming more apparent. Biometrics represents the most convenient and easy form of Multi-Factor Authentication and is therefore very well placed to increase security. It’s easy to combine biometric patterns, fingerprint combined with facial, for example, and to complete it with other authentication method – as per FIDO Alliance standards.

The post Intelligent CISO: Biometric technology revolutionises identity space appeared first on FIDO Alliance.


Security Insider: The future of user authentication: password methods on the brink of extinction?

For some time now, the ‘big three’ – Google, Apple and Microsoft – have been working together with the FIDO Alliance to implement a new, cross-platform login standard: the ‘Passkey’ […] The post Security Insider: The future of user authentication: password methods on the brink of extinction? appeared first on FIDO Alliance.

For some time now, the ‘big three’ – Google, Apple and Microsoft – have been working together with the FIDO Alliance to implement a new, cross-platform login standard: the ‘Passkey’ system.

The post Security Insider: The future of user authentication: password methods on the brink of extinction? appeared first on FIDO Alliance.


Nyheder fra WAYF

Islands Kunstakademi nu med i WAYF

Islands Kunstakademi (“LHI”) er nu indtrådt i WAYF som brugerorganisation. Hermed har ansatte og studerende ved LHI nu adgang til at identificere sig som sådanne over for en lang række webtjenester i både WAYF og eduGAIN. Language Danish Read more about Islands Kunstakademi nu med i WAYF

Islands Kunstakademi (“LHI”) er nu indtrådt i WAYF som brugerorganisation. Hermed har ansatte og studerende ved LHI nu adgang til at identificere sig som sådanne over for en lang række webtjenester i både WAYF og eduGAIN.

Language Danish Read more about Islands Kunstakademi nu med i WAYF

OpenID

Unmet Authentication Requirements is now a Final Specification

The OpenID Foundation membership has approved the following OpenID Connect specification as an OpenID Final Specification: OpenID Connect Core Error Code unmet_authentication_requirements A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision. The Final Specification is available at: https://openid.net/specs/open

The OpenID Foundation membership has approved the following OpenID Connect specification as an OpenID Final Specification:

OpenID Connect Core Error Code unmet_authentication_requirements

A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision.

The Final Specification is available at:

https://openid.net/specs/openid-connect-unmet-authentication-requirements-1_0-final.html

The voting results were:

Approve – 45 votes Object – 0 votes Abstain – 11 votes

Total votes: 56 (out of 263 members = 21.3% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Unmet Authentication Requirements is now a Final Specification first appeared on OpenID.

Wednesday, 16. November 2022

DIF Blog

DIF Monthly #30

Our November 2022 round of updates from DIF: Stay on top of developments at our Working Groups, news from our members, events and much more.

Website | Mailing list | Meeting recordings

Table of contents Foundation News; 2. Group Updates; 3. Member Updates; 4. Digital Identity Community; .5. Events; 6. Metrics; 7. Get involved! Join DIF 🚀 Foundation News DIF's Community Manager, Limari Navarrete Joining the digital identity community can be overwhelming. DIF's Limari Navarrete has been documenting her personal journey. Check out her series on Medium here: What is Decentralized Identity? Let’s Figure This Out Why Decentralized Identity is Needed DIF Monthly All-Hands DIF's monthly All-Hands call next takes place on Wednesday 7th December 2022 at 5pm CET | 8am PT - See DIF calendar This public call is open to anyone interested in getting an overview of all the work happening in and around DIF, as well as home to some lively discussions! Please join to ask us your questions, as well as discuss some end-of-year highlights and IIW takeaways. Newsletter RSS The DIF monthly newsletter has an RSS feed. Find it here - DIF Newsletter RSS 🛠️ Group Updates ☂️ InterOp WG (cross-community) Timing update: now meeting every second week Nov. 16 skipped due to IIW Next meeting November 30, 2022 Overlays Capture Architecture (OCA) v1.0 with Paul Knowles (Human Colossus Foundation) Nov. 2 Peter Langenkamp (TNO) presented about a cluster of interoperability centered on TNO EASSI. "eassi strives to improve the usability of SSI in real life by providing basic functionality that assists in issuing and verifying credentials to and from multiple wallets." https://eassi.ssi-lab.nl/docs/about Recording of session here Survey insights - Results Deck here Find the pockets of interop Which companies interop with what other companies; build bridges between the pockets Interop testing Don't boil the ocean. Start with small targets. Let the question "what are we trying to accomplish with interop?" guide the targets. Interested in a public assessment framework. Bonnie Yau of IDLab mentioned they're working on a framework, starting from (or modelled after) the Aries Agent Test Harness. Approaches to Localization BC Gov gave a demo of a localized wallet app with credential localization using OCA. (Link to demo pending) What other approaches? What does "interoperable" mean to different communities? Often gets thrown around in marketing jargon but what are some real metrics for defining whether something is "interoperable?" Wallet UX There might be some other working groups focused on UX already. There might be some overlap with interop. 💡 Identifiers & Discovery Nikos Fotiou presented on the did:self method - Meeting recording Discussion around the W3c DID Interoperability Test Suite Universal Resolver is better than ever! We have been making behind-the-scenes stability and capacity improvements. Read more about this interoperability tool on the DIF Blog here 💡 DID Authentication WG This ongoing collaborative work is co-hosted with DIF by the Open Identity Foundation (OIDF) as a work item of the OpenID Connect WG Latest: OpenID for Verifiable presentations Self-Issued OpenID Provider v2 (latest update Sept 2022) 🛡️ Claims & Credentials Next meeting 11 Nov 2022 - (1300 ET) TPAC W3C DIDWG+VCWG update Pre-IIW Discussion Topics of Interest https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/verfiable-credentials-holder-binding.md https://github.com/w3c/vc-data-model/issues/929 📻 DID Comm Open Wallet DIDComm v2 library contributions?
- light but steady use
- python / jvm languages / rust / swift wrappers
- SICPA
- Repo location? - determine later. IIW Sessions DIDComm KERI (maybe Stephen) W3C compatible HL AnonCreds (Stephen) DIDComm v2 Intro & Basics (Steve and Sam) TOIP Trust Spanning Protocol Maybe: DIDComm vs DWN Messages (Interaction vs Data) DIDComm Open User Group: GitHub Link Meets on Discord: Invite is here Using UnSync format, described here to engage asynchronously in a set time period (typically 4-12 hours) at regular intervals to keep conversation & work moving. Discussion of the group writing a DIDComm Guidebook - feedback and contributions welcome! Potentially covering: What is DIDComm and why would I use it? (super high level) Why developing a protocol on top of DIDComm is a great idea. Protocol Design Basics Protocol Design Best Practices 🌱 Wallet Security WG Meetings are now every two weeks, next meeting Tuesday 30th November

Upcoming topics

eIDSAS ARF update (should come out imminently) - what impact this has on wallet vendors FIDO activity - updates on work items that come out of this liaison DIF & ToIP Trust Registry and Trust Agreements (and DIACC hopefully) groups to combine on specific work items and what this may mean for wallets (and specifically wallet security). Open Wallet Foundation Architecture - collating input and activity happening in this group. IIW debrief

Recent Discussions

Backup and recovery - new methods of combinations of SSS/Threshold Sigs and liveness checks

Portability (Passkey V2 initiaives)

multi device registration/recovery (traditional recovery mechanisms)

eg. Collaborative Key Management - BlockchainCommons GitHub

Authentication of the verifier - OASIS Secure QR Code Authentication 1.0 (ratified)

📦 Secure Data Storage Discussion User Controlled Authorization Network (UCAN) model and how it contrasts with decentralized approaches Value of JWTs? CACAOs (Containers for a chain-agnostic Object Capability (OCAP)) 🌱 Applied Crypto WG Thursday 6 October 2022 - (3pm ET) Work Item status bbs signatures JSON Web Proof: activity is paused at DIF while we work within IETF on establishing a working group. The next step for that is an interim birds-of-feather meeting scheduled for Wednesday, October 12th. revocation_methods_for_verifiable_credentials we now have a strawman of Andrew's method https://identity.foundation/revocation/non-revocation-token/ new meeting time 8am PST every Tuesday would like to bolster more attendance, would another meeting time change help more people attend? Spartan ZKP signatures ✈️ Hospitality & Travel Continued discussion around creation of passenger/customer profile for wholistic travel experiences - join the next call to discuss! added an SSI Activity Tracker section to the H&T SIG site. Two trackers listed, additions welcome! 🏦 Finance & Banking Meeting 38: Financial Privacy & The U.S. Constitution Presentation from Nick Anthony. Nick is a policy analyst at the Cato Institute's Center for Monetary and Financial Alternatives. During this meeting, Nick will discuss some of the reasons that financial privacy is so lacking in the United States. Among these reasons, Nick will highlight the Bank Secrecy Act, third-party doctrine, and Right to Financial Privacy Act." Meeting 37 - Five Lessons Learned from 3 Years of 'Open Banking' In meeting 37 we will host David O'Neill from APImetrics. APImetrics has been monitoring Open Banking APIs since the earliest days of UK Open Banking. Monitoring production endpoints and FAPI consent flows provides a lot of insights about how the Open Banking rollout has evolved. David will provide comprehensive insights into the state of the standard, its adoption, and performance. Topics touched on include: UK Open Banking Adoption different from 'World' Consequences of Non-Compliance Banking data is "yours" in UK and Australia
- "Synthetic Calls" - expand
- Technology Debt (COBOL)
- Sandbox Offer Meeting 36 - Working Meeting "AML Survey" Meeting 35 - Decentralized Identifier and the State Meeting 34 - Public Voter Registries Meeting 33 - Beneficial Ownership Meeting 30 - Intro New Co-Chair: Lennart Lopin 🌏 APAC/ASEAN Open Call Our APAC/ASEAN Community Call is a collaborative initiative between DIF and the Trust over IP Foundation (ToIP). We invite you to attend the next meeting Thurs Nov. 24th at 9am CET / 3pm SGT. See DIF Calendar here for meeting details Last meeting October 27th: ToIP Task Force updates Governance Architecture task force SSI Harms Paper Technical Architecture work AI Task Force Biometric Update analyses S.Korea's government plan for a top-down, although decentralized, blockchain-backed digital ID scheme using mobile devices 🌍 Africa Open Call Join us on the next call: 1st December 2022 Nigerian Minister of Communications and Digital Economy, Professor Isa Ali Ibrahim (Pantami) wins Dubai World Trade Centre Award 💼 Product Managers Thursday Nov 10th at 11am ET (8am PST, 5pm CET) Juan Caballero, Standards Coordinator at Centre Consortium, joined to discuss Web3 and Verite solutions 🦄 Member Updates

Trinsic

Trinsic Ecosystems, the next generation of our platform for building identity products, is live and will be the default product for new users. Key features include W3C compliant verifiable credentials, built in governance based on the ToIP spec, and more performant credential exchange. Check it out here

DanubeTech

Announced at IIW35, DanubeTech has launched a dashboard for statistics on the volume of Decentralized Identifier (DID) transactions on various distributed ledgers. Take a deep dive and see the growth and experimentation across various main nets, testnets and more!

Affinidi

Digital Pilipinas and Affinidi sign MOU to accelerate workforce digitalisation in The Philippines Affinidi has signed a Memorandum of Understanding (“MOU”) with Digital Pilipinas regarding the issuance of verifiable credentials (“VCs”), an open standard for tamper-evident digital credentials, to help facilitate the digital upskilling of the local workforce in The Philippines. This partnership aims to empower individuals by recognising their achievements in upgrading their skills and knowledge, through the issuance, verification and authentication of cryptographically-verifiable VCs that attest to such achievements.
‍ 🌎 Digital Identity Community

Linux Foundation

Get up to speed on Self-Sovereign Identity with this brand new, entirely free course provided by the Linux Foundation: Getting Started with Self-Sovereign Identity (LFS178x)! A 6-7 hour course which aims to prepare you to have informed business discussions around digital identity, and particularly self-sovereign identity, with a good understanding about how identity systems work and influence our lives. It will also enable you to identify innovative ideas and solutions for leveraging SSI, and better position you for further technical learning around digital identity.

Trust over IP

The ToIP Foundation is pleased to announce the release of the first public review draft of the ToIP Technology Architecture Specification V1.0. This initial draft is open for feedback now! More info on the feedback and review process is here

ESSIF-Lab

This month, ESSIF-Lab interviewed David Chadwick, CEO at Verifiable Credentials about SSI, W3C recommendation and funding opportunities.

EBSI

September 2022 - Self-Sovereign technologies are a part of Europe’s Digital Transformation - European Blockchain Services Infrastructure (EBSI) supporting SSI through three key principles: Ecosystem growth Privacy by design Market-development & collaboration