Last Update 10:26 AM September 19, 2021 (UTC)

Organizations | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!!

Sunday, 19. September 2021

Commercio

250 FREE blockchain membership APIs for participants in the blockchain hackathon organized by Var-Group

Var group has organized, as part of the Convention Var Group in Riccione, from 19 to 21 a hackaton on the theme Blockchain www.vargroup.it/hackathon   In 2018, Var Group entered the capital of commercio.network, one of the most innovative European blockchains. In fact, commercio.network is focused not on cryptocurrencies but rather on three crucial topics of digital transformation for compani

Var group has organized, as part of the Convention Var Group in Riccione, from 19 to 21 a hackaton on the theme Blockchain www.vargroup.it/hackathon

 

In 2018, Var Group entered the capital of commercio.network, one of the most innovative European blockchains. In fact, commercio.network is focused not on cryptocurrencies but rather on three crucial topics of digital transformation for companies:

E-identity. Identity management respecting privacy E-signature The management of advanced electronic signatures E-delivery Certified delivery of documents and proof of their existence

Var group, through its BlockIT company, became one of 100 independent validator nodes on July 4, 2020. The commercio.network blockchain  instead of using electricity to validate its network,  uses a new consensus algorithm called POS. This consensus mechanism consumes in a year what Bitcoin consumes in a minute, making it the most sustainable blockchain in the world and consequently the cheapest in the world as each transaction costs only a euro cent plus VAT

From July 4, 2021 Commerc.io srl offers the opportunity to all independent developers to use its commerce.app platform to create applications. commerce.app allows to solve the problem of wallet security through a fully-hosted platform based on HSM devices and programmable via API.

For developers participating in Var Group’s hackathon, commercio.io has provided two free online seminars consisting of 10 one-minute videos that explain respectively:

How to do business with blockchain :(in italian only)  in which it is finally explained the substantial difference between blockchain and Internet and 36 use cases of applications to be developed related to 36 big problems that can be solved only through the blockchain of Commerce network. The video is available at the link https://commerc.io/farebusiness   Using the invitation code: VarGroup

How to make apps with blockchain:  (in italian only)  a seminar explaining how to use the Commerce.app , a managed wallet platform that provides a set of APIs to develop any application on Commerce.network. This platform allows developers to save 99% of time and cost to create any application after a few minutes and not months. The video explains how to access the 250 bronze memberships in the test-net version made available for free to the participants of the Var Group hackaton. The video is available at the link https://commerc.io/fareapp

 

L'articolo 250 FREE blockchain membership APIs for participants in the blockchain hackathon organized by Var-Group sembra essere il primo su commercio.network.


OpenID

Third Public Review Period for OpenID Connect Federation Specification Started

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft: OpenID Connect Federation 1.0 This would be the third Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public […] T

The OpenID Connect Working Group recommends approval of the following specification as an OpenID Implementer’s Draft:

OpenID Connect Federation 1.0

This would be the third Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpeID Implementer’s Draft. For the convenience of members, voting will actually begin a week before the start of the official voting period.

The relevant dates are:

Implementer’s Draft public review period: Saturday, September 18, 2021 to Tuesday, November 2, 2021 (45 days) Implementer’s Draft vote announcement: Wednesday, October 20, 2021 Implementer’s Draft voting period: Wednesday, November 3, 2021 to Wednesday, November 10, 2021 *

* Note: Early voting before the start of the formal voting period will be allowed.

The OpenID Connect working group page is https://openid.net/wg/connect/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specification in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “AB/Connect” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-ab, and (3) sending your feedback to the list.

— Michael B. Jones – OpenID Foundation Board Secretary

The post Third Public Review Period for OpenID Connect Federation Specification Started first appeared on OpenID.

Third Public Review Period for OpenID Connect for Identity Assurance Specification Started

The OpenID eKYC and Identity Assurance Working Group recommends approval of the following specification as an OpenID Implementer’s Draft: OpenID Connect for Identity Assurance 1.0 This would be the third Implementer’s Draft of this specification. An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This […]

The OpenID eKYC and Identity Assurance Working Group recommends approval of the following specification as an OpenID Implementer’s Draft:

OpenID Connect for Identity Assurance 1.0

This would be the third Implementer’s Draft of this specification.

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. This note starts the 45-day public review period for the specification draft in accordance with the OpenID Foundation IPR policies and procedures. Unless issues are identified during the review that the working group believes must be addressed by revising the draft, this review period will be followed by a seven-day voting period during which OpenID Foundation members will vote on whether to approve this draft as an OpenID Implementer’s Draft. For the convenience of members, voting will actually begin a week before the start of the official voting period.

The relevant dates are:

Implementer’s Draft public review period: Saturday, September 18, 2021 to Tuesday, November 2, 2021 (45 days) Implementer’s Draft vote announcement: Wednesday, October 20, 2021 Implementer’s Draft voting period: Wednesday, November 3, 2021 to Wednesday, November 10, 2021 (7 days)*

* Note: Early voting before the start of the formal voting will be allowed.

The eKYC and Identity Assurance working group page is https://openid.net/wg/ekyc-ida/. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration. If you’re not a current OpenID Foundation member, please consider joining to participate in the approval vote.

You can send feedback on the specification in a way that enables the working group to act upon it by (1) signing the contribution agreement at https://openid.net/intellectual-property/ to join the working group (please specify that you are joining the “eKYC and Identity Assurance” working group on your contribution agreement), (2) joining the working group mailing list at https://lists.openid.net/mailman/listinfo/openid-specs-ekyc-ida, and (3) sending your feedback to the list.

— Michael B. Jones – OpenID Foundation Board Secretary

The post Third Public Review Period for OpenID Connect for Identity Assurance Specification Started first appeared on OpenID.

Friday, 17. September 2021

ResofWorld

Los streamers les están ganando a las emisoras deportivas tradicionales en su propia cancha

Un streamer español pasó de comentarista online a cenar con Messi y Shakira.
A principios de agosto, tras meses de especulaciones, el astro argentino del fútbol Lionel Messi hizo algo que parecía inimaginable: le puso fin a una etapa de dos décadas en...

Six things you need to become a certified tech bro in India

Presenting Rest of World’s starter kit for fitting in Bengaluru.
So, you want to break into the Indian tech industry. And why not? Technology is reshaping the country and changing the way people live and work. From ordering food to...

We Are Open co-op

Strategic Starfish: Another Co-op Day

In the before times, co-op members came together in person for a couple days about twice a year to spend time on our cooperative’s strategy and culture. We haven’t seen each other IRL since our London meet-up in January 2020. Nowadays, we do our very best to make sure that we all meet online for at least a half a day each month to focus on our internal stuff. Like every organisation, We Are Open C

In the before times, co-op members came together in person for a couple days about twice a year to spend time on our cooperative’s strategy and culture. We haven’t seen each other IRL since our London meet-up in January 2020. Nowadays, we do our very best to make sure that we all meet online for at least a half a day each month to focus on our internal stuff. Like every organisation, We Are Open Co-op has to set time aside to focus on our internal strategy, culture, processes and setup.

Over the summer, we got out of our monthly rhythm because we didn’t want to exclude any members, and we had some time off scheduled. Now, however, autumn has arrived, and we’ve run our September Co-op day.

After checking in, saying hello and discussing the phrase “brand new second hand desk chair”, we got to work and focused on two main areas of the co-op: Operations and Marketing.

Operations

Digital stuff, workflows and processes are not static components of business. Technical decisions need to be reviewed on a regular basis no matter what shape or size your organisation is.

We spoke about our digital infrastructure and making improvements to our server setup. We’re going to track downtime on our current servers, have a look at whether or not specific web apps are wobbling and review our energy consumption with those servers. Then, we’ll set to work revamping our infra.

The other operational tidbit we addressed was reviewing our Divvy Up template. The Divvy Up sheet tracks our days and budgets on individual clients. This template has been changing about every six months as our administrative processes evolve and become more efficient.

Marketing

A lot, most or all our clients come through referrals, so we have a long-standing negligence towards our own marketing.

We recently onboarded a media education grad student, Anne, and she is having a strategic look at our work. In our co-op day, Anne shared some of the things she’s been working on that loosely connect to our “marketing strategy”. She also showed us thinking she’s been doing regarding Learnwith.weareopen.coop, our library of free resources for working openly and collaboratively. In response, John got excited about Wardley Mapping, and all of us started geeking out on the methodology. John and Anne are going to do some Wardley Mapping for WAO in the coming days.

We also spoke about running an end of year campaign. It would be something experimental, creative and fun for us, additionally it would serve as some sort of marketing. We have some ideas :)

Proposals Our Strategic Starfish CC-BY-ND Bryan Mathers

Earlier this year we participated in Outlandish’s Sociocracy Workshop, and now use Sociocracy principles to run our organisation. In our co-op day we ran proposals to allot budget towards two budding ideas. We decided to support someone going to the CoTech meet-up in London later in the month.

We also passed a proposal to do a Strategic Starfish exercise in our co-op days as an experiment. Once that proposal was passed, we did the exercise. Our Strategic Starfish was great fodder for all the little to-dos and ideas people have about co-op work. We should have done that at the beginning of the day because we would have produced even MOAR things. Oh well, next time! We’ll be meeting for another half-day soon ;)

Strategic Starfish: Another Co-op Day was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Own Your Data Weekly Digest

MyData Weekly Digest for September 17th, 2021

Read in this week's digest about: 9 posts, 1 question
Read in this week's digest about: 9 posts, 1 question

Thursday, 16. September 2021

Digital ID for Canadians

DIACC Project Coordinator (Bilingual)

Open date: September 23, 2021 Closing date: October 14, 2021 DIACC is seeking a highly motivated English and French bilingual Project Coordinator to support projects,…

Open date: September 23, 2021

Closing date: October 14, 2021

DIACC is seeking a highly motivated English and French bilingual Project Coordinator to support projects, meetings, events, community engagement, project plans, and action tracking. The Project Coordinator will work in collaboration with DIACC community partners and stakeholders, report to the DIACC Senior Program Manager, and will work closely with the DIACC French markets lead. 

Digital Identity is at the forefront of innovation and digital transformation and the Project Coordinator is ideally a person who is passionate about enabling technology that works for the benefit of society. 

This is a 30-40 hour a week position with compensation based on commensurate experience. 

Please submit your resume and CV to info@diacc.ca with “Application: DIACC Bilingual Project Coordinator” in the subject line.

Thank you.

Duties and Responsibilities:

Coordination of assigned groups, meetings, projects, events, and other related requirements.  Coordination of community partners and stakeholders related to assigned projects. Monitor and report on project progress (e.g., minutes of meeting, evaluation/feedback notes, focus group transcripts, etc.). Communication of any project concerns to the DIACC team. Support the Senior Program Manager and French market lead to ensure projects are delivered on time, within scope, and within budget. Provides assistance in the evaluation and prioritization of projects. Keep teams updated regarding deliverables and timelines.

Qualifications:

University or college degree, or a minimum of three years of subject matter relevant project coordination experience. Minimum 2 years working in a non-profit organization or other sectors preferred. Strong organizational, planning, and time management skills, with the ability to multitask and coordinate projects with tight timelines. Excellent interpersonal, verbal, and written communication skills in French and English. Ability to work effectively and unsupervised in a wide range of settings with people from diverse backgrounds, including clients and co-workers. Knowledge of concepts, approaches, and practices related to digital identity is a plus. Experience working with the innovation ecosystem and digital media is considered an asset. Works well independently in team-oriented, remote, and collaborative environments. Strong computer skills utilizing Microsoft Office and G Suite required; advanced level of proficiency.  Ability to interact professionally with multiple stakeholders and partners from the public and private sectors.  Bilingual (English and French reading and writing) is required 

BrightID

BrightDAO is Here!

A community for decentralized proof-of-uniqueness We’ve grown. At 30,000 sponsored users, we need to imagine more clearly how a decentralized proof of unique human system can manage itself. We need to imagine our movement as a collection of teams self-organizing and rallying to contribute: verification helpers, anti-sybil researchers, user-experience experts, and many more. A DAO powe
A community for decentralized proof-of-uniqueness

We’ve grown. At 30,000 sponsored users, we need to imagine more clearly how a decentralized proof of unique human system can manage itself.

We need to imagine our movement as a collection of teams self-organizing and rallying to contribute: verification helpers, anti-sybil researchers, user-experience experts, and many more.

A DAO powered by 1Hive Gardens and Aragon technologies is the best structure to achieve this.

$BRIGHT is the token that makes BrightDAO work.

A fairdrop and ongoing fair distribution mechanisms (think BrightID-enabled faucets) are the best ways to allocate $BRIGHT. (A “fairdrop” simply means some or all rewards are given per person instead of per address.)

We’ve written several guides to help explain $BRIGHT, BrightDAO, and the BrightID-enabled fairdrop process.

$BRIGHT guide BrightDAO $BRIGHT fairdrop Launch Party / AMA

Still have questions?

Today at 11:00 a.m. Pacific, CurlyBracketEffect of RabbitHole and DAOnToEarth podcast fame will be our MC to introduce us to the wonders of BrightDAO and $BRIGHT.

Join us on our discord stage for this epic gathering.
* BrightID discord

BrightDAO is Here! was originally published in BrightID on Medium, where people are continuing the conversation by highlighting and responding to this story.


ResofWorld

Streamers are beating traditional sports broadcasters at their own game

Spanish streamer Ibai Llanos went from video game commentary to dinner with Messi and Shakira.
In early August, after months of speculation, Argentine football star Lionel Messi did the unthinkable: He ended a nearly two-decade-stint with FC Barcelona, announcing that he would be joining Paris...

Inequality is growing between gig workers and employees

Only a small group will retain the privileges that workers fought to gain over the last 150 years.
At the end of 2018, when it was a little less than 10 years old, Uber announced that each month more than 91 million people were using its services to...

SelfKey Foundation

SelfKey Join Hands with Blockster

We’re excited to announce that SelfKey will join hands with the upcoming social media platform Blockster. The post SelfKey Join Hands with Blockster appeared first on SelfKey.

We’re excited to announce that SelfKey will join hands with the upcoming social media platform Blockster.

The post SelfKey Join Hands with Blockster appeared first on SelfKey.

Wednesday, 15. September 2021

aNewGovernance

Japan-based Dixon Siu to join the Board of aNewGovernance AISBL

I am proud to say our Board now spans 4 continents after Dixon joins us. This is a testimony to human-centric data infrastructure being on track to become a global standard.

As the Data Strategy and the Data Spaces are being put in place in Europe, as the new US Administration is questioning the operating practices of global platforms, as new digital public and private services are created the world over, it is critical our approach over Personal Data Sharing is truly Global, and brings elements which are not just influenced by the framework of the European Data Strategy we help shape, but is the mirror of the actual needs of individuals, tech-savvy or not, and the realities of a very competitive market. This is why we are thankful Dixon has agreed to join us, bringing his experience of Chief Evangelist of Personium which is the only open-source personal data store platform in Japan.

In parallel to his role at Personium, Dixon is a software engineer at Fujitsu Limited. Currently, he is also leading the MyData movement within Fujitsu.

After attending MyData 2018, he was moved by the passionate people involved and is now heavily involved in the overall MyData movement in Europe, Asia, and Global South. He also drives human-centric projects in areas like data interoperability, child data protection with UNESCO, and common data platform for the next pandemic.

Given his breadth of experience and alignment with a number of strategic sectors where aNewGovernance is currently developing ecosystems, I am sure, he will bring incredible contribution.

Eric Pol, Chairman

Brussels, Tokyo, 14 September 2021


ResofWorld

La app de compras celebrada por sus repartidores. Hasta que llegó Uber

Todo cambió cuando el gigante estadounidense compró el unicornio chileno Cornershop.
Angélica Salgado lloró de alegría cuando recibió su primer cheque de la empresa chilena Cornershop en 2017. Después de dos años sin trabajo en Santiago, a Salgado le ofrecieron un...

Beijing’s breakup of Alipay was inevitable

The real question is how it was allowed to get so powerful in the first place.
On Monday, I was chatting with a friend who heads up capital markets at a Chinese unicorn. I apologized for my slow replies; I was busily fielding questions about the...

How one coder became Indonesia’s misinformation guru

In a “dystopian nightmare” internet, Ismail Fahmi holds a strange monopoly on debunking false facts.
It was November 4, 2016, and a protest was raging in the Indonesian capital of Jakarta. As many as 200,000 opponents of the city governor Basuki Tjahaja Purnama (“Ahok”) spilled...

Blockchain Commons

Principal Authority: A New Perspective on Self-Sovereign Identity

This summer, we’ve been iterating through an article intended to talk about the success that Blockchain Commons has had working with the Wyoming legislature to help to define a first-in-the-country legal definition of digital identity. The Digital Identity Working Group for the Wyoming Select Committee on Blockchain meets again next week, on September 21-22, 2021. I will be providing testimony ther

This summer, we’ve been iterating through an article intended to talk about the success that Blockchain Commons has had working with the Wyoming legislature to help to define a first-in-the-country legal definition of digital identity.

The Digital Identity Working Group for the Wyoming Select Committee on Blockchain meets again next week, on September 21-22, 2021. I will be providing testimony there at 2pm MST. As a result, we’ve decided to release the current draft of this article on digital identity and how Wyoming has defined it using Principal Authority, with the goal of helping to shape the agenda for digital identity for the next year, both in Wyoming and elsewhere.

—Christopher Allen

In 2016, I wrote “The Path to Self-Sovereign Identity” to talk about the historic evolution from centralized identity to user-centric identity and to define the next step: a self-sovereign digital identity that was controlled by the user, not some third party. In it I also offered 10 Self-Sovereign Identity Principles which have been widely accepted by the decentralized identity community.

Self-sovereign identity has matured and grown considerably since, as I chronicled in “Self-Sovereign Identity: Five Years On”. There are now specifications, products, and entire companies supporting the concept. However, recent legal efforts to define self-sovereign identity may be just as important for catapulting it into the mass market.

Read More Defining Identity

Defining identity is by no means easy. That core topic could encompass a paper much longer than this. The following are some various definitions of identity drawn from the RWOT Glossary:

Identifier: A label that refers to an entity and can be used to establish or maintain an identity. For example, a name or UID.

Identity: A conceptual construct that enables the differentiation between distinct entities that are individually considered to be unique, but which may possess class or group characteristics. An identity gives these entities the ability to interact as peers in collaborative or competitive ways. It is not the entity that it represents.

Identity, Digital: A digital representation of an entity, managed by digital tools, over which that entity has personal or delegated control.

Identity, Functional: How we recognize, remember and respond to specific people and things.

SSI: Self-sovereign identity. A decentralized digital identity that does not depend on any centralized authority and whose information is portable.

Digital identity is just one aspect of a complex, interconnected web of different digital models. It’s not the same thing as identification (where you prove you are a distinct entity with unique characteristics), authentication (where you prove you were the same entity as before), or personal data (which is information related to an identified or identifiable entity).

Those other elements all need to be considered, but it’s digital identity, and now self-sovereign identity, that gives us the linchpin to do so.

Turning Digital Identity into Law

For self-sovereign identity to truly achieve international success, I feel that it needs to not just be embraced by the technological sector, but also to have a basis in law. In recent years, I’ve been progressing toward that goal through work with various state and national legislatures.

Collaborating with the Wyoming legislature has borne the first fruit. This year they passed SF0039 on digital identity, which the Governor signed into law and which went into effect on July 1, 2021. It defines digital identity as follows:

(xviii) “Personal digital identity” means the intangible digital representation of, by and for a natural person, over which he has principal authority and through which he intentionally communicates or acts.

So where’s the self-sovereign identity in that?

As with much legislation, it’s all about the careful selection of words.

Defining Principal Authority

To understand how Principal Authority relates to self-sovereign identity requires insight into what Principal Authority is. The concept comes out of English Common law. It appears in most Commonwealth countries but has also found its way into the laws of other countries, including the United States. It’s primarily used in the Laws of Agency, an area of commercial law where an agent is empowered to take on certain tasks.

As the name would suggest, Principal Authority first requires a Principal: a person or entity. It then requires that entity have Authority: some power. Principal Authority is thus the power derived from a person or entity, which they can use or which they can delegate out to others. When applied to digital identity, Principal Authority says that a Principal has Authority over his identity — which is a clear restatement of self-sovereign principles.

In fact, the recognition of a Principal is itself a statement of the first of the principles of self-sovereign identity: existence. It asserts that digital identity is always a representation of an actual entity, who predates any digital representation, and who is also the first and foremost beneficiary of that representation.

However, in drawing on the Laws of Agency, the concepts of Principal and Principal Authority go beyond that. Because the person at the heart of an identity has the ultimate power to control the self-sovereign digital representation that they’ve created (and/or that they use), this means that any others who exert Principal Authority over that identity data are doing so only as agents of the Principal.

By focusing on Agency, the concept of Principal Authority also ensures that the Principal always has the ability to revoke their delegation to the agents whom they have temporarily offered Authority regarding their identity. This is a requirement for other self-sovereign principles such as portability, and it’s a real necessity in the digital world, where we might need to delete personal data or to cancel Terms & Conditions that we signed without real, informed consent.

Altogether, this new definition of Principal Authority adds a lot of nuance to self-sovereign identity, and much of that comes thanks to the implicit incorporation of Agency.

Defining Control

In saying that a Principal has the ultimate authority to control their digital identity, care also needs to be taken to define what that control means. It means that a Principal has (or can delegate) the Principal Authority to identify as that identity; to authenticate as that identity; and to know the contents of the data held by that identity.

However, any digital identity also exists as part of a larger ecosystem, and the Principal does not have control over the larger ecosystem. They cannot control how other entities rate, rank, or note the reputation of their identity; and they cannot control comments, flags, or other notes that other entities might attach to their identity.

Further, a Principal cannot necessarily prevent other entities from creating new digital identities related to them, which may or may not link to an existing identity. (Though states are increasingly recognizing the limits of voluntary disclosure of information, digital identity laws will ultimately never prevent a police station from creating their own identity record related to a criminal, or a civic authority from creating government-related identity records.)

In other words, self-sovereign identity, and the establishment of Principal Authority over it, lays down boundaries for what the Principal controls — boundaries that are much wider than those established for digital identities controlled by third parties. However, those boundaries still exist.

Fundamentally, control of a digital identity means that the Principal can expect to maintain the continuity of that identity for as long as they see fit, but in doing so must accept the battle scars inflicted by interactions in a larger ecosystem that are implicit to the identity itself.

Agents & Their Duty

The ability to delegate Principal Authority, as revealed by the Laws of Agency, may be as crucial to self-sovereign identity as the concept of a Principal itself. It allows the empowerment of agents by a Principal — who might be physical agents or, in the increasingly digital world, virtual agents. But, it also institutes the core requirement that those agents be working for the good of the Principal when they are exerting Principal Authority over the identity holder’s identity data.

This concept of “duty” is crucial to the Laws of Agency. Duty requires that an Agent only use their delegated Principal Authority as the Principal requests, in a way that benefits the Principal, and with care and due diligence, while frequently reporting back what they’ve done.

This is a notable change from the way that digital identities have been treated traditionally. Compare that to banks, who represent you in financial transactions, and then sell your spending data; compare that to Facebook, who collects as much personally identifiable data and other information as you’re willing to give them, then sells that to advertisers; or compare it to Google, who infers personally identifiable and demographic data based on the information you input into their tools and the choices you make. In large part, you don’t even know what these identity representatives and data holders are doing. In the world before Europe’s GDPR or California’s CCPA, you had little input into their actions. Even now, with those early and rough attempts to protect digital self-sovereignty, you’re typically opting-out, not opting-in — which is barely agency at all — and you’re still not protected against people who are self-serving nor those who are inferring information from scattered pieces of data.

That’s because any duties currently required of the entities to whom you grant agency over your data are quite minimal. Maybe there’s a duty of privacy, maybe there’s a duty of safety, but in general they don’t need to work in your best interest. That’s why we need to ensure that new definitions of digital identity, particularly self-sovereign identity, follow the Laws of Agency in ways that our current systems do not.

This sort of agent agreement needs to be part of delegation. To date, this has been true to a limited extent with federation protocols such as SAML and Oauth, but that needs to be extended to every person. Wyoming’s digital-identity law is the first example of legislation that focuses on Agency in this way, and that’s much of its power.

However, this isn’t a simple issue. Even with Agency-focused legislation, we need to determine a source for duties. This article will try to outline some of them, using not just the traditional duties of agents, but also the self-sovereign principles themselves. However, it’s a mere starting point, with a stronger legal foundation required.

Principal Authority & The State

Before defining duties, it’s important to note one other interesting element of Principal Authority and its foundation in Agency: it focuses not just on a single person’s authority, but also on their ability to delegate to and require duties from other entities. In other words, it’s a peer-to-peer relationship; this relationship works within the context of a state who recognizes the concept of Principal Authority, respects its ability to enable Agency, and enforces its established duties.

However, though the state is involved, this peer-to-peer relationship still lies in stark contrast to traditional property law, where property is always in some way beholden to the state: the state might be the original source for property, they might be able to reclaim it by eminent domain, and they might be able to seize it through asset forfeiture. Those ideas all run counter to the idea of self-sovereignty — which is yet another reason that we choose to focus on the Agency of Principal Authority, not property law, as the core legal metaphor for self-sovereign identity.

Restating the Self-Sovereign Principles

The use of Principal Authority to empower self-sovereign identity provides a legal foothold for many of my original 10 principles.

What follows is a restatement of the self-sovereign principles that reorganizes the original ten as rights and duties that are suggested by customs, expectations, and best practices, but which need to be better codified to become true duties. It also proposes five additional duties that could come from customs implicit in the Laws of Agency.

Together, these ideas may allow us to both better understand how to turn the self-sovereign principles into concrete usage and also to more easily translate them into duties bound by legislation.

The Rights of Self-Sovereign Authority

Some principles of self-sovereign identity are implicit in the idea of a Principal.

Existence. The definition of Principal requires that there be a real entity at an identity’s heart. Control. The definition of Principal Authority says that the Principal always retains control of an identity, within specifically defined boundaries, no matter who is holding it at a particular time. Persistence. Because of their uncontested Principal Authority, a Principal may decide to have an identity last for as long as they see fit. Consent. Anything that happens within the defined boundaries of the digital identity is implicitly with the consent of the Principal, who may delegate or revoke Principal Authority at any time.

Self-sovereign rights recognize that an identity exists to benefit its Principal. These core principles likely derive explicitly from a definition of digital identity such as that created by the Wyoming legislature.

The Duties of Self-Sovereign Identity

The remaining principles of self-sovereign identity can be stated as duties owed to a Principal by an agent who has been granted Principal Authority over an identity for certain purposes.

Access. An agent must provide the Principal with access to data related to their digital identity. Transparency. An agent must be open about how they will administer a digital identity. Portability. An agent must make a digital identity portable upon the request of the Principal. Interoperability. An agent must use standard, interoperable protocols when making an identity portable, and should also use those interoperable protocols when interacting with other identity systems. Minimization. An agent must minimize the data collected, stored, transmitted, and shared regarding an identity so that it only includes data that is strictly necessary in the context of a request made by the Principal. Protection. An agent must place the protection of the identity above their own needs or desires.

Identity duties says that agents will tell you how they’re using your identity, use it in the most minimalist way possible, and make it easy for you to reclaim the identity. However, legislation may be required to turn these best practices into duties bound by law.

The Duties of Self-Sovereign Agents

The idea of Principal Authority itself suggests additional duties that were not included on the original list of principles of self-sovereign identity, but which are generally defined in the Laws of Agency to be due from agents to Principals.

Specificity. An Agent will use Principal Authority to serve specific tasks as instructed by the Principal, or as required by Custom, and do nothing more with an identity. Responsibility. An agent will serve those tasks with reasonable care and skill, with due diligence, and without further delegation. Representation. An agent will act in the best interests of the Principal, without secret profit, and will not take on other responsibilities that might conflict with that. Fidelity. An agent will serve those tasks in good faith. Disclosure. An agent will maintain accounts and report their actions back to the Principal.

Agent duties say that agents will be trustworthy in their use of your identity. These duties are more likely to implicitly be a part of any legislation that was built atop the Laws of Agency.

Taking the Next Steps

Wyoming’s definition of personal digital identity helps us to lay more foundation for self-sovereign identity, but it’s still just a starting point.

There’s more to do.

Laws of Custom

To start with, the Laws of Agency are largely built on Laws of Custom, which are as likely to be common law as any formally codified legislation. When creating new laws related to self-sovereign identity, we’ll be creating new Laws of Customs for the digital frontier, an area that’s so fresh that the tradition of customs has been limited.

This creates real challenges, as we must decide what customs we want to create and then we must develop them from common law to legal customs to (eventually) codified duties. We can integrate these with the Laws of Agency, and we can figure out how that interrelates with old common laws such as the Use Laws. We may even need special courts to set these common laws and achieve remedies, such as the Court of Chancery.

Fundamentally, there’s a lot of work to be done here; recognizing the existence of a Principal and the use of delegatable Principal Authority bound by the Laws of Agency is just a starting place. New customs, even though understood as best practices, will not automatically become legal duties.

Open Questions

Beyond that, I’m not a lawyer. There may be other legal elements that can support our definition of digital identity. Are there additional duties that we could bring in? Are there fiduciary or agency laws that we could leverage? Are there other legal models of interest we can draw from, such as the UNCITRAL Model Law of Electronic Commerce approach, which says that “The actions, promises, rights and obligations of a person shall not be denied legal validity on the sole ground they are effectuated through their digital identity”? These possibilities need to be studied, preferably with the help of legal experts.

Even once we’ve fully defined digital identity, we still must consider how digital identity may need to be more carefully protected. Are there ways we can give specific protection to private keys used for signatures and to express authority? Can we protect against the theft of private keys that might allow impersonation or false witness? Can we prevent the misuse of digital biometric or genetic information? Can we protect against other “crimes of authority”?

There’s also a flipside: digital identity should give us some new advantages not found in traditional identity. For example, there have always been problems with individuals with low market power being at a disadvantage when negotiating with larger parties. Can new digital identity laws help start to resolve that imbalance?

Final Notes

One of the most important steps going forward will be to continue working with the Digital Identity subcommittee in the Wyoming legislature. However, I’d also welcome discussions with other states and nations, to ensure that we have great definitions of digital identity that support self-sovereign identity everywhere.

If this is important to you too, consider supporting Blockchain Commons to make this a reality.

Offering Some Thanks

This article was written by Christopher Allen with Shannon Appelcline. Thanks to commentators who made time to talk to us about it, including Joe Andrieu, Dazza Greenwood, and Clare Sullivan. (Our conclusions are ours; they may or may not agree with them.)

Many thanks to Wyoming State Senator Chris Rothfuss who invited me to join the Wyoming Digital Identity subcommittee and to the others members of the Digital Identity subcommittee in the Wyoming legislature, including Brittany Kaiser, Carla Reyes, Diedrich Henning, Scott David, and once more Clare Sullivan and Dazza Greenwood. Thanks to their hard work, Wyoming now offers the first definitions of personal digital identity in the United States, laying the foundation for these additional ideas.

Tuesday, 14. September 2021

EdgeSecure

Network & Systems Engineer (Level 1)

Position Summary New Jersey’s Higher Education and Research Network (NJEdge.Net) provides connectivity and transport services, access to Internet2 and support for various on-network applications and services to more than 50... The post Network & Systems Engineer (Level 1) appeared first on NJEdge Inc.

Position Summary
New Jersey’s Higher Education and Research Network (NJEdge.Net) provides connectivity and transport services, access to Internet2 and support for various on-network applications and services to more than 50 connected members throughout the State of New Jersey and the region. 

Reporting to the Principal Network Architect, the Network & Systems Engineer (Level 1) manages the daily operational activities of the NJEdge.Net network and systems. This position is responsible for the systems maintenance of the switching/routing infrastructure for production and maintenance systems. Responsibilities include support for remote hands on installation, moves, adds and repairs of network hardware, supporting ongoing member installation and upgrade projects along with trouble shooting both core network, member access, performance and systems issues. 

Job Focus
The Network & Systems Engineer (Level 1) works in concert with the Principle Network Architect, AVP Programs and Services, NJEdge.Net’s technical team, contracted resource and member network specialists with the ongoing planning and designing of the network infrastructure and enterprise network solutions; troubleshooting, installing, implementing and administering management network,  production network, associated server systems and providing support and problem resolution to the membership. The Network & Systems Engineer (Level 1) also analyzes capacity issues and develops capacity planning models, ensures planned testing activities are executed. The Network & Systems Engineer (Level 1) also responds to technical inquiries, logs service and repair activities, creates manuals and guides, updates and maintains all network documentation.

People Specifications

A self-starter who is intellectually curious, innovative, and excited about “creating” in a fast-paced environment. Ability to manage an aggressive agenda in a fast-paced and rapidly changing environment. Strategic and future-oriented, with a tangible, demonstrated, robust commitment to the mission and growth strategies of Edge. Superior communication skills; expert at translating and communicating complex financial and operational information to varying audiences. Effective, creative and a problem-solver.

Duties and Responsibilities 

Primary responsibility is to manage day to day operations of the NJEdge network and associated systems. Strong working familiarity with Linux Cisco IOS/XR and Nexus OS with hands-on experience configuring Cisco routers, switches. Network Connectivity: Provides support for network connectivity or related network issues for the user community. Manages trouble ticket resolution in NJEdge service desk platform. Troubleshoot all network and systems related problems Ability to work with multiple vendors concurrently to resolve problems Assists in the installation of network and associated production systems. Installation of hardware and cabling in support of network installations, moves, adds and changes. Network Monitoring: Analyzes network activity and network problems to discover and prevent systematic errors. Issue Resolution: Troubleshoots, diagnoses and resolves network problems. Researches, analyzes and recommends the implementation of software or hardware changes to rectify any network deficiencies or to enhance network performance. Ability to provision services on various platforms Manage Inventory and documentation. Ability to work in a dynamically changing environment. Perform required maintenance on all network hardware and software platforms Network Performance Assessment: Assesses network performance to ensure that it meets the present and future needs of the membership.  Knowledge of commercial enterprise-level tools and products to provide network services, including: ASA firewalls, IPS/IDS, DNS, DHCP, web security, TACACS+, VPN and NAC.

Additional Duties and Responsibilities

This role is designated as a remote personnel position and the incumbent is expected to function autonomously, manage time efficiently and in so doing achieve high levels of productivity. These are essential capabilities and will be an area of evaluation on personnel reviews and performance plans.  Document and communicate assigned work and progress to immediate project team, management and member stakeholder personnel. Communicate problems and/or issues in a timely manner and in accord with established protocols for incident management and problem resolution. Establish and maintain effective working relationships with NJEdge staff, vendor partners and the membership. Requires excellent verbal and written communication skills with the ability to provide nontechnical descriptions of technology issues for nontechnical managers, directors and other stakeholders. Provide customer support with high levels of professional conduct Performs related duties and fulfills responsibilities as directed by the Principle Network Architect and senior management of the NJEdge organization. Will be required to travel to NJEdge POPs and member locations in support of NJEdge operational activities. Ability to operate  Optical Power Meters, OTDR, Light Source Meters 

Qualifications Required

Minimum of 5 years operational IT experience with at least 3 years prior experience working with Cisco networking devices including routers and switches as well as other networking gear. CCNA Proven knowledge based on work experience with BGP, OSPF and IPV6. Knowledge of CentOS/RHEL Experience providing support for large enterprise or service provider networks. Experience with network diagnostic tools Knowledge of Python, PHP and Bash. Knowledge of Linux routing and networking in general for high throughput applications Must have a valid license to drive that is in good standing. Hardware configuration and server assembly for custom configurations Must be capable of lifting up to 50 lbs., climb ladders, bend and stoop to perform equipment maintenance in Co-location facilities when necessary

Qualifications Preferred

Related Cisco certifications including Cisco Certified Network Professional (CCNP), CCNA, CCDA, CCNP, RHEL

Education

Bachelor’s degree or equivalent work experience.

Compensation and Benefits
An annual salary of $100,000. Medical, dental, vision, short-term and long-term disability and life insurance, 401(k) plan, three weeks paid time off (PTO), and 10 paid holidays.

Apply [contact-form-7]

The post Network & Systems Engineer (Level 1) appeared first on NJEdge Inc.


Digital ID for Canadians

Be a Digital ID Champion

Canadian leaders must prioritize and champion digital ID to unlock capabilities for social inclusion and economic recovery TORONTO, Sept. 14, 2021 — More than 100…

Canadian leaders must prioritize and champion digital ID to unlock capabilities for social inclusion and economic recovery

TORONTO, Sept. 14, 2021 — More than 100 financial institutions, telcos, technology companies, consulting companies, SMEs, academic partners, international organizations, and nonprofits have declared they are digital ID champions in a campaign by the Digital Identification and Authentication Council of Canada (DIACC), launched today. The DIACC calls on all political parties and private sector leaders to prioritize digital ID to unlock the digital economy as a critical path forward to pandemic recovery.

The COVID-19 pandemic has dramatically accelerated the pace at which Canada is shifting online. Many things we used to do ‘in person’ — from grocery shopping to medical appointments — we now do without ever leaving our home. At the same time, a global army of hackers are working to exploit online security vulnerabilities to access sensitive personal information.

“Canadians need digital ID credentials that  empower them to use digital capabilities, while knowing their personal information is respected and secured,” says DIACC president Joni Brennan. “Technology alone can’t solve today’s challenges. Digital ID that works for everyone must  be a public policy priority. Canadians want their governments to work with stakeholders to unlock digital ID multi-use credentials that protect their personal privacy while accelerating economic recovery.”

Now more than ever Canada needs Digital ID Champions. Digital ID Champions:

Promote digital ID to accelerate economic recovery and secure equitable social inclusion. Work with DIACC and others to establish privacy-protecting digital ID that empowers individuals, businesses, the public sector and civil society.

Digital ID has foundational importance for every Canadian and every political party.  

The Liberal Party platform’s proposed “digital policy task force” ” would strongly benefit from secure, privacy-respecting, and citizen empowering digital identity if it wants “to position Canada as a leader in the digital economy and shape global governance of emerging technologies, including with respect to data and privacy rights, taxation, online violent extremism, the ethical use of new technologies, and the future of work.” 

The Conservative platform acknowledges the need for Canadians to understand how their data is being used. Digital ID provides a tool for making this possible. As the platform notes, “Digital data privacy is a fundamental right that urgently requires strengthened protection through legislation and enforcement. Canadians must have the right to understand and control the collection, use, monitoring, retention, and disclosure of their personal data.”

The NDP’s promise to include a digital bill of privacy rights in its platform will gain much needed utility from a citizen empowering digital ID ecosystem . 

The Bloc’s platform commitment “to improve support for the digital transition of businesses to enable a local, inclusive, and secure economy,” is also important in both a local and global digital economy. Digital ID can help Quebec and other communities to enable business transformation while enabling capabilities to verify the authenticity of Quebec existing and emerging cultural IP.

For these commitments to be implemented effectively, a strong and secure digital ID ecosystem must urgently be established. Creating a digital identity ecosystem now, supported by a pan-Canadian Trust Framework, will enable an effective transition to secure Canada’s full and beneficial participation in the digital economy, whether it’s introducing digital vaccine proofs, receiving government benefits, updating an immigration status, entering the workforce and more. 

“Digital ID is the key to effective economic recovery that will grow social inclusion and equity,” says Brennan. “We urge all Canadian leaders to be champions for digital ID by working with the DIACC and others to ensure social inclusion and accelerate economic recovery through secure and privacy protecting digital ID that empowers individuals, businesses, the public sector and civil society.”

For more information on becoming a digital ID champion, please visit: https://diacc.ca/champion/  

About DIACC

The DIACC (Digital Identification and Authentication Council of Canada) is a non-profit coalition of more than 110 leading public and private sector organizations that are  committed to developing a Canadian framework for digital identification and authentication. Our member organizations range from small 1-5 person shops to large multinational corporations with over 15,000 employees. We advocate for an inclusive, equitable, and privacy-enhancing digital ID strategy that works for all Canadians.

For more information

Krista Pawley 

Krista@ImperativeImpact.com

416-270-9987


ResofWorld

3 minutes with Oluwasoga Oni 

Founder/CEO of MDaaS Global 🇳🇬
Oluwasoga Oni co-founded MDaaS, a Nigeria-based health tech company, with Opeyemi Ologun and Genevieve Oni in 2016. MDaaS provides access to affordable healthcare using an annual subscription plan and a...

SelfKey Foundation

SelfKey Gets Listed on 7b

We’re thrilled to announce that the native token for SelfKey, $KEY token is now listed on crypto broker platform 7b. The post SelfKey Gets Listed on 7b appeared first on SelfKey.

We’re thrilled to announce that the native token for SelfKey, $KEY token is now listed on crypto broker platform 7b.

The post SelfKey Gets Listed on 7b appeared first on SelfKey.


ResofWorld

The world’s biggest VCs are now vying for stakes in Nigeria’s tech sector

But to maintain that attention, the work of local investors is more critical than ever.
In 2010, Idris Ayodeji Bello gave up his lucrative job at a major oil company in Texas to return home to Nigeria. Bello believed that Nigeria’s tech sector was poised...

Monday, 13. September 2021

MyData

Personal data holds the key for sustainable city life

Personal data is a key building block for innovative smart city services. However, as seen in cases like the Sidewalk Labs’ effort in Toronto, concerns about governance and unclear usage of personal data can create pushback from the public and erode trust. In response, cities around the world are adopting approaches to personal data management... Read More The post Personal data holds the k

Personal data is a key building block for innovative smart city services. However, as seen in cases like the Sidewalk Labs’ effort in Toronto, concerns about governance and unclear usage of personal data can create pushback from the public and erode trust. In response, cities around the world are adopting approaches to personal data management...

Read More

The post Personal data holds the key for sustainable city life appeared first on MyData.org.


ResofWorld

This delivery app went above and beyond for its workers. Then Uber took over

Everything changed after Chilean unicorn Cornershop was bought by the American ride-hailing giant.
Angelica Salgado cried tears of joy when she received her first paycheck from Chilean startup Cornershop in 2017. After two years of unemployment in Santiago, Salgado was given a 30-hour...

Friday, 10. September 2021

Elastos Foundation

Elastos Bi-Weekly Update – 10 September 2021

...

ResofWorld

Why Amazon really built a new warehouse on the U.S.-Mexico border

The fulfillment center in Tijuana is a symbol of how the pandemic has changed the way the world shops.
Stark photos of a new Amazon warehouse in Tijuana standing directly beside a dilapidated housing development with dirt roads have gone viral over the last few days. The pictures, taken...

Ceramic Network

HOPR introduces data verifiability with Ceramic

How the HOPR network is using Ceramic to provide off-chain logging information to node runners, while keeping the data private.

The HOPR protocol is a layer-0 privacy foundation for the new generation of decentralized applications. The incentivized HOPR mixnet lets any application send data without leaking data or metadata. HOPR nodes will rely on Ceramic to track node payments without sacrificing user privacy.

Interested in learning more about the privacy protocol? Visit HOPRnet.org to run a node yourself.

Encryption is not enough for privacy

Standard end-to-end encryption does not provide sufficient privacy, because it still leaks important metadata, such as who is exchanging data, when, and how often. Given enough metadata, global adversaries and private companies can identify your behavior across multiple applications, even when the content of the traffic you exchange remains encrypted. Services like VPNs can hide your information from your internet service provider (ISP) but can’t protect you from the general fingerprinting of your browser, mobile devices and websites visits on a daily basis. Over time, this online behavior creates a profile that can be traced.

HOPR addresses this challenge by ensuring all the data and metadata you produce and consume online stays private. To do this, HOPR feeds content through a state-of-the-art mixnet - the HOPR network.

The HOPR protocol

HOPR incentivizes node runners who relay packets using its own currency, the HOPR token. Messages routed through the network are embedded with HOPR tokens in the form of “tickets” to pay each node along the route. HOPR’s proof of relay mechanism ensures a node can’t claim a ticket until the data packet is relayed to the next downstream node. This creates positive incentives for nodes to be good actors in the network. Tickets are cashed out via an Ethereum compatible blockchain (EVM) but are not always valid. This probabilistic payment system ensures a node’s online behavior can not be analyzed using timing attacks.

To ensure privacy at any level of network usage, the HOPR network is constantly fed with cover traffic: arbitrary data which provides cover for real users. This is particularly important in the early years of the network, when usage will understandably be lower.

The HOPR Association has allocated 250M HOPR tokens to cover traffic, to be released over four years. These will be issued anonymously by nodes sponsored by the HOPR Association and routed through nodes based on multiple parameters including HOPR tokens staked, amount of channels open, and general connectivity.

Verify off-chain activity without compromising privacy

This incentivized cover traffic system, combined with proof of relay, is what sets HOPR apart from other privacy networks, which either lack the proper incentives to scale or end up compromising on privacy or decentralization. Privacy networks are very challenging to develop and then test - how can you monitor and verify node activity without sacrificing privacy?

This is particularly important for the development of cover traffic and the long-term economic balancing of the network. Since running HOPR nodes incurs electricity and internet bandwidth costs, it’s important to be able to distinguish between the kickstarter cover traffic (artificially high for the first four years) and real traffic (initially minimal, but growing as the network scales).

Tickets issued by cover traffic can be tagged as such and shared off-chain. This would allow analysis but prevent inspection by other nodes. The HOPR team needed a reliable way to implement this approach, since simply logging this information in a decentralized IPFS node, which anyone can add data, would allow  information about cover traffic metrics to be skewed or manipulated by any party in the network.

Using Ceramic to create a decentralized monitoring tool for HOPR nodes

Using Ceramic, HOPR can propagate user-specific data and allow node runners to inspect and share their node's information as needed. To do this, nodes create a log entry in a Ceramic Stream, a DAG-based data structure for storing continuous, mutable streams of content on IPFS, every time they receive a particular message (for example, one with the cover traffic tag). Since this information is published with the secp256k1 private key used by the HOPR nodes, this information can be connected to a HOPR node and verified as such. This prevents outside manipulation of data.

HOPR node runners can use the HOPR dashboard to see their node data, and by checking the Ceramic Streams pinned by their nodes, obtain meaningful information about the cover traffic sent to their nodes. In this way, Ceramic not only helps HOPR developers, but also allows users themselves to verify the state of the network without involuntarily compromising their privacy.

Previous iterations of HOPR testnets issued unverifiable cover traffic via a bot, which was cumbersome and of limited utility. Thanks to Ceramic, HOPR nodes can now easily compare notes and assess their own ability to connect to other nodes.

The HOPR Network is the first of its kind using Ceramic to log decentralized data for an open-source protocol, and we believe this approach can be used by any project which relies on peer-to-peer off-chain data, which would otherwise be unverifiable.

Using IDX to connect HOPR nodes to external accounts

Since HOPR nodes have their own private keys, which need to quickly sign and submit transactions on-chain, each node has a unique private identity. This is extremely important for privacy, but it also makes engaging with a community of node runners challenging. As a result, it wasn't trivial for the HOPR team to link HOPR node runners to another digital identity, enrich node runner data, or perform promotional actions linked to their staking program, which would be trivial in a centralized (but not private) setting. This is where Ceramic provides a lot more flexibility.

The identities of HOPR node runners are private, but they don’t have to be anonymous. It’s possible to connect node runners with another digital identity, as long as that connection is verifiable.

The HOPR team has already started using Ceramic’s identity protocol, IDX, for this use case in a recent testnet on the Polygon network. HOPR nodes leverage IDX to connect to external Ethereum accounts outside of the HOPR network. With IDX, HOPR nodes now have the ability to sign an Ethereum account, enabling users to connect their nodes with an existing pseudonymous digital identity - real or not.

What's next for HOPR?

In the coming weeks, the HOPR team will start to test our cover traffic nodes, which will become the cornerstone for the HOPR network. The HOPR Association will continue to run testnets with the community to finetune the HOPR protocol. From there, node adoption and development of the protocol will continue alongside the community. The HOPR team will continue to use Ceramic to strike the right balance between data gathering and privacy.

We’re excited about what’s next to come for the HOPR team and the impact they'll have on data privacy. Learn more on the HOPR website, follow them on Twitter or jump into the Telegram group to get involved.



Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


ResofWorld

Rest of World’s (very international) streaming guide

Our fall viewing guide has everything from Filipino anime to a supernatural drama from Senegal.
The world of entertainment has never been as accessible as it is today. Whether it’s anime, telenovelas, or Chinese action dramas, the expansion of streaming services and their mad rush...

Own Your Data Weekly Digest

MyData Weekly Digest for September 10th, 2021

Read in this week's digest about: 14 posts
Read in this week's digest about: 14 posts

Thursday, 09. September 2021

SelfKey Foundation

SelfKey Partners with CrypTalk

We’re delighted to announce that SelfKey will be partnering with the upcoming messaging service platform CrypTalk. The post SelfKey Partners with CrypTalk appeared first on SelfKey.

We’re delighted to announce that SelfKey will be partnering with the upcoming messaging service platform CrypTalk.

The post SelfKey Partners with CrypTalk appeared first on SelfKey.


Energy Web

Protocol Labs and Energy Web complete first showcase of an open-source solution to decarbonize…

Protocol Labs and Energy Web complete first showcase of an open-source solution to decarbonize Filecoin In support of the Crypto Climate Accord (CCA), Protocol Labs and Energy Web today announced a successful showcase of an open-source solution for decentralized renewable energy purchasing by crypto miners. In this solution showcase, six Filecoin storage providers (i.e., miners) purchased verifie
Protocol Labs and Energy Web complete first showcase of an open-source solution to decarbonize Filecoin

In support of the Crypto Climate Accord (CCA), Protocol Labs and Energy Web today announced a successful showcase of an open-source solution for decentralized renewable energy purchasing by crypto miners. In this solution showcase, six Filecoin storage providers (i.e., miners) purchased verified renewable energy from 3Degrees using Energy Web Zero, a public renewable energy search engine. (PL and 3D are both CCA Supporters.) The solution will eventually be applicable across any blockchain to make it easy for any crypto miner to deliver proof of green mining.

The CCA is a private sector led initiative to decarbonize the crypto sector with open-source decentralized solutions and promote best industry practices. Energy Web is a co-convener of the CCA and both Protocol Labs and 3Degrees are CCA Supporters.

In the showcase announced today, renewable energy purchases are recorded in Filrep, a reputation system used by clients aiming to store data on the decentralized Filecoin network. Integrating renewable energy data from Energy Web Zero into the reputation system allows clients to factor renewables into their decision when choosing a storage provider. Each record on Filrep points to a verification page in Energy Web Zero, showing where the energy was produced, documenting the renewable source, and providing an attestation certificate proving ownership of the corresponding renewable energy credits.

Over the next year, Protocol Labs and Energy Web will create a package of open-source solutions that make it easy for all Filecoin storage providers to search for and purchase renewable energy on Energy Web Zero. Both Filecoin storage providers and clients will be able to verify procurement to substantiate their environmental claims because transactions will be anchored on the public Energy Web Chain, and in turn be traceable to existing renewable energy registries as well as individual solar, wind and hydroelectric power projects.

“We are excited to develop open-source decentralized technologies to support investments in renewable energy and track energy use throughout the supply chain,” said Alan Ransil, the technical lead for Filecoin Green at Protocol Labs. “By serving as an early adopter of these technologies, the decentralized network of Filecoin storage providers will set an example for both crypto miners and other industries to follow.”

Filecoin is an open-source cloud storage marketplace, protocol, and cryptocurrency that serves as a decentralized alternative to traditional cloud storage. Traditional cloud storage providers like Apple, Amazon, Google, and others have been moving aggressively with renewables. Now, decentralized storage and blockchain, too.

Many members of the Filecoin Storage Provider community voiced support for the new project:

“The dream of a decentralized storage network powered by renewable energy has been a motivating force behind the company since our first prototype,” said Kevin Huynh, the CEO of PiKNiK, a San Diego, California-based storage provider. “We believe it is imperative for the future of Web3 to grow our industry in a way that protects nature, while setting a sky-high standard for transparency as to our environmental footprint and renewable energy sourcing.” “I was deeply impressed by The Paris Agreement and Kyoto Protocol, which established the need to act quickly to support the environment,” said Alex Ma, a Filecoin storage provider based in Asia. “In my view, linking Filecoin to Energy Web Zero brings the community closer to a sustainable data storage solution.” “I’m happy to do my part in making Filecoin more environmentally clean and sustainable,” said Nelson Chen, “this approach allowed me to support renewable energy and prove my contribution easily, without having to change my power provider or install new equipment.” Stuart Berman, who serves as CTO, also at PiKNiK, said that “We started participating in Filecoin because we believe in the long-term vision. This proof of concept connecting Filecoin to renewable energy markets is a major first step towards solving the challenge of environmental sustainability within Filecoin and Crypto more broadly.” Alex Altman, the COO of Seal Storage Technology Inc. said: “As we enter a new age of corporate environmental responsibility, Seal Storage Technology is delivering on its commitment of carbon neutrality with the ambitious goal of achieving carbon negativity in 2022. Filecoin already empowers Seal to offer decentralized cloud storage using significantly less energy than other blockchain-based networks, and the integration with Energy Web will allow our renewable energy use to be tracked and verified”

The full cross-chain solution to be deployed in mid-2022 will create a unique decentralized identifier (DID) on the Energy Web Chain for each Filecoin storage provider who chooses to participate and link storage providers’ renewable energy purchases to their respective DIDs. This linking of a given Filecoin storage provider’s DID with verified green Filecoin mining will enable that storage provider to reflect their zero carbon status in the Filecoin Reputation System, and allow energy attributes to be traced from client to storage provider to individual renewable energy projects.

This cross-chain solution being developed can be extended to miners in other ecosystems because the underlying technical architecture will leverage the verifiable, interoperable hash links that are the basis of Web3 technologies such as IPFS and all blockchains. The project will therefore support the development of verifiably renewable crypto mining projects such as Green Hashrate.

Protocol Labs and Energy Web will share progress updates over the months ahead in advance of releasing commercial ready open-source solutions.

— — —

About Protocol Labs

Protocol Labs is an open-source research, development, and deployment laboratory. Our projects include IPFS, Filecoin, libp2p, and many more. We aim to make human existence orders of magnitude better through technology. We are a fully distributed company. Our team of more than 100 members works remotely and in the open to improve the internet — humanity’s most important technology — as we explore new advances in computing and related fields.

Protocol Labs is an Energy Web Member.

Contact: alan@protocol.ai

About Energy Web

Energy Web is a global, member-driven nonprofit accelerating the low-carbon, customer-centric energy transition by unleashing the potential of open-source, digital technologies. We enable any energy asset, owned by any customer, to participate in any energy market. The Energy Web Chain — the world’s first enterprise-grade, public blockchain tailored to the energy sector — anchors our tech stack. The Energy Web ecosystem comprises leading utilities, grid operators, renewable energy developers, corporate energy buyers, IoT / telecom leaders, and others.

For more, please visit https://energyweb.org.

Contact: CCA@energyweb.org

Protocol Labs and Energy Web complete first showcase of an open-source solution to decarbonize… was originally published in Energy Web Insights on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 08. September 2021

Oasis Open

Managed Open Projects

I recently pointed out in a TechCrunch contribution that the open source and open standards communities need to find ways to team up if they are to continue driving innovation and  development of transformative technologies to push our society forward. The challenge I posed to organizations like ours, OASIS Open, and others in the open […] The post Managed Open Projects appeared first on OA

A New Way For Open Source and Open Standards To Collaborate

I recently pointed out in a TechCrunch contribution that the open source and open standards communities need to find ways to team up if they are to continue driving innovation and  development of transformative technologies to push our society forward.

The challenge I posed to organizations like ours, OASIS Open, and others in the open source and standards ecosystems was to find ways to collaborate and encourage our stakeholders to do the same. My opportunity to “walk the talk” arrived immediately.

A Case in Point

The Enterprise Ethereum Alliance (EEA), with whom OASIS Open has had a long and productive relationship, came to us and described a need. EEA has years of experience as a forum for building standards for enterprise-ready, interoperable blockchain implementations.  However, to promote broader community participation in the Ethereum ecosystem, EEA also wanted to offer the wider Ethereum community the ability to start open source projects and to do so in a proven setting that provides solid support for governance, standards, reference implementations and a path to cooperation with international high-level standardization as well.

And that’s the expertise of OASIS Open. So we joined forces and announced EEA Community Projects. The work is led by EEA, managed under the OASIS Open Projects process, and driven by the community at large. We’re describing this partnership as a “Managed Open Project,” and we think this approach may be useful for other modern, agile development projects hosted with consortiums as well. 

Traditional OASIS Open Projects provide the pathway for open source projects to attract broad buy-in and become recognized international open standards as well. Of course, any developer can always build their own Git repo and open it to pull requests. But a stable, trustable project often requires more than that. Stakeholders want to know that their contributions or feature requests will be noted. Contributors want to know that their joint work will be run fairly. Commercial, open source, and government users (and international standards bodies) want to know that a project is properly licensed, vendor-neutral, and will remain available.  

The critical components needed for those goals are a tested process, expert facilitators with a light touch, and well-documented provenance. These elements are required by global treaties and many national practices to assure standards quality; they also happen to create the best, fairest open technologies. That’s precisely what OASIS Open offers with its well-known, verified, and accredited development governance process. And that’s why works produced here are widely accepted as open and reliable by so many industries and governments.

Our technical committees have co-developed their standards with open source reference implementations and proofs of concept for over 20 years. By creating our Open Projects program we encouraged future projects to operate on a “FOSS-first” basis, adding open source license defaults to our other routine process protections.  

Open Projects have also been the preferred approach elected for most of our newest projects.  But until recently, those were solely OASIS operations.  So, what’s the difference between those and a ‘Managed Open Project?’

The Value-Add of Managed Open Projects

The primary difference is in branding, but it’s also more than that. OASIS has long enjoyed the reputation as our industry’s leader in sharing our work. We’ve been co-approving outputs with international standards bodies like ISO, ITU, IEC, and others since 2004; our strategy has always been radical transparency, the opposite of “not invented here.”  

So, let’s say, you participate in or support a software technology ecosystem. You might have member-focused and proprietary work, or community-based and open source efforts, or both. Either way, you rely on your active collaborators. If those contributors recognize the value of being well governed and having the opportunity for broader approval and certification, they will look for some of the qualities described above: support for open source code implementations, stability of releases and versions, open public process, eligibility for certifications, and assurances that support broad global acceptance. It takes a lot to build that, and most projects don’t readily have access to the skill sets in-house to shepherd a project to that kind of success.  So you are faced with a build versus buy decision. 

Some huge foundations, or one-off smaller projects, may opt to build their own capabilities and seek certifications—or just decide to do without them. But for everyone else in the non-profit sector, outsourcing the process and development platforms may make a lot of sense. EEA essentially decided, after due diligence and some negotiated ground rules, to outsource their need for an open source and open standards development platform to us. They absolutely could have built their own native capacity for Ethereum enterprise-facing open source work. But the immediate up-front costs of time and personnel, and the longer-lead-time issues of seeking independent accreditation, made it worth it for EEA to combine forces with OASIS Open. 

The deliverables created in that joint program will still be EEA works, branded from the EEA community and whoever else may choose to contribute to their open source projects, but administered by our team and supported by our processes and infrastructure. 

Through OASIS Open’s Managed Open Projects program, other existing and new development organizations or consortiums can also add these self-branded capabilities to their existing hosted activities. They also will be eligible, at their option, to submit for international standards approval, without having to build from scratch all those resources, relationships and critical processes.  

This Managed Open Project approach is readily adaptable to any open source code, API, transaction or data structure in all industries. This includes blockchain, such as with EEA, but also can comprise a larger, open-source-powered tent for collaboration in real estate, petroleum, healthcare or any other industry with shared data and open standards needs. 

A Dream Come True

The reason that I came to OASIS Open is because I saw in this organization an ability and a desire to share and collaborate with other organizations, and the OASIS Open team is passionate about doing what’s best for the community. Now, after almost two years here, I’m more convinced than ever that we’ve got the expertise and contributor base of experts to make it happen; we’re nimble enough to be effective change catalysts; and, most importantly, we are doing the work. I couldn’t be more excited about this new Managed Open Project approach and how it is fulfilling the vision of harmonizing open source and open standards communities. 

Our Managed Open Projects program specifically, and most cooperation and sharing of work across organizational boundaries generally, allow development projects to leverage broad community input while embracing contributions, feature requests, and bug fixes from a much larger potential group of stakeholders. Doing so in a transparent and open fashion and, increasingly, with widely-understood open source tools has been a great benefit to our own projects.

OASIS Open encourages all projects—of any size and housed anywhere—to look for cooperation and sharing opportunities across industries, consortia and open source foundations. It is a strategy that has served us very well since our own launch at the dawn of the internet in 1993.

If you share this vision and have a project or consortium that wants a reliable path to open source development, broader cooperation, or de jure standards, I hope you will reach out to us. Together we can make it happen.

The post Managed Open Projects appeared first on OASIS Open.


LegalRuleML Core Specification V1.0 OASIS Standard published

Building on RuleML to represent the legal norms and rules with a rich, articulated, and meaningful mark-up language. The post LegalRuleML Core Specification V1.0 OASIS Standard published appeared first on OASIS Open.

The specification for representing legal norms and rules is now an OASIS Standard.

OASIS is pleased to announce the publication of its newest OASIS Standard, approved by the members on 30 August 2021:

LegalRuleML Core Specification Version 1.0
OASIS Standard
30 August 2021

Legal texts, e.g. legislation, regulations, contracts, and case law, are the source of norms, guidelines, and rules that govern societies. As text, it is difficult to label, exchange, and process content except by hand. In our current web-enabled world, where innovative e-government and e-commerce are increasingly the norm, providing machine-processable forms of legal content is crucial.

The objective of the LegalRuleML Core Specification Version 1.0 is to define a standard (expressed with XML-schema and Relax NG and on the basis of Consumer RuleML 1.02) that is able to represent the particularities of the legal normative rules with a rich, articulated, and meaningful mark-up language.

LegalRuleML models:

defeasibility of rules and defeasible logic; deontic operators (e.g., obligations, permissions, prohibitions, rights); semantic management of negation; temporal management of rules and temporality in rules; classification of norms (i.e., constitutive, prescriptive); jurisdiction of norms; isomorphism between rules and natural language normative provisions; identification of parts of the norms (e.g. bearer, conditions); authorial tracking of rules.

The project received 4 Statements of Use from Livio Robaldo, Swansea University, CSIRO Data61, and CirSFID-AlmaAI.

URIs

The OASIS Standard and all related files are available here:

HTML (Authotative):
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/legalruleml-core-spec-v1.0-os.html

PDF:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/legalruleml-core-spec-v1.0-os.pdf

Editable source:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/legalruleml-core-spec-v1.0-os.docx

XSD schemas:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/xsd-schema/

RelaxNG schemas:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/relaxng/

XSLT transformations:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/xslt/

XSD-conversion drivers:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/generation/

RDFS metamodel:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/rdfs/

Metamodel diagrams:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/diagrams/

Examples:
https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/examples/

Distribution ZIP files

For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:

http://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/os/legalruleml-core-spec-v1.0-os.zip

Our congratulations to the members of the OASIS LegalRuleML TC on achieving this milestone.

The post LegalRuleML Core Specification V1.0 OASIS Standard published appeared first on OASIS Open.

Tuesday, 07. September 2021

EdgeSecure

IT End User Support Specialist (Contract Services)

Position Summary: Under the direction of the Chief Information Officer at Georgian Court University (GCU), the IT End User Support Specialist provides instructors, students, and faculty with technical support and... The post IT End User Support Specialist (Contract Services) appeared first on NJEdge Inc.

Position Summary:

Under the direction of the Chief Information Officer at Georgian Court University (GCU), the IT End User Support Specialist provides instructors, students, and faculty with technical support and training on the use of software, personal computers, printers, peripheral equipment, and network systems hardware. The IT End User Support Specialist resolves computer application problems and troubleshoots hardware malfunctions; provides support and installation for software or department, desktop computer applications, and Internet/Intranet including e-mail; installs, configures, and repairs personal computer hardware and software.

Duties and responsibilities will include:

Evaluates, responds to, and resolves requests for technical support from instructors, students, and departmental staff experiencing problems with hardware, software, networking, and other computer related technologies. Diagnoses problems, performs remedial actions to correct problems, and/or recommends and determines solutions. Researches, resolves, and follows up on user problems; refers more complex problems to specialized or higher-level personnel. Delivers, installs, or assists personnel in the installation of personal computers, software, and peripheral components. Responds to inquiries concerning operating systems and diagnoses system hardware, software, and operator problems; installs, maintains, and upgrades operating systems and software packages across disparate platforms. Tests, loads, and configures specified software packages onto computers and mobile devices; may modify specific applications for use by department; deploys software, settings, scripts, and batch files to workstations remotely. Maintains documentation database as used by the department. Instructs users in software applications usage and basic computer navigation; advises users on best security practices. Creates baseline software sets for various makes and models of computers. Performs user data migration and recovery due to hardware/software upgrades or disasters. Assists in coordinating activities with the help desk, network services, or other information systems staff. Trains users on software and hardware usage by providing instruction and documentation. Provides updates, status, and completion information to personnel and/or users via voicemail, e-mail, or in-person communication. Recovers technology assets and evaluates/repurposes viable hardware; decommissions obsolete hardware; collects, strips, and prepares used equipment for salvage, including coordinating delivery to warehouse. Assists new staff as required.

Must Have Knowledge Of

Academic Technology and peripherals, such as, personal computers, network hardware, and mobile devices. Personal computer, mobile device, and network system application software packages, specific to the area of assigned department, learning laboratory, or academic discipline. Principles and practices used in the operations, maintenance, and administration of network operating systems, personal computer system hardware, mobile devices, and related software systems. Techniques and methods of computer and mobile device hardware and software evaluation, implementation, and documentation. Troubleshooting, configuration, and installation techniques. Ability to use English effectively to communicate in person, over the telephone, and in writing.

Education and Experience

Bachelor’s degree in an applicable field or equivalent combination of education and experience. Two years of related experience in IT.

Please note: this is a 1099 contract position and requires you to be on-site at GCU. 

Apply [contact-form-7]

The post IT End User Support Specialist (Contract Services) appeared first on NJEdge Inc.

Monday, 06. September 2021

omidiyar Network

Reimagining Capitalism Series: Building Counterweights to Power

Photo Credit: Rochelle Hartman, https://creativecommons.org/licenses/by/2.0/legalcode This post expands on the third key pillar for building a new economic paradigm as outlined in Our Call to Reimagine Capitalism. Read the first post in this series, “An Introduction to Ideas, Rules, and Power and How They Shape Our Democracy and Economy” here. By Audrey Stienon, Associate, Reimagining Capital
Photo Credit: Rochelle Hartman, https://creativecommons.org/licenses/by/2.0/legalcode

This post expands on the third key pillar for building a new economic paradigm as outlined in Our Call to Reimagine Capitalism. Read the first post in this series, “An Introduction to Ideas, Rules, and Power and How They Shape Our Democracy and Economy” here.

By Audrey Stienon, Associate, Reimagining Capitalism

Labor Day represents an important moment to commemorate the essential contributions of working people to our country.

The holiday was created at the height of the Gilded Age, a time when the American economy was booming, the likes of John D. Rockefeller, Andrew Carnegie, and J.P. Morgan were making their fortune, and most people toiled in exhausting, unsafe jobs for paltry wages. Working people, including children as young as five, labored in 12-hour shifts, seven days a week — and still were not paid enough to escape poverty. Individuals had little power to change the status quo, but when they banded together in labor unions, they were able to effectively demand better pay, hours, and working conditions from their employers.

This shift in power didn’t happen quietly or easily. The wealthy few who profited by exploiting labor wielded their money and political connections to mobilize private security forces, police, and the judicial system against working people. In turn, the labor movement used all the tools at its disposal — from strikes and boycotts to protests and marches — to fight for change, even in the face of violence. In 1894, after twelve working people were killed during the Pullman railroad strike, President Cleveland tried to repair the government’s frayed ties with working Americans by declaring the first Monday in September “Labor Day,” a holiday commemorating the essential contributions of working people to our country.

A new holiday, of course, did not suddenly balance power between working people and their employers. Today, working people — along with the labor unions, grassroots movements, small business associations, and community organizations that represent them — continue to engage in many of the same battles for equal power that they fought more than a century ago.

Equal power means freedom of choice for working people. That freedom is what makes possible the American Dream, with its promise that everyone should have the equal right to build the life they desire by participating in the economy in whichever way they choose — through hard work, risk-taking, innovation, or entrepreneurship.

Like democracies, capitalist markets — especially those intended to support democratic societies — work best when participants enjoy a high degree of equality in decision-making power. Ideal markets, after all, depend on individuals’ ability to buy what they choose as well as to walk away from any exchange that does not satisfy them — something true for people entering a store just as much as it is for people assessing how they want to spend their time and labor.

But whereas democratic power is shared through the vote, markets’ power often comes from money. Any capitalist system intended to support a democratic society depends on checks and balances that prevent any one group from writing rules that benefit themselves at the expense of everyone else. A society in which this power of choice is not balanced — in which the rules of the economy leave working people with no choice but to accept jobs that leave them in poverty, and with no way to achieve meaningful change — cannot be truly democratic.

We know American capitalism has routinely failed to live up to this ideal. The two-pronged restriction on voting access and market participation by race, gender, disability, and class ensured that whole swaths of our society were (and in many cases, continue to be) robbed of both the political power to write economic rules that protect their interests and the economic power to gain enough resources to live with dignity. And yet, by learning from the combined struggles of the labor movement, civil rights movement, and alliances between them, we know that it is possible to build an economy truer to our democratic values.

We are currently experiencing two crises simultaneously: both American democracy and capitalism are in critical condition owing to the increasingly unequal concentration of political and economic power throughout our society.

Of the many failed ideas US policymakers have implemented in recent decades, one of the most dangerous has been to systematically erode the checks and balances within both government and markets. Lax regulations have made markets less competitive, allowing corporate behemoths to block the entry of new competitors, robbing consumers of the freedom to buy alternatives to products they do not like — including digital products that require them to give up personal data — and depriving working people of the ability to find better jobs. Right to work laws and anti-union employers have produced a steady decline in the wages paid by industries that used to be pillars of the middle class. Our political system has been corrupted by private money that fuels election campaigns and lobbying efforts, giving the wealthiest in our society an unparalleled ability to influence the policymaking process and ensure that the rules continue to work to their advantage.

All of this means that, even as most of us recognize the socio-economic status quo is harming many Americans, we lack the power to enact the change necessary to bring it more in balance.

We at Omidyar Network believe that if we want our democracy to survive, we need a new form of capitalism that promotes and protects balanced power. As in our democracy, we need checks on those with the most power to make sure they cannot rig the rules to their benefit, and we need to build counterweights to concentrated power so that the people who suffer under unfair rules are certain to have their voices heard.

One of our priorities is to help build the power of working people so they can advocate for the wages, working conditions, and policies needed to thrive. That’s why we endorsed the Protecting the Right to Organize (PRO) Act of 2021, which would expand people’s right to organize and engage in collective bargaining, ensuring their interests are well-represented when employers respond to economic shocks like the COVID-19 pandemic and prepare for future trends, such as automation.

In addition to the PRO Act at the federal level, many of the fights for worker power are taking place closer to the ground. As the pandemic increased people’s recognition of the importance of caregivers, hospital employees, and service workers — whose jobs typically pay low wages and provide little protection — we have supported the National Essential Workers Campaign, an initiative intended to coordinate state and local efforts fighting to give essential workers more voice and power, especially over health and safety. We have also supported the California Coalition for Worker Power, a new coalition anchored by a number of partners dedicated to ensuring that every worker in the state has the power to come together and improve their work conditions and communities.

We also recognize that a rapidly changing economy requires new strategies and tools to help workers organize. We support Unemployed Workers United (UWU) and their experimentation in the use of digital organizing tactics to engage people who have lost their jobs — especially in communities of color — in local grassroots campaigns. We also support an idea lab at The Aspen Institute’s Business in Society Program to allow a number of stakeholders — including corporate governance experts, business leaders, and worker organizers — to explore solutions for increasing worker voice in corporate governance structures.

Working people, however, are not the only group that needs a greater say in the future of our economy. Alongside the Ford Foundation and W.K. Kellogg Foundation, we launched the Carry on the Fight Fund to support grassroots organizations so they have support beyond Election Day to hold elected officials accountable to their campaign promises and build long-term power. We have also partnered with organizations such as Main Street Alliance, which advocates for policies that support small business and helps small business leaders become their own advocates for policy change. We’re working with Change the Chamber, a student movement taking on the Chamber of Commerce through social media campaigns and direct conversations with companies and policymakers in response to the Chamber’s continued lobbying efforts on behalf of fossil fuel interests to hinder essential climate change regulation.

We are also focused on ensuring economic policies do not concentrate wealth — and the power that comes with it — in the hands of a few individuals and corporations. To do so, we support groups such as Better Markets, a watchdog organization that advocates on behalf of the public interest for a fairer financial sector and serves as a check on Wall Street’s immense lobbying power on financial regulatory matters. We are also working with several organizations pushing to reform the outsized impact large corporations currently have on the American political processes. Notably, the Center for Political Accountability (CPA) is working to increase accountability and transparency for companies’ political spending. They developed the CPA-Wharton Zicklin Model Code of Conduct for Political Spending, which is intended to guide companies on how to ethically engage in political activities and helped mobilize shareholders to win an unprecedented number of fights, resulting in requirements that companies disclose their political spending.

We recognize that many of these efforts have been guided by organizations that provide the research and ideas for how different stakeholders of the economy can work together toward a more equal distribution of economic power. The Economic Analysis and Research Network (EARN), for example, is a network of research, policy, and public engagement organizations that provide resources and technical assistance to local groups. Through their Worker Power Project, they aim to bolster the ability of working people to achieve racial, gender, and economic justice through organizing, collective bargaining, and advocating for state and local policies that promote worker power. Meanwhile, the merger of the Center for Responsive Politics and the National Institute on Money in Politics created OpenSecrets, which will build on the extensive datasets of its two parent organizations to track the money being invested in political campaigns and lobbying at both the federal and state level. This information is crucial for efforts to hold policymakers accountable to the public.

We celebrate this Labor Day amid a pandemic that has acted as a shock to the system and has given may people the power — often for the first time — to make a conscious choice about how they want to participate in the economy, triggering a “Great Reassessment of work in America.” Many are choosing not to return to jobs that paid low wages and had erratic schedules or poor working conditions. These decisions are proof that the economy people want is very different than the one we have. Even a pandemic has not been enough to disrupt the systemic power imbalance that exists in both our economy and democracy. We must continue the legacy of the working people and labor leaders who have stood up to hoarders of power; we must continue to fight for a future in which each of us has the power and voice to determine how we live our lives.

Reimagining Capitalism Series: Building Counterweights to Power was originally published in Omidyar Network on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 03. September 2021

Digital ID for Canadians

Spotlight on Credivera

1. What is the mission and vision of Credivera? Mission: To bridge the way the world manages personal and professional credentials. We make people’s lives…

1. What is the mission and vision of Credivera?

Mission: To bridge the way the world manages personal and professional credentials. We make people’s lives easier, safer, and more productive by seamlessly connecting trustworthy information to where it needs to go.

Vision: Transform the way we provide proof of people, products, and processes.

2. Why is trustworthy digital identity critical for existing and emerging markets?

A global economy requires a shared, trusted framework for understanding that the information we are receiving is true, untampered, and current.

As we look at traditional industries, we see an urgent need to securely share information about an individual that respects both their personal data and the information required for operating a compliant organization.

As these markets interconnect, the skills of a workforce may be hybrid, remote or highly accessible but fundamentally still require a capacity to analyze, assess, and deploy talent associated with an identity based upon their current status.

Digital ID transitions the way we validate skills from subjective to objective, placing an emphasis on the work and removing stereotypes and bias limiting the advancements of new markets.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

Digital identity will create a common language and roadmap for securely sharing important personal information that honours the individual being in control of their identity. As their identity expands to involve personal and career credentials, the nature of work and repositioning of talent globally becomes a reality.

Canada will be able to attract the talent our industries require, share expertise in emerging geographies, and remove blockages in the way current economic systems operate that require verified identification in a manual way.

Credivera is developing an open, standards-based identity platform that provides proof of certifications and work experience using verified digital credentials. As we build a digital ecosystem of businesses, issuers, and individuals, we are supporting industries to adopt new ways to capacity plan, manage compliance, and identify opportunities for new career and skill development.

4. What role does Canada have to play as a leader in this space?

Canada can be agile and advocate for creating and mandating a shared standard of digital identity among world leaders. Through fundamental systems that enable both public and private trust services people, markets, and personal data can flow faster and more securely.

5. Why did your organization join the DIACC?

As a participant in emerging advancements in digital identity for the government, Credivera learned of the Pan Canadian Trust framework and aligned with DIACC’s initiatives. As a member, we are excited to share knowledge and assist in creating the future of digital identity together.

6. What else should we know about your organization?

Credivera is currently working with the Government of Canada on User Centric Verifiable Credentials, supporting the interoperability of foundational identity documents and will be found on the Microsoft Marketplace in the coming months for their digital Wallet.


Own Your Data Weekly Digest

MyData Weekly Digest for September 3rd, 2021

Read in this week's digest about: 12 posts, 2 questions, 1 Tool
Read in this week's digest about: 12 posts, 2 questions, 1 Tool

Thursday, 02. September 2021

MyData

Catalysing transformative change: new project to produce innovative services in smart cities

A new project by MyData Global and the Uusimaa region helps companies and cities capitalise on personal data and develop better digital services on citizens’ terms. MyData Global and Uusimaa Regional Council are launching the H3C project – Human centric companies and cities. H3C will provide training and networking opportunities for cities, companies and organisations... Read More The post

A new project by MyData Global and the Uusimaa region helps companies and cities capitalise on personal data and develop better digital services on citizens’ terms. MyData Global and Uusimaa Regional Council are launching the H3C project – Human centric companies and cities. H3C will provide training and networking opportunities for cities, companies and organisations...

Read More

The post Catalysing transformative change: new project to produce innovative services in smart cities appeared first on MyData.org.


Elastos Foundation

Elastos Financial Report – First Half 2021

...

Energy Web

AEMO announces open-source operating system for world-leading distributed energy marketplace design…

AEMO announces open-source operating system for world-leading distributed energy marketplace design trial World-first global partnership includes Energy Web (to provide the open-source data exchange system), Microsoft (to provide cloud services), and PXiSE (to provide market logic software), with local aggregators (Mondo) and network operators(AusNet) Zug, Switzerland — 2 September 2021 — 
AEMO announces open-source operating system for world-leading distributed energy marketplace design trial World-first global partnership includes Energy Web (to provide the open-source data exchange system), Microsoft (to provide cloud services), and PXiSE (to provide market logic software), with local aggregators (Mondo) and network operators(AusNet)

Zug, Switzerland — 2 September 2021 — 
Australian power system and market operator, Australian Energy Market Operator (AEMO) has announced the architecture and technology partners for Project EDGE — a flagship initiative enabling distributed energy resources (DER) to provide for both wholesale and local network services at enterprise scale within an off-market trial environment. The DER Marketplace solution will be developed in partnership with local market participants AusNet Services, Mondo and global technology vendors, including Energy Web, PXiSE, and Microsoft.

Australia leads the world in adoption and deployment of distributed energy resources (DER), with nearly one in four homes featuring rooftop solar, and 40% in some states (South Australia and Queensland).

Combined with the rapid growth of energy storage, demand-side management, and utility-scale renewable generation, the Australian grid is transforming to a decentralised system in which consumer owned DER play a pivotal role. This has created challenges for AEMO and distribution network operators in balancing and protecting the grid, but also creates new opportunities for consumers and other market participants to create value via supporting the energy transition with their DER.

In response, AEMO has established a DER program to enable the transition from one-way energy supply to a world-leading system that maximises the value of DER for all consumers through digitisation and integration of DER into Australia’s power systems and markets.

Under project EDGE, AEMO is collaborating with Mondo and AusNet Services, with input from the broader energy industry, to demonstrate via a proof-of-concept trial, how aggregated fleets of DER can deliver multiple energy services at scale both wholesale power system and local network levels.

This project will provide AEMO and its partners with technical and operational experience to inform evidence-based changes to regulatory and operational processes to effectively manage Australian electricity grids and markets with increasing levels of DER participation. Additionally, EDGE will focus specifically on understanding consumers’ perspectives and preferences in selling their DER capacity to Aggregators for use in energy markets.

Project EDGE is focused on streamlining data exchange among market participants to maximise utilisation and coordination of DER for providing multiple grid services, informing capability for scalability. Project EDGE is testing this capability with up to 10MW of DER capacity, made up of approximately 1,000 customers.

The DER Marketplace will be underpinned by Energy Web’s open-source operating system, which will be used to establish digital identities for all assets and market participants in the program. This identity-based architecture facilitates secure, efficient exchange and validation of DER data amongst market participants using digital infrastructure shared amongst Australian market participants.

The market intelligence and logic layer will be provided by PXiSE, and the solution will leverage Microsoft Azure cloud computing resources. EDGE’s DER Marketplace will deliver three core function sets:

1:Efficient and scalable DER data exchange between actors through use of self-sovereign identities anchored to a digital infrastructure that in future, could be shared between market participants;

2:Wholesale integration: DER Fleets will be dispatched as if they are participating in existing wholesale markets, while considering Distribution Network Limits in the dispatch process; and

3:Local Services Exchange — the Marketplace will facilitate visible, scalable and competitive trade of Local DER Services that enables Distribution Network Service Providers (DNSPs) to manage local power security and reliability and enables DER Aggregators to stack local and wholesale value streams efficiently.

As DER deployment accelerates across Australia, coordinating DER across market participant operating systems and participating prosumers has become a significant challenge. Project EDGE will provide evidence to support decisions relating to investment in digital operating system and the underlying design of data exchange systems.

Australia’s EDGE project will begin testing services in April 2022, with project completion in March 2023.

AEMO Chief Markets Officer, Member Services, Violette Mouchaileh, said “Project EDGE aims to build understanding of and inform the most efficient and sustainable way to integrate DER into the electricity system and markets, allowing all consumers to benefit from a future with high levels of DER. There is no existing product that meets all high-level requirements for the DER Marketplace, so Project EDGE is at the leading edge of product development in this space.”

The EDGE Project is being supported by the Australian Renewable Energy Agency (ARENA), with AEMO receiving almost $13 million in funding for the project. ARENA CEO Darren Miller said at the time that this landmark trial would provide the blueprint for integrating DER into the grid.

AEMO announces open-source operating system for world-leading distributed energy marketplace design… was originally published in Energy Web Insights on Medium, where people are continuing the conversation by highlighting and responding to this story.


Commercio

New SDK in C sharp

The new prerelease of the #sdk in #C# for the software version of chain of commerce.network v2.2.0-pre.1 (pre release) has been published IMPORTANT NOTICE- It is absolutely essential that all companies that have live projects on the Commercio.network chain test their solution on the test-net using #SDK v2.2.0-pre.1 The new prerelease of #sdk i

The new prerelease of the #sdk in #C# for the software version of chain of commerce.network v2.2.0-pre.1 (pre release) has been published

IMPORTANT NOTICE- It is absolutely essential that all companies that have live projects on the Commercio.network chain test their solution on the test-net using #SDK v2.2.0-pre.1
The new prerelease of #sdk in #C# for the software version of chain of commerce.network v2.2.0-pre.1 (pre release) on which is based the current testnet https://lnkd.in/eGg9D6J.
The new #sdk prerelease is version v2.2.0-pre.1 (pre release) and includes the following new features:
Modification of the helpers for the Docs and ID module to make them compatible with the new version of the chain.
Mint module Rewrite with the introduction of the helpers for mint and burn CCC tokens.
Ability to create multiple mint messages in the same transaction.
Rewriting of Accreditation module, now renamed KYC.
Added data integrity checks to avoid invalid transactions on the chain.
Added endpoint querying of trade.network modules.
Code improvement with new types to avoid duplication in collections
Documentation update
Examples update
Added tests

Library publication : https://lnkd.in/gRE-uPGj
Soon the packets will be published on https://www.nuget.org, the official packet manager of the .NET platform, both by Commercio-sdk and Sacco.
#blockchain

L'articolo New SDK in C sharp sembra essere il primo su commercio.network.

Wednesday, 01. September 2021

OpenID

OpenID Connect Client-Initiated Backchannel Authentication (CIBA) Core is now a Final Specification

The OpenID Foundation membership has approved the following MODRNA specification as an OpenID Final Specification: OpenID Connect Client-Initiated Backchannel Authentication Flow – Core 1.0 A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision. The Final Specification is available at: https://openid.net/specs/op

The OpenID Foundation membership has approved the following MODRNA specification as an OpenID Final Specification:

OpenID Connect Client-Initiated Backchannel Authentication Flow – Core 1.0

A Final Specification provides intellectual property protections to implementers of the specification and is not subject to further revision.

The Final Specification is available at:

https://openid.net/specs/openid-client-initiated-backchannel-authentication-core-1_0-final.html

The voting results were:

Approve – 63 votes Object – 0 votes Abstain – 4 votes

Total votes: 67 (out of 310 members = 21.6% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post OpenID Connect Client-Initiated Backchannel Authentication (CIBA) Core is now a Final Specification first appeared on OpenID.

Me2B Alliance

Flash Guide #10: Data Flow & the Invisible Parallel Dataverse

Flash Guide #10 DOWNLOAD PDF Data Flow & the Invisible Parallel Dataverse

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#InvisibleParallelDataverse #RespectfulTech #DystopianMetaverse

IN A NUTSHELL
The reality of online data flows is nothing like what we expect. Our personal data flows do not start light and increase with time and trust. Instead, a firehose of personal information is released – and shared with a host of unseen third parties – as soon as we open an app or website. Me2BA’s Respectful Tech Specification V.1 is largely focused on testing for these invisible parallel dataverse data flows.

So far in this series of Me2B 101 Flash Guides, we’ve focused on the digital world as we interact with and experience it. We now turn our attention to the flow of data during the Me2B Relationship lifecycle.  

Figure 10.1 illustrates typical user expectations about data sharing over the course of the Me2B Lifecycle. It reflects our expectations – based on the relationship norms in the physical world – that the flow of personal information would start off very slowly, only to build up over time as our trust builds. 

Figure 10.1  

This is the [somewhat] idealized view of how our digital Me2B Relationships should be – reflecting the increasing trust between the Me and the B over time, and a proportional increase in information sharing over time. It would be more ideal if the flow of information ceased more cleanly at the end of a Me2B Marriage, but that doesn’t appear to be the case for most services. 

Moreover, this idealized lifecycle is not at all the current reality of digital Me2B Relationships.  

Instead, as soon as we so much open a website or an app, there is a firehose of information about us being shared with countless undisclosed entities—Data Processors (or Hidden B2B Affiliates, as introduced in Flash Guide #6).I  The reality looks more like Figure 10.2 than 10.1. 

Figure 10.2 Invisible Parallel Dataverse 

How Does This Happen? 

 A major culprit in the tsunami of unwitting information sharing is the digital advertising infrastructure, (described in Flash Guide # 5) which systematically, and at massive scale shares personal data with multiple third parties (aka Data Processors). So long as this infrastructure exists, the personal data floodgates are open.ii 

Digital advertising is just one of the ways that websites and apps invisibly share data with third party Data Processors. Until all these avenues are exposed, and businesses and other Data Controllers are made responsible for respectful data processing, people are at risk. Version 1.0 of the Me2B Respectful Tech Spec is largely dedicated to testing for [outbound] invisible parallel dataverse data flows.  

See Flash Guide 6 for how GDPR terms such as “Data Processor” and “Data Controller” map onto Me2B Alliance terminology. See our July 4, 2021 blog post on Freedom in the Digital World https://me2ba.org/freedom-in-the-digital-world/

© Me2B Alliance 2021


Flash Guide #9: The 10 Attributes of Respectful Me2B Commitments

Flash Guide #9 DOWNLOAD PDF The 10 Attributes of Respectful Me2B Commitments

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#Me2BRelationship #Me2BDeals #RespectfulTech

IN A NUTSHELL
The Me2B Respectful Tech Specification measures technology behavior against 10 attributes that respectful Me2B Commitments should possess. These attributes represent how technology should treat us and our data at every step along the Me2B Relationship Lifecycle.

Flash Guide #8 described the types of Me2B Commitments that may occur throughout the digital Me2B Lifecycle. The Me2B Alliance has identified 10 high level attributes that respectful Me2B Commitments should possess. Our Respectful Tech Specification measures technology behavior against these 10 attributes, while taking into consideration the specific context and stage for each commitment.

The 10 Attributes of Respectful Me2B Commitments Clear Data Processing Noticei Viable Permission Identification Minimization Data Collection Minimization Private by Default Reasonable Data Use & Sharing / Me2B Deal in Action Data Processing Behavior Complies with Data Subject’s Permissions and Preferences Data Processing Behavior Complies with Policies Reasonableness of Commitment Duration Commitment Termination or Change Behavior

Each of the 10 attributes is described in more detail below: 

Clear Data Processing Notice:  Measures if the app or website provides adequate notice on how data is collected, used, shared, monetized, etc. by the Data Controller(s) and all Data Processors.
  Viable Permission:  The Me2BA uses legal scholar Nancy Kim’sii three requirements for legally viable permission to enter into a commitmentiii, and checks for the following:  Understandability: Can the Data Subject understand the Me2B Deal?  Freely Given: Is the Data Subject coerced or manipulated in any way into providing permission for this commitment?  Intentional Action: Does the Data Subject perform a distinct act to provide permission, and is it recorded?

“Viable Permission” also evaluates the commitment’s permission flow to all Co-Data Controllers and Data Processors (which we refer to as “Transitive Permissions”).

Identification Minimization: Measures if the identification constructed by the service are appropriate for the particular Me2B Commitment. Note that “appropriate” means that the identification level reflects the social norms described in Flash Guide #8.
  Data Collection Minimization: Tests whether the data collected is proportional to, and appropriate, for the particular Me2B Commitment. We measure across three types of data:    Volunteered Data – entered by the Data Subject Observed Data – automatically collected by the website or app without the individual’s awareness, and   Derived Data – data that is derived by the Data Controller(s) or Data Processors.
  Private by Default: Measures whether the Data Subject must modify any website or app settings in order to have a private experience.
  Reasonableness of Data Use & Sharing Behavior: Measures if the observed data use and sharing behavior is appropriate for the particular Me2B Commitment.
  Data Processing Behavior Complies with Data Subject’s Permissions and Preferences:  Measures if the observed data processing behavior matches the Data Subject’s permissions and preferences.
  Data Processing Behavior Complies with Policies: Measures if the observed data processing behavior matches the promised behavior as stated in the privacy policy and terms of service or terms of use.
  Reasonableness of Commitment Duration: Measures if the duration of the commitment is appropriate for the particular Me2B Commitment, and the industry sector.
  Commitment Termination Behavior: Tests three commitment termination behaviors:   Usability:  If it’s easy for the Data Subject to change or end the commitment.  Record:  If the commitment termination is recorded and provided to the Data Subject.  Data Removal: If the Data Subject’s data is forgotten/deleted by all Data Controllers and Data Processors.  

Like in the case of “Viable Permission” (attribute 2), this attribute assesses whether changes to “Commitment Termination”-related permissions or settings cascade down to all Co-Data Controllers and Data Processors. 

Note that the Me2B Alliance and our Respectful Tech Specification use GDPR terminology. In particular, note that “data processing” under GDPR Article 4, item (2) includes collection and all other behaviors relating to data: “‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction” Art. 4 GDPR – Definitions | General Data Protection Regulation (GDPR) (gdpr-info.eu) https://www.ali.org/members/member/344957/ “Consent has a variety of meanings in the law, but it is typically a conclusion based upon the presence or absence of three conditions: an intentional manifestation of consent, knowledge, and volition/voluntariness.” Kim, Nancy S. “Consentability: Consent and its Limits” (p. 9). Cambridge University Press. 2019

© Me2B Alliance 2021


EdgeSecure

IEEE and Edge Announce Partnership to Enhance Research Data Management and Collaboration with IEEE DataPort

More clearly than before, we are seeing the recent pandemic as a catalyst and accelerator for Digital Transformation, or Dx as the term is now designated. As many higher education institutions have been on a Dx trajectory, there is now no denying that higher education’s future is digital. The post IEEE and Edge Announce Partnership to Enhance Research Data Management and Collaboration with IEEE

Newark, NJ, August 31, 2021 – Edge, a nonprofit research and education network and technology partner, has announced a partnership with IEEE, the world’s largest technical professional organization dedicated to advancing technology for humanity. The two organizations will collaborate to offer increased awareness of institutional subscriptions to IEEE DataPort — a web-based, cloud services platform supporting the data-related needs of the global technical community — making it available to academic, government, and not-for-profit institutions across the United States.

IEEE DataPort provides a unified data and collaboration platform which researchers can leverage to efficiently store, share, access, and manage research data, accelerating institutional research efforts. Researchers at subscribing institutions will gain access to the more than 2,500 research datasets available on the platform and the ability to collaborate with more than 1.25 million IEEE DataPort users worldwide. The platform also enables institutions to meet funding agency requirements for the use of and sharing of data. 

“As research in nearly all domains becomes more data intensive, providing institutions with the ability to store, share, access, and manage high quality data is critical.  This agreement facilitates that opportunity. Edge and its peer Regional Research and Education organizations have a substantial overlap in technical domains of interest with IEEE, and will leverage their very wide and deep relationships with research organizations across North America to promote IEEE DataPort,” said Dr. Forough Ghahramani, Associate Vice President for Research, Innovation, and Sponsored Programs at Edge.

The partnership advances the mission of EdgeDiscovery, a research and discovery framework providing access to leading-edge technology to support collaborative research and educational opportunities.

“IEEE is pleased to have the opportunity to work with Edge in bringing IEEE DataPort to researchers and institutions. IEEE and Edge both understand the critical need of the research community for replication and extension of data intensive studies, to meet funding agency requirements, and for a secure platform on which to store, manage and provide accessibility to research data. IEEE DataPort is a full-service data platform that can serve all researchers and all institutions and help them meet the requirements for managing research data,” according to Dr. David Belanger, Chair, IEEE DataPort Project.  

Existing Edge members and other North American academic and government institutions interested in learning more about IEEE DataPort, please contact Edge at ieeedataport@njedge.net

Corporations and institutions outside North America can subscribe to IEEE DataPort directly through IEEE.  Learn more at IEEE DataPort | Discovery & Open Science 

About Edge:
Founded in 2000, Edge, a 501(c)(3) nonprofit corporation, serves as a purpose-built research and education wide area network and technology solutions partner. Edge connects members with affordable, easy access to high-performance optical networking, commodity Internet and Internet2 services, and a variety of technology-powered products, solutions, and services. The Edge member consortium consists of colleges and universities, K-12 schools and districts, government entities, healthcare networks, and businesses spread throughout the continental US. The group is governed by the New Jersey Presidents’ Council with offices in Newark, Princeton, and Wall Township, NJ. For more information, please visit: www.njedge.net.

Media Contact:
Adam Scarzafava
AVP for Marketing and Communications, Edge
855-832-3343
adam.scarzafava@njedge.net 

About IEEE

IEEE is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity. Through its highly cited publications, conferences, technology standards, and professional and educational activities, IEEE is the trusted voice on a wide variety of areas ranging from aerospace systems, computers, and telecommunications to biomedical engineering, electric power, and consumer electronics. Learn more at http://www.ieee.org.

The post IEEE and Edge Announce Partnership to Enhance Research Data Management and Collaboration with IEEE DataPort appeared first on NJEdge Inc.


Me2B Alliance

Flash Guide #8: Digital Me2B Commitments & Deals

Flash Guide #8 DOWNLOAD PDF Digital Me2B Commitments & Deals

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#Me2BRelationship #Me2BLifecycle #Me2BCommitments #Me2BDeals

IN A NUTSHELL
Over the course of the digital Me2B Lifecycle, individual “Me-s” (Data Subjects) will have the choice of deepening the relationship through a series of Me2B Commitments with the online vendor, “B” (Data Controller). This guide provides examples of common Commitments and Deals, and shows how they map to the stages of a Me2B Lifecycle. It also reflects social norms for being anonymous, recognized, or known at each stage.

Flash Guide #7 introduced the Me2B Lifecycle, both online and offline. This Flash Guide takes a deeper dive into the digital Me2B Lifecycle, with a focus on the types of Me2B Deals and Commitments that define each state. 

As in any relationship, ongoing two-way transactions ultimately shape the arc of the relationship lifecyle over time. Over the course of the digital Me2B Lifecycle, the individual (“Me”) will be asked or will seek to enter into specific bargains with the vendor (“B”). These opportunities to move along the relationship arc are called Me2B Commitments.

What’s the difference between a Me2B Deal and Me2B Commitment?

A Me2B Commitment is typically presented to the individual “Me” through the user interface, and it is an opportunity to transact with the B to agree to enter a new state, taking a step along the relationship arc into a specific commitment state, changing the intensity of the relationship.

 

The Me2B Deal describes the terms of the commitment, i.e. what our “typical Me”, Mia, gives to the B and what she receives in exchange. (see Flash Guide #5 Me2B Deal Currencies).

Examples of Me2B Commitments & Deals, as experienced by Mia, our representative “Me”,  are provided in Table 8.1, below.

Me2B Commitment Description Me2B Deal Cookie Commitment Mia accepts cookies. Mia allows the vendor (the “B”) to write cookies (and other data) to local storage in exchange for a better user experience. Location Commitment Mia approves location sharing. Mia shares her location information for a better, location-aware experience. One-off Transaction Mia performs a one-off transaction, like buying something from a retailer without creating a log-in. Mia provides cash equivalent in exchange for a tangible good or service. Promotional Commitment Mia enrolls for promotional communications from the vendor. Mia shares email address and potentially other information in exchange for timely promotional communications from the B. Loyalty Commitment Mia enrolls in Loyalty Program. Mia shares personal information and ongoing transactional information in exchange for valuable discounts from the B. Me2B Marriage Mia creates an account on the site/app. Mia shares personal information, ongoing transactional and behavioral information in exchange for valuable, personalized service.

Table 8.1 – Me2B Commitments and Me2B Deals 

Each of the stars in Figure 8.1, below, depicts the point where a Me2B Commitment begins. Note that each of these commitments could also have a corresponding end point overlaid on the lifecycle. For simplicity we have not shown those end points – except for the cessation of the “Me2B Marriage” which is shown as “Close Account”.    

Figure 8.1 Me2B Commitments over a Digital Me2B Relationship Lifecycle 

As described in Flash Guide #7, Me2B Deals (and the associated Me2B Commitments) should follow the identification norms — Anonymity, Recognition, and being Known —  that match the stage of the relationship where they occur. Figure 8.2 illustrates how the behavioral norms for identification for each stage map onto the Me2B Commitments during the course of the Me2B Relationship Lifecycle.   

Figure 8.2 Digital Me2B Lifecycle with Behavioral Norms 

These identification norms  provide a crucial social context that is presently missing from technology standards and privacy regulation. This missing context can be understood as the state of the Me2B relationship, and ignoring it accounts for substantial harm to Me-s, such as rampant profiling and manipulation.i  

This contextual overlay on the Me2B Lifecycle gives us the language and framework—whether as Me-s, technology makers, policy makers, or standards makers – to better describe how technology should treat us and, importantly, when it departs from expected social norms, as nearly all technology does today. In particular, it serves as a foundation from which the Me2B Alliance works to develop rigorous standards for respectful and safe technology.   

See also the Me2B Alliance Digital Harms Dictionary for a fuller list of harms, as well as the Data Justice Lab’s Data Harm Record, https://datajusticelab.org/data-harm-record/

© Me2B Alliance 2021


Introducing the Me2B 101 Flash Guide Series

When we started drafting the Respectful Tech Specification a couple of years ago, it was immediately obvious that we didn’t have an adequate vocabulary to describe personal experiences in the digital world—never mind measure them.  

When we started drafting the Respectful Tech Specification a couple of years ago, it was immediately obvious that we didn’t have an adequate vocabulary to describe personal experiences in the digital world—never mind measure them.  

Prior to even the formation of the Me2B Alliance, one of the first things I did was examine all of the interpersonal relationship categories we have in our lives to see if there was a generic term to describe the customer/purveyor relationship, because this relationship is at the heart of many of our digital interactions. After much exploration, I came up dry on a simple, pre-existing term to describe this kind of relationship. And it baffled me, since in our very capitalistic culture in the US, we have literally scores of these kinds of relationships in our everyday lives. How is it that we don’t have a category or class name for it? 

My early presentations describing this relationship had over 100 slides. I remember spinning through an early version at the IIW (Internet Identity Workshop), posing this question:  what do we call these customer/purveyor relationships? Doc Searls serenely offered, “why not Me2B Relationships?” And thus, a term was born.  

Words matter. If we don’t have words to describe it, we can’t possibly understand it. And we certainly can’t measure it.   

Me2B refers to a class of relationships between customers (Me-s) and purveyors (B-s) of virtually any sort, in both the physical and the digital world. This concept was the starting point for a new set of terms that enabled us to be more specific about human interactions in the digital world. These included: 

Me2B Deals,  Me2B Relationship Life Cycle  Me2B Marriage,  Me2B Commitments,  Me2P Relationships, (where P = product)  Me2T Relationships, (where T = technology enabler)  B2B Hidden Affiliates. 

Once we recognize that we are in fact in a two-way relationship with purveyors of things and services — not unlike other inter-personal relationships we have in our lives — we can better see the full tapestry of analogs. We can, for instance, see clearly that data is not the new oil, and that it’s harmful to think of it like that. Data isn’t fungible, it is not a commodity, nor labor, nor [strictly speaking] is it property. Data is about us – it is, simply, our very lives. Our data is us.  

It is more appropriate to think of data like blood, or better yet, DNA—it is that sensitive and uniquely identifying. We shed data like microscopic flecks of skin as we go about living our lives. We don’t expect someone to be gathering our hair and skin debris as we wander about, and similarly, we don’t want people gathering our digital “DNA” detritus on the internet. It’s just not ok.   

The Me2B Alliance has recently published a number of short tutorials (Flash Guides) in a “Me2B101 Series” that describes the concepts underpinning of our mission and approach, and tells the story of our journey from ethical principles to metrics and measurement. We hope that you find these illuminating and helpful.  

READ FLASH GUIDES

OpenID

OpenID Foundation Hosting Workshop at EIC 2021

The OpenID Foundation is pleased to announce it is hosting a workshop at EIC 2021 in Munich. The Foundation’s workshop is part of the pre-conference workshops at EIC on Monday, September 13, 2021 from 9am-1pm CEST. As EIC hybrid event, the workshop is available to those participating virtually as well. This workshop will include a […] The post OpenID Foundation Hosting Workshop at EIC 2021 first

The OpenID Foundation is pleased to announce it is hosting a workshop at EIC 2021 in Munich. The Foundation’s workshop is part of the pre-conference workshops at EIC on Monday, September 13, 2021 from 9am-1pm CEST. As EIC hybrid event, the workshop is available to those participating virtually as well.

This workshop will include a panel discussion on the ongoing global adoption of the Financial-grade API (FAPI) security profile while reviewing global open banking initiatives. The workshop will also include updates on the OpenID Certification Program as well as all active Foundation working groups.

If attending EIC in person, the workshop will take place in the BODENSEE II room. Current workshop agenda including presenter bios can be found here: https://www.kuppingercole.com/sessions/4592/1 


Workshop Agenda

TIME (PT) PRESENTATION PRESENTER(S) 9:00-9:10 Welcome & Introduction Nat Sakimura – NAT Consulting & OIDF Chairman

Gail Hodges – OIDF Executive Director

9:10-9:30 WG Update – AB/Connect Michael Jones – Microsoft 9:30-9:50 OpenID Connect for SSI (OIDC4SSI) Update Kristina Yasuda – Microsoft

Torsten Lodderstedt – yes.com

9:50-10:10 WG Update – FAPI + FAPI 2.0 Nat Sakimura – NAT Consulting & OIDF Chairman

Torsten Lodderstedt – yes

10:10-10:30 OpenID Certification Program Update Joseph Heenan – OIDF & Authlete 10:30-10:45 Break & Networking 10:45-11:15 Panel Discussion:

Securing the Future of the FAPI Ecosystem – Global Open Banking Interoperability and Roadmaps

Panelists:

·      Gail Hodges – OIDF Executive Director

·      Don Thibeau – OIDF Non Executive Director

·      Daniel Goldscheider – yes

·      Danillo Branco – Finansystech

11:15-11:35 WG Update – eKYC-IDA Mark Haine – Considrd Consulting 11:35-11:55 WG Update – MODRNA (Mobile OpenID Connect Profile) Bjorn Hjelm – Verizon & OIDF Vice Chairman

 

11:55-12:15 WG Update – Shared Signals & Events + CAEP Tim Cappalli – Microsoft

 

12:15-12:30 Open Q&A and Closing Remarks 12:30-1:00 Networking

 

 

The post OpenID Foundation Hosting Workshop at EIC 2021 first appeared on OpenID.

Me2B Alliance

Flash Guide #7: The Me2B Lifecycle: Overlaying Social Norms on the Digital World

Flash Guide #7 DOWNLOAD PDF The Me2B Lifecycle: Overlaying Social Norms on the Digital World

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#Me2BRelationship

IN A NUTSHELL
Key to creating a standard to measure the behavior of technology is the ability to take several contexts into consideration, including the current status of the Me2B relationship. The Me2B Lifecycle model provides a framework and vocabulary to articulate and account for the dynamic “relationship context” over time when evaluating the behavior of technology. This real life social context is currently missing in both existing privacy regulation and in industry standards models for ethical technology, but it is crucial to the Me2B Respectful Tech Specification. Our model helps course-correct connected technology by pinpointing how the digital Me2B experience deviates from important social behavioral norms.

The Me2B Alliance has developed a model for the lifecycle of Me2B Relationships that is rooted in interpersonal psychology. Specifically, we built on psychologist George Levinger’si ABCDE relationship model, which asserts that every relationship eventually traverses through five stages: 

Acquaintance  Build-up  Commitment  Deterioration  Ending 

The Me2B Relationship Lifecycle follows a similar arc. It starts with acquaintance, and moves through a buildup of repeated interactions. Over time, if these interactions deepen in intimacy and trust, they may culminate in a deliberate deep commitment, which we call the “Me2B Marriage”, which is the point of creating an online account. Eventually, the relationship may deteriorate to the point of termination. (See Figure 7.1) 

Figure 7.1 – The Me2B Relationship Lifecycle  

Each stage in the Me2B Relationship Lifecycle represents a unique context that reflects the trust level and intensity of the relationship from the individual’s point of view. Figure 7.2 provides examples of how each stage of the lifecycle may look in the physical world and in the digital world. 

Figure 7.2 – Me2B Relationship Lifecycle in the Physical & Digital Worlds 

In the physical world, there are specific behavioral social norms and expectations for each of these stages of the Me2B Lifecycle. One doesn’t expect to be greeted by name, for instance, before any introductions have been made. Similarly, we don’t expect store employees to know our home address unless we’ve given it to them for a specific reason (such as delivery).

This real life social context is currently missing in both existing privacy regulation and in industry standards models for ethical technology, but it is crucial to the Me2B Respectful Tech Specification. Our work translates appropriate and respectful behavioral norms from the physical world onto our experiences online. Figure 7.3, below, illustrates how social norms for identification –for being anonymous, recognized, and “known” or remembered– operate in the physical world, and how they should operate in the digital world.  

Figure 7.3 Identification Behavioral Norms of Me2B Relationships 

What is the Me2B Marriage?

The Me2B Marriage reflects the most intimate, trusted phase of the Me2B relationship, a stage where the individual chooses to be recognized, “known” or remembered, and personally responded to.  In the physical world, this implies a level of being known by the business (such as loyalty program enrollment) and the agents of the business (such as cashiers or salespeople). In the digital world, the Me2B Marriage is the legal ceremony of agreeing to the vendor’s terms of service or terms of use, which most people never readii. This is, in fact, the act of entering a legal contract between the “Me” and the “B”. Ideally, the individual is not coerced or manipulated into signing up for an account, such that the individual freely signals to the B that they wish to be remembered, recognized, and personally responded to.

Currently, respectful behavioral norms from Me2B Relationships in the physical world are neither reflected nor supported in digital Me2B Relationships. The Me2B Lifecycle model serves as a tool that can help course-correct connected technology to reflect appropriate social norms and behaviors for each stage in the Me2B Relationship Lifecycle.  

The Me2B Respectful Tech Specification is specifically designed to address the disconnect between offline and online behavior. The tests in the specification evaluate connected technologies for how well they are translating respectful behavioral norms from the physical world into the digital world. 

https://en.wikipedia.org/wiki/George_Levinger https://www.businessinsider.com/deloitte-study-91-percent-agree-terms-of-service-without-reading-2017-11#:~:text=A%20new%20Deloitte%20survey%20found,and%20conditions%20without%20reading%20them

© Me2B Alliance 2021


Flash Guide #6: Online Me2B Relationships

Flash Guide #6 DOWNLOAD PDF Online Me2B Relationships

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#Me2BRelationship

IN A NUTSHELL
Me2B Relationships in the digital world have even more layers than those in the physical world. In addition, our relationship with connected technology includes a set of “hidden affiliates” (third party integrations) that most of us are not aware of. This guide describes how these relationships – conscious or not – emerge as we interact with digital technologies.

In Flash Guide #4, we introduced Me2B Relationships and examined how they may be experienced in the physical world. In this Flash Guide, we’ll explore Me2B Relationships in the digital world. 

All Me2B Relationships have a number of different facets. We introduced this layering in the Flash Guide #4, explaining the difference between the relationship our typical “Me,” Mia, has with a retailer or brand, and the “Me2P” relationship Mia has with a particular product. This same kind of layering exists—with even more layers—in the digital world.  

Let’s walk through a fictional account of Mia unboxing a brand new iPhone:

Figure 6.1 

Stage 1: The first thing that happens is that Mia powers up her phone and is immediately required to agree to the Apple Terms of Service—i.e. to enter into a Me2B Legal Relationship with the company (or “B”), Apple. As Mia uses the phone, she develops a disposition (love/hate, e.g.) towards the phone; she has an experiential relationship with the phone, not dissimilar to the experiential relationship she has with the OXO brand containers described in FG #4. This relationship with her phone is a Me2P or “Me-to-Product” relationship.  

There is a critical difference, however, between connected technology like Mia’s iPhone and static products like her OXO containers. Connected technology relies on a number of integrated third-party technologies, depicted as “B2B Hidden Affiliates” in Figure 6.1. These entities may have access to significant pieces of information about Mia and her usage of the phone. So in essence, Mia is having a relationship with those hidden affiliates—she just doesn’t know it.  

Stage 2: As Mia starts configuring her phone, she decides to install the Chrome browser app:
 

 

Figure 6.2 

Similar to Stage 1, as soon as Mia opens the browser, she is required to establish a Me2B Legal Relationship with Google, LLC. And over time, as she uses the browser, she builds a Me2P relationship with the browser itself. She is aware that she is using the browser—essentially having an ongoing dialogue with the app. Note that, even though she’s using the phone, it’s not at the forefront of her consciousness, and it has become an Enabling Technology Relationship (or Me2T relationship). We reserve the term “Me2P” for the app or website with which the individual is directly interacting. Finally, in this stage, notice that the list of B2B Hidden Affiliates continues to grow—this time, with all the third parties included in the browser app. 

Stage 3: Mia opens the Instagram website from the Chrome browser: 

 

Figure 6.3 

Similar to the previous stages, Mia is required to enter into a legal relationship with the company responsible for Instagram (Facebook). Her interaction with Instagram occupies her attention and she is in a two-way dialogue with the website. The list of B2B Hidden Affiliates continues to grow, and the browser is demoted to an Enabling Technology, like the phone.  

With just a few steps, Mia has developed a network of entities with whom she is in some kind of relationship—wittingly or not.  

A word about B2B Hidden Affiliates: while these are depicted in this explanation, we do not consider these to be Me2B Relationships since Mia has no direct relationship with those entities. We will go deeper into the network of hidden affiliates in a subsequent Flash Guide. 

Mapping Me2B Relationships to GDPR Terminology

The Me2B Respectful Tech Specification leverages GDPR terminology and concepts, for consistency and clarity.i We acknowledge that the GDPR’s term for people, “Data Subject”, is not particularly empowering from a semantic perspective, but it has the benefit of being clear and legally explicit.There is no doubt that people are more than “data subjects”. The terminology we’ve introduced above maps to GDPR terms as follows:

 

Me2B TERM GDPR TERM Mia Data Subject Business or “B” in a Me2B Legal Relationship Data Controller Product or “P” in a M2P Relationship Data Controller Technology or “T” in a Me2T Relationship Data Controller B2B Hidden Affiliate Data Processor See GDPR Definitions here: https://gdpr-info.eu/art-4-gdpr/

© Me2B Alliance 2021


Flash Guide #5: Online Me2B Deals: Currencies in the Digital World and the Price of “Free”

Flash Guide #5 DOWNLOAD PDF Online Me2B Deals: Currencies in the Digital World and the Price of “Free”

Version 1.0 | September 1, 2021

Author: Me2B Alliance

#Me2BDeal #TANSTAAFL

IN A NUTSHELL
The Me2B Deals or transactions that occur online typically involve three types of “currency”: money, attention or data. The individual consumer (Me) exchanges one of these currencies for goods or services online. Data monetization has emerged as the primary method of income generation to subsidize so-called “free” digital services, with little to no regulation or oversight. What sets online data monetization apart from the other two currencies is that often, customers have no idea what they are paying with – or that they are paying at all.

As introduced in Flash Guide #4, Me2B Deals are the value exchanges that occur in the course of a Me2B Relationship. Me2B Deals represent bi-directional transactions between the individual customer (“Me”) and the business or purveyor (“B).   

In the physical world, we typically experience these deals as cash or credit transactions. The customer (“Me”) gives the business (“B”) some amount of money in order to receive some good or service. In addition, since the advent of advertising-supported media such as magazines, radio and television, attention has also evolved to be a type of currency used in Me2B Deals for media goods and services. 

As we traverse the digital world, online Me2B Deals also involve exchanges of money and attention for goods or services. In addition, “data monetization”i – the ability to make money off of data – has evolved into a powerful new type of currency in online digital transactions, resulting in three primary currencies for online Me2B Deals:

Money. Credit card transactions (and increasingly cryptocurrencies) are a typical occurrence in the digital world. In the 2020 pandemic, many of us turned increasingly to online retailers, grocers and delivery services, for which we pay using cash equivalents.   Attention. When radio and television came of age, another kind of currency matured:  namely, attention–i.e. people literally “paying” attention for some duration of time to watch or listen to advertisements. In exchange, we received broadcast radio and TV for “free”. When early online companies were grappling to find a viable business model, advertising was the obvious choice and digital advertising was born. Banner ads and pop-up ads were the earliest examples. Soon there was a “race for eyeballs” fueled by the “CPM” (cost per 1000 views of an ad), the desire for “sticky apps”ii, and a burgeoning advertising technology infrastructure of Real Time Bidding systemsiii.    Data. The early evolution of the internet in the 2000’s heavily relied on the attention fueled ad-supported business model described above, and soon took it a step further with the monetization of data, which in turn reinforced the newly popular “free” and “freemium”iv business models. It wasn’t long before app and website publishers realized that they could observe and record every single movement a person makes online. (Shoshana Zuboff calls this information “behavioral surplus” in her master work, “Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power”v). The sharing of this information in exchange for money or another value is one example of “data monetization”. The irresistible allure of converting free, observable data into revenue has become the de facto business model for online services. It existed before the internet with credit companies, direct marketing and the segmenting and selling of contact lists. But the current digital version is far more pervasive, hidden, and unchecked. What sets online data monetization apart from the other two currencies is that often, customers have no idea what they are paying with – or that they are paying at all. 

Let’s pause here to take a step back for a thought experiment to imagine how Data Monetization and a similar “free” business model might look in the physical world. 

“Free” in the Grocery Store

Mia goes to her friendly neighborhood grocery store and sees a sign that says, “Free Cantaloupes! (up to two per customer)”. No other information about what Mia is trading for the free cantaloupe is shown, so she adds a couple cantaloupes to her cart. What Mia doesn’t know is that there is a network of video cameras following her around the grocery store. As soon as Mia places the cantaloupes in her cart, the video network begins recording. It observes and notes her behavior throughout the store. It notices, for instance, that she puts only cantaloupes and oranges in her cart and notes to itself, “Subject X023985FRL prefers cantaloupes and oranges. Subject X023985FRL spent 2.3 minutes selecting oranges. Subject X023985FRL only picked up blemish free cantaloupes. Subject X023985FRL walked past cantaloupes that were on sale.” Unbeknownst to Mia, this recording system shares all that information with a variety of other businesses—including ones that had nothing to do with the grocery supply chain. And the video cameras may continue to follow Mia after she leaves the store. The cost of the “free” cantaloupes was actually a total loss of privacy while shopping, but Mia had no way of knowing that. Free comes at a price.

One may ask:  Who has a right to know the details of my behavior?  How much of my behavior should be tracked?  These are the kinds of questions we’re tackling in the Me2B Alliance. In general, we think that no one has a right to track your behavior in the digital world in the current rampant, ungoverned, disrespectful and ultimately harmful  way.    

It used to be that companies built products that competed on the value that they provided to customers. Today, the digital world is all about data monetization and a more exploitive relationship with customers.  

While online services today are free, we are anything but.  

“Data monetization…may refer to the act of generating measurable economic benefits from available data sources (analytics).” https://en.wikipedia.org/wiki/Data_monetization Defined by how long people spent engaged with and returning to apps. “Real-time bidding (RTB) is a means by which advertising inventory is bought and sold on a per-impression basis, via instantaneous programmatic auction, similar to financial markets.” https://en.wikipedia.org/wiki/Real-time_bidding   https://en.wikipedia.org/wiki/Freemium “The Age of Surveillance Capitalism: The Fight for A Human Future at the New Frontier of Power”, Shoshana Zuboff, 2019, PublicAffairs Hachette Book Group, New York, NY.

© Me2B Alliance 2021


CU Ledger

Bonifii joins Indicio Network Node Operator Consortium

Denver-based Bonifii, the financial industry’s first verifiable exchange network for financial cooperatives, today announced it has joined the Indicio Network as a Node Operator to support the continued expansion of decentralized identity across the globe. The Indicio Network is the world’s only globally available, professionally-managed, enterprise-grade network for decentralized identity. With n

Denver-based Bonifii, the financial industry’s first verifiable exchange network for financial cooperatives, today announced it has joined the Indicio Network as a Node Operator to support the continued expansion of decentralized identity across the globe.

The Indicio Network is the world’s only globally available, professionally-managed, enterprise-grade network for decentralized identity. With nodes on five continents, Bonifii joins a consortium of diverse, forward-thinking companies driving the use of decentralized identity to improve privacy and security in fintech, healthcare, travel, and the Internet of Things (IOT).

Bonifii’s president/CEO John Ainsworth said the company decided to support the Indicio Network after seeing it used to support a growing number of successful solutions over the past year.

“We are excited to join the Indicio Network and participate in the development of next-generation identity technology alongside some of the world’s fastest-growing innovators in the identity space,” said Ainsworth. “This has the potential to open up entirely new ways of conducting business for credit unions and enable compelling new experiences for their members.”

“As the leading provider of decentralized identity for credit unions, Bonifii is laying the foundations for the future of financial infrastructure,” said Heather Dahl, CEO, and co-founder of Indicio.tech. “We’re thrilled to see such an innovative company support the Indicio Network, and we look forward to seeing what they will build to make decentralized identity a reality for millions of consumers.”

Bonifii’s digital strategy focuses on credit unions and providing them with a digital network for peer-to-peer financial exchange. The company’s flagship product is MemberPass®, which allows credit union members convenient access to their financial accounts while allowing control and privacy of their personal information.

“The market for application-based consumer identity is wide open with Indicio’s superfast, secure, and low-cost distributed network,” Ainsworth said. “It’s in our interest to be part of the Indicio Network’s Node Operator collaborative. These companies are incubating new identity technologies and commercializing them in a global context. Being a Node Operator means having access to the combined technical insight of all the other members to improve our speed to market.”

Bonifii joins a growing network of diverse global entities, including IdRamp, Liquid Avatar Technologies, GlobaliD, Sapper Future Technologies, Verses Labs, and others who are driving the deployment of decentralized identity.

Indicio Node Operators are responsible for running the nodes of the Indicio Network, as well as guiding both strategy and ecosystem development. This contributes to robust network stability and is conducive to maintaining diversity and decentralization of the network. Node Operators also support Indicio’s comprehensive and easy-to-use governance model that provides safeguards for the entire ecosystem using the Indico Network and preserves the integrity of the Indicio network.

 

For more information see indicio.tech

For more information about Bonifii and MemberPass, visit www.memberpass.com

 

The post Bonifii joins Indicio Network Node Operator Consortium appeared first on Bonifii.

Tuesday, 31. August 2021

omidiyar Network

The growing demand for digital public infrastructure requires coordinated global investment and an…

The growing demand for digital public infrastructure requires coordinated global investment and an ethical lens Global leaders weigh in on the ethical facets of digital public infrastructure at a virtual conference held Monday, Aug. 30, 2021. The 10 must-reads to keep pace with the digital public infrastructure movement By Govind Shivkumar, Director of Responsible Technology, Omidyar&nb
The growing demand for digital public infrastructure requires coordinated global investment and an ethical lens Global leaders weigh in on the ethical facets of digital public infrastructure at a virtual conference held Monday, Aug. 30, 2021. The 10 must-reads to keep pace with the digital public infrastructure movement

By Govind Shivkumar, Director of Responsible Technology, Omidyar Network

You don’t need me to tell you that 2020 was an arduous year, or that the year that followed has been just as fraught. We all had a collective experience of the pandemic as a destabilizing force. It exacerbated and highlighted societal inefficiencies and inequities as no other single event in recent memory has, and it did so on a global scale.

Underlying many of the achievements and disappointments in pandemic response was digital public infrastructure (or lack thereof). Where some nations were equipped to deliver cash transfers to citizens, collect and utilize comprehensive public health data, and distribute vaccines in an orderly fashion, others struggled. What made the difference was often preexisting digital public infrastructure (or DPI).

Our friends at The Rockefeller Foundation explain this well in a new report, showing that as more of our society becomes digital (especially the economy), it becomes increasingly imperative that we treat digital infrastructure like we do roads and bridges. They outline six key areas of cooperation. The bottom line? We need coordinated global investment in digital public infrastructure so that it’s in place before we need it. And we need an ethical lens to ensure that any digital systems reinforce safeguards such as inclusion, trust, competition, security, and privacy.

A shared vision and call to action

The need for more and better digital public infrastructure has not gone unnoticed by government and philanthropy. This week, a group of nearly 20 global leaders laid out a shared vision for co-developing broad-based, flexible digital systems to support an equitable recovery from the Covid-19 pandemic while strengthening inclusion and human rights.

Hosted by the Digital Public Goods Alliance, Norwegian Ministry of Foreign Affairs (Norad), and The Rockefeller Foundation, the virtual event elicited meaningful ideas and commitments from The United Nations, European Commission, the Bill and Melinda Gates Foundation, The Rockefeller Foundation, Africa Digital Rights Hub, USAID, India, Sri Lanka, Togo, Sierra Leone, Estonia, Norway, Germany, and many other champions.

Omidyar Network’s Senior Vice President of Programs, Michele Jawando, moderated the conversation, drawing out pledges of funding and other support to MOSIP and Mojaloop (two of Omidyar Network’s grantees), the health data exchange DHIS2, Digital Square, Digital Public Goods Alliance, technical assistance to implementing nations, research, and proposals for new funding structures such as The Giving Pledge or creating a GAVI-like alliance that pools demand and resources from the public and private sectors.

“More than 400 people tuned in to learn more about digital public Infrastructure (DPI) — the basic systems like digital identity, payments and data exchanges that can connect us at a societal scale,” Jawando reflected. “Some heard for the first time how innovators built on DPI to slow disease spread and deliver help. They heard calls for caution and the need for safeguards for inclusion and protection. And they saw that we have a shared and coordinated agenda for equitable global co-development, based on digital public goods.”

As the speakers reminded us, this is just the beginning. We have a chance to bake ethics and inclusion into society’s next great socio-technological transformation; the first that is truly global AND digital. Let’s use that opportunity wisely.

An ethical framework

If history has shown us anything, it’s that creating systems that have society-wide implications without prioritizing ethical considerations is a fool’s errand at best; at worst it’s catastrophic. As the demand for digital public infrastructure becomes more widespread, it’s important to prioritize the ethical considerations when designing such systems.

That’s the objective of a new white paper funded by Omidyar Network and developed by Harvard University’s Edmond J. Safra Center for Ethics. To provide governments with a roadmap for the ethical deployment of digital public infrastructure, the Center’s Justice, Health and Democracy Impact Initiative produced a set of best practices for organizations that covers the design and deployment considerations for technologists, national governments, and philanthropic funders.

“After taking a hard look at existing work on this subject, we realized much of the debate is happening at the wrong level,” said co-author Josh Simons, who holds a joint fellowship at the Carr Center for Human Rights Policy and the Edmond J. Safra Center for Ethics. “There is too much uncertainty, change, and dynamism for hard and fast rules. What we need are concrete structures of decision-making and participation that hold actors to account for how they build digital infrastructure.”

Specifically, the paper advises technology developers to design digital public infrastructure that can adapt to emerging needs, concerns, and technological developments. Developers should consult directly with advocates of end users at all stages of development to ensure systems do not undermine their rights. Developers should also ensure their systems are able to be accessed by a variety of users, regardless of background or experience with technology, and regularly revise and maintain the systems once they’re built.

On the policy side, it is critical that policymakers integrate deliberation in the design and deployment of digital public infrastructure. These deliberation processes should also seek out input and feedback from groups have historically been marginalized. Additionally, policymakers should continuously conduct evaluations of how digital systems are affecting the communities they serve and establish clear ways to reform the systems, protect user rights, and establish transparency and accountability for politicians and technology developers.

Finally, philanthropists who are hoping to support the development of digital public infrastructure should identify and support developers and policymakers who have clear commitments to responsible digital public infrastructure development. They should moreover establish accountability measures for themselves such as adopting multidisciplinary ethics committees to help with oversight and navigating uncertain situations.

And the paper stresses regular discussions among all stakeholders involved to assess the merits, feasibility, and limitations of digital public infrastructure.

“Our guiding principles for stakeholders involved in developing and deploying digital public infrastructure are revisability of technology design, deliberative governance to create feedback loops, and accountability of funders and policy-makers,” said Jeff Behrends, Director of Ethics and Technology Initiatives at the Edmond J. Safra Center for Ethics and co-author on the project. “We advocate an approach to the ethics of digital public infrastructure that we believe is robust, resilient, and applicable to countries across the world.”

A growing movement

If you feel like this has only scratched the surface, you’d be right! As with so many things with social, political, economic, and technological consequences, digital public infrastructure is a complex topic. Below are links to further reading that highlights the need for digital public infrastructure and how to make it a reality everywhere, for everyone. We’re confident that once others in government, international organizations, civil society, business, and philanthropy learn more, they will want to help overcome some of the remaining hurdles.

We welcome them into a rapidly growing international community of organizations working to change how countries are supported in their digital transformation journeys. There are no quick fixes. Only through better coordination, vastly more resources, and a clear vision of what “good DPI” is and why it matters can we accelerate deployments, strengthen national digital sovereignty and local value creation.

How to bring digital inclusion to the people who need it most (World Economic Forum, Aug. 2021)

Reimagining digital public infrastructure is no longer just a development agenda (Omidyar Network, June 2021)

A digital agenda for the Eastern Partnership (European Council on Foreign Relations, June 2021)

Exploring digital public goods (An eight-part series written by Richard Pope and supported by Omidyar Network, May-July 2021)

Opinion: The time is now for digital public goods (Devex, May 2021)

Financing the digital public goods ecosystem (Digital Public Goods Alliance, March 2021)

Covid-19 spurs national plans to give citizens digital identities (The Economist, Dec. 2020 — subscriber access required)

‘Fire hose’ of health innovation risks going down the drain (Financial Times, Nov. 2020)

The growing demand for digital public infrastructure requires coordinated global investment and an… was originally published in Omidyar Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Me2B Alliance

12 ways a human-centric approach to data can improve the world

Twenty-five quintillion bytes of data are generated every day. That’s 25,000,000,000,000,000,000. In this era of data abundance, it’s easy to think of these bytes as a panacea – informing policies and spurring activities to address the pandemic, climate change or gender inequality – but without the right systems in place, we cannot realize the full potential of data to advance a sustainable, equit

EdgeSecure

Safe and Accelerated Procurement via Lead Agencies and Shared Services

The post Safe and Accelerated Procurement via Lead Agencies and Shared Services appeared first on NJEdge Inc.

Webinar

September 22, 2021
3 pm EDT

Lead Agencies and shared services exist to provide public entities with a collaborative model for accelerated procurement. Procurement decision makers with an in-depth understanding of how to leverage these options bring an added dimension of value to their organizations.

In this session you’ll learn:

The legal foundations of lead agency procurements and shared services agreements The basis upon which education institutions and public entities can share contracts and resources How Lead Agencies provide contractual and pricing benefits How a Lead Agency like Edge can procure contracts for use by our members How Shared Services Agreements allow education institutions and public entities to share resources to accelerate positive organizational outcomes Register Today

The post Safe and Accelerated Procurement via Lead Agencies and Shared Services appeared first on NJEdge Inc.


We Are Open co-op

The End of the Beginning

Reflecting on WAO’s digital support of charities during the Catalyst Continuation programme Image: Clay LeConey “Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” (Winston Churchill) A couple of months ago, we announced that we’d been asked to help eight charities with some digital support through Catalyst’s ‘Continuation ‘progra
Reflecting on WAO’s digital support of charities during the Catalyst Continuation programme Image: Clay LeConey
“Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” (Winston Churchill)

A couple of months ago, we announced that we’d been asked to help eight charities with some digital support through Catalyst’s ‘Continuation ‘programme. This has now come to an end, so we thought we’d wrap up what happened and what we learned from the process.

First of all, we tracked everything we did. This was vital not only to account for our time but also to co-ordinate with one another and fairly distribute our support amongst our eight charities. Doug and Laura collaborated to provide advice and support over the course of the programme.

The work being done by charities was diverse and different, and we didn’t have a lot of time to plan what kind of support we were going to offer. As our eight charities were in different places, we needed a plan that would be quite flexible. Consequently, we decided to run weekly office hours in which charities could randomly show up, meet whomever else happened to have shown up and ask questions or just chat. In addition, we offered weekly 1:1 sessions for each charity.

Our digital support was optional, which meant that some charities used their full allocation of time, whereas others touched in occasionally or used us as accountability partners.

From WordPress training to Typeform remixing, advice on SSL certs and hosting packages to analytics, GDPR, CRMs or discovering if a particular tool had its own API, over the ten weeks, we helped charities with a wide range of technical issues.

I don’t believe we would have been as effective or achieved as much in this phase without the help, support and direction of the you as our Digital Mentors. (Christine Brown, Ideal For All)

We also gave advice on community engagement, peer to peer learning, architectures of participation, user testing, UX and many other socio-technological phenomena. All of this kind of consultancy can only happen by understanding the wider context a charity is working in, so we are pleased that our participation in previous Catalyst programmes had given us that insight and understanding.

Really appreciate everything you’ve taught us and that goes for the rest of the team too. (Lindsay Woodward, CFF)

We’re so proud to have had the opportunity to work with such a wonderful group of charities and human beings. As we wrapped up our participation in the Catalyst Continuation programme, we made sure to let our charities know that they are always welcome to reach out and share with us what they’ve been working on. After all, the relationships don’t end just because the beginning does!

We Are Open were as clever as they were time efficient, a valuable resource throughout. (Kat Lewis, The Margate Bookie)

The End of the Beginning was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Kantara Initiative

Kantara lays out trust-building recommendations for mDLs

A global digital ID association has published steps vendors and others need to take in order to build effective mobile driving license services that also put ID holders in control of their identity. The Kantara Initiative’s report starts from the premise that trust in mobile driving licenses grows with the degree of control that license holders have over the documents, their privacy and

A global digital ID association has published steps vendors and others need to take in order to build effective mobile driving license services that also put ID holders in control of their identity. The Kantara Initiative’s report starts from the premise that trust in mobile driving licenses grows with the degree of control that license holders have over the documents, their privacy and their biometric identifiers. Read more -> https://www.biometricupdate.com/202108/kantara-lays-out-trust-building-recommendations-for-mdls

The post Kantara lays out trust-building recommendations for mDLs appeared first on Kantara Initiative.

Monday, 30. August 2021

GLEIF

\#4 in the LEI Lightbulb Blog Series - Soaring Regulatory Confidence puts LEI at Center of Trust in Payments Ecosystem

We do not have to look back further than the global economic collapse of 2008 to fully understand the worst-case scenario of unverified legal entities engaging in financial transactions. The LEI was created at the request of the G20 and Financial Stability Board (FSB) in response to this global catastrophe. Its objective is to provide a means to uniquely identify any legally distinct entity that e

We do not have to look back further than the global economic collapse of 2008 to fully understand the worst-case scenario of unverified legal entities engaging in financial transactions. The LEI was created at the request of the G20 and Financial Stability Board (FSB) in response to this global catastrophe. Its objective is to provide a means to uniquely identify any legally distinct entity that engages in a transaction and subsequently reduce fraud and mitigate risk within the ecosystem.

With the LEI’s origin and objective in mind, it follows naturally that it offers potentially immeasurable value too when it comes to delivering transparency and trust in payment transactions. This value is widely recognized among many payment industry insiders, and in recent years, there has been burgeoning appetite and advocacy for LEI usage in a broad range of payments use cases. There has been significant industry-wide progress with two concrete mandates for LEI usage recently emanating from the Reserve Bank of India and the Bank of England in specific payment applications. Additionally, many regions and regulators are consulting on transformation within the ecosystem, and considering what role the LEI could play as a result. Since early 2020, GLEIF has shared information with regulatory authorities and organizations in the payments space on the LEI and its credentials in identity management via nine public consultations. Many more are anticipated in the years ahead.

Below is a summary of key recent developments within the payments’ arena, which demonstrate widening acceptance of the LEI as a tool capable of delivering an enhanced global payments ecosystem.

Mandate: LEI for Large Value Transactions

The Reserve Bank of India (RBI) has introduced a new mandate for the use of the LEI for all payment transactions of the value ₹50 crore (approximately €5.5 million) and above for entities using the Reserve Bank-run Centralized Payment Systems viz. Real Time Gross Settlement (RTGS) and National Electronic Funds Transfer (NEFT). By 1 April 2021, banks had to ensure that all remitter and beneficiary information in RTGS and NEFT payment messages were populated with the LEI. Data showed that in Q1 2021, India was ranked fifth in terms of the jurisdictions with the highest rate of LEI growth. This offers an indication of the mandate’s impact on LEI issuance.

A strong proponent for the LEI, the RBI had earlier introduced it as a requirement in over the counter (OTC) derivative and non-derivative markets and large corporate borrowers.

The RBI’s use of the LEI to verify the identity of participants in large value transactions is the first use case of its kind. Yet the potential risk management advantages it offers make it a compelling tactic that other central banks may wish to consider adopting.

Mandate: LEI in CHAPS Payments Message Standard (ISO 20022)

In December 2020, The Bank of England (BoE) published its ‘Policy Statement: Implementing ISO 20022 Enhanced data in CHAPS’. Within the document, the BoE reaffirms its proactive position “to support the wider uptake of the LEI - beyond the financial sector - to corporates. […] the Bank believes wider uptake of the LEI could unlock a number of key benefits and is working with Government and national and international stakeholders to promote LEI use cases. These include their potential role in enhancing cross-border payments as part of the Financial Stability Board’s roadmap, as well as the role the LEI can play to help tackle financial crime.”

The paper confirms the BoE’s long-held intention to “introduce the LEI into the CHAPS payment message standard when migrating to ISO 20022, in line with industry views and international consensus, including HVPSplus and CBPRplus guidance.”

It then sets out the following initial phasing timeline:

February 2023: LEIs will be introduced into ISO 20022 standard CHAPS payment messages on an ‘optional to send’ basis. While the BoE encourages all CHAPS Direct Participants to start using LEIs as early as possible, it will not become mandatory until spring 2024. Spring 2024: The BoE will begin mandating LEIs to be used in by certain circumstances, with a vision to widen out the requirement to all participants over time. The BoE will mandate the use of the LEI where the payment involves a transfer of funds between Financial Institutions.

Within the report the BoE notes that the earlier firms adopt the LEI, the sooner they will derive its benefits. It also states an intention for the BoE to monitor the use of LEI for all transactions, with a view to assessing whether the mandatory requirement to include LEI data should be extended to all CHAPS payments. Industry will be provided with at least 18 months’ advance notice if the BoE extends any mandatory LEI requirements.

Consultation: Cross Border Payments

In February this year, GLEIF published a blog exploring the FSB’s support for the LEI in its Stage 3 roadmap on Enhancing Cross-Border Payments. In that publication, the FSB lists several focus areas that require global coordination and action to overcome the challenges and frictions in cross-border payments. Of particular note is the identification of ‘Establish Unique Identifiers with proxy registries’ as a key building block in the FSB’s roadmap to enhanced cross-border payments.

An action-oriented framework is laid out in the roadmap, which commits the FSB and GLEIF to collaborate in consultation with other prominent stakeholders to: “explore the scope for, and obstacles to develop, a global Unique Identifier (UI) for cross border payments and potentially other financial transactions, that takes into account existing identifiers including the LEI for legal entities….” This collaborative work effort is scheduled to run from October 2020 until December 2021 and GLEIF welcomes the opportunity to engage with it.

Additionally, as a further action mapped to the same ‘Establishing Unique Identifiers…’ building block, GLEIF will work in close coordination with the FSB, the Regulatory Oversight Committee (ROC) and national authorities to explore options for improving LEI adoption. This work effort will run from June 2021 to June 2022.

Reassuringly, the FSB roadmap makes the association between an enhanced payments ecosystem and legal entity identification. For cross-border payments, the ability for identity verification to happen across borders is critical and this is why the LEI is perfectly poised to provide a solution. Its universality makes it the perfect candidate for bestowing transparency in relation to entity identification across the global payments landscape.

Consultation: Instant Payments

In March 2021, the European Commission (EC) published its Consultation Strategy on Instant Payments in the EU. The purpose of the consultation is to identify obstacles to the creation of efficient pan-European instant payment solutions, to assess the effectiveness of potential solutions and to measure the potential benefits and costs of those solutions. The first part of this consultation involves surveying payment service providers (PSPs) and providers of supporting technical services. Of relevance to the LEI are the questions regarding sanctions screening. The survey asks whether there is a need for alleviated screening of transactions by PSPs involving clients vetted or whitelisted beforehand, or if a common EU-wide list of false hits and/or the use of the LEI for firms and digital IDs for individuals could solve any sanctions screening-related issue that instant payments may create.

GLEIF welcomes the EC’s consideration of the LEI as a potential solution to support the screening of instant payment transactions against sanction and watch lists and has a keen interest in the outcome of the survey.

The ‘LEI Lightbulb Blog Series’ from GLEIF aims to shine a light on the breadth of acceptance and advocacy for the LEI across the public and private sectors, geographies and use cases by highlighting which industry leaders, authorities and organizations are supportive of the LEI and for what purpose. By demonstrating how success derived from strong regulatory roots is giving rise to a ground swell of champions for further LEI regulation and voluntary LEI adoption across new and emerging applications, GLEIF hopes to educate on both the current and future potential value that ‘one global identity’ can deliver for businesses, regardless of sector, world-wide.


Oasis Open

Invitation to comment on Specification for Transfer of OpenC2 Messages via MQTT v1.0

This specification describes the use of MQTT Version 5.0 as a transfer mechanism for OpenC2 messages. The post Invitation to comment on Specification for Transfer of OpenC2 Messages via MQTT v1.0 appeared first on OASIS Open.

First public review of this draft specification - ends September 29th

OASIS and the OASIS Open Command and Control (OpenC2) TC are pleased to announce that Specification for Transfer of OpenC2 Messages via MQTT Version 1.0 is now available for public review and comment. This is the first public review for this specification.

Open Command and Control (OpenC2) is a concise and extensible language to enable the command and control of cyber defense components, subsystems and/or systems in a manner that is agnostic of the underlying products, technologies, transport mechanisms or other aspects of the implementation. Message Queuing Telemetry Transport (MQTT) is a widely-used publish / subscribe (pub/sub) transfer protocol. This specification describes the use of MQTT Version 5.0 as a transfer mechanism for OpenC2 messages.

The documents and related files are available here:

Specification for Transfer of OpenC2 Messages via MQTT Version 1.0
Committee Specification Draft 04
18 August 2021

Editable source (Authoritative):
https://docs.oasis-open.org/openc2/transf-mqtt/v1.0/csd04/transf-mqtt-v1.0-csd04.md

HTML:
https://docs.oasis-open.org/openc2/transf-mqtt/v1.0/csd04/transf-mqtt-v1.0-csd04.html

PDF:
https://docs.oasis-open.org/openc2/transf-mqtt/v1.0/csd04/transf-mqtt-v1.0-csd04.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/openc2/transf-mqtt/v1.0/csd04/transf-mqtt-v1.0-csd04.zip

How to Provide Feedback

OASIS and the OpenC2 TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 31 August 2021 at 00:00 UTC and ends 29 September 2021 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility, which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=openc2).

Comments submitted by TC non-members for this work and for other work of this TC are publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/openc2-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the OpenC2 TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/openc2/

Additional information related to this public review can be found in the public review metadata document [3].

Additional references

[1] https://www.oasis-open.org/policies-guidelines/ipr

[2] https://www.oasis-open.org/committees/openc2/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr#Non-Assertion-Mode
Non-Assertion Mode

[3] Public review metadata document:
– https://docs.oasis-open.org/openc2/transf-mqtt/v1.0/csd04/transf-mqtt-v1.0-csd04-public-review-metadata.html

The post Invitation to comment on Specification for Transfer of OpenC2 Messages via MQTT v1.0 appeared first on OASIS Open.


Good ID

Amazon is Giving Free FIDO Security Keys to AWS Customers to Encourage Better Account Security

By Andrew Shikiar, Executive Director & CMO, FIDO Alliance Leaders from Amazon, Apple, Google, Microsoft and IBM met with President Joe Biden at the White House last week to discuss […] The post Amazon is Giving Free FIDO Security Keys to AWS Customers to Encourage Better Account Security appeared first on FIDO Alliance.

By Andrew Shikiar, Executive Director & CMO, FIDO Alliance

Leaders from Amazon, Apple, Google, Microsoft and IBM met with President Joe Biden at the White House last week to discuss strategies the government and private sector can use together to improve the nation’s cybersecurity. 

Following the meeting, Amazon announced that it will provide eligible AWS customers with access to free FIDO Security Keys. Not only will this protect the burgeoning number of businesses that run on AWS, but it will help instill better authentication practices as these keys can be used across many other business (e.g., G Suite, Github, Dropbox, Stripe) and consumer (Facebook, Twitter, Coinbase, Bank of America) services.

Amazon has been a leading stakeholder in FIDO Alliance for several years now – it is wonderful to see their leadership extended to the market at large. As more businesses move to the cloud, it is absolutely critical that cloud service providers follow suit to protect this critical infrastructure. Threats and attackers are growing in sophistication, and the impacts are non-trivial. Hundreds of millions of personal records are being stolen and resold on the dark web on an alarmingly regular basis. This is a clear and present threat to our economy, our national security and our society.

It’s difficult to name a breach from the past five years that wasn’t tied to stolen credentials. 

The latest prominent attack, which was carried out on Colonial Pipeline, used a single stolen password to essentially cripple the U.S eastern seaboard.

It is important that all businesses take steps to educate and protect their employees and customers from such threats. “Traditional” means of multi-factor authentication (such as OTPs) simply aren’t fit-for-purpose to protect against these attacks, which can financially cripple a company or organization. 

Ultimately, credential-based breaches (like Colonial Pipeline’s) wouldn’t be possible if accounts were protected with FIDO Authentication, which requires local possession of a device with no knowledge-based authentication credentials passed over the network. 

The FIDO Alliance has come a long way since our inception. What started as a whiteboard concept has evolved into technology that is becoming part of the web’s DNA. Virtually every platform and device can now support FIDO Authentication, and there are public SDKs and tools, plus a rich ecosystem of FIDO Certified vendor products and services that can help companies implement FIDO for their sites and apps. 

Amazon’s move to provide free FIDO Security Keys sets a strong – and important – example. We encourage all other cloud service providers to urgently consider following suit by at a minimum enabling FIDO authenticators for admin access to networks.

The post Amazon is Giving Free FIDO Security Keys to AWS Customers to Encourage Better Account Security appeared first on FIDO Alliance.


EdgeSecure

How to Leverage Purchasing Cooperatives to Execute “Audit-Proof” Procurements

The post How to Leverage Purchasing Cooperatives to Execute “Audit-Proof” Procurements appeared first on NJEdge Inc.

Digital ID for Canadians

DIACC Women in Identity: Julianne Trotman

DIACC is hosting a series of spotlights showcasing our amazing female DIACC members in the digital identity space, noting the importance of diversity. These spotlights…

DIACC is hosting a series of spotlights showcasing our amazing female DIACC members in the digital identity space, noting the importance of diversity. These spotlights will be regularly socialized through DIACC’s LinkedIn and Twitter channels as well as our monthly member newsletters.

If you’re a DIACC member and would like us to feature your spotlight, contact us today to learn more!

What has your career journey looked like?

It has not been what I would call a straight line. I started out in accounting and financial services before I transitioned to what I call my second career, marketing. I will say that every discipline and role I have had has allowed me to gain a wide selection of experiences and expertises. These have ultimately made me a more well-rounded marketer that views my role and responsibilities from a wider business perspective.

When you were 20 years old, what was your dream job and why?

I wanted to be a backup dancer for Janet Jackson. I was a competitive dancer and I wanted to make dance my career, however dance is unforgiving on the body and ultimately only a very small few make it to that level :)


As a female leader, what has been the most significant barrier in your career?

Imposter Syndrome. For me, I have had to fight through the “I don’t belong at this table” feeling, both from self doubt but also from men in the room who have made the environment one where I have not felt like I belong. On average, I have learnt to overcome this and to let my skills and experience speak for themselves.

How do you balance work and life responsibilities?

This has been a challenge for most of my career, however over the past few years I have learned to find balance. I have carved out time to do the things I enjoy such as travelling, road riding, golf, and photography. It is not always easy or practical (depending on your role) to be able to turn things off, from a work perspective, at 5 or 6 pm but I do believe that you have to create a division between work and state. When I am hanging out with my family and friends I try to be present and in the moment and when I am working I am fully committed and engaged in my work and my team.

How can more women be encouraged to pursue careers in the digital ID/tech space?

I think introducing girls into STEM at an early age is a great way to get them comfortable and inspired with the disciplines. Early exposure and education to the tech space is key so that girls can see that these disciplines are not just for boys and that anyone can do it.

What are some strategies you have learned to help women achieve a more prominent role in their organizations?

A couple of things (1) You should have a deliberate plan with regards to how you intend to advance in your career. For some this may come easily without much forethought, but for many of us being purposeful helps to provide a roadmap to our goals. (2) Don’t be afraid to highlight your achievements. For women especially, this tends to go against the way we are hardwired. However, if you don’t take control of your career, you can not expect others to and finally (3) Find an advocate, not a mentor, but someone that you can confide in, get advice from and who knows your abilities and will campaign for you as you move through your career.

What will be the biggest challenge for the generation of women behind you?

As far as we think we have progressed, with regards to women breaking the glass ceiling, in many respects it still is one step forward two steps back. You will know they have overcome the biggest challenge when they are described as tech professionals and not prefaced as women in tech.

What advice would you give to young women entering the field?

Be curious, never stop learning, and be bold.

Julianne Trotman is the Growth Marketing Lead at Vaultie

Follow Julianne on LinkedIn



SelfKey Foundation

The Infrastructure Bill and What it Holds for Crypto

In this article, we’ll try to summarize the key points surrounding the infrastructure bill and the effect it has on crypto. We will keep this article updated as and when new information regarding the infrastructure bill becomes available. The post The Infrastructure Bill and What it Holds for Crypto appeared first on SelfKey.

In this article, we’ll try to summarize the key points surrounding the infrastructure bill and the effect it has on crypto. We will keep this article updated as and when new information regarding the infrastructure bill becomes available.

The post The Infrastructure Bill and What it Holds for Crypto appeared first on SelfKey.

Sunday, 29. August 2021

EdgeSecure

How to Leverage Purchasing Cooperatives to Execute “Audit-Proof” Procurements

The post How to Leverage Purchasing Cooperatives to Execute “Audit-Proof” Procurements appeared first on NJEdge Inc.

Cooperative pricing systems (“co-ops”) enable participants to make purchasing decisions with confidence and peace of mind. By leveraging a knowledgeable, responsible co-op such as EdgeMarket, your institution can accelerate the speed of purchasing and onboard impactful services and products while fulfilling all legal obligations. In this session you’ll learn:

What a co-op is, and why legislation has been passed to authorize co-ops such as EdgeMarket How co-ops can save your institution time and money How co-ops aggregate demand and buying power for participants How EdgeMarket fulfills the legal obligations of procurement on behalf of participants to produce “audit-proof” procurements How your institution can leverage co-ops to rapidly onboard in-demand solutions Complete the Form Below to Access Webinar Recording [contact-form-7]

The post How to Leverage Purchasing Cooperatives to Execute “Audit-Proof” Procurements appeared first on NJEdge Inc.

Saturday, 28. August 2021

EdgeSecure

“How to Learn Like Your Students”: Applying Best Practices for Online Learning to Professional Development

The post “How to Learn Like Your Students”: Applying Best Practices for Online Learning to Professional Development appeared first on NJEdge Inc.

Webinar

September 28, 2021
3 pm EDT

As best practices for online learning are identified and applied to courses and students, professional development must keep pace. By applying online learning best practices to professional development, including asynchronous, self-paced learning, faculty can better prepare to deliver engaging learning experiences, both online and in-person.

In this session, you’ll learn:

How platform-agnostic professional development prepares instructors to optimize the online learning experience How asynchronous, self-paced learning for professional development can reinforce the application of online learning principles How asynchronous learning and collaboration can enable your institution to customize and improve professional development over time Register Today

The post “How to Learn Like Your Students”: Applying Best Practices for Online Learning to Professional Development appeared first on NJEdge Inc.

Friday, 27. August 2021

Elastos Foundation

Elastos Bi-Weekly Update – 27 August 2021

...

Kantara Initiative

Kantara Releases Report on Identity and Privacy Protection For mobile Driver’s Licenses

Washington, D.C., – August 26, 2021 — The Kantara Initiative released its Privacy and Digital Identity Protections in the Mobile Driver’s License (mDL) Ecosystem Report.  The report outlines how to implement mDL systems as Privacy Enhancing Technologies. It provides guidance on protecting people’s individual privacy and the digital identifiers of an individual who carries or uses an mDL. 

Washington, D.C., – August 26, 2021 — The Kantara Initiative released its Privacy and Digital Identity Protections in the Mobile Driver’s License (mDL) Ecosystem Report.  The report outlines how to implement mDL systems as Privacy Enhancing Technologies. It provides guidance on protecting people’s individual privacy and the digital identifiers of an individual who carries or uses an mDL.  Kantara is the global consortium improving trustworthy use of identity and personal data through innovation, standardization and good practice.   With approximately 1.5 billion vehicles and three billion smartphones in the world, mDLs and identity credentials stored and displayed on mobile devices…

The post Kantara Releases Report on Identity and Privacy Protection For mobile Driver’s Licenses appeared first on Kantara Initiative.


Own Your Data Weekly Digest

MyData Weekly Digest for August 27th, 2021

Read in this week's digest about: 14 posts, 4 questions, 1 Tool
Read in this week's digest about: 14 posts, 4 questions, 1 Tool

Thursday, 26. August 2021

Velocity Network

Interview with Ivan Basart, CTO and Co-Founder of Validated ID

The Velocity Network Foundation's CEO, Dror Gurevich sat down with Ivan Basart, CTO and Co-Founder of Validated ID to discuss why they joined the Velocity Network™. The post Interview with Ivan Basart, CTO and Co-Founder of Validated ID appeared first on Velocity.

Interview with Jean-Marc Laouchez, President of Korn Ferry Institute of Korn Ferry

Velocity Network Foundation's CEO, Dror Gurevich sat down with Jean-Marc Laouchez, President of Korn Ferry Institute of Korn Ferry to discuss their participation in the Velocity Network™. The post Interview with Jean-Marc Laouchez, President of Korn Ferry Institute of Korn Ferry appeared first on Velocity.

Digital Identity NZ

Identity + Security + Privacy = Trust

All the latest news from the Digital Identity New Zealand community The post Identity + Security + Privacy = Trust appeared first on Digital Identity New Zealand.

Early this month we were hosted by our good friends at Marsh New Zealand to run a joint Connect event with FinTechNZ on Digital and Cyber Risk.  Overall, we see cyber security as one of three key elements, alongside identity and privacy which in combination make up Digital Trust – providing the confidence needed for people, technology and processes to create a truly safe and effective digital world.  

The world has seen an unprecedented rise in cyber security incidents since the beginning of 2020 – a trend which is shaking up everything we thought we knew about prevention and management of this insidious risk.

We had four insightful presentations, given by Jono Soo (Marsh NZ), Jonathon Berry (InPhySec), Andy Prow (Red Shield) and Paul Platen (SSS) which provided a deep-dive into what has been going on in the world of cyber security and insurance over the past 12 months, how this has impacted on businesses in all sectors both globally as well as in New Zealand, and what we might expect looking forward.

Their presentations can be found here.

The key take homes for me were:

More human interactivity now happens at the digital layer rather than the human layer (according to the CIA!) The frequency and severity of ransomware attacks continues to climb, therefore cyber security insurance premiums are going up quickly too and are going to be tougher to acquire. CERT NZ is a great place to get a whole host of cyber security info.  However, when NZ CERT was set up it was given a budget of only $22m for 4 years. The NZ Defense Force spends around 4bn per annum i.e. the comparison between what is spent to protect ourselves physically vs what is spent to protect us digitally is stark. We need to take the protection of the digital realm much, much more seriously and orders of magnitude and more resources need to be focused here.  

Ngā Mihi,

Michael Murphy
Executive Director

To receive our full newsletter including additional industry updates and information, subscribe now

The post Identity + Security + Privacy = Trust appeared first on Digital Identity New Zealand.

Wednesday, 25. August 2021

Oasis Open

SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification

Specification is intended for developers and architects designing systems and applications that utilize threshold sharing schemes in an interoperable manner. The post SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification appeared first on OASIS Open.

CS02 is ready for testing and implementation

OASIS is pleased to announce that SAM Threshold Sharing Schemes Version 1.0 from the OASIS Security Algorithms and Methods (SAM) TC [1] has been approved as an OASIS Committee Specification.

This document is intended for developers and architects who wish to design systems and applications that utilize threshold sharing schemes in an interoperable manner. Committee Specification 02 (CS02) clarifies that the implementation examples in Appendix E are non-normative.

This Committee Specification is an OASIS deliverable, completed and approved by the TC and fully ready for testing and implementation.

The prose specifications and related files are available here:

SAM Threshold Sharing Schemes Version 1.0
Committee Specification 02
19 August 2021

Editable source:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs02/sam-tss-v1.0-cs02.docx (Authoritative)
HTML:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs02/sam-tss-v1.0-cs02.html
PDF:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs02/sam-tss-v1.0-cs02.pdf
Non-material changes between CS01 and this CS02 are marked in:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs02/sam-tss-v1.0-cs02-DIFF.pdf

Distribution ZIP file
For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs02/sam-tss-v1.0-cs02.zip

Members of the SAM TC [1] approved this specification by Special Majority Vote. The specification had been released for public review as required by the TC Process [2]. The vote to approve as a Committee Specification passed [3], and the document is now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving this milestone and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

Additional references:

[1] OASIS Security Algorithms and Methods (SAM) TC
https://www.oasis-open.org/committees/sam/

[2] Public review:
* 30-day public review, 02 June 2021:
https://lists.oasis-open.org/archives/members/202106/msg00002.html
https://lists.oasis-open.org/archives/members/202106/msg00003.html
– Review metadata:
https://docs.oasis-open.org/sam/sam-tss/v1.0/csd01/sam-tss-v1.0-csd01-public-review-metadata.html
– Comment resolution log:
https://docs.oasis-open.org/sam/sam-tss/v1.0/csd01/sam-tss-v1.0-csd01-comment-resolution-log.txt

[3] Approval ballot:
https://www.oasis-open.org/committees/ballot.php?id=3639

The post SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification appeared first on OASIS Open.


SelfKey Foundation

SelfKey Now Listed on Coin Social Story

We’re excited to announce that SelfKey’s native token $KEY has been listed on the popular crypto tracking platform Coin Social Story. The post SelfKey Now Listed on Coin Social Story appeared first on SelfKey.

We’re excited to announce that SelfKey’s native token $KEY has been listed on the popular crypto tracking platform Coin Social Story.

The post SelfKey Now Listed on Coin Social Story appeared first on SelfKey.

Tuesday, 24. August 2021

Oasis Open

JSON Abstract Data Notation v1.0 from OpenC2 TC approved as a Committee Specification

JADN is an information modeling language for defining data structures, validating data instances, informing user interfaces for structured data, and facilitating protocol internationalization. The post JSON Abstract Data Notation v1.0 from OpenC2 TC approved as a Committee Specification appeared first on OASIS Open.

JADN v1.0 is ready for testing and implementation

OASIS is pleased to announce that Specification for JSON Abstract Data Notation Version 1.0 from the OASIS Open Command and Control (OpenC2) TC [1] has been approved as an OASIS Committee Specification.

JSON Abstract Data Notation (JADN) is an information modeling language. It has several purposes including defining data structures, validating data instances, informing user interfaces working with structured data, and facilitating protocol internationalization. JADN specifications consist of two parts: abstract type definitions that are independent of data format, and serialization rules that define how to represent type instances using specific data formats. A JADN schema is itself a structured information object that can be serialized and transferred between applications, documented in multiple formats such as text-based interface definition languages, property tables or diagrams, and translated into concrete schemas used to validate specific data formats.

This Committee Specification is an OASIS deliverable, completed and approved by the TC and fully ready for testing and implementation.

The prose specifications and related files are available here:

Specification for JSON Abstract Data Notation Version 1.0
Committee Specification 01
17 August 2021

Editable source:
https://docs.oasis-open.org/openc2/jadn/v1.0/cs01/jadn-v1.0-cs01.md (Authoritative)
HTML:
https://docs.oasis-open.org/openc2/jadn/v1.0/cs01/jadn-v1.0-cs01.html
PDF:
https://docs.oasis-open.org/openc2/jadn/v1.0/cs01/jadn-v1.0-cs01.pdf

Distribution ZIP file
For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:
https://docs.oasis-open.org/openc2/jadn/v1.0/cs01/jadn-v1.0-cs01.zip

Members of the OpenC2 TC [1] approved this specification by Special Majority Vote. The specification had been released for public review as required by the TC Process [2]. The vote to approve as a Committee Specification passed [3], and the document is now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving this milestone and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

Additional references

[1] OASIS Open Command and Control (OpenC2) TC
https://www.oasis-open.org/committees/openc2/

[2] Public review and comment resolution timeline:
https://docs.oasis-open.org/openc2/jadn/v1.0/csd02/jadn-v1.0-csd02-public-review-metadata.html
– Most recent comment resolution log:
https://docs.oasis-open.org/openc2/jadn/v1.0/csd02/jadn-v1.0-csd02-comment-resolution-log.pdf

[3] Approval ballot:
https://www.oasis-open.org/committees/ballot.php?id=3638

The post JSON Abstract Data Notation v1.0 from OpenC2 TC approved as a Committee Specification appeared first on OASIS Open.


OpenID

Shared Signals: An Open Standard for Webhooks

New OpenID Foundation draft enables secure and privacy protected webhooks to power an “API-First” world Author: Atul Tulshibagwale   APIs are an increasingly important aspect of software today, and “API-First” is the mantra being followed in a lot of new software development. A critical aspect of efficient APIs is their ability to notify callers of […] The post Shared Signals: An Open Standa
New OpenID Foundation draft enables secure and privacy protected webhooks to power an “API-First” world Author: Atul Tulshibagwale

 

APIs are an increasingly important aspect of software today, and “API-First” is the mantra being followed in a lot of new software development. A critical aspect of efficient APIs is their ability to notify callers of changes relating to something the caller is interested in. Such mechanisms are often referred to as “Webhooks”, sometimes as “Callbacks”. Webhooks in most APIs are proprietary implementations, although API standards such as aip.dev and Open API have specified mechanisms such as “Long Running Operations” and “Callbacks” respectively. In both these standards, the Webhook mechanism is specific to resources recently operated upon by the API client, and so is limited in its scope.

The Shared Signals and Events Framework

The OpenID Foundation formed the “Shared Signals and Events” (SSE) Working Group as a combination of the previous OpenID RISC working group and an informal industry group that was focused on standardizing Google’s CAEP proposal. These represented two distinct applications of the same underlying mechanism of managing asynchronous streams of events. Therefore the SSE Framework is now proposed to be a standard for managing such streams of events for any application, not just CAEP and RISC. In effect, it is a standard for generalized Webhooks.

The SSE Framework defines stream-based communication mechanisms between Transmitters that generate events and Receivers that consume them. It defines an Event Stream Management API for obtaining the Transmitter configuration: Which events it supports, how they can be verified, and where it sends them to the receiver. Note that since this call to get the Transmitter configuration can also be authorized using OAuth, a Transmitter may decide based on which Receiver is requesting the configuration, which events it would support for that Receiver. The Receiver can specify the URL at which it wishes to receive the events, and the types of events it is interested in receiving. It also provides mechanisms for adding or removing subjects about which events should be included in the stream, and for pausing, starting and stopping the stream. The actual events are delivered using Security Event Tokens (SETs), which are potentially signed JSON objects. The delivery mechanisms are defined using IETF draft specifications for SET Push or Poll.

In combination with OAuth, which can be used to authorize calls between the Receiver and the Transmitter, this SSE Framework provides a comprehensive mechanism for implementing secure, privacy protected webhooks.

How It Works

Many internet services today have a “multi-tenanted” architecture, which means that each service hosts a number of independent customers within their infrastructure. Using SSE, streams may be managed using host-level authorization or tenant level authorization. Using host-level authorization requires the Transmitter to trust the Receiver highly, such that intentional or unintentional action from the Receiver does not leak data about a tenant to the Receiver. For example, if a Receiver uses host-level authorization to enable the stream for a customer that hasn’t authorized it, the Receiver will start receiving events for the customer without their knowledge.

It is therefore safer to use tenant level authorization, where the tenant administrator can authorize such management actions. This method is described below:

Setting up a Stream

The following example describes how an administrator authorization may be used to start a stream:

 

Adding a Subject

A Subject Principal in SSE can be a user, device, application or even a session. A subject can also be an aggregate subject like an OU, group or tenant. A Receiver can explicitly add or remove subjects in a stream. A stream will contain events only about subjects that are explicitly or implicitly included in the stream. In some cases, it makes sense to implicitly include all applicable subjects to a stream (for example, if events being exchanged are about the security of the subject). In other cases, a subject may explicitly authorize the use of their information in a stream, providing privacy protection (for example, if a user’s location needs to be shared). In the explicit case, the following steps can be taken to authorize the inclusion of events regarding a user subject in the stream:

 

Removing a Subject

The SSE Event Stream Management API also supports removing subjects, whether or not they were explicitly added. The flow to remove a subject is the same as the “add subject” flow described above.

Other Features

The SSE Framework also defines ways for both Receivers and Transmitters to enable, disable and pause streams. It also defines a mechanism for Receivers to verify the stream is active, which could be used as a “heartbeat” mechanism.

Current Status and Call to Action

The SSE Framework draft has been adopted as an Implementer’s Draft by the OpenID Foundation. This is a great time to begin implementations based on this specification. In your APIs, consider adding the webhook capability using the SSE Framework instead of a proprietary way of providing such functionality.

For more information, please visit the Shared Signal & Event Working Group page.

The post Shared Signals: An Open Standard for Webhooks first appeared on OpenID.

MyData

Announcing the MyData Operator Awards 2021

The MyData Operator Award recognises personal data companies that have shown leadership by providing human-centric solutions that empower individuals to manage their personal data. The amount of personal data generated about people increases every day. This data holds the potential to unlock a range of benefits for individuals, businesses, governments, and society as a whole.... Read More T

The MyData Operator Award recognises personal data companies that have shown leadership by providing human-centric solutions that empower individuals to manage their personal data. The amount of personal data generated about people increases every day. This data holds the potential to unlock a range of benefits for individuals, businesses, governments, and society as a whole....

Read More

The post Announcing the MyData Operator Awards 2021 appeared first on MyData.org.


We Are Open co-op

Hello world!

Hi. My name is Anne Hilliger and I am the new intern at We Are Open Co-op. Not only am I new but also the first ever intern at WAO which means I’m also their guinea pig for a possible future intern program. Photo by Adam Solomon on Unsplash As a little taste of what makes me me here are some randomly collected facts about me: I am crazy for bike riding (nice gravel rides with coffee brea

Hi. My name is Anne Hilliger and I am the new intern at We Are Open Co-op. Not only am I new but also the first ever intern at WAO which means I’m also their guinea pig for a possible future intern program.

Photo by Adam Solomon on Unsplash

As a little taste of what makes me me here are some randomly collected facts about me: I am crazy for bike riding (nice gravel rides with coffee breaks and a good amount of icecream are just the best), the first half of this year I spend living and studying in Lapland in the same town where also Santa lives (Rovaniemi and yes I met him, twice), I live in a flat with 5 other adults and two incredibly cute kids (not mine though), I am definitely a cat person and if somebody would give me a basket full of kitties I would happily say yes.

I am studying cultural and media education at a small university in central/eastern Germany called Hochschule Merseburg. In the current semester it is time to do your mandatory internship and like my last name might tell, I know Laura Hilliger and have always been very interested in what she tells me about her work. Therefore I thought this internship might be a perfect opportunity to learn more about what WAO is doing, who her colleagues are and what it is like to work on projects with them.

Another incentive that made me want to work with WAO is that my curriculum at the university is rather focused on teaching kids and teenagers, mostly in face to face, school or kindergarten settings. This is also fun and interesting but in my world media education is even more than that. To have the opportunity to work with people who are specialized in teaching and learning in mostly online environments, who developed interesting and innovative tools and who know how to work with companies, charities or other collaborators, is going to teach me so many cool things! I am really looking forward to that.

one project I worked at in 2016: creating a virtual dinosaur museum in minecraft (CC BY-NC-SA 4.0)

After receiving a very warm welcome I am now in my fourth week already, the onboarding is in full speed and I am learning a lot about the projects, the processes and of course the philosophy of openness. I worked myself through the email courses they developed (please go and check them out if you haven’t, they are great!), and I got particularly caught up with the topic of working openly because this is something I never really did before and it is as exciting as it is challenging.

I am coming from an understanding of work that is very conservative and heteronormative due to the capitalistic structures I grew up in. Therefore I find it challenging to actually pursue the approach of openness. In theory I understand the principles, but what does it actually mean? You have to unlearn quite some patterns. I for example noticed that in order to ask questions not only to Laura (who is a confidant) but into the channel where everybody from the Co-op can read it, I have to jump over my own shadow a lot (a German expression that means to overcome yourself). I am still thinking in the pattern of “Am I allowed to do that?”, “Aren’t the others too important/busy to listen to what I have to say?”, “I am only the intern”. I am really looking forward to jumping over my own shadow for this more often and starting to integrate the philosophy of openness into my understanding of how work can also be like.

And this is me, very much frozen in Lapland (CC BY-NC-SA 4.0)

Hello world! was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


GS1

5 术语表

相关术语的最新定义,请参阅www.gs1.org/glossary。 术语 定义 应用标识符 在一个数据串的开头,由两个或多个数字组成,用于唯一定义该单元数据串的格式和含义。 测定 测定是实验室医学、药理学、环境生物学和分子生物学中的一种研究(分析)程序,用于定性评估或定量测量目标实体(被分析物)的存在、数量或功能活性。 阻隔包装 一种包装类型,可以防止包装内容物接触到外界物质或受到其他外界影响。根据材料和包装工艺的不同,阻隔包装可保护内容不受光、水分或微生物的影响,即保持内容物的无菌性。 泡罩包装 一种包装类型,即将材料(通常是塑料或金属箔)制造成含有一个或多个泡罩,每个泡罩包含一单元的产品。通常通过粘贴一层塑料或金属箔或纸张密封泡罩单元的开口底座,当获取内容物时,就穿透这些塑料或金属箔或纸张,从而留下

相关术语的最新定义,请参阅www.gs1.org/glossary

术语

定义

应用标识符

在一个数据串的开头,由两个或多个数字组成,用于唯一定义该单元数据串的格式和含义。

测定

测定是实验室医学、药理学、环境生物学和分子生物学中的一种研究(分析)程序,用于定性评估或定量测量目标实体(被分析物)的存在、数量或功能活性。

阻隔包装

一种包装类型,可以防止包装内容物接触到外界物质或受到其他外界影响。根据材料和包装工艺的不同,阻隔包装可保护内容不受光、水分或微生物的影响,即保持内容物的无菌性。

泡罩包装

一种包装类型,即将材料(通常是塑料或金属箔)制造成含有一个或多个泡罩,每个泡罩包含一单元的产品。通常通过粘贴一层塑料或金属箔或纸张密封泡罩单元的开口底座,当获取内容物时,就穿透这些塑料或金属箔或纸张,从而留下明显的操作痕迹(即防篡改封签)。

品牌所有者

即拥有贸易项目设计说明书的组织,与贸易项目的产地和生产者无关。品牌所有者通常负责管理该商品的全球贸易项目代码(GTIN)。

呼吸包装/袋

一层纤维材料的封装。外科元件(如缝线或缝合针组件)的呼吸包,其具有一层纤维材料和一层塑料材料,在两者之间形成一个袋。

校验码

根据GS1标识关键字的其他数据中计算出的最后一位数字,用于检查数据的组成是否正确。

囊片

口服药的包衣片。

联合品牌

在产品标签或包装上附加一个可识别的品牌(副品牌)、标志、商标或注册标志,并与主品牌共存的情况。这通常会在与原品牌所有者的合同协议下执行。

直接部件标记

直接部件标记是指使用侵入式或非侵入式方法在项目上标记符号的过程。

剂量

每次用药的数量和用药的频率。

单品/基本单元

在贸易项目的包装层级中,基本单元或单品指的是零售消费者贸易项目级别。术语“单品”指的是最低贸易包装级别。这个级别可能包含多个单品/使用单元。

产品电子代码(EPC)

使用RFID标签和其他方式统一识别物理对象(如贸易项目、资产和位置)的识别方案。标准化的EPC数据包括唯一标识单个对象的EPC(或EPC标识符),以及在被判断为能够有效和高效读取EPC标记所必需的可选滤值。

单元数据串

GS1应用标识符和GS1应用标识符数据字段的组合。

同等产品

根据供应商定义的与贸易项目的功能等效性,可以替代贸易项目的产品。

配方

构成最终药品的不同化学物质(包括活性成分)的组合的定义。

形式、装配或功能

产品规格或设计的变更,其改变了预期目的/用途,且需要与顾客沟通。

GS1厂商识别代码(GCP)

由4~12位数字组成的唯一字符串,用于创建GS1标识关键字。第一部分数字是有效的GS1国别码,GCP长度至少比GS1国别码长一位。GS1厂商识别代码由GS1成员组织分配,由于GS1厂商识别代码的长度可变,所以较长的GS1公司前缀开头应避免与较短的GS1公司前缀数字相同。(参见U.P.C厂商识别代码)

GS1通用规范

定义了通过使用条码、射频识别、GS1标识符对相关贸易项目、位置、物流单元、资产等进行标识及自动识别的GS1系统数据和应用标准。

GS1全球办公室

位于比利时布鲁塞尔和美国普林斯顿,是GS1成员组织下属机构,负责管理GS1系统。

GS1成员组织(GS1 MO)

负责在本国或者指定地区管理GS1系统的GS1组织。管理工作包括但不限于:通过教育、培训、推动和应用支持确保对用户对GS1系统的正确使用,并积极参与到GSMP中。

GS1系统

由GS1管理的规范、标准和指南。

全球贸易项目代码® (GTIN®)

GS1标识关键字,用于标识贸易项目。关键字由GS1公司前缀、项目代码和校验码组成。

全球数据同步网络

(GDSN)

全球数据同步网络®(GDSN®)是一个可互操作的数据池网络,使协作用户能够根据GS1标准安全地同步主数据。

图形用户界面

(GUI)

图形用户界面是一种用户界面形式,允许用户通过图形图标和音频指示器(如主符号)与电子设备交互,而不是基于文本的用户界面、键入的命令标签或文本导航。

指示符

一位数字,范围1-9,在GTIN-14的最左端位置。

项目代码

全球贸易项目代码(GTIN)的一部分,由品牌所有者分配,以生成唯一的GTIN。

供人识读字符

供人识读字符(HRI)是条码或标签下面、旁边或上面的信息,编码在条码或标签中,表示与条码或标签中的相同字符。更多信息请参阅《GS1通用规范》,供人识读字符(HRI)规则一节。

组套

不同管制医疗产品的组合,用于单次治疗。

组套生产商

定义组套内容、规范和标签的品牌所有者。组套生产商可以组装或不组装组套,可能让第三方生产成品贸易项目。

单品以下级别

GS1系统中最低层级的贸易项目通常称为“单品”级别。“单品”级别的贸易项目可以包含多个使用单元。在这种情况下,可能需要标识低于“单品”级别的级别,直至单个单元或使用单元。在医疗行业中,可能有“较小”或“较低”的单元,通常在医疗点扫描,称为“单品以下级别”。

医疗器械

制造商单独或组合使用任何医疗用途的仪器、设备、器具、机器、装置、移植物、体外试剂或矫正器、软件、材料或其它类似或相关物品。

non-HRI

non-HRI 文本是包装、标签或项目上的所有除供人识读字符的文本。更多信息请参阅GS1 通用规范,供人识读字符(HRI)规则一节。

药品

药品是指用于医学目的的药物,如止咳糖浆或安眠药。

主品牌

主品牌是由品牌所有者确定的,最易于医护人员或患者识别的品牌,可以表示为标志、注册标志或商标。

初级包装

包装的最内层,即离产品(药丸、植入物、器械等)最近的一层。

管制医疗贸易项目

管制医疗贸易项目是指在受控环境(如零售药店、医院药房)内销售或分发的药品或医疗器械。

处方药(Rx)

处方药(Rx)是指需要医生处方或直接医疗干预的药物或医疗用品。典型的例子包括药物绷带、止痛药、注射剂等,通常只能通过执业医师的处方获得。

内外密封(SITO)

一种阻隔包装,由两层组成。在特殊的生产过程中,内层(通常是一个箔袋)最初保持打开状态,以便对内容物进行消毒,然后通过外层进行密封,外层之前处于封闭状态,并且始终保持完整。

二级包装

初级包装周围的包装层。可用于显示信息和展示品牌,或用于初级包装可能无法提供的附加机械保护。可能包含一个或多个初级包装。

独立单元包装/铝箔包装

医疗初级包装,由独立分离的药物剂型构成(如一个片剂,定量的溶液剂),或是某一医疗器械的直接包装(如注射器)。多个独立单元可相互连接,但是容易通过穿孔分离。

制剂规格

药物中有效成分的量。

贸易项目

能检索其预定义信息,并可在供应链各环节对其进行定价、订购或开具发票的任意项目(产品或服务)

U.P.C.厂商识别代码

将以一个零开头的GS1厂商识别代码删除前置零即变成UPC厂商识别代码。U.P.C.厂商识别代码用于创建GTIN-12。


4 GTIN补充信息

全球贸易项目代码(GTIN) 企业可以使用全球贸易项目代码(GTIN)对企业所有贸易项目进行唯一标识。GS1将贸易项目定义为任何需要检索预定义信息且可在任何供应链中随时定价、订购或开具发票的项目(产品或服务)有关GTIN的更多信息,请参阅《GS1通用规范》。 GTIN的结构 企业可以从GS1成员组织获得GS1厂商识别代码的许可及如何将GTIN分配给其产品的完整文档。 GTIN应视为一组无含义的数字,这表明它应作为一个整体来记录和处理;其中任何数字均与分类无关,不具有任何意义。 GS1厂商识别代码 GS1厂商识别代码由GS1成员组织授权给企业,使企业有权创建任何GS1标识关键字,如GTIN、SSCC或GIAI。 注:虽然GS1厂商识别代码可用于确定分配了代码的GSI成员组织,但不能用于确定项目的制造或流通地。

全球贸易项目代码(GTIN)

企业可以使用全球贸易项目代码(GTIN)对企业所有贸易项目进行唯一标识。GS1将贸易项目定义为任何需要检索预定义信息且可在任何供应链中随时定价、订购或开具发票的项目(产品或服务)有关GTIN的更多信息,请参阅《GS1通用规范》

GTIN的结构

企业可以从GS1成员组织获得GS1厂商识别代码的许可及如何将GTIN分配给其产品的完整文档。

GTIN应视为一组无含义的数字,这表明它应作为一个整体来记录和处理;其中任何数字均与分类无关,不具有任何意义。

GS1厂商识别代码

GS1厂商识别代码由GS1成员组织授权给企业,使企业有权创建任何GS1标识关键字,如GTIN、SSCC或GIAI。

注:虽然GS1厂商识别代码可用于确定分配了代码的GSI成员组织,但不能用于确定项目的制造或流通地。

U.P.C.厂商识别代码

U.P.C.厂商识别代码是以零(“0”)开头的GS1厂商识别代码去掉前置零派生出来的。U.P.C.厂商识别代码应只能用于构建12位数字的贸易项目标识符(如GTIN-12)。当U.P.C.厂商识别代码加上一个前置零时,就成为GS1厂商识别代码,可用于创建所有其他GS1标识符。

注:例如,6位U.P.C.厂商识别代码厂商识别代码614141就是源自7位GS1厂商识别代码0614141。

项目代码

项目代码为全球贸易项目代码(GTIN)的一部分,由GS1厂商识别代码或UPC厂商识别代码的所有者分配,以创建唯一的GTIN,该数字本身并无具体含义,这意味着数字的每一位均与分类无关,不代表任何信息。分配项目代码最简单的方法是按顺序分配,如:000、001、002、003等。

校验码

校验码是末位数,由GTIN中其他数字计算而来。

下图显示了四个GTIN结构以及应如何存储在数据库中。GTIN应右对齐,左边填充零。

注1:编码到数据载体时,任何GTIN均必须为固定长度的14位数据字符串,长度小于14位的GTIN必须在开头填充零作为补位。补零不会将GTIN-8、12或13更改为GTIN-14。不过,作为补位的零并不会在EAN-8、U.P.C-A或EAN-13条码中编码。

注2:14位的GTIN格式用于商业交易,特别是电商交易(如电子订单、发票、价格目录等)以及GS1的全球数据同步网络(以下简称GDSN)。

指示符

指示符仅用于GTIN-14数据结构中,它可以是1-8的任一数字(参阅以下注释1),表示产品不同级别的包装。分配指示符最简单的方法是按顺序分配给贸易单元的每个包装组合,即1、2、3 ...。

标准贸易项目组合包装是指多个相同贸易项目组成的可订购的组合。品牌所有者可为每个组合包装分配唯一的GTIN-13或GTIN-12,或分配带有取值为1-8的指示符的唯一GTIN-14。GTIN-14包含组合中贸易项目的GTIN(不包括校验码)。然后每个GTIN-14的校验码须重新计算。由于有1到8的数值选项,所以可以从一个GTIN-13或GTIN-12创建8个单独的、唯一的GTIN-14。

指示符并无具体意义。指示符不必按顺序使用,其中一些数字甚至可以弃用。标准贸易项目组合包装的GTIN-14结构扩充了编码容量。指示符按照创建GTIN的公司的要求分配。

注 1:9用于标识变量贸易项目。变量:如果纸箱中项目的度量不是预定义的,则贸易项目是变量贸易项目。此变量贸易项目的一些相关属性不是预定义的,例如包含的单元数量和重量,只有在产品制造时才会知道。要预先确定项目是具有预定义基本属性的定量项目,还是具有可变度量值的变量项目,如项目度量值(如包含的单元数)不是预定义的,而是特定于每个实例。在项目是变量的情况下,可以使用带有标识符9的GTIN-14和应用标识符(30)。更多信息请参阅《GS1通用规范》的变量贸易项目中项目数量应用标识符:AI(30)部分。有关变量的其他信息,请参阅《GS1通用规范》的以下章节:常规配送可扫描的变量贸易项目和在零售点扫描的变量贸易项目。

注2:一些零售点的扫码设备可能无法读取和解释除EAN/UPC之外的条码符号,EAN/UPC不能编码GTIN-14。

注3:更多信息请参见《GS1通用规范》GTIN章节。

注4:有关如何创建GTIN的更多信息,请参阅如何在GS1网站上创建GTIN联系您当地的GS1成员组织


3 临床试验

开展临床试验的目的旨在调查治疗、干预或试验在预防、管理或检测疾病或其他医疗条件方面的有效性和安全性。临床试验可以将一种新疗法与现有疗法进行比较,测试现有疗法的不同组合,甚至可以研究其他生活方式因素及其对患者健康的影响。 临床试验具有目前的商业医疗产品供应链中没有的产品标识复杂性。试验产品的独特性,在许多情况下只针对一名患者,意味着需要对该产品的实例进行标识。在盲法试验中,盲方不应从标签上看到试验产品是供试品、对照品还是安慰剂。应用GS1标准进行临床试验必须考虑到这些复杂性和行业需求。 如果产品外观发生变化,请参考临床试验应用标准。 ■ 有关GTIN分配规则的具体信息,请参阅《临床试验应用标准中试验产品的标识》第7节。 ■ 常规信息请参考临床试验应用标准中试验药物的标识。

开展临床试验的目的旨在调查治疗、干预或试验在预防、管理或检测疾病或其他医疗条件方面的有效性和安全性。临床试验可以将一种新疗法与现有疗法进行比较,测试现有疗法的不同组合,甚至可以研究其他生活方式因素及其对患者健康的影响。

临床试验具有目前的商业医疗产品供应链中没有的产品标识复杂性。试验产品的独特性,在许多情况下只针对一名患者,意味着需要对该产品的实例进行标识。在盲法试验中,盲方不应从标签上看到试验产品是供试品、对照品还是安慰剂。应用GS1标准进行临床试验必须考虑到这些复杂性和行业需求。

如果产品外观发生变化,请参考临床试验应用标准。

■ 有关GTIN分配规则的具体信息,请参阅《临床试验应用标准中试验产品的标识》第7节。

■ 常规信息请参考临床试验应用标准中试验药物的标识


2.10 包装上的价格

包装上的价格是指品牌所有者将预定价作为包装图形的一部分。此规则不适用于价签、贴纸、吊牌或任何可能从包装或产品中移除的物品上标记的价格。 任何增加、改变或删除直接标记在产品包装上的价格(不建议),均需要分配新GTIN。 GTIN分配的层级 ■ GTIN的变更发生在基本单元级别。 ■ 在基本单元以上的每个现有包装层级都会分配唯一的GTIN。 需要新GTIN的业务场景示例: ■ 包装上预先打印的价格从3欧元变为2欧元。 ■ 8欧元的售价添加到产品包装图形中。 附加信息 在贸易活动中,不鼓励在包装上预先定价,因为预先定价使供应链中贸易项目文件的维护变得复杂。此外,还存在一种风险,即对消费者的声明的价格(包装上)与收取的价格(零售商或医疗系统的价格)不同。然而,监管机构可能强制要求预定价。 相关指导原则: GTIN规则名称 是否希望利益相

包装上的价格是指品牌所有者将预定价作为包装图形的一部分。此规则不适用于价签、贴纸、吊牌或任何可能从包装或产品中移除的物品上标记的价格。

任何增加、改变或删除直接标记在产品包装上的价格(不建议),均需要分配新GTIN。

GTIN分配的层级

■ GTIN的变更发生在基本单元级别。

■ 在基本单元以上的每个现有包装层级都会分配唯一的GTIN。

需要新GTIN的业务场景示例:

■ 包装上预先打印的价格从3欧元变为2欧元。

■ 8欧元的售价添加到产品包装图形中。

附加信息

在贸易活动中,不鼓励在包装上预先定价,因为预先定价使供应链中贸易项目文件的维护变得复杂。此外,还存在一种风险,即对消费者的声明的价格(包装上)与收取的价格(零售商或医疗系统的价格)不同。然而,监管机构可能强制要求预定价。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

包装上的价格


2.9 预定义产品组合

预定义产品组合是指两个或多个不同的贸易项目组合在一起作为单个贸易项目销售的一组产品。 变更、添加或替换预定义产品组合中的一个或多个贸易项目,需要分配新GTIN。 相关指导原则: GTIN规则名称 是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分? 是否有法规或责任要求对消费者和/或贸易伙伴披露变更? 是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)? 预定义产品组合

预定义产品组合是指两个或多个不同的贸易项目组合在一起作为单个贸易项目销售的一组产品。

变更、添加或替换预定义产品组合中的一个或多个贸易项目,需要分配新GTIN。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

预定义产品组合

2.9.1 组套

组套是用于特定的临床或商业目的的一组非同质的、可分离的组合包装,可作为单一的贸易项目进行识别、购买和供应。

组套主要有两种类型:

■ 成品型组套:是由成品组件组成的组套,每个组件都是具有GTIN标识的贸易项目。组件不用单独包装,但是应独立于组套,按照组件包装级别单独标识(即可供销售、识别和交易)。

■ 加工型组套:在组装过程中完成配套。此类组套中至少含有一个不是成品贸易项目的组件,因此,此组件不具有GTIN。

GTIN分配的层级

组套制造商或组套生产商负责为组套分配GTIN。

■ GTIN变更发生在组套和以上所有级别。

以下GTIN变更规则适用:

■ 添加组套组件需要分配新GTIN。参见 图 2-18 将组件添加到组套中

■ 减少组套组件需要分配新GTIN。参见图 2-19 拆除组套中的组件

■ 对于具有GTIN和/或品牌所有者项目编号代码的特定组套组件,当包装中的组件被替代时,该组套的GTIN必须变更。参见图 2-20 带有特定组件的组套。

■ 当组套组件仅有文字描述(即没有GTIN或品牌所有者的项目编号代码)时,如组套生产商更换替换包装中的组件(保持规格、组合和功能不变),无须更改该组套的GTIN。参见图 2-21 无特定组件的组套

需要变更GTIN的业务场景示例:

图 2-18 将组件添加到组套中

图 2-19 拆除组套中的组件

图 2-20 带有指定组件的组套

不需要更改GTIN的业务场景示例:

图2-21 带有非指定组件的组套


2.8 包装/箱数量

此规则处理具有预定义内容的预定义贸易项目组合。要变更包装或箱子中包含的预定义贸易项目的数量(即贸易项目分组),需要将新GTIN分配到变更的级别以及以上所有受影响的级别。变更预定义托盘配置中的箱子数量需要分配新GTIN。 GTIN分配的层级 ■ 变更的最低及以上每个现有层级均会分配唯一的GTIN。 需要在更高级别的包装(如包、箱子、托盘)中使用唯一GTIN的业务场景示例: ■ 箱子中所含贸易项目从8个变更为12个,此时,需要唯一的GTIN标识箱子。 ■ 托盘中所含贸易项目从12个变更为16个,此时,需要唯一的GTIN标识托盘。 附加信息 参考第 2.1.2 节,了解单品、单品以下级别、单个单元和更高包装级别的GTIN分配情况。 相关指导原则: G

此规则处理具有预定义内容的预定义贸易项目组合。要变更包装或箱子中包含的预定义贸易项目的数量(即贸易项目分组),需要将新GTIN分配到变更的级别以及以上所有受影响的级别。变更预定义托盘配置中的箱子数量需要分配新GTIN。

GTIN分配的层级

■ 变更的最低及以上每个现有层级均会分配唯一的GTIN。

需要在更高级别的包装(如包、箱子、托盘)中使用唯一GTIN的业务场景示例:

■ 箱子中所含贸易项目从8个变更为12个,此时,需要唯一的GTIN标识箱子。

■ 托盘中所含贸易项目从12个变更为16个,此时,需要唯一的GTIN标识托盘。

附加信息

参考第 2.1.2 节,了解单品、单品以下级别、单个单元和更高包装级别的GTIN分配情况。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

包装/箱数量

2.8.1 托盘作为贸易项目

当托盘是贸易项目且使用GTIN标识托盘以便下单和开具发票时,此条规则适用。在此种情况下,GTIN分配规则,包括包装层级的GTIN分配规则适用。

变更预定义托盘配置中的箱子数量需要分配新GTIN。托盘布局不影响在托盘上面装箱贸易项目的GTIN分配。因此,托盘GTIN的变更并不一定要改变更低包装级别的GTIN。

GTIN分配的层级

■ 当托盘是可订购项目时,将唯一的GTIN分配到每个承载不同数量箱子的托盘上。

需要变更GTIN的业务场景示例:

图2-17 订购额外或新预定义托盘需要不同的GTIN。

附加信息

■ 仅当托盘为贸易项目(即可以定价、订购或开具发票)时才需要GTIN。

■ 当托盘是物流单元时(例如,装运、运输、储存),使用系列货运包装箱代码(SSCC)标识。

■ 有关物流标签和SSCC的更多信息,请参阅《GS1通用规范》


2.7 时间限定或促销产品

促销通常对商品的展示方式进行短期修改。 对于某一特定事件或日期中的促销产品进行的更改(包括包装更改),如果这种更改影响了产品供应链以确保在指定时间段内可供销售,则需要分配新GTIN。 GTIN分配的层级 ■ 基本单元级别不需要变更GTIN。 ■ 现有基本单元以上的包装层级需要为限时促销活动品分配唯一的GTIN。 需要在更高级别的包装(如包装、箱子、托盘)上使用唯一GTIN的业务场景示例: ■ 在促销期内,将免费试用品(未标明其自身的GTIN)附在现有产品上,原产品声明的净含量不变,包装尺寸和产品毛重的变化不超过20%。 不需要更改GTIN的业务场景示例: ■ 促销活动:买2送1 ■ 绷带上的图案每季度轮换一次。图案没有季节性或时间关键性关联,视为流通产品。 注:任何影响到产品内容的促销活动,或者需要新的法规备案的促销活动,视为重大变更,则必须

促销通常对商品的展示方式进行短期修改。

对于某一特定事件或日期中的促销产品进行的更改(包括包装更改),如果这种更改影响了产品供应链以确保在指定时间段内可供销售,则需要分配新GTIN。

GTIN分配的层级

■ 基本单元级别不需要变更GTIN。

■ 现有基本单元以上的包装层级需要为限时促销活动品分配唯一的GTIN。

需要在更高级别的包装(如包装、箱子、托盘)上使用唯一GTIN的业务场景示例:

■ 在促销期内,将免费试用品(未标明其自身的GTIN)附在现有产品上,原产品声明的净含量不变,包装尺寸和产品毛重的变化不超过20%。

不需要更改GTIN的业务场景示例:

■ 促销活动:买2送1

■ 绷带上的图案每季度轮换一次。图案没有季节性或时间关键性关联,视为流通产品。

注:任何影响到产品内容的促销活动,或者需要新的法规备案的促销活动,视为重大变更,则必须分配新GTIN。

附加信息

■ 对于时间限定或促销产品,贸易项目/基本单元级别的GTIN不需要变更,但为了在供应链中进行跟踪,需要唯一GTIN标识更高级别的包装。

■ 当地、国家或地区法规可能要求更频繁的GTIN变更。此类法规优先于《医疗产品GTIN分配规则》中的规则。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

时间限定或促销产品


2.6 主品牌

主品牌是指由品牌所有者确定的,最易于医护人员或患者识别的品牌,可以表现为标志和/或文字、注册标志或商标。 在贸易项目上出现的主品牌变更,需要分配新GTIN。 GTIN分配的层级 ■ 在贸易项目、基本单元级别或单品以下级别(如适用)变更GTIN。 ■ 在贸易项目/基本单元以上的每个现有包装层级均分配唯一的GTIN。 需要变更GTIN的业务场景示例: 公司的主品牌名称从“医疗产品公司”变更为“前沿医疗产品” 附加信息 联合品牌:公司根据与原品牌所有者签订的合同协议申请第二个品牌(“联合品牌”)的行为。 ■ 拥有联合品牌的公司负责GTIN分配事宜。 ■ 所申请的联合品牌应显眼醒目,以便消费者观看,从而确保产品与联合品牌的关系,被视为是联合品牌产品的“主品牌”。 注:合同关系可能规定“主品牌”不是联合品牌,因此应在包装上突出显示第一个原始品牌。在

主品牌是指由品牌所有者确定的,最易于医护人员或患者识别的品牌,可以表现为标志和/或文字、注册标志或商标。

在贸易项目上出现的主品牌变更,需要分配新GTIN。

GTIN分配的层级

■ 在贸易项目、基本单元级别或单品以下级别(如适用)变更GTIN。

■ 在贸易项目/基本单元以上的每个现有包装层级均分配唯一的GTIN。

需要变更GTIN的业务场景示例:

公司的主品牌名称从“医疗产品公司”变更为“前沿医疗产品”

附加信息

联合品牌:公司根据与原品牌所有者签订的合同协议申请第二个品牌(“联合品牌”)的行为。

■ 拥有联合品牌的公司负责GTIN分配事宜。

■ 所申请的联合品牌应显眼醒目,以便消费者观看,从而确保产品与联合品牌的关系,被视为是联合品牌产品的“主品牌”。

注:合同关系可能规定“主品牌”不是联合品牌,因此应在包装上突出显示第一个原始品牌。在此种情况下,应由原品牌所有者负责分配GTIN。

经销商:品牌所有者与标签上注明的经销商之间存在协议的产品。品牌所有者仍然负责GTIN的分配,当在标签上添加 “经销商标识”时,不需要新GTIN。

注:“经销商”标识不得包含任何注册标记或商标,且只能以纯文本形式制作。

自有品牌标签:原制造商和标签上标明的品牌所有者之间存在协议的产品。品牌所有者负责GTIN分配,因此保持品牌与GTIN分配的一致性。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

对供应链和贸易伙伴是否有实质性影响(例如,产品在临床环境中如何运输、储存、接收或处理)?


2.5 添加或删除认证标志

在医疗产品行业有许多认证标志的实例。认证标志是声明一个产品符合某套监管标准(如:欧洲认证标志CE)的标志、标识或文字。当一个产品变更后采用了一个认证标志(该标志以前在包装或产品本身上未出现过),如果该认证标志对于某些市场非常重要,则应分配一个新GTIN。GTIN应是产品及其包装配置的唯一标识,这是GTIN分配的一个重要原则。 包装上添加新的或移除现有的认证标志(如欧洲认证标志CE)对监管机构、贸易伙伴或最终消费者具有重大影响,则需要分配新GTIN。 GTIN分配的层级 ■ GTIN变更发生在基本单元级别。 ■ 在基本单元以上的每个现有包装层级均会分配唯一的GTIN。 需要变更GTIN的业务场景示例: 受国家许可证或注册的影响,在产品标签中出现或变更认证标志,影响到了全球分销渠道,必须与贸易伙伴进行沟通,因此需要变更GTIN。

在医疗产品行业有许多认证标志的实例。认证标志是声明一个产品符合某套监管标准(如:欧洲认证标志CE)的标志、标识或文字。当一个产品变更后采用了一个认证标志(该标志以前在包装或产品本身上未出现过),如果该认证标志对于某些市场非常重要,则应分配一个新GTIN。GTIN应是产品及其包装配置的唯一标识,这是GTIN分配的一个重要原则。

包装上添加新的或移除现有的认证标志(如欧洲认证标志CE)对监管机构、贸易伙伴或最终消费者具有重大影响,则需要分配新GTIN。

GTIN分配的层级

■ GTIN变更发生在基本单元级别。

■ 在基本单元以上的每个现有包装层级均会分配唯一的GTIN。

需要变更GTIN的业务场景示例:

受国家许可证或注册的影响,在产品标签中出现或变更认证标志,影响到了全球分销渠道,必须与贸易伙伴进行沟通,因此需要变更GTIN。

图2-16 采用认证标志—新GTIN

还应当注意的是,如增加认证标志是为了确保产品在新的国家/市场销售,且对以前销售的国家/市场没有影响时,则没有必要分配新GTIN。

附加信息

品牌所有者负责其库存和任何退货系统的内部控制。对此类系统以及启动和停止库存的物流管理来说,能够区分“新”、“旧”产品非常重要。当能利用如批号或型号有效实现这一功能,外部供应链不受影响,则上述情境中不需重新分配GTIN。

注:如果已经采取此种做法,请了解目标市场、法规和客户要求。

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

添加或删除认证标志


2.4 尺寸或毛重变化

在任何轴上(例如高度、宽度、深度)的物理尺寸或毛重变化超过20%时,均需要分配新GTIN。 注:低于20%的变化可能需要由品牌所有者自行斟酌是否分配一个新GTIN。 GTIN分配的层级: ■ GTIN分配发生在贸易项目或基本单元的级别。 ■ 在贸易项目/基本单元级别之上的每个现有包装层级均分配唯一的GTIN。 需要变更GTIN的业务场景示例: ■ 由于包装材料的变更,产品的毛重增加50%,从0.34公斤(0.75磅)增加到0.51公斤(1.125磅)。 ■ 一个箱子或托盘的方向(贸易项目的数量没有变化)可能发生改变,从而使一个或多个轴向尺寸发生改变。 ■ 为了减少折叠箱样式的多样性,将尺寸为47×18×127mm的折叠箱改为62×20×115mm。 图2-15尺寸或毛重变化

在任何轴上(例如高度、宽度、深度)的物理尺寸或毛重变化超过20%时,均需要分配新GTIN。

注:低于20%的变化可能需要由品牌所有者自行斟酌是否分配一个新GTIN。

GTIN分配的层级:

■ GTIN分配发生在贸易项目或基本单元的级别。

■ 在贸易项目/基本单元级别之上的每个现有包装层级均分配唯一的GTIN。

需要变更GTIN的业务场景示例:

■ 由于包装材料的变更,产品的毛重增加50%,从0.34公斤(0.75磅)增加到0.51公斤(1.125磅)。

■ 一个箱子或托盘的方向(贸易项目的数量没有变化)可能发生改变,从而使一个或多个轴向尺寸发生改变。

■ 为了减少折叠箱样式的多样性,将尺寸为47×18×127mm的折叠箱改为62×20×115mm。

图2-15尺寸或毛重变化

注:20%的变化适用于每个单独的轴,而不是立方体/体积。

附加信息:

■ 本标准的这一部分仅适用于产品尺寸和毛重的变化。对声明的净含量的任何变更均受2.3节中关于声明的净含量的规则约束。

■ 为了避免超过20%的限制,累积变更且不更改GTIN的做法是不可接受的。所有尺寸变更均应通知贸易伙伴。累积变更可能会给贸易伙伴带来问题,并阻碍产品的流通。

■ 有关确定既定产品测量结果的一致、可重复的流程,请参阅GS1包装测量规则标准

相关指导原则:

GTIN规则名称

是否希望利益相关方(如医护人员、消费者、患者、监管机构和/或贸易伙伴)对更改前后的产品或新旧产品进行区分?

是否有法规或责任要求对消费者和/或贸易伙伴披露变更?

是否有影响供应链的重大变更(例如,临床环境中贸易项目装运、储存、接收或临床环境下的处理方式受到影响)?

尺寸变更

Monday, 23. August 2021

DIF Medium

DIF Grant #1: JWS Test Suite

The Decentralized Identity Foundation announced recently a new mechanism for rewarding and funding work in the common interest of its membership, the DIF Grants Program. The Steering Committee ratified an addendum defining these collaborations between a specific DIF working group and a grant sponsor, and this shiny new tool sits ready to be debuted in the toolkit available to DIF community work. D

The Decentralized Identity Foundation announced recently a new mechanism for rewarding and funding work in the common interest of its membership, the DIF Grants Program. The Steering Committee ratified an addendum defining these collaborations between a specific DIF working group and a grant sponsor, and this shiny new tool sits ready to be debuted in the toolkit available to DIF community work. DIF’s work is leveling up with this new mechanism for transparently funding work that supports DIF’s mission and benefits the membership as a whole.

The first such grant was initiated by Microsoft, which has a long history of collaborations in DIF. Of the various modalities described by the grant program, The Claims and Credentials Working Group will be overseeing a new work item open to all DIF members that creates and harden a JWS test suite, with this grant funding a lead editor to drive the work and keep it to a pre-determined timeline, paid upon stable and complete release.

Photo by Natasya Chen History

Pamela Dingle, co-chair of the Interoperability WG, identified a recurring theme in conversations about interoperability across the great “JSON”/”JSON-LD” divide: while considerable cross-vendor, cross-stack alignment had happened over the last years to address many other aspects of credential exchange and identifier resolution, “translation issues” and varying interpretations of the JSON sections of the VC data model specification lead to divergent ways of structuring, defining, signing, and parsing VCs in JWT form. The kinds of signature suite definitions that define Linked Data Proofs made strange bedfellows with the in-built mechanisms of JWT, which were hardened and commoditized earlier. This results in a slightly “balkanized” landscape of VC-JWTs that make different concessions to the expectations of JSON-LD-native parsers and systems.

Initially, the idea of iterating one or more foundational specifications seemed a natural solution: a few key sections of any specification could be made more explicit, more normative, or less ambiguous. But just as the plural of anecdote is not data, so too is the plural of specifications not disambiguation. Another tack was chosen: implementers of the existing specification would come together and compare their running code to identify every difference and incompatibility, defining a test suite that interpreted the current specification. They might still produce a list of changes they would like to see in a future specification, and that list might contain some contentious items that take a long time to align. But in the meantime, they would have a practical roadmap to alignment and a benchmark for conformance to drive interoperability amongst each other and clarity for external interoperability.

A signature suite that spans two worlds

The specification under test in this new work item is Json Web Signature 2020 (“JWS2020” for short), a CCG signature suite that defines how the signature mechanisms and data structures native to “detached signature” JWS tokens can be parsed as Linked Data Proofs. The signature mechanisms of the JWT world and the LDP world are quite different– bridging them requires a deep understanding of both: differing security models, canonization and serialization complexities, explicit reliance on IANA registries, URDNA dataset normalization, and the like. Advanced topics, even for this blog!

Understanding such a specification on a deep intellectual level and understanding its design decisions requires a firm grasp on all these complexities spanning two very different engineering traditions. Implementing the specification, however, should not; to be blunt, our community cannot afford such understanding gating off successful implementation, particularly if industrial adoption of decentralized identity is a shared goal. Before the specification gets iterated, it needs to be testable, and even more implementable than it already is.

This is where a test suite offers pedagogical tool for first-time implementers, as well as a negotiation tool for seasoned veterans and large corporations managing vast roadmaps and production development pipelines. While many DIF members do not encode VCs in JWT form, this grant is clearly in the common interest because mainstream adoption will indisputably benefit from a clearer, more explicitly-defined, and definitively-testable “normative VC-JWT”.

Selection Process

From 26 July 2021 to 9 August 2021, the chairs fielded applications through a google form, and submitted all valid application to github for posterity. The selection criteria for the work item lead is as follows:

familiarity with JWS mechanics and common JOSE libraries/best-practices experience with automated testing, vector design, and test scripts/suites familiarity with LD Proofs and signature suites generally familiarity with JSON/JSON-LD interop problems, at the semantic, signature-verification, and representation/parsing levels Grantee Announcement

On 12 August 2021, The Claims & Credential (C&C) Working Group at DIF gladly announced Transmute Industries as the recipient of the first DIF Grant to provide a JWS Test Suite for Verifiable Credentials. One of the chairs, Martin Riedel, summarized the chairs’ decision thusly:Transmute has been a thought leader in the SSI space for years and uniquely fits the requirement profile laid out in the Grant announcement.

Orie Steele, the CTO of Transmute, is the initiating author of, lead editor of, and main contributor to the (LD-) JSON Web Signature 2020. Transmute has a strong background in TDD and is a strong proponent of delivering test vectors with specifications in order to achieve greater implementation-level interoperability. Examples include Test Vectors in Sidetree.js and did-key.js. The company already provides integrated libraries to support (LDS)-VC and JWT-VC side-by-side, demonstrating their familiarity with both representations. See vc.js. Lastly, Transmute has a strong track record of making specifications and libraries accessible to the general public in a variety of deployed web-projects (https://did.key.transmute.industries/, https://wallet.interop.transmute.world/ and others…)

Therefore we regard Transmute Industries as the ideal candidate to provide a comprehensive JWS test suite for LDS-VC and JWT-VC and support the community interactions around this project within DIF’s C&C Group.

Next Steps

The new work item will be announced and regular meetings initiated at the 23 August meeting of C&C– if you are interested in participating, please attend and/or drop a note in the C&C slack channel.

DIF Grant #1: JWS Test Suite was originally published in Decentralized Identity Foundation on Medium, where people are continuing the conversation by highlighting and responding to this story.

Saturday, 21. August 2021

OpenID

Implementer’s Drafts of Two SSE Specifications Approved

The OpenID Foundation membership has approved the following Shared Signals and Events (SSE) specifications as OpenID Implementer’s Drafts: OpenID Shared Signals and Events Framework Specification OpenID Continuous Access Evaluation Profile An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the fi

The OpenID Foundation membership has approved the following Shared Signals and Events (SSE) specifications as OpenID Implementer’s Drafts:

OpenID Shared Signals and Events Framework Specification OpenID Continuous Access Evaluation Profile

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the first SSE Implementer’s Drafts.

The Implementer’s Drafts are available at:

https://openid.net/specs/openid-sse-framework-1_0-ID1.html https://openid.net/specs/openid-caep-specification-1_0-ID1.html

The voting results were:

Approve – 61 votes Object – 0 votes Abstain – 8 votes

Total votes: 69 (out of 303 members = 22.7% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Implementer’s Drafts of Two SSE Specifications Approved first appeared on OpenID.

Kantara Initiative

Lexisnexis risk solutions gets kantara certification for risk defense platform

LexisNexis’s digital ID platform has been vetted and approved by the Kantara Initiative, receiving the NIST SP 800-63 rev.3 Class of Approval for its digital ID and biometric solutions. Click below to see the full publication.https://blog.executivebiz.com/2021/08/lexisnexis-risk-solutions-gets-kantara-certification-for-risk-defense-platform/https://www.biometricupdate.com/202108/lex

LexisNexis’s digital ID platform has been vetted and approved by the Kantara Initiative, receiving the NIST SP 800-63 rev.3 Class of Approval for its digital ID and biometric solutions. Click below to see the full publication.https://blog.executivebiz.com/2021/08/lexisnexis-risk-solutions-gets-kantara-certification-for-risk-defense-platform/https://www.biometricupdate.com/202108/lexisnexis-digital-id-platform-meets-nist-standard-according-to-kantara-compliance-test

The post Lexisnexis risk solutions gets kantara certification for risk defense platform appeared first on Kantara Initiative.

Friday, 20. August 2021

Ceramic Network

Ceramic + Chainlink VRF: the toolset for building more dynamic NFTs

Learn how to create NFTs with evolving content, randomized drops, and owner-remixable content.

NFTs let anyone proveably own a unique digital property. This technology is revolutionizing collectibles, digital art, gaming, media, and even science. However, to date, the type of property that could be managed through NFTs has been limited to constant, static, hardcoded properties. This is great for art, collectibles, and other use cases where immutability is a feature.

Most types of property benefit from evolution, mutability, or gamification. Together, Ceramic and Chainlink give creators the tools to create more dynamic NFT experiences. Ceramic allows for hosting more flexible forms of content, while Chainlink can supply external inputs that change and gamify NFTs, creating what’s called dynamic NFTs.

With these new tools, NFT-based experiences can expand to include:

Evolving art and collectibles, changed by the original creator, current owner, or new data like the weather Collectible packs of NFTs with special edition traits randomly chosen upon opening Property that gains value from each owner depending on how they use it, like an in-game item that keeps its own history of how it’s used in gameplay Ownership rights that let the current NFT holder change the content associated to the NFT, for example for land parcels in AR, VR, or games

“When you buy a physical piece of land, the house doesn’t disappear. There’s a foundation there,” says Geo Web founder Graven Prest. “We need to replicate that for NFTs in the digital world.”

CERAMIC is a sovereign data network for creating, storing and sharing data under the direct control of any identity. Because data in Ceramic is mutable, and only by the controller of that data object, it provides not only a fully decentralized option to store NFT content and metadata but the first way to attach dynamic content to an NFT, with strict permissions and access control rights. Because Ceramic has a cross-chain identity system built-in, this can be deployed easily for any blockchain.

There are two ways to use this in NFT experiences. First, Ceramic streams can be updated only by the original creator. This would let artists sell NFTs and then update and enhance their work over time. The history is preserved, so this mutability is purely a gain. Athletes could sell NFTs that evolve as their stats do.

NFT:DID, a recently released feature in Ceramic, allows the current owner of an NFT exclusive rights to update content. For example, Geo Web is using Ceramic to store content displayed in their AR metaverse. The owner of each parcel of land, represented by an NFT, can change the content being displayed. This could be used for owner-only logbooks, storing critical metadata or permissions, profile pics with changeable attire, collective playlist creation, or more.  

CHAINLINK provides more dynamic ownership models for NFT experiences based on reliable external data feeds and trust-minimized off-chain computations. Most notably, Chainlink Verifiable Random Function (VRF) generates a tamper-proof and auditable source of randomness, which users can reference to offer fair drops of limited edition NFTs, evolve NFTs over time via pure entropy, and retain rarity odds in NFT distribution models.

Given the historical security and reliability of Chainlink VRF, we recommend that any developer launching NFTs on Ceramic and needing randomness to use Chainlink VRF. Ultimately, this will help them create a more exciting, transparent, and fraud-proof user experience, as well as help gamify NFTs by enabling them to change over time.

About Chainlink

Chainlink is the industry standard oracle network for powering hybrid smart contracts. Chainlink Decentralized Oracle Networks provide developers with the largest collection of high-quality data sources and secure off-chain computations to expand the capabilities of smart contracts on any blockchain. Managed by a global, decentralized community, Chainlink currently secures billions of dollars in value for smart contracts across decentralized finance (DeFi), insurance, gaming, and other major industries.

Chainlink is trusted by hundreds of organizations, from global enterprises to projects at the forefront of the blockchain economy, to deliver definitive truth via secure, reliable oracle networks. To learn more about Chainlink, visit chain.link, subscribe to the Chainlink newsletter, and follow @chainlink on Twitter. To understand the full vision of the Chainlink Network, read the Chainlink 2.0 whitepaper.

Solutions | Docs | Discord | Reddit | YouTube | Telegram | GitHub

About Ceramic

Ceramic is a network for sovereign data that lets developers easily build rich applications on top of blockchains and IPFS. Ceramic's permissionless data streaming network stores streams of information and ever-changing files directly on the decentralized web, eliminating the need for a backend. Because all data is managed directly by cross-chain identities, it’s easy to discover and share content across application boundaries.

Ceramic is in use by hundreds of applications building the future of finance, coordination, and Web3. To learn more about Ceramic, visit ceramic.network, follow @ceramicnetwork on twitter, or join the developer community.


Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


Own Your Data Weekly Digest

MyData Weekly Digest for August 20th, 2021

Read in this week's digest about: 22 posts, 2 questions, 1 Tool
Read in this week's digest about: 22 posts, 2 questions, 1 Tool

Thursday, 19. August 2021

We Are Open co-op

Open standards should be developed openly

Equity, innovation, and community gardens Image by dylan nolte Imagine a community garden with abundant fruit and vegetables for anyone to come and pick and consume. Now imagine a factory creating standardised parts to be used by manufacturers. The two metaphors feel quite different, don’t they? The first, the garden metaphor, feels welcoming. You can envisage suggestions being made abo
Equity, innovation, and community gardens Image by dylan nolte

Imagine a community garden with abundant fruit and vegetables for anyone to come and pick and consume. Now imagine a factory creating standardised parts to be used by manufacturers. The two metaphors feel quite different, don’t they?

The first, the garden metaphor, feels welcoming. You can envisage suggestions being made about which fruit and vegetables to grow to meet the needs of the community. And it feels natural that people would be able to opt-in to come and work in the garden for the benefit of all.

The second, the factory metaphor, feels less welcoming. In fact, unless you’re an employee of the factory, or have been sanctioned for entry, then you don’t get to see how the standardised parts are made. And if you’re not a manufacturer, then why would you get a say in how they’re made?

Metaphors are imperfect tools for thought but do help us grasp at deeply-held assumptions about how groups of people should interact with one another. In WAO’s work with various organisations, we’ve found the garden metaphor comes up time and time again as one to strive towards. Too often, though, organisations treat people as ‘consumers’ within a ‘market’ — rather than community members within an ecosystem.

Standards are extremely important to our everyday modern lifestyles. They’re the reason that devices don’t explode when you plug them into electrical outlets; they’re the reason that you can go to shops and buy clothes and shoes that fit; they’re the reason you can visit any website using any web browser.

It’s important that experts help develop standards. For example, we want people who know a lot about electrical engineering designing standards for electrical outlets. But it’s no good having standards that no-one uses, or that they use begrudgingly. That’s why it’s crucial to have end-user input into standards development.

Different types of standards, of course, require different levels of input. If we’re talking about electrical wall sockets, there’s a level of mandating that applies: in this country, if you want to plug things in, it works like this. But when it comes to things that are international and have to work in a variety of contexts, then it’s important that standards aren’t limited to only what a small group of people are familiar with.

For people who have never really worked openly, who have always interacted within the confines of closed working groups (with people who look and talk a lot like them, come from similar backgrounds, and want similar things in life) end-user requests might come as a bit of a shock. Insiders might describe these requests as ‘unhelpful’ or dismiss the people themselves as ‘not understanding the bigger picture’.

The reason that the recent IMS debacle has rankled the Open Badges community so much is because of the stark contrast between the way that the standard was developed by Mozilla, compared to the way it is being stewarded by IMS. The only reason that the community have any insight into the development process, which is closed off to paying members, is because of a requirement from Mozilla that IMS retain an open repository.

At that repository, Kerri Lemoie has submitted a proposal to align Open Badges with the W3C standard for Verifiable Credentials. Unlike IMS working groups, anyone can turn up for W3C calls — as a number of Open Badges community members have done. The opposition to Kerri’s proposal by a faction of IMS members has not been explained clearly and much of the discussion is happening behind closed doors.

This is not how open standards should be developed. It might take time to develop standards in the open. You may have to deal with people different to you and that you don’t particularly understand or like. It’s also possible that you have to consider use cases outside your own experience. But in the end, bringing a wealth of diversity and experience to the table is the exact thing that brought Open Badges its initial success.

Western capitalist society conditions us to understand openness as a danger. However, as the Open Badges community (and indeed other Open Source communities) have shown time and time again, openness leads to innovation. We’re all here to help make standards better — for everyone, everywhere.

Returning to the factory metaphor, consumers don’t usually have a direct say in what is produced by manufacturers. That is in contrast to the community garden, where even those people who are unable to work the land can have a say in what is grown. Openness leads to equity.

Open standards should be developed openly because not enough people work to ensure that equity is central to innovation and development. We believe that openness is an attitude, and one which bears fruit over time from which everyone can benefit.

Open standards should be developed openly was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 18. August 2021

OpenID

Notice of Vote for Proposed FAPI Grant Management Implementer’s Draft

The official voting period will be between Monday, August 30, 2021 and Monday, September 6, 2021, once the 45-day review of the specification has been completed. For the convenience of members, voting will actually begin on Monday, August 23, 2021 for members who have completed their reviews by then, with the voting period ending on […] The post Notice of Vote for Proposed FAPI Grant Management I

The official voting period will be between Monday, August 30, 2021 and Monday, September 6, 2021, once the 45-day review of the specification has been completed. For the convenience of members, voting will actually begin on Monday, August 23, 2021 for members who have completed their reviews by then, with the voting period ending on Monday, September 6, 2021.

The Financial-grade API (FAPI) working group page is https://openid.net/wg/fapi/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/246.

– Michael B. Jones, OpenID Foundation Secretary

The post Notice of Vote for Proposed FAPI Grant Management Implementer’s Draft first appeared on OpenID.

EdgeSecure

System Administrator: Professional Service Specialist 2 (Contract Services)

Position Summary: On behalf of our client, Kean University, Edge is seeking a highly motivated professional to serve as System Administrator.  In this role, the incumbent oversees the day-to-day operations... The post System Administrator: Professional Service Specialist 2 (Contract Services) appeared first on NJEdge Inc.

Position Summary:
On behalf of our client, Kean University, Edge is seeking a highly motivated professional to serve as System Administrator.  In this role, the incumbent oversees the day-to-day operations of Blackboard Learn (LMS.) Handle system-programming work associated with maintaining and expanding LMS system automation and integration with third-party systems. Will use LMS System Administrator expertise to troubleshoot user support problems and interact with vendor and other technical staff to plan and devise problem resolution. The Blackboard Systems Administrator supports course development projects including design, production, and testing of online courses and programs. This position works with key internal and external stakeholders across the higher education community to further the vision and plan for the application of the LMS.

Duties and responsibilities will include:

Manages the day-to-day operations Learning Management System Management System. Works directly with the LMS hosting provider to ensure the LMS remains continuously operational.  Provides multi-tier support for all escalations received from Blackboard Provides leadership and works with functional areas to implement enhancements to software by designing and testing new features, and modifications of existing functionality Work with external vendors and functional areas to create and maintain new and existing integrations. Develops, manages, and coordinates integration of The Learning Management System with external systems (Starfish, Pearson, McGraw-Hill, Cengage, VoiceThread, etc.) to support system-wide initiatives and transformations. Coordinate with vendors to complete necessary software fixes, refreshes, upgrades, and planning for current and future initiatives. Serves as subject matter expert on technical issues. Represents Technology Solutions in workgroups, committees, focus groups, etc. for LMS product functionality. Performs other duties as assigned.

Qualifications:
Required skills/qualifications:

Computer/Internet savvy, comfortable operating in several applications simultaneously Strong technical acumen and problem-solving skills stakeholder-focused mindset Ability to work in a team environment with minimal supervision Excellent written and oral communication skills, with strong stakeholder service and interpersonal skills Knowledge with Windows10 and Apple products Knowledge of Learning Management Systems, their functional and technical requirements in design and implementation Blackboard and Canvas.

Qualifications:
Preferred skills/qualifications:

Bachelor degree 3+ years previous experience in the education industry and with e-learning technologies Relevant technical certifications Candidates able to provide their own equipment are preferred but the ability to provide your own equipment is not considered a requirement for the position

Education and Experience:
Preferred:

A graduate of a computer science/technology, business science, or certain social science degree programs (Associates or higher)

Compensation:
This is initially a 1099 contractor position. The rate of payment will be a monthly amount of $5,833.00 for a 35 hour work week, on-site at Kean University.

Apply [contact-form-7]

The post System Administrator: Professional Service Specialist 2 (Contract Services) appeared first on NJEdge Inc.


Technician Specialist – Professional Service Specialist 3 (Contracted Services)

Position Summary On behalf of our client, Kean University, Edge is seeking a highly motivated professional to serve as Technical Support Specialist.  In this role, the incumbent will help customers... The post Technician Specialist – Professional Service Specialist 3 (Contracted Services) appeared first on NJEdge Inc.

Position Summary
On behalf of our client, Kean University, Edge is seeking a highly motivated professional to serve as Technical Support Specialist.  In this role, the incumbent will help customers with account issues for Blackboard Learn (LMS) and provide exceptional service. 

Primary Responsibilities Will Include:

Must have strong people skills to build a genuine connection with customers in a friendly and professional manner and provide quality service Listening attentively to customer needs and concerns and resolving technical issues Navigating applications and troubleshooting medium to complex technical issues while researching solutions with ease Providing customer support through the following channels: phone calls, in-person, and web tickets Working in a structured environment for the duration of your allotted, full-time schedule taking high-volume calls from customers

Qualifications:
Required Skills/Qualifications:

Computer/Internet savvy, comfortable operating in several applications simultaneously Strong technical acumen and problem-solving skills Stakeholder-focused mindset Ability to work in a team environment with minimal supervision Excellent written and oral communication skills, with strong stakeholder service and interpersonal skills Knowledge with Windows10 and Apple products Comfortable operating in several applications simultaneously

Education and Experience:
Preferred

Previous experience in the education industry and with e-learning technologies Relevant technical experience Graduate of a computer science/technology degree program (associate degree or higher) Candidates that are able to provide their own equipment are preferred but not a requirement

Compensation:
This is initially a 1099 contractor position. The rate of payment will be a monthly amount of $4,333.00 for a 35 hour work week, on-site at Kean University.

Apply [contact-form-7]

The post Technician Specialist – Professional Service Specialist 3 (Contracted Services) appeared first on NJEdge Inc.

Tuesday, 17. August 2021

Digital ID for Canadians

Spotlight on the CREA

1. What is the mission and vision of the CREA? Our mission is to support REALTORS® in service to their clients through the provision of…

1. What is the mission and vision of the CREA?

Our mission is to support REALTORS® in service to their clients through the provision of services and standards that enrich the REALTOR® profession and reputation and to advocate for public policy that ensures Canadians can fulfill their home and property rights and aspirations. Our vision is that REALTORS® are the chosen, trusted and respected professionals for consumer real estate needs.

2. Why is trustworthy digital identity critical for existing and emerging markets?

As technology continues to develop and touch all aspects of life, it’s more important than ever to create secure proof of identity and authentication in the digital space that enables individuals, organizations and governments to have confidence in their digital interactions. Ultimately, digital identity is critical to ensure people can trust the services they receive online and help reduce occurrences of cyber fraud that can erode this trust.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

Digital identity is quickly becoming a key feature in the 21st century economy as innovation and economic growth is, for the most part, happening online. Developing a strong digital identity framework in Canada will help build global trust and could lead to increased capacity for international collaboration in many different sectors.

4. What role does Canada have to play as a leader in this space?

Canada’s forward thinking and focus on the protection of personal information provides it with the unique opportunity to become a leader in the development and adoption of digital identity and authentication technologies. As technology companies continue to choose Canada as the destination to develop their products and services, Canada is well-positioned to be a strong international voice in this space.

5. Why did your organization join the DIACC?

As the real estate industry continues to shift online and adopt digital tools, the need for trusted digital identity for both REALTORS® and their clients has become essential. CREA joined the DIACC to become a part of the ongoing discussions and valuable work happening on this subject, with the goal of being part of the solution to this important issue.

6. What else should we know about your organization?

The Canadian Real Estate Association (CREA) is one of Canada’s largest single-industry Associations. Our membership includes more than 135,000 real estate brokers, agents and salespeople, working through 78 real estate Boards and Associations across Canada.


Ceramic Network

The Convo Space: Decentralizing the Conversation Layer of the Internet

Leveraging Ceramic to enable conversations and communities to flow across the Web.

Thanks to Billy Leudtke of Layer 0 Ventures for drafting this article.

The Convo Space is giving Web3 developers an easy way to add comments, content and conversations to their decentralized applications. Now live on Ceramic mainnet, The Convo Space enables content from across applications to be linked to an user’s unique decentralized identity to create a shared, decentralized layer for Web3 conversations. The easy-to-use API abstracts away the complexity of Web3, paving the way for the defragmentation & democratization of the ‘conversation’ layer of the internet.

Check out https://theconvo.space/ and get started today!

Fragmented social context

Today, our conversations are fragmented across applications and platforms. Threads of conversation we have with our friends, family, coworkers and peers flow unconnected across a plethora of mediums such as text message, WhatsApp, Telegram, Discord, Facebook, Instagram and others, and each context of interaction is tied to a different one of our many application-specific digital identities.

These silos destroy our continuity online, forcing us to:

Rediscover and reconnect with contacts on each new platform Carry conversations across platforms, eliminating context Choose applications based on access to our social graph, rather than purely for the product itself Relinquish control of our data, including the value it generates Trust platforms to protect our privacy

Platforms don’t talk, people do. Web3 promises an alternative model, with data, identity and reputation separated from individual applications and easily shared across platforms. This information is always controlled by the users, so they can bring reputation and data along when moving to different applications and interfaces, instead of starting from scratch.

The Convo Space: de-platformed conversations

The Convo Space is a decentralized conversation protocol with data separated from the application/interface, allowing a unified conversation layer that can work across applications and solve the fragmentation issue. Through this separation, conversations can continue to be generated organically on the incumbent platforms, while The Convo Space unifies that content from across platforms to create a cohesive conversation for the user.  

The team’s first product is a commenting solution similar to a decentralized Disqus. The developer product offers three ways to integrate:

A 1-line integration that embeds an i-frame for an out-of-the-box comments section that can be added to any website React components to let developers customize their own decentralized comments product APIs to build a fully custom social experience, without having to worry about all the complexities of Web3

Developers can easily leverage any of the above methods to begin building on the decentralized conversation layer of the internet. And because all the data on The Convo Space is associated with users rather than applications, any application can build on the same conversation graph and content.

Built on decentralized content, organized around decentralized identities  

The Convo Space is intended to be fully decentralized, making sure this key social infrastructure is not confined to, or controlled by, any single app or platform. The solution pulls together the latest decentralized technologies to achieve this, including Ceramic, IPFS, Textile, Pinata, and decentralized identifiers (DIDs).

Comments are stored in ThreadsDB databases, built by Textile. This content is made available through Pinata’s IPFS-pinning service and backed up to Filecoin to ensure fully decentralized availability and redundancy.

To complete the solution, the product needed to associate the comment threads with users and give them control of their content. For this, they turned to Ceramic. IDX, the decentralized identity protocol built with Ceramic, gives The Convo Space an easy way to link multiple addresses and disparate digital identities into a single user identity.

This open, unified data network lays the foundation for rich identities that can be used across platforms. IDX also provides an easy way to associate specific comment threads with a user directly. Says Anudit Nagar, founder of The Convo Space:

Ceramic truly allows your data to flow and aggregate across multiple different chains - it's truly a one-stop solution for all your storage and access needs, in a truly decentralized manner. That is really future proof.

The platform lets users aggregate all their social conversations into a single place, under a single identity. This lets developers focus on building new and amazing user experiences and worry less about data management. For users, this means conversational data will never be lost or trapped, locking you into a platform because your contacts and ongoing conversations are there. This bridge between Web2 to Web3 enables users to leverage their existing social communities to create more seamless and flexible experiences.

What’s Next for The Convo Space?

Today’s solution is just the first step towards an unbundled conversation layer for Web3.  The team is actively working on rolling out new features to further improve user engagement and experience. Decentralized governance of the protocol is also in the works, which will allow the community to eventually manage itself and set up standards for content moderation, among other things. The team is also working on a Metamask Snap which will allow conversations to live natively in a Metamask wallet.

The Convo Space is currently onboarding NFT marketplaces, DAO aggregation platforms and others to begin opening the commenting layer of the Web. Follow their progress on Twitter and read more here to learn about adding a social layer to your app.

Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


Velocity Network

Interview with Andrew Cunsolo, Vice President of Product Management, Jobvite

We sat down with Andrew Consulo, Vice President of Product Management, Jobvite to chat about their involvement in the Velocity Network Foundation. The post Interview with Andrew Cunsolo, Vice President of Product Management, Jobvite appeared first on Velocity.

Monday, 16. August 2021

Oasis Open

Call for Consent for LegalRuleML Core Specification V1.0 as OASIS Standard

Defining a standard that is able to represent the particularities of the legal normative rules in a rich, articulated, and meaningful mark-up language. The post Call for Consent for LegalRuleML Core Specification V1.0 as OASIS Standard appeared first on OASIS Open.

The markup language for expressing legal norms is now presented to the membership for approval

The OASIS LegalRuleML TC members [1] have approved submitting the following Committee Specification 02 to the OASIS Membership in a call for consent for OASIS Standard:

LegalRuleML Core Specification Version 1.0
Committee Specification 02
06 April 2020

This is a call to the primary or alternate representatives of OASIS Organizational Members to consent or object to this approval. You are welcome to register your consent explicitly on the ballot; however, your consent is assumed unless you register an objection [2]. To register an objection, you must: 

1. Indicate your objection on this ballot, and 

2. Provide a reason for your objection and/or a proposed remedy to the project. 

You may provide the reason in the comment box or by email to the LegalRuleML TC on its comment mailing list. If you provide your reason by email, please indicate in the subject line that this is in regard to the Call for Consent. Note that failing to provide a reason and/or remedy may result in an objection being deemed invalid.

Details

The Call for Consent opens on 17 August 2021 at 00:00 UTC and closes on 30 August 2021 at 23:59 pm timezone. You can access the ballot at:

Internal link for voting members: https://www.oasis-open.org/apps/org/workgroup/voting/ballot.php?id=3641

Publicly visible link: https://www.oasis-open.org/committees/ballot.php?id=3641

OASIS members should ensure that their organization’s voting representative responds according to the organization’s wishes. If you do not know the name of your organization’s voting representative is, go to the My Account page at

http://www.oasis-open.org/members/user_tools

then click the link for your Company (at the top of the page) and review the list of users for the name designated as “Primary”.

Information about LegalRuleML Core Specification V1.0

Legal texts, e.g. legislation, regulations, contracts, and case law, are the source of norms, guidelines, and rules that govern societies. As text, it is difficult to label, exchange, and process content except by hand. In our current web-enabled world, where innovative e-government and e-commerce are increasingly the norm, providing machine-processable forms of legal content is crucial. 

The objective of the LegalRuleML Core Specification Version 1.0 is to define a standard (expressed with XML-schema and Relax NG and on the basis of Consumer RuleML 1.02) that is able to represent the particularities of the legal normative rules with a rich, articulated, and meaningful mark-up language.  

LegalRuleML models: 

– defeasibility of rules and defeasible logic; 

– deontic operators (e.g., obligations, permissions, prohibitions, rights); 

– semantic management of negation; 

– temporal management of rules and temporality in rules; 

– classification of norms (i.e., constitutive, prescriptive); 

– jurisdiction of norms; 

– isomorphism between rules and natural language normative provisions; 

– identification of parts of the norms (e.g. bearer, conditions); 

– authorial tracking of rules. 

The TC received 4 Statements of Use from Livio Robaldo, Swansea University, CSIRO Data61, and CirSFID-AlmaAI.[3].

URIs

The prose specification document and related files are available here:

HTML (Authoritative): 

https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/cs02/legalruleml-core-spec-v1.0-cs02.html

DOCX: 

https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/cs02/legalruleml-core-spec-v1.0-cs02.docx

PDF: 

https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/cs02/legalruleml-core-spec-v1.0-cs02.pdf

Distribution ZIP file

For your convenience, OASIS provides a complete package of the specification document and related files in a ZIP distribution file. You can download the ZIP file at:

https://docs.oasis-open.org/legalruleml/legalruleml-core-spec/v1.0/cs02/legalruleml-core-spec-v1.0-cs02.zip

Additional information

[1] OASIS LegalRuleML TC

https://www.oasis-open.org/committees/legalruleml/

Project IPR page

https://www.oasis-open.org/committees/legalruleml/ipr.php

[2] Comments may be submitted to the TC through the use of the OASIS TC Comment Facility as explained in the instructions located at https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=legalruleml

Comments submitted to the TC are publicly archived and can be viewed at https://lists.oasis-open.org/archives/legalruleml-comment/

Members of the TC should send comments directly to legalruleml@lists.oasis-open.org.

[3] Statements of Use:

The Statements of Use are packaged in https://www.oasis-open.org/committees/download.php/68193/StatementofUse.zip

The post Call for Consent for LegalRuleML Core Specification V1.0 as OASIS Standard appeared first on OASIS Open.

Friday, 13. August 2021

Elastos Foundation

Elastos Bi-Weekly Update – 13 August 2021

...

Oasis Open

Invitation to comment on Common Security Advisory Framework v2.0

CSAF v2.0 is the definitive reference for the CSAF language, which supports creation, update, and interoperable exchange of security advisories among interested parties. The post Invitation to comment on Common Security Advisory Framework v2.0 appeared first on OASIS Open.

First public review of this draft specification ends September 12th

OASIS and the OASIS Common Security Advisory Framework (CSAF) TC are pleased to announce that Common Security Advisory Framework Version 2.0 is now available for public review and comment.

The Common Security Advisory Framework (CSAF) Version 2.0 is the definitive reference for the CSAF language which supports creation, update, and interoperable exchange of security advisories as structured information on products, vulnerabilities and the status of impact and remediation among interested parties.

The OASIS CSAF Technical Committee is chartered to make a major revision to the widely-adopted Common Vulnerability Reporting Framework (CVRF) specification, originally developed by the Industry Consortium for Advancement of Security on the Internet (ICASI). ICASI has contributed CVRF to the TC. The revision is being developed under the name Common Security Advisory Framework (CSAF). TC deliverables are designed to standardize existing practice in structured machine-readable vulnerability-related advisories and further refine those standards over time.

The documents and related files are available here:

Common Security Advisory Framework Version 2.0
Committee Specification Draft 01
05 August 2021

Editable source (Authoritative):
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.md

HTML:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.html

PDF:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.pdf

JSON schemas:
Aggregator JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/schemas/aggregator_json_schema.json
CSAF JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/schemas/csaf_json_schema.json
Provider JSON schema:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/schemas/provider_json_schema.json

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.zip

A public review announcement metadata record [3] is published along with the specification files.

How to Provide Feedback

OASIS and the CSAF TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of our technical work.

The public review starts 14 August 2021 at 00:00 UTC and ends 12 September 2021 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=csaf).

Comments submitted by TC non-members for this work and for other work of this TC are publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/csaf-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about the specification and the CSAF TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/csaf/

Additional references

[1] https://www.oasis-open.org/policies-guidelines/ipr

[2] https://www.oasis-open.org/committees/csaf/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr#Non-Assertion-Mode

[3] Public review announcement metadata:
https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01-public-review-metadata.html

The post Invitation to comment on Common Security Advisory Framework v2.0 appeared first on OASIS Open.


Own Your Data Weekly Digest

MyData Weekly Digest for August 13th, 2021

Read in this week's digest about: 17 posts
Read in this week's digest about: 17 posts

Thursday, 12. August 2021

Me2B Alliance

Me2B 101 Flash Guide Series

The Me2B 101 Flash Guide Series is Now Available. Learn More

The Me2B 101 Flash Guide Series is Now Available. Learn More


Measuring the Ethical Behavior of Technology at RSA Conference 2021

Me2B Alliance’s Executive Director Lisa LeVasseur and Board member Cam Geer presented “Measuring the Ethical Behavior of Technology” at the 2021 RSA Conference. In this session they shared an overview of the Me2B Alliance’s ethical “yardstick” for measuring technology, and the most current learnings from its application to websites and mobile apps. If you’re new to the Me2B Alliance,

Me2B Alliance’s Executive Director Lisa LeVasseur and Board member Cam Geer presented “Measuring the Ethical Behavior of Technology” at the 2021 RSA Conference. In this session they shared an overview of the Me2B Alliance’s ethical “yardstick” for measuring technology, and the most current learnings from its application to websites and mobile apps. If you’re new to the Me2B Alliance, this presentation is an excellent overview of our Respectful Tech Specification and preliminary testing results.   

Experts from around the world gather at the RSA Conference to share insights on cybersecurity. This is an important platform for innovation and partnership in the respectful technology space. Check out the presentation video to learn about Me2B Alliance’s progress toward creating a safe and respectful world.

WATCH VIDEO

Trust over IP

Release of the Good Health Pass (GHP) Interoperability Blueprint

The Trust Over IP (ToIP) Foundation together with the Good Health Pass Collaborative (GHPC) today announced the release of the Good Health Pass (GHP) Interoperability Blueprint V1.0.0 (PDF). Produced by over 125 participating companies and... The post Release of the Good Health Pass (GHP) Interoperability Blueprint appeared first on Trust Over IP.

The Trust Over IP (ToIP) Foundation together with the Good Health Pass Collaborative (GHPC) today announced the release of the Good Health Pass (GHP) Interoperability Blueprint V1.0.0 (PDF). Produced by over 125 participating companies and organizations spanning global travel, health, cybersecurity, privacy, and government, the Blueprint is an urgently needed solution that describes how to unify the widely disparate set of digital vaccination certificate solutions on the market.

“Over the past several months, different vaccination certificate formats have been announced by at least a dozen different governments, health authorities, and industry consortia around the world, including the European Union’s Digital COVID Certificate and the World Health Organisation’s Digital Document of COVID-19 Certificates,” said John Jordan, Executive Director of the ToIP Foundation. “Each of these is good and valuable in its own right, however because they are designed to be digital health documents, they share more information than is necessary simply to prove one’s COVID-19 status for purposes of travel or entry to a venue. The Good Health Pass Blueprint was designed from the ground up to provide an international trust framework that addresses the need for a simple, secure, standard, privacy-preserving health pass that works anywhere you need to prove your health status, just like a mobile boarding pass works with any airline.”

The Good Health Pass effort began with the Good Health Pass Collaborative (GHPC) organized by ID2020, a non-profit organization focused on ethical digital identity. In February 2021, GHPC published the Good Health Pass Interoperability Blueprint Outline, which specified the key problems that needed to be solved and the core design principles that needed to be followed. GHPC then formed a partnership with the ToIP Foundation to launch the Interoperability Working Group for Good Health Pass. Working Group leadership and cross-industry expertise were also contributed by Linux Foundation Public Health (LFPH), particularly its COVID Credentials Initiative (CCI). This combined effort resulted in a fully open and transparent process that created the full Blueprint in under eight weeks.

After a public review period during June with stakeholders in air travel, government, healthcare, hospitality, and other affected sectors, the Blueprint was finalized in mid-July for final approval and publication. “Publication of the V1.0.0 Blueprint is just the first step in seeing interoperable privacy preserving digital health passes adopted in order to support people being able to gather together again with lower personal and public health risk,” said Kaliya Young, chair of the Working Group and Ecosystems Director at CCI. “Our next task is collaborating with real world implementers to fill in any remaining gaps to get to an interoperable system and working with LFPH and other partners to deliver open source code that can be deployed.”

Judith Fleenor, Director of Strategic Engagements at the ToIP Foundation, adds “The best news is that the Good Health Pass Blueprint does not compete with or replace any of the publicly announced COVID-19 health certificates. It is compatible with all of them—and others still to come. With the right software, all those health certificates can be ingested and verified in order to issue a Good Health Pass-compliant health pass. This focus on interoperability will make life much simpler and safer for both users and verifiers.”

The ToIP Foundation is especially proud of this effort because it demonstrates how applying the core principles of interoperable digital trust architecture solves global problems and builds confidence in solutions that have data integrity, portability, and confidentiality built-in. This is the fundamental motivation for creating the ToIP Stack, the ToIP Foundation’s model for developing privacy-preserving, interoperable, and decentralized digital identity solutions that form and sustain digital trust relationships.

The high-speed collaboration enabled by the Interoperability Working Group for Good Health Pass also illustrates the value of the antitrust and royalty-free intellectual property rights protections that all our Working Groups enjoy as a Linux Foundation (LF) project. “Good Health Pass is a textbook example of the kind of project the LF was formed to support,” said Brian Behlendorf, LF General Manager of Healthcare, Blockchain, and Identity. “Being able to bring together this many global experts to work so intensively in such a short period is unprecedented—and shows the kind of confidence that the LF has built in an open public collaboration process.”

We encourage you to review the Blueprint, endorse it, and adopt it—and to join the Interoperability Working Group for Good Health Pass if you would like to collaborate with other industry leaders in creating the world’s first privacy-preserving health credential interoperability framework.

The post Release of the Good Health Pass (GHP) Interoperability Blueprint appeared first on Trust Over IP.


Velocity Network

On climate crisis and self-sovereign verifiable career credentials

Experts present discerning estimates that the near and medium future will be characterized an inevitable mass migration, as hundreds of millions will relocate themselves. Portable, self-sovereign verifiable career credentials will be critical to job security in an age of mass migrations. The post On climate crisis and self-sovereign verifiable career credentials appeared first on Velocity.

Tuesday, 10. August 2021

Good ID

FIDO Developer Challenge: Welcoming Teams to the Implementation Stage

By Joon Hyuk Lee, APAC Market Development Director Editor’s Note: This is the second blog covering the FIDO Developer Challenge.  To learn more about the background and process, please read […] The post FIDO Developer Challenge: Welcoming Teams to the Implementation Stage appeared first on FIDO Alliance.

By Joon Hyuk Lee, APAC Market Development Director

Editor’s Note: This is the second blog covering the FIDO Developer Challenge.  To learn more about the background and process, please read the earlier blog post, Announcing the FIDO Developer Challenge for Developers Across the Globe.

We are happy to announce that 14 teams from eight different countries (the U.S., Japan, Canada, France, India, Malaysia, Vietnam, and South Korea) have been invited to participate in the implementation stage of the 2021 FIDO Developer Challenge. Six of the teams are early-stage ventures and an equal number hail from academia; the other two are individual developers.

[Faces of participants, captured during online interviews in late July]

All of the teams share a commitment to using FIDO authentication to provide a smoother and more secure user experience across a variety of application areas. As was the case in our earlier Hackathons, we are seeing yet again that the mix of entrepreneurial vision coupled with the capabilities of FIDO Authentication can be realized in a wide array of use cases and industries. We will share more details on each of the submissions as the review process carries forward.

The teams are now engaged in designated virtual lounges for possible Q&As and support from the Developer Challenge sponsors and broader FIDO development community.  To that end, we would like to recognize and give special thanks to the W3C WebAuthn Adoption Community Group for managing the private Discord Channel to provide technical support for participating teams.

Implementations will be done by the end of August and the judges will evaluate the teams’ final presentations and demos by early September.  Please stay tuned for our announcement of the Top 3 by the middle of September – with the winner being announced at the Authenticate conference in Seattle on October 20.

The post FIDO Developer Challenge: Welcoming Teams to the Implementation Stage appeared first on FIDO Alliance.


Ceramic Network

Build scalable Web3 apps with Polygon and Ceramic!

Developers building on Polygon can now use Ceramic's sovereign data network for cross-chain identity and dynamic data.

Developers building multi-chain applications have a powerful new combination to work with, as Ceramic supports Polygon applications.  Polygon offers developers a massively scalable, EVM-compatible blockchain infrastructure. Ceramic adds support for advanced data management and cross-chain identity.

Together, this enables developers and users to depend on the same accounts and wallets to manage transactions, identities, and data across their applications. Get started building today.

A full stack for cross-chain applications

Since Polygon uses the same keypairs and wallets as Ethereum, and Ceramic’s identity system can be managed by Ethereum wallets, signing and authentication for Ceramic's data streams is supported out of the box for Polygon applications.

Now that Polygon and Ceramic can easily be used together, developers can:

Build data-rich user experiences and social features on fully decentralized tech Give users cloud-like backup, sync and recovery without running a centralized server Publish content on the open web without the need to anchor IPFS hashes on-chain Leverage interoperable profiles, social graphs and reputations across the Web3 ecosystem

Users can now perform transactions on Ceramic and Polygon with their existing wallets. For example, users of Boardroom’s DAO governance platform can participate in ideation forums and discussions with all content stored on Ceramic and managed by their decentralized identity.

Getting started with Ceramic and Polygon To get started on Polygon visit their documentation site To add IDX to your project, follow this installation guide To use Ceramic for streams without IDX, follow this installation guide. For questions or support, join the Ceramic Discord and the Polygon discord About Polygon

Polygon is the first well-structured, easy-to-use platform for Ethereum scaling and infrastructure development. Its core component is Polygon SDK, a modular, flexible framework that supports building and connecting Secured Chains like Plasma, Optimistic Rollups, zkRollups, Validium, etc, and Standalone Chains like Polygon POS, designed for flexibility and independence. Polygon’s scaling solutions have seen widespread adoption with 350+ Dapps, ~128M txns, and ~1M+ unique users.

If you’re an Ethereum Developer, you’re already a Polygon developer! Leverage Polygon’s fast and secure transactions for your Dapp, get started here.

Website | Twitter | Reddit | Discord | Telegram

About Ceramic

Ceramic provides developers with database-like functionality for storing all kinds of dynamic, mutable content. This finally gives developers a Web3 native way to add critical features like rich identities (profiles, reputation, social graphs), user-generated content (posts, interactions), dynamic application data, and much more.

Ceramic's stream-based architecture is designed for web-scale volume and latency and to handle any type of data. Built on top of open standards including IPFS, libp2p, and DIDs and compatible with any raw storage protocol like Filecoin or Arweave, all information stored on Ceramic exists within a permissionless cross-chain network that lets developers tap into an ever growing library of identities and data while using their preferred stack.

IDX (identity index) is a cross-chain identity protocol that inherits Ceramic's properties to provide developers with a user-centric replacement for server-siloed user tables. By making it easy to structure and associate data to a user's personal index, IDX lets applications save, discover and route to users' data.

Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


DIF Blog

🚀DIF Monthly #21 (Aug, 2021)

DIF "Frequently Asked Questions" Knowledgebase and many more news.
Table of contents Group Updates; 2. Member Updates; 3. Funding; 4. DIF Media; 5. Members; 6. events; 7. Jobs; 8. Metrics; 9. Join DIF 🚀 Foundation News DIF "Frequently Asked Questions" Knowledgebase

DIF has launched a massive knowledgebase, structured as a long series of frequently-asked questions and answers. This synthesizes a year of educational efforts in the interop WG, blog posts, newsletters, and many other DIF inputs in a format we hope will be useful as a reference and onboarding document throughout the decentralized identity space. Please peruse it, particularly the sections about your personal research focus and/or your company's specialty and products, opening issues or PRs on github wherever you feel a correction or addition is needed. This is intended as a community resource, so PRs are open to your input!

🍾 DIF Welcomes Chris Kelly at Comms

Hello and greetings to all! I am delighted to introduce myself as DIF's new communications director. I am originally from Dublin, Ireland, and home is now Berlin, Germany. I'm excited to get involved at DIF, support members in their documentation efforts and bring the decentralised identity conversation to a wider audience. I've spent time in the advertising and non-profit sectors, as a video producer, a sysadmin, and I cohost a comedy drag talkshow on YouTube. At DIF I'll be working on things like promoting our blog and managing our social media channels, and I have some exciting ideas in store.
Visit my drop-in office hours Tuesdays at 8pm CET (11am PT, 2pm EST) for a chat, or reach out to me directly via Slack or email.
I am always happy to hear your feedback and ideas for articles and initiatives!

🛠️ Group Updates ☂️ InterOp WG (cross-community) New Chairs of the Interop Group: David Waite (PingIdentity) Snorre Lothar von Gohren Edwin (Diwala) John Jordan - the ToIP vision and governance for decentralized identity Conversation about SSI, "eIDAS 2.0" and possible cooperation with the European Blockchain Services Infrastructure. EBSI: is this about roots of trust or trust frameworks? EBSI: European Blockchain Services Infrastructure eIDAS 1.0 was defining eIDs in a way that allows crossborder interop the EU digital ID & VC-holding wallets and EBSI should be seen as two distinct movements EU member states still can chose to adopt, or fork, or ignore whatever EBSI does they want to avoid potential misalignement of independent implementations across the EU here is the EBSI documentation homepage that gets updated regularly Report from the OIDF Working Groups with David Waite and Kristina Yasuda. Discussion about identity history, SIOP, WebAuthN, NFC support. history of the past 12 years in identity world OAuth1, Oauth2, Keybase etc. "Different kind of trust, can anchor with more sources of identity." With SIOP, you do self-generated and pairwise keys - ephemeral, not rooted to anything. one of reasons I think that "Sign in with Google" is so attractive is that I expect their Google account to outlast my account - so I don't have to do recovery. OIDF is interested in using VP as recovery scheme (e.g. drivers license). If shared during registration, becomes source of registration for recovery. repairing a damaged trust relationship is very time-, cost- and labor-intensive for a company. Using VP can help reestablish the trust relationship, spares asking secret questions, extra data eg. Mom's name, pets etc. Discussion about OpenID Connect for Verifiable Presentation Previous term: portable identifiers. For existing hosted provider, is there a way to include/assert DID ownership in challenge with other parties? Aggregated and Distributed Claims. upstream credentials without defined retrieval mechanism. Potential overlap with Credential Manifest 💡 Identifiers & Discovery New time for UR Calls: Wednesdays 2PM CET (8AM ET). Discussion about EBSI Ledger and Trust - List of Trusted Registries, Registry Properties, "Onboarding Service", LoA, key security, eIDAS, Current Status of EBSI Discussion about Historical Key Resolution. A DID document could point to a hub, which contains a list of historical keys associated with the DID. This list is signed by a current DID controller key. Using DID URLs, it should be possible to point to a specific historical key at a specific point in time, and it can be dereferenced publicly by anyone. Discussion on implementing this in the Universal Resolver. The ID WG could start a new work item which defines the data structure of historical keys, as well as the format of DID URls that point to them. Universal Resolver returning a key in a given format (e.g., JWK(S)) Discussions about transform-key DID URL parameter. PoC has been implemented in Universal Resolver. What are possible values? JWK, JWKS, base58, multibase, PEM, ...? Discussion on this approach vs. use media types. Discussion about Solid and DIDs. 🛡️ Claims & Credentials Workitem Status: WACI-PEX. Workitem Status: PE (Maintenance) + Credential Manifest. Workitem Status: VC Marketplace. Discussion: COVID Vaccination Pass Story. 🔓 DID Auth Updates from the OIDF-DIF co-hosted work. OIDC4VP draft that specifies how VPs can be transported using any OpenID Connect flow is maturing - this is an alternative to "empty VPs" Main piece left in OIDC4VP is how to use DIF PE as a query language. There are few outstanding issues in DIF PE GitHub that will be discussed on a call on 4th Aug - join us if interested SIOP V2 (previously known as did-siop) is also making progress. Big discussion happening now is how to secure and perform origin verification in a cross-device flow where verifier and the user wallet are on the different devices. Credential Provider draft is in the process of being integrated into the Claims Aggregation draft to define how VCs can be issues from the OpenID Identity Provider 📻 DID Comm PRs 198 - Typ/Cyt Language. 200 - Threading. 209 - Sequencing Extension. 211 - Attachment Format Attribute. 212 - ECDH 1PU Draft 4. 225 - THID / PTHID. 227 - Sender Key Protection. DIDComm v2 Library Interface Comparison. DIDComm-rs - Rust - Jolocom (who?) Ivan did-jwt/veramo - Typescript - Oliver Go - Securekey - Baha SICPA Support. Tracking Repo: Gemini 📦 Secure Data Storage EDV Dedicated Call: Once a data vault has been created, can the owner/controller of the data vault be changed? 22 Is the Data Vault Configuration stored as a (plain text) Document/resource or aa an EncryptedDocument in its data vault? 32 What should the allowed action/capability for querying an index be? 37 All examples must include complete JSON 38 Is the Data Vault Configuration stored as a (plain text) Document/resource or aa an EncryptedDocument in its data vault? 63 Identity Hub Call: Attempt to form a PR-ready opinion about the associations between logical objects. Will the spec allow for both flat and tiered structures across logical objects? Can we specify it in such a way that other structures can be virtually overlayed on top? How might such logical object associations determine how tracking of objects/sync works between instances? Discuss top-level API work/proposals, including DID-relative URLs and object-centric normalization of invocations 🔧 KERI IPR boundaries are difficult to navigate and understand - substantive contributions are blocked Proposal to map out specifications of common tooling CESR - pointers to relevant KIDs other than 001? possible subspec aligning TF work with KERI work? IIW session notes How to define “KERI-specific” subset of general-purpose version of CESR-- let’s start descriptively and address that later, it is a significant problem Input to the CESR spec might be this straw man CESAR doesn’t work in strictly-typed languages; it was hard to implement in Rust CESAR <> Sam's CDE encoding mechanism? Roadmap proposal 🌱 Sidetree ⚙️ Product Managers Specifications of eIDAS Toolbox is still early. Definition of where this toolbox will be specified seems unclear. The group is expanding and is looking for an additional chair. ✈️ Travel & Hospitality 2 major use cases Share Profile Use Case - developed by Verifiable Credentials & Offers Team. Discount Entitlement - developed by Verifiable Credentials & Offers Team. 4 Weekly sub-group meetings: Verifiable Credentials & Offers. Travel Change & Disruption. KYC / Customer Profile / Loyalty. Government Sanctioned Credentials. 🪙 Finance & Banking SIG Kevin Tussy, CEO @ Facetec gave a presentation and hosted the discussion. 🦄 Member Updates Affinidi Selective disclosure, Share What You Want.
Read more. 💰 Funding OSCAR HAS AN EMA

NGI Open Calls (EU)

Funding is allocated to projects using short research cycles targeting the most promising ideas. Each of the selected projects pursues its own objectives, while the NGI RIAs provide the program logic and vision, technical support, coaching and mentoring, to ensure that projects contribute towards a significant advancement of research and innovation in the NGI initiative. The focus is on advanced concepts and technologies that link to relevant use cases, and that can have an impact on the market and society overall. Applications and services that innovate without a research component are not covered by this model. Varying amounts of funding.

Learn more here.

🖋️ DIF Media 🎈 Events & Promotions

The Business of SSI: An IIW Special Topic
Aug 04, 2021 | Virtual Event

Thought leaders, researchers, educators, and more will come together for intensive discussion and thought-provoking dialogue on Opening Up the Learning-Earning Ecosystem, and what that means for your leadership, community, and students. Participants will hear from leaders and drivers of change and be able to engage with panels of experts as well as participate in open discussions on the latest developments within skills-based learning and hiring. Conference attendees will also benefit from learning cutting edge tools and hearing from ongoing pilot projects across the United States.

Internet Identity Workshop XXIII
October 12 - 14, 2021 | Virtual Event

You belong at IIW this Fall! You’ll acquire the real-time pulse of genuinely disruptive technologies that are the foundation of today's important Internet movements. Every IIW moves topics, code, and projects downfield. Name an identity topic and it's likely that more substantial discussion and work has been done at IIW than any other conference!

Books

Manning - 37% off on the book "Self Sovereign Identity"!

Manning is an independent publisher of computer books and video courses for software developers, engineers, architects, system administrators, managers and all who are professionally involved with the computer business. Use the code ssidif37 for the exclusive discount on all products for DIF members.

💼 Jobs

Members of the Decentralized Identity Foundation are looking for:

Software engineer (Remote) Product Design (Austin) Fullstack engineer (Austin)

Check out the available positions here.

🔢 Metrics

Newsletter: 4.7k subscribers | 31% opening rate
Twitter: 4.6k followers | 11k impressions | 2.6k profile visits
Website: 20k unique visitors

In the last 30 days.

🆔 Join DIF!

If you would like to get involved with DIF's work, please join us and start contributing.

Can't get enough of DIF?
follow us on Twitter
join us on GitHub
subscribe on YouTube
read us on our blog
or read the archives

Got any feedback regarding the newsletter?
Please let us know - we are eager to improve.


Nyheder fra WAYF

Marstal Navigationsskole med i WAYF

I dag er Marstal Navigationsskole indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som MARNAV-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse. Language Danish Read more about Marstal Navigationsskole med i WAYF

I dag er Marstal Navigationsskole indtrådt i WAYF som brugerorganisation. Studerende og ansatte herfra kan derfor nu identificere sig som MARNAV-brugere over for de mange webtjenester i WAYF og eduGAIN af relevans for forskning og uddannelse.

Language Danish Read more about Marstal Navigationsskole med i WAYF

DIF Medium

Setting Interoperability Targets Part 2 of 2

Having shown in our last piece how interoperability “profiles” are designed, we now tackle some key technical problem areas ripe for this kind of profile-first interoperability work across stacks. In our last essay, we explored the means and ends of interoperability targets and roadmapping across stacks and markets. “Interoperability,” like “standardization,” can be a general-purpose tool or

Having shown in our last piece how interoperability “profiles” are designed, we now tackle some key technical problem areas ripe for this kind of profile-first interoperability work across stacks.

In our last essay, we explored the means and ends of interoperability targets and roadmapping across stacks and markets. “Interoperability,” like “standardization,” can be a general-purpose tool or an umbrella of concepts, but rarely works from the top-down. Instead, specific use-cases, consortia, contexts, and industries have to take the lead and prototype something more humble and specific like an “interoperability profile” — over time, these propagate, get extended, get generalized, and congeal into a more universal and stable standard. Now, we’ll move on to some technical problem areas ripe for this kind of profile-first interoperability work across stacks.

> What makes sense to start aligning now? What are some sensible scopes for interoperating today, or yesterday, to get our fledgling market to maturity and stability as soon as safely possible?

Ponte Vecchio, Fiorenze, by Ray Harrington From testable goals to discrete scopes

The last few months have seen a shift in terminology and approach, as many groups turn their attention from broad “interoperability testing” to more focused “profiles” that test one subset of the optionalities and capabilities in a larger test suite or protocol definition. This decoupling of test suites from multiple profiles each suite can test helps any one profile from ossifying into a “universal” definition of decentralized identity’s core featureset.

As with any other emerging software field, every use case and context has its own interoperability priorities and constraints that narrow down the technological solution space into a manageable set of tradeoffs and decisions. For instance, end-user identification at various levels of assurance is often the most important implementation detail for, say, a retail bank, and DID-interoperability (which is not always the same thing!) might be a hardware manufacturer’s primary concern in being able to secure hardware supply chains.

Every industry has its unique set of minimum security guarantees, and VC-interoperability is obviously front-of-mind for credentialing use-cases. For example, in the medical data space, “Semantics” (data interpretation and metadata) might be a harder problem (or a more political one) than the mechanics of identity assurance, since high standards of end-user privacy and identity assurance have already made for a relatively interoperable starting points. Which exact subset of the many possible interoperability roadmaps is safest or most mission-critical for a given organization depends on many factors: the regulatory context, the culture of the relevant sectors, its incentive-structures, and its business models.

Cross-cutting industry verticals, however, are structural issues with how decentralized identity stacks and architecture vary, which can already been seen today. By applying “first principles,” (or in our case, the “functions” of a decentralized identity system and its many moving parts) across use-cases and industrial contexts, certain shared problems arise. As is our default approach in DIF Interop WG, we applied DIF’s in-house mental model of the “5 layers” of decentralized identity systems, sometimes called “the 4+1 layers”. (See also the more detailed version).

We call these the “4+1” layers because our group agreed that a strict layering was not possible, and that all the architectures we compared for using verifiable credentials and decentralized identifiers had to make major architectural decisions with consequences across all four of the more properly “layered” categories. This fifth category we named “transversal considerations,” since they traverse the layers and often come from architectural constraints imposed by regulation, industrial context, etc. Foremost among these are storage and authorization, the two most vexing and cross-cutting problems in software generally; these would justify an entirely separate article.

In a sense, none of the topics from this transversal category are good candidates for specification in the medium-term across verticals and communities — they are simply too big as problems, rarely specific to identity issues, and being addressed elsewhere. These include “storage” (subject of our newest working group), “authorization” (debatably the core problem of all computer science!), “cryptographic primitives”, and “compliance.” (These last two are each the subject of a new working group, Applied Cryptography and Wallet Security!). These interoperability scopes are quite difficult to tackle quickly, or without a strong standards background. Indeed, these kinds of foundational changes require incremental advances and broad cooperation with large community organizations. This is slow, foundation work that needs to connect parallel work across data governance, authentication/authorization, and storage in software more generally.

Similarly, the fourth layer, where ecosystem-, platform-, and industry-specific considerations constrain application design and business models, is unlikely to crystallize into a problem space calling out for a single specification or prototype in the medium-term future. Here, markets are splintered and it is unclear what can be repurposed or recycled outside of its original context. Even if there were cases where specification at this later would be timely, DIF members might well choose to discuss those kinds of governance issues at our sister-organizations in the space that more centrally address data governance at industry-, ecosystem-, or national- scale: Trust over IP was chartered to design large-scale governance processes, and older organizations like MyData.org and the Kantara Initiative also have working groups and publications on vertical-specific and jurisdiction-specific best practices for picking interoperable protocols and data formats.

That still leaves three “layers”, each of which has its own interoperability challenges that seem most urgent. It is our contention that each of these could be worked on in parallel and independently of the other two to help arrive at a more interoperable community — and we will be trying to book presentation and discussion guests in the coming months to advance all three.

Scope #1: Verifiable Credential Exchange

The most clear and consensus-building, even urgent way forward is to bring clarity to Verifiable Credential exchange across stacks. This has been the primary focus of our WG for the last year. Given that most of the early ecosystem-scale interest in SSI revolves around credentialing (educational credentials, employment credentials, health records), it is highly strategic to get translation and unified protocols into place soon for the interoperable verification and cross-issuance of credentials.

In fact, there has actually been a lot of good progress made since Daniel Hardman wrote an essay on the Evernym blog making a pragmatic case for sidestepping differences in architecture and approach to exchange VCs sooner. This aligned with much of our group’s work in recent months, which has included a survey of VC formats among organizations producing “wallets” for verifiable credentials (be they “edge” wallets or otherwise). Our group has also sought to assist educational efforts at the CCG, in the Claims and Credentials working group, and elsewhere to make wallet-producing organizations aware of the relevant specifications and other references needed to make their wallets multi-format sooner and less painfully. Much of this work was crystalized into an article by co-chair Kaliya Young and crowd-edited by the whole group; this work was a major guiding structure for the Good Health Pass work that sought to make a common health record format (exported from FHIR systems) equally holdable and presentable across all of today’s VC systems.

One outgrowth of this effort and other alignments that have taken place since Hardman’s and Young’s article is the work of prototyping a multi-community exchange protocol that allow a subset of each stack’s capabilities and modes to interoperate. This tentative, “minimum viable profile” is called WACI-PEx and is currently a work item of the Claims and Credentials working group. Work is ongoing on v0.1, and an ambitious, more fully-featured v1 is planned for after that. This profile acts as an “extension” of the broader Presentation Exchange protocol, giving a handy “cheat sheet” for cross-stack wallet-to-issuer/verifier handshakes so that developers not familiar with all the stacks and protocols being spanned have a starting point for VC exchanges. Crucially, the results of this collaborative prototype will be taken as inputs to future versions of the DIDComm protocols and the WACI specification for Presentation Exchange.

Note: There has been some discussion of a OIDC-Presentation Exchange profile at some point in the future, but given that the alignment of DIDComm and Presentation Exchange started over a year ago, the most likely outcome is that work on this would not start until after v1 has been released of the “WACI-PEx” profile for DIDComm has been released.

Scope #2: Anchoring layer

Of course, other forms of alignment are possible as well in the “bottom 3” layers of the traditional stack, while we wait on the ambitious transversal alignment specifications and the ongoing work to align and simplify cross-format support for VCs (and perhaps even multi-format VCs, as Hardman points out in the essay above).

The Identifiers and Discovery WG at DIF has long housed many work items to align on the lowest level of stack, including general-purpose common libraries and recovery mechanisms. It has also received many donations that contribute to method-level alignment, including a recent DID-Key implementation and a linked-data document loader donated by Transmute. The group has also served as a friendly gathering point for discussing calls for input from the DID-core working group at W3C and for proposing W3C-CCG work items.

One particularly noteworthy initiative of the group has been the Universal Resolver project, which offers a kind of “trusted middleware” approach to DID resolution across methods. This prototype of a general-use server allows any SSI system (or non-SSI system) to submit a DID and get back a trustworthy DID Document, without needing any knowledge of or access to (much less current knowledge of or authenticated access to) the “black box” of the participating DID methods. While this project only extends “passive interoperability” to DIDs, i.e., only allowing DID document querying, a more ambitious sister project, the Universal Registrar, strives to bring a core set of CRUD capabilities to DID documents for DID methods willing to contribute drivers. Both projects have dedicated weekly calls on the DIF calendar, for people looking to submit drivers or get otherwise involved.

Scope #3: “Agent”/Infrastructure Layer

There is another layer, however, in between DIDs and VCs, about which we haven’t spoken yet: the crucial “agent layer” in the ToIP/Aries mental model, which encompasses trusted infrastructure whether in or outside of conventional clouds. The Aries Project has scaled up an impressively mature ecosystem of companies and experimenters, largely thanks to the robust infrastructure layer it built (and abstracting away from application-layer developers and experimenters).

Until now, differences of strategy with respect to conventional clouds and infrastructures have prevented large-scale cooperation and standardization at this layer outside of Aries and the DID-Comm project. Partly, this has been a natural outgrowth of the drastically different infrastructural needs and assumptions of non-human, enterprise, and consumer-facing/individual use cases, which differ more at this level than above or below. Partly, this is a function of the economics of our sector’s short history, largely influenced by the infrastructure strategies of cloud providers and telecommunication concerns.

This is starting to change, however, now that agent frameworks inspired by the precedent set by the Aries frameworks have come into maturity. Mattr’s VIII Platform, the Affinidi framework SDK (including open-source components by DIF members Bloom, Transmute, and Jolocom), ConsenSys’ own modular, highly extensible and interoperable Veramo platform, and most recently Spruce ID’s very DID-method-agnostic DIDKit/Credible SDK all offer open-source, extensible, and scalable infrastructure layers that are driving the space towards greater modularity and alignment at this layer.

As these platforms and “end-to-end stacks” evolve into frameworks extensible and capacious enough to house ecosystems, DIF expects alignment and harmonization to develop. This could mean standardization of specific components at this layer, for example:

The DIDComm protocol could expand into new envelopes and transports Control recovery mechanisms could be specified across implementations or even standardized on a technical and/or UX level Auditing or historical-query requirements could be specified to form a cross-framework protocol or primitive Common usages of foreign function interfaces, remote procedure calls like gRPC and JSON-RPC, or other forms of “glue” allowing elements to be reused or mixed and matched across languages and architectures could be specified as a community

We are very much in early days, but some see on the horizon a day when frameworks that don’t cooperate with one another can’t compete with the ones that join forces. After all, adoption brings growing pains, particularly for the labor market — aligning on architectures and frameworks makes onboarding developers and transferring experience that much easier to do!

Next Steps

I would encourage anyone who has read this far to pick at least one of the three scopes mentioned above and ask themselves how they are helping along this alignment process in their day-to-day work, and if they truly understand what major players at that level are doing. Often large for-profit companies pay the most attention to what their competitors are doing, but here it is important to think outside of competition and look instead at non-profit organizations, regulators, and coalitions of various kinds to really see where the puck is heading. Sometimes consensus on one level is blocking compromise somewhere else. It can be pretty hard to follow!

In recent articles, DIF has encouraged its members to think about an open-source strategy as comparably important to a business plan, a living document and guiding philosophy. I would like to suggest that the subset of DIF companies working with VCs and DIDs should also think of interoperability strategy and conformance testing as the most crucial pillar of that strategy — if you cannot demonstrate interoperability, you might be asking people to take that strategy on faith!

Setting Interoperability Targets Part 2 of 2 was originally published in Decentralized Identity Foundation on Medium, where people are continuing the conversation by highlighting and responding to this story.


Digital Scotland

Transforming Scottish Education on the Blockchain

A keynote use case for the Scottish Credential Ecosystem that demonstrates the tremendous potential for this approach is Education, notably the awarding of academic credentials.… The post Transforming Scottish Education on the Blockchain appeared first on DigitalScot.net.

A keynote use case for the Scottish Credential Ecosystem that demonstrates the tremendous potential for this approach is Education, notably the awarding of academic credentials.

Innovations relevant to this use case include ‘Blockcerts‘, an open source blockchain project for enabling a Universal Verifier that will verify any Blockcert issued by any institution, anywhere in the world.

Via their Medium article UniversaBlockchain explore the scenario of Blockchain in Education.

They highlight keynote problems like the high rates of medical school diploma falsification as pain points a technology like Blockchain is ideal for tackling in some form, among a wave of other transformative benefits for the sector as it ripples through all workflow areas related to HR, resume checking, et al.

Athena builds on this some, notably detailing the core signature process that underpins the integrity of the record, as a comparison to traditional paper-based approaches:

Blockchain-enabled digital certificates are immutable and cannot be forged. The records are stored on a distributed ledger, hence certificates can be only evaluated by anyone who has access to the blockchain. Since the records are stored in a shared distributed ledger, the certificate can still be validated even if the organization that had issued it no longer exists. The digital certificates stored in the ledger can only be destroyed if all the copies in every system are destroyed.

Countries like the Bahamas are now issuing Blockchain-based academic certificates.

Digitary is one vendor specializing in this sector, offering a suite of features for managing academic credentials and digital badges, and has teamed up with Evernym to integrate these with Self-Sovereign Identity.

As this news highlights one of the first customers to harness this capability is the Association of the Registrars of the Universities and Colleges of Canada (ARUCC),  choosing Digitary as the solution provider for the Made for Canada National Network.

This initiative means the Canadian higher education community is creating the very first online platform and national credential wallet for post-secondary learners. Once fully operational, the Network will enable 3 million learners across the country to access and share their official digitized post-secondary transcripts and credentials online – anytime, anywhere.

Digital Badges – Foundation for Gamified Micro Learning

This capability would ideally go hand in hand with ‘Digital Badges’, which can also be authenticated via the Blockchain the same way. As Hastac explains :

“A digital badge is a validated indicator of accomplishment, skill, quality, or interest that can be earned in many learning environments. Open digital badging makes it easy for anyone to issue, earn, and display badges across the web—through an infrastructure that uses shared and open technical technical standards.”

Open digital badging makes it easy for anyone to issue, earn, and display badges across the web—through an infrastructure that uses shared and open technical technical standards. Organizations like Credly facilitate their universality across industries.

A pertinent example of how this can be applied in the corporate world is this example of the Scottish Social Services Council uses them to underpin workforce learning. The BCS describes this as the future of professional development, with many organizations like Siemens using them this way.

What if your CV was based on how many ‘#digitalbadges’ and skills you could list, rather than what degree you held? A new report from @theRSAorg labels this future-of-work scenario as the ‘Precision Economy’. Read more: https://t.co/MTf3ds5XRN #futureofwork #skills2035 pic.twitter.com/fzEAonIN58

— Skills Development Scotland (@skillsdevscot) March 11, 2020

The post Transforming Scottish Education on the Blockchain appeared first on DigitalScot.net.

Monday, 09. August 2021

Oasis Open

SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification

This document is intended for developers and architects who wish to design systems and applications that utilize threshold sharing schemes in an interoperable manner. The post SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification appeared first on OASIS Open.

New CS is ready for testing and implementation

OASIS is pleased to announce that SAM Threshold Sharing Schemes Version 1.0 from the OASIS Security Algorithms and Methods (SAM) TC [1] has been approved as an OASIS Committee Specification.

This document is intended for developers and architects who wish to design systems and applications that utilize threshold sharing schemes in an interoperable manner.

This Committee Specification is an OASIS deliverable, completed and approved by the TC and fully ready for testing and implementation.

The prose specifications and related files are available here:

SAM Threshold Sharing Schemes Version 1.0
Committee Specification 01
04 August 2021

Editable source:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs01/sam-tss-v1.0-cs01.docx (Authoritative)
HTML:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs01/sam-tss-v1.0-cs01.html
PDF:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs01/sam-tss-v1.0-cs01.pdf

Distribution ZIP file
For your convenience, OASIS provides a complete package of the prose specification and related files in a ZIP distribution file. You can download the ZIP file here:
https://docs.oasis-open.org/sam/sam-tss/v1.0/cs01/sam-tss-v1.0-cs01.zip

Members of the SAM TC [1] approved this specification by Special Majority Vote. The specification had been released for public review as required by the TC Process [2]. The vote to approve as a Committee Specification passed [3], and the document is now available online in the OASIS Library as referenced above.

Our congratulations to the TC on achieving this milestone and our thanks to the reviewers who provided feedback on the specification drafts to help improve the quality of the work.

Additional references

[1] OASIS Security Algorithms and Methods (SAM) TC
https://www.oasis-open.org/committees/sam/

[2] Public review:
* 30-day public review, 02 June 2021:
https://lists.oasis-open.org/archives/members/202106/msg00002.html
https://lists.oasis-open.org/archives/members/202106/msg00003.html
– Review metadata:
https://docs.oasis-open.org/sam/sam-tss/v1.0/csd01/sam-tss-v1.0-csd01-public-review-metadata.html
– Comment resolution log:
https://docs.oasis-open.org/sam/sam-tss/v1.0/csd01/sam-tss-v1.0-csd01-comment-resolution-log.txt

[3] Approval ballot:
https://www.oasis-open.org/committees/ballot.php?id=3634

The post SAM Threshold Sharing Schemes v1.0 from SAM TC approved as a Committee Specification appeared first on OASIS Open.


Call for Consent for OSLC Core V3.0 and OSLC Query V3.0 as OASIS Standards

Two key OSLC Project Specifications enter the call for consent as OASIS Standards The post Call for Consent for OSLC Core V3.0 and OSLC Query V3.0 as OASIS Standards appeared first on OASIS Open.

Two Project Specifications from the OSLC Open Project begin the call for consent as OASIS Standards

The OASIS Open Services for Lifecycle Collaboration (OSLC) OP members [1] have approved submitting the following Project Specifications as candidates for OASIS Standard to the membership:

OSLC Core Version 3.0.
Project Specification 02
23 April 2021

OSLC Query Version 3.0
Project Specification 01
01 October 2020

This is a call to the primary or alternate representatives of OASIS Organizational Members to consent or object to this approval. You are welcome to register your consent explicitly on the ballot; however, your consent is assumed unless you register an objection [2]. To register an objection, you must:

Indicate your objection on this ballot, and Provide a reason for your objection and/or a proposed remedy to the OP.

You may provide the reason in the comment box or by email to the Open Project on its general mailing list [2]. If you provide your reason by email, please indicate in the subject line that this is in regard to the Call for Consent. Note that failing to provide a reason and/or remedy may result in an objection being deemed invalid.

OASIS Open Services for Lifecycle Collaboration (OSLC) is an OASIS Open Project operating under the Open Project Rules [3]. Specifically, OSLC Core Version 3.0 and OSLC Query Version 3.0 have proceeded through the standards process defined in https://www.oasis-open.org/policies-guidelines/open-projects-process/#project-specifications and is presented for consideration as an OASIS Standard following the rules in https://www.oasis-open.org/policies-guidelines/open-projects-process/#oasis-standard-approval-external-submissions [4].

Details

The Call for Consent is open now and closes on 23 August 2021 23:59 pm timezone. You can access the ballot at:

Internal link for voting members: https://www.oasis-open.org/apps/org/workgroup/voting/ballot.php?id=3636

Publicly visible link: https://www.oasis-open.org/committees/ballot.php?id=3636

OASIS members should ensure that their organization’s voting representative responds according to the organization’s wishes. If you do not know the name of your organization’s voting representative is, go to the My Account page at

http://www.oasis-open.org/members/user_tools

then click the link for your Company (at the top of the page) and review the list of users for the name designated as “Primary”.

Information about the candidate OASIS Standard and the OSLC Open Project

The OSLC initiative applies Linked Data principles, such as those defined in the W3C Linked Data Platform (LDP), to create a cohesive set of specifications that can enable products, services, and other distributed network resources to interoperate successfully.

OSLC Core defines the overall approach to Open Services for Lifecycle Collaboration based specifications and capabilities that extend and complement the W3C Linked Data Platform. The OP received 3 Statements of Use from KTH Royal Institute of Technology, SodiusWillert, and IBM [5].

OSLC Query provides a mechanism for a client to query or search for RDF resources that match a given criteria. The response to a successful query includes the RDF of a query result container that references the member resources found by the query, and optionally includes selected properties of each member resource. The OP has received 3 Statements of Use from KTH Royal Institute of Technology, Koneksys, and IBM [6].

URIs


The prose specification document and related files are available here

OSLC Core Version 3.0.

Part 1: Overview

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/oslc-core.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/oslc-core.pdf

Part 2: Discovery

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/discovery.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/discovery.pdf

Part 3: Resource Preview

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/resource-preview.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/resource-preview.pdf

Part 4: Delegated Dialogs

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/dialogs.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/dialogs.pdf

Part 5: Attachments

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/attachments.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/attachments.pdf

Part 6: Resource Shape

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/resource-shape.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/resource-shape.pdf

Part 7: Vocabulary

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-vocab.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-vocab.pdf

Part 8: Constraints

HTML:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-shapes.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-shapes.pdf

Machine Readable Vocabulary Terms
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-vocab.ttl

Machine Readable Constraints
https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-shapes.ttl

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:

https://docs.oasis-open-projects.org/oslc-op/core/v3.0/ps02/core-v3.0-ps02.zip

OSLC Query Version 3.0

HTML:
https://docs.oasis-open-projects.org/oslc-op/query/v3.0/ps01/oslc-query.html

PDF:
https://docs.oasis-open-projects.org/oslc-op/query/v3.0/ps01/oslc-query.pdf

For your convenience, OASIS provides a complete package of the specification document and any related files in ZIP distribution files. You can download the ZIP file at:

http://docs.oasis-open-projects.org/oslc-op/query/v3.0/ps01/query-v3.0-ps01.zip

Additional information

[1] OASIS Open Services for Lifecycle Collaboration (OSLC) Open Project
https://open-services.net

OP IPR page
https://github.com/oasis-open-projects/administration/blob/master/IPR_STATEMENTS.md#oslc

[2] Comments may be submitted to the OP via the project mailing list at oslc-op@lists.oasis-open-projects.org. To subscribe, send an empty email to oslc-op+subscribe@lists.oasis-open-projects.org and reply to the confirmation email.

All emails to the OP are publicly archived and can be viewed at https://lists.oasis-open-projects.org/g/oslc-op/topics

[3] Open Project Rules
https://www.oasis-open.org/policies-guidelines/open-projects-process/

[4] Timeline Summary:

OSLC Core Version 3.0

PS02 approved as a candidate for OASIS Standard 08 June 2021: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/151

Project Specification 02 (PS02) approved for publication 23 April 2021: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/133

Project Specification 01 (PS01) approved for publication 17 September 2020: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/73

Project Specification Draft 04 (PSD04) approved for publication 20 December 2019: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/47

OSLC Query Version 3.0

PS01 approved as a candidate for OASIS Standard 08 June 2021: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/149

Project Specification 01 (PS01) approved for publication 01 October 2020: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/76

Project Specification Draft 01 (PSD01) approved for publication 03 July 2020: https://lists.oasis-open-projects.org/g/oslc-op-pgb/message/64

[5] OSLC Core Statements of Use:

KTH Royal Institute of Technology:
https://lists.oasis-open-projects.org/g/oslc-op/message/526 Sodius Willert:
https://lists.oasis-open-projects.org/g/oslc-op/message/530 IBM:
https://lists.oasis-open-projects.org/g/oslc-op/message/567

[6] OSLC Query Statements of Use:

KTH Royal Institute of Technology:
https://lists.oasis-open-projects.org/g/oslc-op/message/526 Koneksys:
https://lists.oasis-open-projects.org/g/oslc-op/message/519 IBM:
https://lists.oasis-open-projects.org/g/oslc-op/message/567

The post Call for Consent for OSLC Core V3.0 and OSLC Query V3.0 as OASIS Standards appeared first on OASIS Open.


Berkman Klein Center

​Lumen Researcher Interview Series: Andrea Fuller, Wall Street Journal

In the latest installment of the Lumen Researcher interview series, we spoke with Andrea Fuller, a reporter for The Wall Street Journal who specializes in data analysis. Prior to her work with the WSJ, Fuller was previously a data journalist for Gannett Digital, The Center for Public Integrity, and The Chronicle of Higher Education. Fuller’s work combines data, spreadsheets, and code with investi

In the latest installment of the Lumen Researcher interview series, we spoke with Andrea Fuller, a reporter for The Wall Street Journal who specializes in data analysis. Prior to her work with the WSJ, Fuller was previously a data journalist for Gannett Digital, The Center for Public Integrity, and The Chronicle of Higher Education.

Fuller’s work combines data, spreadsheets, and code with investigative journalism and storytelling. Last May, Fuller wrote an article for the Journal about her investigation into Google’s handling of DMCA takedowns. Using Lumen’s extensive database of copyright notices, Fuller exposed some of the shady copyright claims and abusive submitters that were flying under Google’s radar.

In this interview, Adam Holland, Project Manager at the Lumen Project, Anna Callahan and Gina Markov, Lumen’s 2021 summer interns, spoke with Fuller about her research and how the Lumen Database helped her develop her story about the bad actors in the world of DMCA takedowns.

Lumen: Could you give us a brief description of your bio?

Andrea Fuller: I’ve been at the Journal since 2014, so, seven years now. I’ve done tech stories, I’ve done a lot about nonprofit shenanigans, I’ve done a lot about education. So, I’m a little bit of a jack of all trades when it comes to subjects. But I mainly work on data projects at the Journal — I do spreadsheets, I do SQL, I do R, I write code, I call people and talk to them, and then I write stories. So, yeah, I do a little bit of everything.

Lumen: In 2020, you wrote an article for The Wall Street Journal about how Google approaches DMCA takedown requests. Could you give us some background about your process of researching and writing that piece?

Andrea Fuller: We started this article in late 2019 after it came out of some other Google reporting. We came across some of Lumen’s own blog posts and great work about the DMCA and we wanted to take this a step further. So we spent months and months downloading lots of DMCA requests from the Lumen database. We got the Lumen API key and started uploading the text of these requests and assembling our own database.

I set up a MongoDB database that we imported all these JSON files into, and we started collecting them incrementally. I actually wrote a bunch of code to turn it into a relational database so I could search them more easily. I was working in SQL. Google releases its own transparency report, which does not have all the detailed information of the text of these requests, but does say how many and which of the URLs were removed, so I was able to cross reference that.

I looked for any DMCA requests against major websites, whether they be major retail sites or news sites. It was interesting — there weren’t any successful DMCA requests from sites like The Wall Street Journal or The New York Times. What I started finding was more obscure, niche news sites — whether they be blogs or international news sites in foreign languages — [where] Google had complied with the take-down requests [it had received]. It became evident to me that Google was doing a really good job at filtering out bad takedown requests against major newspapers and major consumer sites, but not so much against more esoteric media.

Lumen: Do you have any speculation as to why that is?

Andrea Fuller: One of the things we say in our article is that the team at Google that’s in charge of DMCA is not very big. To actually manually review all of these requests is a near impossible task as the volume of DMCA requests has increased dramatically. And it’s really hard to make those judgment calls, because whether something is a real news site or not can be a really difficult determination. You get into broader philosophical discussions about the credibility of media in particular.

In one of the most compelling interviews that I had, I talked to this journalist in the Ukraine who was affiliated with an international reporting organization. He described these shady characters who will post fake copies of webpages on live journals and then file DMCA take-down requests to remove the original pages. It works, forcing them to file counterclaims, but there’s a gap before you can have it restored. So it’s a pretty effective strategy for getting content removed if you are an alleged Ukrainian gangster.

I also talked to one girl who was a blogger in Singapore, and she had a blog that was taken down through a DMCA request. She was writing letters to Google saying, “Why? I don’t understand. What did I do?” She didn’t do anything. She wrote about this Instagram influencer who was promoting a questionable sham cryptocurrency product. And that [product’s] organization created a blog post and filed a take-down request, claiming that it wrote the article first, which was laughable because it was clearly written in her voice, in her person. So, it’s really hard for people to fight these. They can definitely file counterclaims, but there is a delay in the process and then sometimes people don’t necessarily have the resources.

It’s a really tricky issue. It would take a lot of manpower to manually review all of these. Even though I was able to use queries to identify likely problematic ones, this took me a lot of time. Eventually, I sent Google about 100 things that I pretty much knew were wrong. And from that, because they have much more data than I do internally, they were able to restore something like 50,000 pages that they had taken down erroneously. And that’s just out of my one little slice of research.

Lumen: Did you see any trends in how fraudulent notices looked as you were going through them?

Andrea Fuller: You would see examples of things that claim to have happened earlier, but the date was something that couldn’t possibly match with the dates in the context of the piece. And it was clearly just made up. So, it was pretty self evident. I think there were a lot of really clever, yet nefarious, tactics. But once you look at some of these articles it was really blatantly obvious: We had these LiveJournals that had been hacked and were suddenly all in Russian or Ukrainian, and had all these articles about oligarchs.

In other cases, it was harder. Both of these sites in Ukrainian look kind of scammy, which is the real one and which isn’t? And it was really hard to assess that in some cases.

Lumen: What was the most shocking or unexpected thing that you found during your research?

Andrea Fuller: There were a couple things that I think surprised me, one being the hacking of LiveJournals. The idea that LiveJournals played such a big role was so bizarre to me. I actually, at one point, started searching for any take-down requests that were based on claiming that a LiveJournal had copyright, because that was a way I was identifying them. I think I was also pretty surprised by that story where the girl’s blog actually got taken down because it was a Google Blogger product, and that was pretty stunning to me. Not only was it not available in search, but it was just gone from the internet entirely and that was pretty troubling.

Lumen: What alternative approach do you think you would have been able to take had the notices not been available to you via Lumen? Do you think that approach would have been as effective?

Andrea Fuller: I think it would have been really difficult just going off the URLs — I don’t think this would have even been possible without the Lumen data. I don’t think we could have found such a large volume of stuff. Having the actual text of the complaint was what made this analysis possible and I can’t even really fathom how we could have done it simply based on the transparency report data. We could have probably found a few anecdotes, but I didn’t know what websites to search for because I don’t know the name of Ukrainian blogs, so I think it would have been really difficult without Lumen.

Lumen: Do you think there is any appetite for a follow-up piece about this stuff?

Andrea Fuller: Yes. My coworker, Rachel Levy, did a great story a couple months before mine about reputation management. This hedge fund manager had created fake websites to promote himself and manipulate Google search results. So, it’s certainly a really interesting area, and I think DMCA is a larger piece of that puzzle.

Lumen: As a data journalist who often covers issues pertaining tech and business, how important, in your opinion or experience, is transparency through notice sharing?

Andrea Fuller: As a journalist, we do our jobs based on this kind of transparency. If there had been no Google transparency report or Lumen database we wouldn’t have been able to do this. And so, you have to at least give Google credit for releasing that kind of data. And I think if there was more stuff like that out there in the tech world that would be really beneficial to researchers and journalists and people who can help with the accountability process. I think that the more transparent data we have, it makes stuff like that possible. And at the end of the day, it helps the public and consumers and it really helps the Ukrainian journalists. And that’s what we’re all here for.

[The quotes in this interview have been edited for clarity.]

​Lumen Researcher Interview Series: Andrea Fuller, Wall Street Journal was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 06. August 2021

Own Your Data Weekly Digest

MyData Weekly Digest for August 6th, 2021

Read in this week's digest about: 14 posts
Read in this week's digest about: 14 posts

Wednesday, 04. August 2021

Oasis Open

Invitation to comment on OData Extension for Temporal Data v4.0

This specification defines how to represent and interact with time-dependent data using OData. The post Invitation to comment on OData Extension for Temporal Data v4.0 appeared first on OASIS Open.

First public review ends September 3rd

OASIS and the OASIS Open Data Protocol (OData) TC are pleased to announce that OData Extension for Temporal Data Version 4.0 is now available for public review and comment.

This specification defines how to represent and interact with time-dependent data using the Open Data Protocol (OData). It defines semantics and a representation for temporal data, including operations for querying and modifying temporal data along with vocabulary terms to annotate which data depends on time, and how.

The documents and related files are available here:

OData Extension for Temporal Data Version 4.0
Committee Specification Draft 01
29 July 2021

Editable source (Authoritative):
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/odata-temporal-ext-v4.0-csd01.docx
HTML:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/odata-temporal-ext-v4.0-csd01.html
PDF:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/odata-temporal-ext-v4.0-csd01.pdf

OData Temporal ABNF Construction Rules Version 4.0:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/abnf/odata-temporal-abnf.txt
OData Temporal ABNF Test Cases:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/abnf/odata-temporal-testcases.xml
OData Temporal Vocabulary:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/vocabularies/Org.OData.Temporal.V1.xml

For your convenience, OASIS provides a complete package of the specification document and any related files in a ZIP distribution file. You can download the ZIP file at:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/odata-temporal-ext-v4.0-csd01.zip

How to Provide Feedback

OASIS and the OASIS Open Data Protocol (OData) TC value your feedback. We solicit input from developers, users and others, whether OASIS members or not, for the sake of improving the interoperability and quality of its technical work.

The public review starts 05 August 2021 at 00:00 UTC and ends 03 September 2021 at 23:59 UTC.

Comments may be submitted to the TC by any person through the use of the OASIS TC Comment Facility which can be used by following the instructions on the TC’s “Send A Comment” page (https://www.oasis-open.org/committees/comments/index.php?wg_abbrev=odata).

Comments submitted by TC non-members for this work and for other work of this TC are publicly archived and can be viewed at:
https://lists.oasis-open.org/archives/odata-comment/

All comments submitted to OASIS are subject to the OASIS Feedback License, which ensures that the feedback you provide carries the same obligations at least as the obligations of the TC members. In connection with this public review, we call your attention to the OASIS IPR Policy [1] applicable especially [2] to the work of this technical committee. All members of the TC should be familiar with this document, which may create obligations regarding the disclosure and availability of a member’s patent, copyright, trademark and license rights that read on an approved OASIS specification.

OASIS invites any persons who know of any such claims to disclose these if they may be essential to the implementation of the above specification, so that notice of them may be posted to the notice page for this TC’s work.

Additional information about this review and any previous public reviews is published with the specification documents at:
https://docs.oasis-open.org/odata/odata-temporal-ext/v4.0/csd01/odata-temporal-ext-v4.0-csd01-public-review-metadata.html.

Additional information about the specification and the OData TC can be found at the TC’s public home page:
https://www.oasis-open.org/committees/odata/

Additional references

[1] https://www.oasis-open.org/policies-guidelines/ipr

[2] https://www.oasis-open.org/committees/odata/ipr.php
https://www.oasis-open.org/policies-guidelines/ipr#RF-on-RAND-Mode

The post Invitation to comment on OData Extension for Temporal Data v4.0 appeared first on OASIS Open.


ID2020

ID2020 Welcomes Learning Economy Foundation as Newest Alliance Partner

We are delighted to announce today that the Learning Economy Foundation has joined the ID2020 Alliance. The Learning Economy Foundation envisions a world in which learners can map their educational progress to achieve their academic, employment, and life goals. A U.S. based non-profit organization with a global mission, Learning Economy is accelerating progress toward 21st-century education

We are delighted to announce today that the Learning Economy Foundation has joined the ID2020 Alliance.

The Learning Economy Foundation envisions a world in which learners can map their educational progress to achieve their academic, employment, and life goals.

A U.S. based non-profit organization with a global mission, Learning Economy is accelerating progress toward 21st-century education, addressing inequity in the workplace, and connecting a fragmented ecosystem of stakeholders into a unified hub of innovation — the Internet of Education

They do this by fostering collaboration among a diverse array of stakeholders through regional “labs”, which offer a unified ecosystem for innovation, pilots, research, and collaboration for the future of education and work. Learning Economy is currently running labs in the Asia Pacific region; Broward County, Florida; Colorado; and beyond.

“Education and workforce development represent an exciting opportunity to apply digital ID technology, especially as we think about the potential of digitally verifiable educational credentials,” said ID2020 head of advocacy and communication, Ethan Veneklasen. “We look forward to learning more about the Learning Economy Foundation’s innovative collaboration model and to working together to promote the ethical development and implementation of digital ID technologies in this evolving space.”

Learning Economy was launched at the United Nations General Assembly in September 2018 by founders Chris Purifoy and Jackson Smith. In addition to a staff of 18, the Foundation has engaged more than 50 steering committee members and advisors and hundreds of private-public stewards who share their commitment to modernizing the world’s education and employment infrastructure through the thoughtful implementation of emerging digital technologies — including digital ID.

“Digital identity plays a key role in reshaping the education and employment landscape,” said Chris Purifoy, Chairman of Learning Economy Foundation. “By shifting to a learner and employee-centered approach, we can begin solving some of the toughest challenges in education. New open standards and technologies can allow learners to own their own data and carry their credentials with them like assets throughout their entire lives. We believe our alliance with ID2020 will be an important catalyst toward enabling a more equitable, self-sovereign future.”

About ID2020

ID2020 is a global public-private partnership that harnesses the collective power of nonprofits, corporations, and governments to promote the adoption and ethical implementation of user-managed, privacy-protecting, and portable digital identity solutions.

By developing and applying rigorous technical standards to certify identity solutions, providing advisory services and implementing programs, and advocating for the ethical implantation of digital ID, ID2020 is strengthening social and economic development globally. Alliance partners are committed to a future in which all of the world’s seven billion people can fully exercise their basic human rights and reap the benefits of economic empowerment and to protecting user privacy and ensuring that data is not commoditized.

About LEF

Learning Economy Foundation (LEF), a 501c3 non-profit, serves as a thought leader in

ecosystem-first approaches to education and workforce infrastructure. With years of experience building relationships within standards bodies, education institutions, and the private-public sector, LEF supports local/regional Co-Labs as onramps to the Internet of Education (IoE).

LEF Co-Labs act as collaborative, localized research and development hubs for piloting foundational IoE utilities. Co-Labs help to unify the education to the employment supply chain by connecting systems into a fully unified but decentralized ecosystem. Co-Labs are guided with research to ensure informed, data-driven decisions; they are co-directed with local stakeholders to build community, align incentives, and establish shared systems of trust; they enable foundational pilots helping to confirm minimum system requirements, validate outputs, ensure interoperability, and refine and grow the local Co-Lab model; lastly, they act to support open, community-strengthening utility marketplaces, advance ongoing open source innovation, and ensure community and pilot sustainability.

Tuesday, 03. August 2021

Digital ID for Canadians

2022 Pre-Budget Submission

DIACC’s Written Submission for the Pre-Budget Consultations in Advance of the 2022 Budget Ahead of the 2022 federal budget, the House of Commons Standing Committee…
DIACC’s Written Submission for the Pre-Budget Consultations in Advance of the 2022 Budget

Ahead of the 2022 federal budget, the House of Commons Standing Committee on Finance has asked Canadians to share their input. 
DIACC is pleased to have submitted a brief, calling on the Federal Government to implement the following recommendations: 

That the government secure adoption of the Pan-Canadian Trust Framework by businesses and governments. That the government act on the Finance Committee’s 2021 Pre-Budget Consultation Recommendations 128, Implement a digital identity system that empowers Canadians to control their data that is held by the federal government, and 129, Create a national data strategy. That the government work with provincial and territorial partners and Immigration, Refugees and Citizenship Canada to ensure that all Canadians have access to an ISO-compliant government-issued digital ID credential with economy-wide utility by December 2022. That the government make digital identity-enabled services available to all Canadians by December 2022. That the government prioritizes funding and integration of digital ID as part of the Digital Technology Supercluster Initiative. The Key to Unlocking an Inclusive Digital Economy: Investing in Digital ID

To re-start the economy and deliver inclusive services to all Canadians, governments must invest in unlocking digital. Digital ID empowers Canadians with the choice to safely share their existing credentials (eg: passports, driver’s licenses, health cards) for digital transactions.

Investing in digital ID offers economic benefits to citizens, businesses, and governments and also establishes digital tools to support societal trust, security, privacy, and fraud mitigation. This is a win for all.

Few budget items have the potential to impact every government initiative – digital ID is one such investment with broad impacts and encompassing benefits. Digital ID offers service improvements across all government services and priority areas. This initiative has the potential to empower individuals, increase government efficiency, strengthen companies, and unite communities across the country with secure access to resources, economic development, trust, and support. 

Canadians understand the potential. The pandemic has been an intense and polarizing experience, leading many Canadians to lose faith in institutions. The Edelman Trust Barometer reports that 46% believe that government leaders purposely misled them. At the same time, Canadians are relying more on technology, with the digital sector growing 3.5 percent in 2020, while the economy as a whole shrunk by 5 percent. With digital transformation happening across the country, Canadians are aware that online privacy is crucial. A recent poll from The Office of the Privacy Commissioner of Canada reports that 89 percent of Canadians are concerned about people using information about them online to steal their identity. 

How can the government build trust, enhance privacy, and demonstrate that citizens’ rights are top priority? The answer is clear: 9 in 10 Canadians are supportive of digital ID. Citizen-centric, standards-aligned Digital ID offers an ecosystem that reopens doors closed by the pandemic and unlocks entirely new paths to economic resiliency, cohesion, and social trust.

🔑 Recommendation 1: Implement adoption of the Pan-Canadian Trust Framework by businesses and governments to ensure Canadians are empowered post-pandemic and have clarity in building a secure, interoperable, and privacy-respecting digital ID.

The Pan-Canadian Trust Framework™ (PCTF) is a co-created framework that any jurisdiction — federal, provincial, or international — and industry sector can work with to ensure business, legal, and technical interoperability to realize the full benefits of a digital ecosystem. Rather than seeking a single solution, the PCTF promotes choice and offers a shared hub and language that distinct solutions can interoperate through. Developed by public and private sector experts over a decade, the PCTF provides organizations of all sizes, across sectors, industries, and locations with shared principles and guidelines for a digital ID ecosystem. Built based on recommendations from the federal government’s Task Force for the Payments System Review in 2011, this work has been identified by the public and private sectors as key for Canada’s economic resilience but remains underfunded. 

While provinces, territories, and countries around the world set up COVID credentialing and proof of vaccination systems, the need for these systems is urgent. The credentials issued must be designed with common principles and security to enable acceptance across various jurisdictional and sector-specific solutions for their unique context. The PCTF makes this possible, working as a flexible foundation to connect systems without dictating a single technological architecture. 
The PCTF includes adaptable recommendations that are currently being tested in-market, including standards for Notice and Consent, Authentication, Privacy, Verified Person, Verified Organization, Credentials (Relationship and Attributes), Infrastructure (Technology and Operations) and Assessment. A Model, Overview, and Glossary have been published for ease of use across industries and sectors. Developed with Canadians in mind, the PCTF is technology-agnostic, encouraging innovation while prioritizing privacy, safety and security, and supporting digital economic growth on a global scale.

🔑  Recommendation 2: Put citizens first and integrate cross-government priorities. Act on the Finance Committee’s 2021 Pre-Budget Consultation Recommendations 128, Implement a digital identity system that empowers Canadians to control their data that is held by the federal government, and 129, Create a national data strategy.

Empowering individuals to control their data, understand available services, and have more convenient and secure access to government services offers a direct path to rebuild trust. A recent Leger survey commissioned by Postmedia reports that the pandemic has eroded trust in the federal government, either a little or a lot, for 63% of Canadians. After a challenging year, it is critical that the budget puts citizens first. Digital ID is a proactive initiative that offers immediate and long-term benefits. It has the potential to restore confidence, act on Canadian values, and empower citizens.

Providing Canadians with the digital ID credentials necessary to access, manage, and share their own data ensures citizens have control over the important information they need to manage their health, business(es), and digital services. A national data strategy ensures all Canadians benefit from these advances. It also clarifies accountability for those who seek to use technology and personal information with malicious intent. A pan-Canadian strategy evens the playing field for businesses looking to operate digitally across provincial, territorial, and global borders. This approach also enhances Canadians’ ability to compete economically on a global scale, travel, and seek care with the virtual mobility afforded by a secure, verifiable digital ID. 

🔑 Recommendation 3: Ensure all Canadians benefit from digital connections, opportunity, and the right to be recognized with digital ID. Work with provincial and territorial partners and Immigration, Refugees and Citizenship Canada to ensure that all Canadians have access to an ISO-compliant government-issued digital ID credential with economy-wide utility by December 31, 2022.

Digital ID is the key, as the pandemic has built and opened new doors for Canadians navigating their safety, financial security, health and relationships. According to a study by Brookfield Institute, 9 percent of Canadian businesses made 60 percent or more of their total sales online, up from 6 percent in 2019 — but this digital success has been difficult for small to medium enterprises to adopt. As digital service adoption grows, citizen and employee expectations have also shifted to demand more reliable and secure digital alternatives. Digital ID can encourage sustainable, long-term adoption of digital platforms and help organizations of all sizes to benefit from these systems. It also presents a more flexible and streamlined strategy for pan-Canadian notification systems, service delivery, and community safety initiatives.

Provinces and territories are establishing their own digital ID initiatives. Alberta and British Columbia have launched digital IDs, with BC including a mobile card and a Verify by Video option. Significant investments have been made in Ontario and Québec, where proof of vaccination credentials have been launched. Saskatchewan, Yukon, Nova Scotia, Newfoundland, Prince Edward Island, and New Brunswick are launching pilots, proof of concepts and digital ID components. 

This prioritization demonstrates demand for this enabling capability across the country — but unequal funding and approaches developed in departmental silos pose a risk. Without cohesive federal leadership, these systems will be disjointed and miss the opportunity to be truly interoperable, efficient, and useful for all Canadians. Unlocking these opportunities in a synchronized and equitable manner will ensure Canadians can all access economic opportunities, required public services, and the chance to manage their own personal information.

🔑  Recommendation 4: Collaborate for the highest and most equitable impact. Make digital identity-enabled services available to all Canadians by December 2022.

As the provincial and territorial governments take action to simplify and secure digital identities, private companies are also taking note of this massive market opportunity. Notably, Apple is teaming up with the TSA to be a trusted source of ID for Americans and Stripe is pursuing digital ID services partnering with other apps, including Discord, for user verification. Many more companies are entering the digital ID space in hopes of earning users’ trust and capturing market share. As the issuer of identity in Canada, the public sector is uniquely positioned to empower Canadians and enable the private sector — but the government needs to act now. 

While offering numerous economic and social benefits locally and globally, a Canadian digital ID builds citizen trust and mitigates risk. As the Canadian Centre for Cyber Security noted, “the number of cyber threat actors is increasing, and… Cybercrime will almost certainly continue to be the cyber threat most likely to affect Canadians.” This vulnerability means that Canadians urgently require an encompassing, policy- and leadership-driven approach to implementing and enforcing Privacy by Design principles. A McKinsey report confirms this, suggesting that, for national governments to address the heightened risks presented by cyber threats, “organizations can move from a ‘trust but verify’ mindset to a ‘verify first’ approach.” Pressures and requirements for proof of vaccination, contact tracing, and social distancing are also made possible, digitally secure, and more user-friendly through universal data minimization standards. 

Digital ID offers the key to unlocking secure digital services and pathways. With opportunities to boost job creation, economic growth, citizen wellbeing, COVID-19 planning, support, and mitigation, and reconciliation efforts, digital ID is a budget line that prioritizes and directly benefits all Canadians. Digital ID offers Canadians more personalized control over personal information and convenient access to services. It can increase mobility and connect intra-provincial and territorial systems. It offers an opportunity to strengthen innovation and establish a secure foundation for international collaboration.

🔑  Recommendation 5: Embed within existing ecosystems. Prioritize the funding and integration of digital ID as part of the Digital Technology Supercluster Initiative. Digital ID supports and intersects its areas of focus including health, sustainable natural resource applications, and digital training.

Strides are already being made by Canadians. Purpose-built solutions, like the COVID Alert App, demonstrate that Canada has the talent and innovation to adapt and develop market-leading solutions. Unfortunately, the $20 million price tag and reactive nature of these innovations could be improved. The app has also not been approved by data authorities in Alberta, British Columbia, Nunavut, and Yukon, making it an incomplete solution that doesn’t account for different provincial regulations. Due to the nature of the pandemic, a pan-Canadian solution isn’t a nice to have — it’s a must. Digital ID is a proactive investment that could provide similar benefits in contact tracing and offer lasting impacts on service delivery. 

Digital ID has the potential to add $4.5 billion of added value to SMEs and reinvestments in the economy. It also directly meets the needs and preferences of consumers, with Signicat reporting that 68 percent of consumers expect 100 percent digital onboarding in the wake of COVID-19 and 60 percent would value digital identities to access services internationally. Canada has an opportunity to lead, recover, and take a future-focused position by making an investment in digital ID. 

Prioritizing digital ID is putting Canadians today and in the future first, and reflects responsible investment that offers benefits across departments. Its utility and impact apply during and beyond health or environmental crises. Digital ID delivers an adaptable foundation to deliver new services, security, citizen engagement opportunities, and economic growth.

DIACC members work in partnership with the Government of Canada and all levels of government and welcome further conversations and collaboration.

All sources may be referenced within the PDF version, accessible here or below.

DIACC_Pre-Budget-Consultations_August_2021


Ceramic Network

Ceramic Community Call - July 30, 2021

Catch the latest from the Ceramic core devs and community

Topics included

New core team members NFT:DID update Block reorg issues on Clay/Ropsten Performance improvement updates 3ID Connect issues and plans Running nodes in the next phase of ELP HackFS sponsorship and bounties EIP 712 signature suite

Monday, 02. August 2021

GLEIF

\#3 in the LEI Lightbulb Blog Series - European Commission Weaponizes the LEI in its Battle Against Money Laundering and the Financing of Terrorism

In July 2021, the European Commission (EC) moved one step closer to a clampdown on illicit money flows; it published its long-awaited package of four legislative proposals intended to strengthen the EU’s rules on anti-money laundering and countering the financing of terrorism (AML/CFT). Within the package, the EC officially recognized the value of the Legal Entity Identifier (LEI) as a unique me

In July 2021, the European Commission (EC) moved one step closer to a clampdown on illicit money flows; it published its long-awaited package of four legislative proposals intended to strengthen the EU’s rules on anti-money laundering and countering the financing of terrorism (AML/CFT).

Within the package, the EC officially recognized the value of the Legal Entity Identifier (LEI) as a unique mechanism capable of supporting transparency within any ecosystem, by formalizing it as an important component of future AML/CFT efforts. Two of the four EC proposals call for the LEI to be used in certain customer identification and verification scenarios where it is available:

New Regulation on AML/CFT: Proposal for a Regulation of the European Parliament and of the Council on the prevention of the use of the financial system for the purposes of money laundering or terrorist financing; Revision of the 2015 Regulation on Transfer of Funds: Proposal for a Regulation of the European Parliament and of the Council on information accompanying transfers of funds and certain crypto-assets (recast).

Inclusion of the LEI within such far-reaching, EU-wide proposals for a future-proof AML/CFT regulatory framework represents a huge leap forwards, in terms of realizing a key component of the G20 and Financial Stability Board’s vision to create the LEI as a public good. Any role played by the LEI in helping to protect EU citizens from the impact of terrorism and organized crime can be deemed of significant societal benefit.

A brief analysis of why the EC is championing the LEI within the AML/CFT context highlights that there are many benefits. Use of the LEI consistently for entity identification at the Member State level and the EU level can diminish the margins-of-error related to language ambiguity, human interpretation, and manual intervention. The LEI’s broad interoperability enables it to be integrated seamlessly into both centralized and decentralized digital identity management systems, together with the eIDAS-compliant digital certificates that are already harmonizing the use of e-signature technologies across the EU.

Beyond that, the new EC proposals are intended to create a much more consistent framework to ease compliance for operators subject to AML/CFT rules, especially for those active cross-border. Here too, and given the cross-border nature of money laundering and terrorism financing, the LEI can be used as the ‘Rosetta Stone’ for legal entities involved in financial transactions. Overseen by the Regulatory Oversight Committee (ROC), the Global LEI System is the only system that establishes a recognized, monitored and standardized global identity for legal entities, linked to the entity’s national ID system. Its universality across borders, combined with open and online access to annually verified business card information linked to each LEI, makes it uniquely placed to enable efficient information exchange between the ‘obliged parties’ defined in the AML/CFT proposal and all competent authorities.

And while this is the first time the LEI has been officially endorsed by the EC in its AML/CFT framework, the LEI has a long track record as a powerful tool in assisting the efforts of financial institutions to combat money laundering and terrorist funding. The LEI enables financial institutions to conduct fully automated, straight-through processing; by replacing outdated manual checks, the LEI increases both the speed and the effectiveness of client onboarding and ongoing compliance checks. This includes improving screening against sanctions and watch lists thereby enabling new efficiencies for both institution and client, lowering costs significantly.

The latest recommendation from the EC that the LEI is used, where available, for customer identification and verification in AML/CTF legislative reforms, has the potential to drastically increase the transparency of legal entities participating in financial transactions. Recent AML scandals have shown that financial criminals mostly operate through a network across borders to conceal their illicit transactions. The LEI, a global standard for unique legal entity identification, can make these connections and transactions across borders visible and easy-to-interpret to the financial institutions and Financial Intelligence Units. It can therefore contribute to the stronger authentication of entity clients, promote transparency and reduce illicit transactions.

By including the LEI in its AML/CFT legislative package, the EC strengthens efforts across Europe to use and reinforce global standards in the promotion of transparency and financial stability. The European Central Bank (ECB) recently recognized the benefits of extending LEI use to cover all future public reporting and financial transactions. As a result, the European Systemic Risk Board has recommended the establishment of an EU legislative framework for systemic and comprehensive adoption of LEIs across the EU by any organization involved in a financial transaction and to identify entities reporting financial information.

In context of the AML/CFT proposals, for those ‘obliged entities’ that want to surpass transparency goals, becoming a Validation Agent offers a vast array of business benefits. The Validation Agent role was created by GLEIF within the Global LEI System to simplify LEI issuance for clients and to deliver a variety of cost, efficiency and customer experience benefits for the Validation Agents organizations themselves. More information on the Validation Agent role can be found on the GLEIF website.

How has the LEI Been Referenced Within the EC’s AML/CFT Legislative Package?

AML / CFT Regulation
As defined in Article 1, this proposal lays down rules concerning:

Measures to be applied by obliged entities to prevent money laundering and terrorist financing; Beneficial ownership transparency requirements for legal entities and arrangements; Measures to limit the misues of bearer instruments.

Obliged entities are defined fully in Article 3 of the proposal, but include (with certain exceptions): credit institutions; financial institutions; and various natural and legal persons acting in a professional capacity, ranging from auditors, accountants and tax advisors to legal representatives involved in certain types of transactions, including property, precious metals and stones, gambling services, crypto-asset service providers and crowdfunding service providers.

Reference to the LEI is made in Article 18, which is titled ‘Identification and Verification of the Customer’s Identity’. The text makes clear that where available, an LEI should be obtained by obliged entities, in order to identify a legal entity customer.

Revision of the 2015 Regulation on Transfer of Funds
In a press release announcing the legislative package, the EC makes it clear that primary drivers for enhancements to the existing EU AML/CFT framework are “new and emerging challenges linked to technical innovation.” It names virtual currencies, more integrated financial flows in the Single Market and the global nature of terrorist organisations among these challenges.

The key focus of this Revision of the 2015 Regulation on Transfer of Funds ensures that EU AML/ CFT rules are extended beyond their current remit, to fully apply to the crypto sector. This will ensure full traceability of crypto-asset transfers and allow the prevention and detection of their potential use for money laundering or terrorism financing.

Against this backdrop, the LEI has been referenced twice within the package of legislatory proposals.

In section (25), which outlines that transfers of funds or crypto-assets from the Union to outside the Union should carry complete information on the payer and payee, a new requirement has been introduced: “Complete information on the payer and the payee should include the Legal Entity Identifier (LEI) when this information is provided by the payer to the payer’s service provider, since that would allow for better identification of the parties involved in a transfer of funds and could easily be included in existing payment message formats such as the one developed by the International Organisation for Standardisation for electronic data interchange between financial institutions.” In a later section that outlines the obligations on the payment service provider of the payer, Article 4 of the Revision proposal sets out the requirements for information that must accompany transfers of funds. In the latest revision, a new requirement for the payer’s current LEI has been added, “subject to the existence of the necessary field in the relevant payments message format, and where provided by the payer to the payer’s payment service provider….”.

Kantara Initiative

Kantara Approves LexisNexis® Risk Solutions Risk Defense Platform Compliance With NIST SP 800-63 rev.3 (Technical) Class of Approval, as a Component Service at IAL2 

  August 2nd, 2021 – LexisNexis® Risk Solutions Risk Defense Platform service has been approved by Kantara Initiative as a Component Service, in compliance  with the requirements of NIST SP 800-63 rev.3 (Technical) Class of Approval, at Identity Assurance Level 2 (IAL2). Kantara’s Trust Mark is provided as part of their leading global consortium dedicated to improving trustworthy use of

  August 2nd, 2021 – LexisNexis® Risk Solutions Risk Defense Platform service has been approved by Kantara Initiative as a Component Service, in compliance  with the requirements of NIST SP 800-63 rev.3 (Technical) Class of Approval, at Identity Assurance Level 2 (IAL2). Kantara’s Trust Mark is provided as part of their leading global consortium dedicated to improving trustworthy use of identity and personal data through innovation, standardization and good practice. As a Kantara Approved Service Provider, LexisNexis offers a trusted service to Federal, state and local agencies, financial institutions, insurance companies, and healthcare providers.  The Risk Defense Platform (RDP) is…

The post Kantara Approves LexisNexis® Risk Solutions Risk Defense Platform Compliance With NIST SP 800-63 rev.3 (Technical) Class of Approval, as a Component Service at IAL2  appeared first on Kantara Initiative.

Friday, 30. July 2021

Elastos Foundation

Elastos Bi-Weekly Update – 30 July 2021

...

EdgeSecure

Cybersecurity Insurance – Rising Costs and What You Need to Know

The post Cybersecurity Insurance – Rising Costs and What You Need to Know appeared first on NJEdge Inc.

Webinar
October 19, 2021
3 pm EDT

Cybersecurity insurance premiums continue to rise, even while insurers reduce the amount of damages covered. In this session, we’ll explore the reasons for the change in rates and coverage, and how education institutions and other public sector organizations can use proactive cybersecurity strategies to minimize premium increases in the face of more frequent security threats.

In this session, you’ll learn:

Why cybersecurity premiums are climbing while coverage amounts are reduced How trends in ransomware attacks and other security threats influence the cybersecurity insurance market Proactive steps institutions can take to demonstrate cybersecurity preparedness, minimizing premium hikes  Register Today

The post Cybersecurity Insurance – Rising Costs and What You Need to Know appeared first on NJEdge Inc.


Ceramic Network

Build the dweb at HackFS

Ceramic is excited to sponsor the return of HackFS, with $175,000 of prizes available

We're thrilled to be sponsoring this year's virtual HackFS, working alongside our friends at Protocol Labs and ETHGlobal to support projects moving us closer to the reality of a decentralized web. From July 30th - August 20th, we'll be helping hackers build the foundations of Web3.

To get involved:

Register before the hackathon starts on July 30 Check out details on our Ceramic Bounties below 👇🏼 Jump into our Discord to tell us what you're thinking of building, ask questions, and get support! $8000 of Bounties up for grabs

We're giving away one pool prize and three named prizes, with $8,000 in total bounties up for grabs!

Pool prize: all Ceramic Network or IDX.XYZ implementations will split $3,000

Names prizes:

Best use of Ceramic data streams or IDX for identity: $2,500 Best new tooling or patten for other developers to use IDX or Ceramic: $1,500 Best use of Ceramic alongside another sponsor: $1,000

Check out the full bounty details on the official HackFS site here. We'll be on the lookout for unique project ideas and teams that show a thorough and deep understanding of Ceramic or IDX.

How to use Ceramic in your hack

Ceramic is a sovereign data network that lets you easily build rich applications on top of blockchains and IPFS. Your hack can include aggregated identity and reputation, social features and content, private user data storage, and rich data models by using Ceramic. Eliminate your need for a backend or custom contracts, and give users control over their own data.

Looking for inspiration? Some ideas for each bounty category from Ceramic's core developers:

Some hack ideas we'd be super excited about

Social media with adversarial interoperability
Our community has put up $15K for a tool that mirrors tweets onto Ceramic Network, paving the way for a new Web3-native social conversation without the need for a mass migration off Twitter.

Mutable metadata for NFTs
NFTs are unique, but do they have to be static? Mint NFTs with metadata stored on Ceramic and the owner of the Ceramic document can change the properties of the NFT over time.  An artist could keep updating the NFT metadata after it's minted, creating a unique evolving piece.

DeFi Annotations and Comments
Create a commenting system with Ceramic that lets users add notes to contracts, addresses, trades or anything else in DeFi. These notes could be accessible across any DeFi app. This could be a javascript library for apps to integrate, or a browser extension for users to add themselves.

OmniContact List
Create a contact list using Ceramic that could be used and shared by every wallet, app and experience across the dweb.

Tooling for Ceramic (additional $1500 bounty)

Ceramic in a new language
Lots of developers would use Python, Swift, Kotlin or other implementations

New data models for common patterns
New IDX definitions for common data types or Ceramic CIPs, or even import schema.org into Ceramic!

Reference implementations of common needs
Many developers are looking to use Ceramic for token watch lists, contact lists, private chat, comment threads, and more. If you build these, create a demo or tutorial to help others follow your footsteps!  

Import of Web2 social data to use on Web3
Many large Web2 apps now have APIs to export user data (with users' permission). Build a mass-export tool that puts valuable data into users' hands by adding it to a their identity index. Create definitions for each export so it is easily discoverable and usable across Web3, and store the data encrypted in Ceramic.

Form Filling Tool
Create a reusable component for dapps to simply capture the information they need about a user - for example during onboarding -  and store it in IDX. And if a user already has a certain field completed, no need to ever ask for it again!

Use with other dweb sponsors (additional $1000 bounty)

A few ideas to get you started:

Create an address book for streaming payments with Superfluid Use Ceramic with Fluence for computation on Ceramic streams Use IDX to manage user data that is stored in Textile threads Extend ENS with IDX, which lets you associate arbitrary structured data to an address Getting started on Ceramic Watch Ceramic CTO Joel Thorstensson's HackFS workshop on Building with Sovereign Data on Ceramic Ceramic Documentation: developers.ceramic.network IDX (identity) Documentation: developers.idx.xyz Chat with us in the Ceramic sponsor channel or chat.ceramic.network Tutorials and video workshops Good luck to all the hackers, and don't forget to join us on the Ceramic Discord to share ideas, meet community members and get help from our team!

Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


SelfKey Foundation

EIP-1559 – All You Need to Know

EIP-1559 is a core proposal to improve the efficiency of the transactions through the Ethereum network. Once implemented, the proposal will modify the ETH transaction process to a two-fee mechanism. A base fee amount, adjusted up and down by the Ethereum protocol, based on the congestion of the Ethereum network. A priority fee amount, used […] The post EIP-1559 – All You Need to Know appeared fi

EIP-1559 is a core proposal to improve the efficiency of the transactions through the Ethereum network. Once implemented, the proposal will modify the ETH transaction process to a two-fee mechanism. A base fee amount, adjusted up and down by the Ethereum protocol, based on the congestion of the Ethereum network. A priority fee amount, used to compensate miners.

The post EIP-1559 – All You Need to Know appeared first on SelfKey.


Own Your Data Weekly Digest

MyData Weekly Digest for July 30th, 2021

Read in this week's digest about: 10 posts, 2 Tools
Read in this week's digest about: 10 posts, 2 Tools

Thursday, 29. July 2021

Digital ID for Canadians

Spotlight on Trust Science

1.What is the mission and vision of Trust Science? Trust Science ® is a FinTech SaaS that delivers Credit Bureau 2.0 ® / Credit Bureau…

1.What is the mission and vision of Trust Science?

Trust Science ® is a FinTech SaaS that delivers Credit Bureau 2.0 ® / Credit Bureau + ™. Its mission is to help deserving people get the loans they deserve. The service gives lenders highly accurate scores about underbanked and financially stressed borrowers in a fair, ethical and compliant way. This solution repairs the on-ramp to the modern credit economy, improving financial inclusivity while simultaneously boosting lender profitability.

2. Why is trustworthy digital identity critical for existing and emerging markets?

Even in multi-billion dollar credit bureaus, data often crosses individuals, leading to frequent mix-ups and inaccurate credit reporting (i.e. lost opportunities for deserving borrowers or fraud committed by un-deserving people). With a trustworthy digital identity and interoperable digital credentials, lenders can ensure that individual data is correct and correctly attributed. As well, given the trends in Consumer Protection and Privacy laws, trustworthy identity is the first step toward returning control and consent to the use of personal data back to its rightful owners. Furthermore, in the rapidly emerging market of alternative credit underwriting, where Trust Science is a leader, digital identity enables new application processes, giving new-age credit bureaus fast and secure access to more data points for more accurate scoring. Trust Science believes that digital identity solutions will form the backbone of financially inclusive credit scoring and will be instrumental in a variety of other contexts.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

Interoperable digital credentials will transform all economies by significantly reducing cost and time for all forms of applications and sharing sensitive, verified information. Trust Science is designed to be a compliant service in all jurisdictions so that it can improve borrowing outcomes for global citizens and lenders. This is evidenced, in part, by a massive patent and trademark estate (comprising over 4 dozen patents, trademarks and pending patents across 18 different countries, and counting.) To support this transformation, Trust Science is pioneering the application of digital identity in the credit reporting industry. It is a Founding Steward of the Sovrin.org Identity network alongside industry heavyweights and it has now proudly joined DIACC. It will also provide services/consulting and its Smart Consent™ software to lenders and their borrowers to enable a transition to the use of digital credentials in loan applications, as digital wallets proliferate among consumers. As well, the company is working with regulatory bodies in both Canada (OSFI) and in the U.S. (RFI submission re: Explainable AI to 5 federal agencies in June 2021) to ensure that modern technology and its capabilities are fully understood and appreciated by all stakeholders, especially the topmost banking regulators.

4. What role does Canada have to play as a leader in this space?

Canada undisputedly has the technological capabilities and knowledge needed to be a global leader in digital identity and trust services. A strong supporter of digital identity standards, Trust Science is excited to be a part of that leadership, especially with the Canadian government’s “User-Centric Verifiable Digital Credentials” initiatives.

At a practical and real-world level, Canada shares the world’s longest border with the world’s biggest and most advanced economic and technological actor.  It is incumbent on a partner in such a relationship to meet or beat best practices in all matters of identity and services (like money transfer) that rely on trust and KYC. Turning to the global space, Canada must continue to support domestically-driven innovations to maintain its position of leadership and continue to set the global pace of digital identity standards in the customary Canadian way: fair, ethical, and inclusive.

5. Why did your organization join the DIACC?

Trust Science joined DIACC to facilitate partnerships with other members that will further the adoption of emerging digital credential standards. The company believes that these partnerships will form the foundation of a strong Canadian ecosystem of digital identity companies and that all the different players’ collective experiences working on the cutting-edge of the tightly regulated lending industry will provide mutually valuable insights among each other.  Trust Science looks forward to networking and collaborating with other DIACC members in a way that helps Canada maintain its reputation for having a very strong banking/lending sector and healthy financial (including Insurance industry) institutions.

6. What else should we know about your organization?

Trust Science provides automated loan underwriting solutions leveraging traditional credit bureaus’ data, alternate data, and user-permissioned data. Given the risk associated with centrally stored personal data accessed during the loan application process, Trust Science aims to enable migration to decentralized digital credentials that will increase security and privacy while also significantly simplifying the application process. Put simply, Trust Science offers lending leaders a fair and ethical way of scoring financially stressed or under-banked customers, right now.

Tuesday, 27. July 2021

Commercio

The ultimate guide for the use of COMMERCIO WALLET APP has been published.

The ultimate guide for the use of COMMERCIO WALLET APP has been published.   http://The ultimate guide for the use of COMMERCIO WALLET APP has been published. You will find all the useful information to download and use the Commerce Wallet app. THE SEED PHRASE THE DIFFERENCE OF ACCESS BETWEEN NEW WALLET and RESTORE WALLET MAIN-NET and TEST-NET RECEIVE TOKEN […] L'articolo The ultimate guide

The ultimate guide for the use of COMMERCIO WALLET APP has been published.

 

http://The ultimate guide for the use of COMMERCIO WALLET APP has been published.

You will find all the useful information to download and use the Commerce Wallet app.

THE SEED PHRASE
THE DIFFERENCE OF ACCESS BETWEEN NEW WALLET and RESTORE WALLET
MAIN-NET and TEST-NET
RECEIVE TOKEN
SEND TOKEN
MAKE STAKING
UNBONDING
RECEIVE REWARDS

Important Notes :

The COMMERCIO WALLET is a FREE APP, still under development and is released AS IS without any warranty.

Anyone using THIS FREE APP does so at their own risk, commerc.io cannot be held responsible for any loss of tokens.

All content in this guide is for educational purposes only, it is not financial advice.

 

L'articolo The ultimate guide for the use of COMMERCIO WALLET APP has been published. sembra essere il primo su commercio.network.


Energy Web

PJM-EIS UPDATE: Modernizing a legacy U.S. REC tracking system with blockchain-based technology

With an initial pilot complete, we look ahead to what’s next for one of the world’s largest renewable energy markets ActionVance | Unsplash In the world of power grids, PJM Interconnection is a household name. It is a regional transmission operator (RTO) whose service territory spans a 13-state region in the Mid-Atlantic, Great Lakes, and Northeast region of the United States. It’s not h
With an initial pilot complete, we look ahead to what’s next for one of the world’s largest renewable energy markets ActionVanceUnsplash

In the world of power grids, PJM Interconnection is a household name. It is a regional transmission operator (RTO) whose service territory spans a 13-state region in the Mid-Atlantic, Great Lakes, and Northeast region of the United States. It’s not huge by land area: as measured in square miles, all of PJM’s service territory could fit within the borders of France. But it is a massive electricity and renewable energy market.

PJM was the world’s largest competitive wholesale electricity market until the development of the European Integrated Energy Market in the 2000s. And it remains one of the world’s largest grid operators as a founding member of the GO15. On the renewable energy front, in 2020 PJM generated more renewable energy certificates (RECs) than the whole of Australia.

And so when we, Energy Web, kicked off a collaboration with PJM Environmental Information Services, Inc. (PJM-EIS) — a subsidiary of PJM Interconnection — in late 2018, we were excited. That’s because PJM-EIS administers the Generation Attribute Tracking System (GATS), a platform for tracking and trading RECs in the United States.

The focus of the collaboration was to build a blockchain-based pilot alongside PJM’s existing GATS system. Together, PJM-EIS and Energy Web wanted to explore the potential for bringing new functionality and benefits to GATS and evaluate how blockchain technology could be integrated into existing IT systems seamlessly and create value for PJM stakeholders.

Evaluating opportunities to modernize an existing U.S. REC platform

We set out to leverage the Energy Web Decentralized Operating System (EW-DOS), an open-source tech stack of decentralized software and standards for the energy sector. More specifically, the pilot used Energy Web Origin, an open-source software development toolkit (SDK) for REC (and similar) tracking and trading platforms running on the Energy Web Chain, to create new functionality for the GATS Bulletin Board and assess the potential for decentralized technologies to support wider improvements to GATS beyond the Bulletin Board.

The Bulletin Board is a place on GATS where REC buyers and sellers can, respectively, post their bids and asks for specific REC volumes. Actually selling or buying those RECs remains the responsibility of the counterparties in bilateral agreements. We thought a blockchain-based solution could improve that experience and increase use of the Bulletin Board.

Historically, the Bulletin Board has been an underutilized feature available on GATS to facilitate trading of voluntary RECs. The central aim for this pilot was to test new functionality that could enhance the Bulletin Board as a means to grow the REC market in the PJM footprint while also improving security, increasing transparency, and reducing transaction costs for PJM stakeholders. In other words, this pilot assessed how improving the technical functionality and user experience of the Bulletin Board could remove market barriers and grow the local REC market — and possibly even attract greater REC imports from other U.S. REC markets.

The result: a pilot for the PJM-EIS version of GATS Bulletin Board with blockchain-based REC marketplace functionality that integrated with the existing GATS

This spring, five PJM subscribers tested the pilot GATS Bulletin Board. Seller participants posted asks on the Bulletin Board based on actual RECs in their respective GATS accounts and buyer participants posted their REC bids based on their procurement needs. Any matches made between bids and asks then triggered notifications for participants to confirm if they wanted to complete any REC transfers, where any completed trades would link back to the legacy GATS to update the buyer and seller’s respective accounts. For this pilot, the transactions were purely test transactions and so there was no financial settlement.

Here are some screenshots from the pilot GATS Bulletin Board to give a flavor of the updated user experience (shown from the perspective of a renewable energy buyer):

Getting a view of the overall Bulletin Board market activity

2. Starting the renewable energy procurement journey with various search filters to place bids

3. Identifying verified procurement options that match with bids

4. Viewing completed transactions and blockchain-based proof of each transaction

“We have learned a lot about what it would take to add new technical functionalities to GATS, how to integrate blockchain-based technologies ‘under the hood’ of GATS, and how PJM stakeholders support these improvements,” said Ken Schuyler, President of PJM-EIS. “We look forward to further understanding how we can use EW-DOS to support our overall digitization efforts.”

Energy Web Origin provided the back-end infrastructure for the marketplace functionality and use of a public blockchain — the Energy Web Chain — to digitize the RECs in GATS and anchor the proof of any REC transactions that occurred on the pilot Bulletin Board system.

Pilot feedback and next steps

In post-pilot interviews, pilot participants shared overall positive feedback about their experience, especially around the level of integration with GATS, and shared how they could successfully post bids and asks for RECs from their GATS accounts.

In the end, given how energy sector market participants typically adopt new technologies in phases (rather than in one fell swoop), this pilot illustrates that it is possible to integrate blockchain-based functionalities into existing IT systems to support wider digitization and tech modernization efforts.

Nevertheless, PJM-EIS and Energy Web experienced practical implementation challenges with the pilot. The core challenge revolved around how — while it’s possible to integrate new, blockchain-based tech with legacy IT systems — this integration process takes more time than designing and implementing an entirely new IT system from scratch. This is the main reason why the pilot took more time than initially expected to implement.

The pilot also raised bigger strategic questions that PJM-EIS will continue to evaluate as it explores next steps for technical upgrades to GATS in general and the Bulletin Board in particular. For example, this pilot showed that it is technically possible to integrate new technologies on a legacy system, but cannot answer the central question as to whether or not PJM-EIS wants to make the Bulletin Board an exchange where real REC purchases (and associated financial transactions) happen. In addition, the pilot did not assess the role of using new decentralized identifier technologies that make it possible to assign every energy device a unique digital identity anchored on a public blockchain so that a given device can plug into different PJM markets.

PJM-EIS UPDATE: Modernizing a legacy U.S. REC tracking system with blockchain-based technology was originally published in Energy Web Insights on Medium, where people are continuing the conversation by highlighting and responding to this story.


Alastria

Blockchain para asentar las bases de la identidad digital europea

Bárbara Villuendas. Digital Content Assistant de Cuatroochenta Alastria ID es uno de los modelos referentes de identidad digital soberana en el que han colaborado empresas como Cuatroochenta a través de la firma 4TIC, a la que adquirió en 2020. El proyecto quiere que la ciudadanía y las empresas sean propietarias de sus datos personales y que tengan el control con todas las garantías de seguridad

Bárbara Villuendas. Digital Content Assistant de Cuatroochenta

Alastria ID es uno de los modelos referentes de identidad digital soberana en el que han colaborado empresas como Cuatroochenta a través de la firma 4TIC, a la que adquirió en 2020. El proyecto quiere que la ciudadanía y las empresas sean propietarias de sus datos personales y que tengan el control con todas las garantías de seguridad que aporta la tecnología blockchain.

Sistema de funcionamiento basado en la Identidad Digital Soberana. Foto: Cuatroochenta

Un negocio que quiera tener una cuenta corriente en una entidad bancaria ha de facilitar a la sucursal una copia del DNI del representante legal. Si la firma tiene cuentas en otras entidades, debe ofrecer esa copia a cada uno, redoblando trámites y riesgos de seguridad, ya que esa documentación se expone muchas más veces a un posible robo de información… Esto es lo que pasa hoy en día, pero imaginemos un escenario totalmente diferente. Un escenario en el que una empresa no tiene que ofrecer ningún documento físico ni electrónico, sino que simplemente permite al banco acceder a su número de DNI, a través de su cartera digital, es decir, su wallet. Este sistema serviría también, por ejemplo, para hacer los trámites para ser proveedor de un negocio o para licitaciones en administraciones públicas. Estamos hablando de la identidad digital soberana, conocida por sus términos en inglés como self sovereign identity (SSI).

La Comisión Europea ha presentado recientemente la identidad digital europea, sustentada en la SSI y en la tecnología blockchain, que permite distribuir en una red el control de la información y validarla sin intermediarios.

Con este sistema, la ciudadanía y los negocios pasan a ser los propietarios de la identidad digital y, por tanto, de las credenciales que la conforman. Y ¿qué se entiende por credencial? Cualquier dato que identifica a una persona o empresa. Puede ser, por ejemplo, el número de DNI o NIF, el carnet de conducir, un título universitario o una aptitud profesional. Imaginemos el supuesto de una persona que debe acreditar que es mayor de edad. Con este sistema no es necesario que aporte el DNI con todos los datos personales que recoge, sino una credencial verificable (VC) en que una administración confirma esa mayoría de edad preservando la privacidad de los datos.

Es un giro y un cambio de paradigma hacia un modelo descentralizado. La Unión Europea con su proyecto de identidad digital quiere evitar que las personas cedamos y regalemos sin darnos cuenta datos a las grandes tecnológicas. En los últimos años se han implementado en toda Europa diversas iniciativas con la vista puesta en los usos de la SSI. En España uno de estos proyectos es el que ha desarrollado el consorcio multidisciplinar Alastria, del que es socio Cuatroochenta de forma colaborativa entre más de 200 miembros Alastria ID, un modelo técnico de identidad digital basado en los estándares internacionales y que quiere aprovechar las capacidades de blockchain y de la SSI.

Si quieres conocer cómo se construye la identidad digital, los beneficios de la SSI y las características del modelo Alastria ID puedes leer el artículo completo.


Blockchain Commons

2021 Q2 Blockchain Commons Report

It was another busy quarter for Blockchain Commons, with a focus on work on our Gordian reference apps, which demonstrate our architectural models and specifications. However, we had numerous other releases as well, including a lot of documentation to let everyone know what we’re doing. Our major work included: Overviews Releasing a video overview of our specifications and technologies; Publishing

It was another busy quarter for Blockchain Commons, with a focus on work on our Gordian reference apps, which demonstrate our architectural models and specifications. However, we had numerous other releases as well, including a lot of documentation to let everyone know what we’re doing.

Our major work included:

Overviews

Releasing a video overview of our specifications and technologies; Publishing our list of Gordian Principles;

Reference Apps

Releasing Gordian QR Tool and Seed Tool through the Apple App Store; Debuting our Sweeptool command-line tool; Experimenting with Timelocks for the next generation of #Smart Custody;

Coding Processes

Increasing our focus on the Rust programming language; Working with a sponsor on our first security review;

New Docs

Publishing docs on our UR and SSKR specifications; Kicking off two translations of Learning Bitcoin from the Command Line;

Other Priorities

Beginning work with our summer interns; Continuing to testify for the Wyoming legislature; Celebrating the fifth anniversary of self-sovereign identity; and Talking about the future of the BTCR DID. Read More

(Also see our previous Q1, 2021 report.)

Some Overviews

Blockchain Commons is getting big, so we produced some overviews of our work!

Video Overview. Over the last few years, Blockchain Commons has produced a number of specifications, including our fundamental work on Uniform Resources (URs), our other innovations such as Lifehashes and Object Identity Blocks, and our updates of sharding technology with SSKR. If you’re trying to get a handle on the technologies we’re using and the specifications that we’re creating with our Airgapped community, we invite you to take a look at our technology overview video, which runs through our core conceptual and specification work one element at a time.


The Gordian Principles. We’ve also begun publishing some descriptions of what the Gordian architecture means to us. It’s focused on four principals that we believe are crucial to digital-asset management: independence, privacy, resilience, and openness. They’re all about ensuring that you’re the controller of your digital assets. Besides writing a short overview, we’ve also begun adding descriptions to each of our reference apps, discussing how they embody those principles.

And, those reference apps were our biggest news for the quarter …

Reference App Work

Our reference apps show off how Blockchain Commons’ Gordian Principles work in real life. We made great progress on them this quarter.

Gordian Releases. Where we had three reference apps available for beta testing in Q1, in Q2 we advanced to having two available for release from Apple.

Gordian Seed Tool is the app previously known as Gordian Guardian. It allows for the management of your cryptographic seeds in a way that’s secure and resilient. You store your seeds in Seed Tool, ensuring that the data is encrypted and that it’s redundantly backed up to iCloud, and then you derive and export keys as they’re needed. As with all of the Gordian reference apps, this one demonstrates the usage of a number of Blockchain Common’s specifications, including SSKRs, UR, and even our new request/response airgap methodology that we just debuted this February.

Gordian QR Tool is a simpler tool that similarly allows for the storage of QR codes in a secure and resilient way. Thus, if you had a largely unused seed or key, you could export it as a QR and store it here. QR Tool can also be used to store other private information such as 2FA seeds or the brand-new Smart Health Cards. QR Tool recognizes many categories of QR codes, including the UR types that Blockchain Commons has defined, allowing easy categorization and sorting of your cryptographic data. Our QR Tool was slightly held back by inadequacies we found in Apple’s QR creation functions, which can produce overly large QRs. We’ve already begun work translating a better QR library into Swift and expect to integrate that into v1.1 of QR Tool in the future.

QR Tool and Seed Tool also offer the first demonstration of the interactions possible in a Gordian architecture. One of the most powerful examples involves Blockchain Commons’ SSKR specification. Seed Tools allows a user to not only shard a seed using SSKR, but also to encode those shares as QR codes. The encoded shares can then be given to friends and family who have QR Tool, ensuring the safe storage of all the shares required to restore your seed!

We have more reference app releases planned for the future. Gordian Cosigner remains available as a Testflight beta release, and is our reference app for demonstrating how to conduct airgapped signing. We are also beginning work on the Gordian Recovery app, which we announced last quarter; it’ll demonstrate methodologies for recovering assets held by third-party wallets.

Sweeptool Debut. Recovering funds held in a third-party HD wallet can be tricky since you don’t always know which addresses were actually used. We’ve also been attacking that problem with a command-line tool called sweeptool, which sweeps funds anywhere in the hierarchy defined by a descriptor. Sweeptool is the work of one of our intern graduates, and is already available as an alpha release.

Timelock Experiments. Our current architecture for #SmartCustody depends on multisigs to allow for the partitioning of keys and the creation of resilience. However, we’re already thinking about the next generation of #SmartCustody, which will allow the use of timelocks to recover funds in the case of the incapacitation of a principal. We did some deeper investigation into miniscript this quarter, which allows for easier integration of timelocks into Bitcoin addresses, and also mapped out some new architectures for progressively sweeping funds forward to keep ahead of timelocks. However, we discovered that at this point miniscript isn’t yet integrated with the descriptor-wallets that are another of the core elements of the Gordian architecture. This prevents integration with our Gordian apps.

We’ve published a preliminary paper on the usage of Timelocks, but it isn’t finalized yet because of these issues. Meanwhile, one of our interns has begun work on a Rust-based Timelock project called mori-cli. This work is all meant to ensure we’re ready when miniscript is added to Bitcoin Core, or when timelocks are integrated into descriptors in some other manner.

Coding Processes

Releasing apps is just one step of a larger coding process, which requires careful consideration of what languages (and libraries) to use and careful reviewing of their security.

Rust Focus. Sweeptool is written in Rust, which is an increasing focus at Blockchain Commons. We’ve also been using it for our musign-cli work and some of our torgap work and have touched upon it in Learning Bitcoin from the Command Line. We think that Rust is an important language for the future of Bitcoin development in large part because of its safety guarantee, including memory safety properties, which can resolve 70% of security problems. Our main barrier for wider use of Rust has been the fact that C and C++ are the main languages used by Bitcoin Core, iOS, and Android, but we’ve already seen the start of changes there, and hope to be able to integrate Rust even more fully in the future. The fact that the Rust-based Bitcoin Dev Kit is actually in advance of Bitcoin Core in some features is very encouraging.

Security Review. Bitmark, one of our Sustaining Sponsors, has hired Radically Open Security to conduct a security review of our SSKR libraries, bc-shamir and bc-sskr. This is crucial element in making Blockchain Commons’ specification work widely available, since it would be improper for us to review our own security code. As our first partners planning to use one of our libraries in a shipping project, as opposed to just our specifications, Bitmark is stepping up to the plate to fund the review of our core SSKR libraries. We expect this pattern to repeat with future partners and libraries in the future. It shows the power of our open libraries: future companies will be able to depend on our SSKR libraries thanks to Bitmark’s contribution, and then they’ll be able to make contributions of their own that will be much less than if they’d had to security review an entire suite of themselves.

New Docs

The blockchain infrastructure is empowered by users and developers who know how to use it properly. Teaching them is the goal of our documentation projects.

UR & SSKR Docs. The ultimate purpose of our reference applications is to demonstrate how our new specifications can be used in actual applications. Hand-in-hand with that, we’re also producing documentation that lays out in more detail how those specifications work. We’ve begun collecting many of those docs in the documents area of our crypto-commons repo. This quarter we added to our docs repo with several documents about our Sharded Secret Key Reconstruction system, including SSKR for Users and SSKR for Developers. We also released a series of articles on Uniform Resources (URs) intended to show developers how they’re constructed and used.

We are also planning further support for our interoperable specifications such as UR and SSKR at a virtual interoperable wallet specification workshop. We’re currently working on locking down a date.

Learning Bitcoin Translations. Our best-known tutorial at Blockchain Commons, is our Learning Bitcoin from the Command Line course, which we pushed to v2.0 last year and which has 1,700 stars and 100 watchers on Github. We’re thrilled that translations are now ongoing into both Portuguese and Spanish thanks to a half-dozen core volunteers. Some of Blockchain Commons’ own programmers and interns entered the blockchain industry thanks to this course, so we’re looking forward to these translations opening the doors even wider.

Other Priorities

Finally, we have a lot of more varied projects that all saw some progress.

Our Summer Interns Have Begun Work. Thanks to a grant from the Human Rights Foundation, we have a class of a dozen interns this summer. We’re engaging them with weekly calls to introduce them to crucial use cases and to let them talk with Bitcoin and Lightning experts from the industry and human-rights experts, most recently including John Callas from the EFF and Alex Gladstein from HRF.

Some of our human-rights-focused intern projects include a self-sovereign donation app, scripts to automate the setup of privacy and/or bitcoin services, documents on managing pseudonymity, and tracking of blockchain-based legislation.

Some of our more general intern projects include deployment of a full Esplora node for Blockchain Commons, the creation of scripts to make it easier for others to do so, documentation on bitcoin fee-estimation use cases, improvements to our Spotbit server including expansion of the Spotbit API and client app, and the aformentioned work on Mori and on Learning Bitcoin translations.

Our first completed intern ‘21 work is a short i2p chapter for Learning Bitcoin from the Command Line, talking about an alternative (or supplement) to traditional Tor privacy.

Wyoming Testimony. Christopher’s testimony has continued in Wyoming, talking about DAO and identity, to improve and extend both laws. The most recent testimony on May 28 continued to work through the ramifications of the definitions of principal authority for self-sovereign identity.


Self-Sovereign Identity. Speaking of which, it was the fifth anniversary of the concept of self-sovereign identity, which first suggested that we should control our identities on the internet, not be beholden to centralized agencies. Christopher wrote an article for Coindesk celebrating that anniversary, talking about how far we’ve come and reflecting on how things could be improved.

BTCR. Finally, we’ve also been talking about another of our blockchain projects that originated in Rebooting the Web of Trust: BTCR, a self-sovereign decentralized identifier that uses the Bitcoin blockchain. Recently Christopher and the other creators of BTCR talked with The Rubric about the past and future of the DID.

As you can see, we’ve got lots of ongoing work that continue to expand the ideas of responsible key management and self-determination on the internet. If you’d like to support our work at Blockchain Commons, so that we can continue to design new specifications, architectures, reference applications, and reference libraries to be used by the whole community, please become a sponsor. You can alternatively make a one-time bitcoin donation at our BTCPay.

Thanks to our sustaining sponsors Avanti Bank, Bitmark, Blockchainbird and Unchained Capital, and our new project sponsor Human Rights Foundation(@HRF), as well as our GitHub monthly sponsors, who include Flip Abignale (@flip-btcmag), Dario (@mytwocentimes), Foundation Devices (@Foundation-Devices), Adrian Gropper (@agropper), Eric Kuhn (@erickuhn19), Trent McConaghy (@trentmc), @modl21, Jesse Posner (@jesseposner), Protocol Labs (@protocol), Dan Trevino (@dantrevino), and Glenn Willen (@gwillen).

Christopher Allen, Executive Director, Blockchain Common

Monday, 26. July 2021

Velocity Network

Open Assembly’s Center for the Transformation of Work Meet-Up

On July 15, our CEO Dror Gurevich joined Open Assembly's Center for the Transformation of Work Meet-Up to talk about the Velocity Network. The post Open Assembly’s Center for the Transformation of Work Meet-Up appeared first on Velocity.

Friday, 23. July 2021

OpenID

Notice of Vote for Proposed Final OpenID Connect Client-Initiated Backchannel Authentication (CIBA) Core Specification

The official voting period will be between Saturday, August 7, 2021 and Saturday, August 14, 2021, once the 60-day review of the specification has been completed. For the convenience of members, voting will actually begin on Saturday, July 31, 2021 for members who have completed their reviews by then, with the voting period ending on […] The post Notice of Vote for Proposed Final OpenID Connect C

The official voting period will be between Saturday, August 7, 2021 and Saturday, August 14, 2021, once the 60-day review of the specification has been completed. For the convenience of members, voting will actually begin on Saturday, July 31, 2021 for members who have completed their reviews by then, with the voting period ending on Saturday, August 14, 2021.

The MODRNA working group page is https://openid.net/wg/mobile/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/241.

– Michael B. Jones, OpenID Foundation Secretary

The post Notice of Vote for Proposed Final OpenID Connect Client-Initiated Backchannel Authentication (CIBA) Core Specification first appeared on OpenID.

Berkman Klein Center

Five Years of Assembly

Five Years of Assembly: Interdisciplinary fellowship offers paths forward in public interest technology By Zenzele Best Assembly Program logo, Berkman Klein Center for Internet & Society Between 2017 and 2021, the Assembly Program at the Berkman Klein Center for Internet & Society (BKC) at Harvard University brought together almost 150 professionals, experts, and students to bet
Five Years of Assembly: Interdisciplinary fellowship offers paths forward in public interest technology

By Zenzele Best

Assembly Program logo, Berkman Klein Center for Internet & Society

Between 2017 and 2021, the Assembly Program at the Berkman Klein Center for Internet & Society (BKC) at Harvard University brought together almost 150 professionals, experts, and students to better understand and develop solutions to some of the most intractable issues in technology policy. Led by Professor Jonathan Zittrain and supported by faculty, staff, and experts from across the University and the Center, the program explored topics in digital security, artificial intelligence (AI), and disinformation; produced a variety of prototypes and projects; and fostered a community that spanned across sectors and disciplines.

“Being academic isn’t, at its core, about credentials or paper-writing for its own sake,” said Zittrain. “It’s about approaching problems with humility, energy, and an open mind, attacking those problems rigorously, and reviewing whether one is even asking the right questions. Assembly was begun on the theory that people outside academia, from across disciplines and sectors, would be eager to take on problems that way, and with the overall public interest in mind, rather than the interests of one player in a larger ecosystem. We found an extraordinary number of people willing to engage in that spirit.”

Over five years, Assembly grew from a pilot project that brought together professionals from across sectors to a multi-pronged, interdisciplinary fellowship program that developed innovative approaches and cross-sectoral collaborations focused on the public interest. In 2019, Assembly expanded to include two additional tracks, the Forum and Student Fellowship, which convened senior leaders and Harvard students, respectively, to discuss the spread of disinformation on online platforms. In 2021 — the program’s fifth and final year — the Assembly Fellowship invited ongoing alumni project teams to return (virtually) to BKC to support their work and celebrate their continued success.

“Our initial goal with Assembly was to combine the real-world expertise of people in industry with the socially-motivated nature of academia,” said Jordi Weinstock, one of the founders of the Assembly Program. “From day one of the pilot year, we had no idea whether any of the projects would develop beyond a nascent stage, or even if that was a necessity. Instead, we worked to create a lasting community within the cohort, one that would continue to bring positive change for the world long after their program ended. Through the years, both the thematic focus and the people involved would change, but consistent throughout has been the strength of the Assembly community. To me, it is both a bonus and a testament to our Assemblers and staff that many projects did, in fact, flourish.”

Over twenty prototypes explore solutions to complex tech policy questions

How can we build AI systems to serve the public interest and not perpetuate or exacerbate existing inequalities? How can we improve security vulnerabilities among “Internet of Things” (IoT) devices? Who should be held accountable for the spread of disinformation on social media platforms? How do we govern cyberspace across international borders?

Five years of Assembly projects have sought to make progress on complex, whole-of-society questions like these. Created as a model to help solve some of society’s most pressing issues around technologies, the program brought together a community of professionals from around the world to leverage their expertise across disciplines and actively collaborate on provocations and prototypes in the public interest.

Incoming fellowship cohorts convened at Harvard to meet each other, identify problem spaces and form project teams, and participate in programming in-residence before working asynchronously on their projects for the remainder of the four-month development period. Each cohort was supported by an expert Advisory Board of faculty and practitioners with expertise across the range of topics fellows explored. Between 2017 and 2020, four Assembly Fellowship cohorts developed twenty-one projects that aim to offer paths forward on complex challenges in the digital public sphere. As Hilary Ross, Assembly’s program manager, put it, “Assembly created a space for people from across disciplines to come together around complex technology and policy problems, problems that at times feel intractable and really require effort from across all sectors. Through programming and projects, fellows got to both better understand those problems from varied perspectives, and collectively find paths forward and demonstrate possibilities.”

2017

Assembly’s pilot year in 2017 focused on the challenge of digital security and sought to answer a central question: how do we move beyond a world where virtually every computing device and network is insecure? Over four months, the sixteen professionals that comprised the cohort explored the complex interaction between Internet governance organizations and sovereign states, the tension between the ease of disseminating information online and the interest of copyright holders, privacy advocates, and other stakeholders, and the roles of intermediaries and platforms in shaping what people can and cannot do online. The first year of Assembly produced five projects aimed at addressing these challenges, including: Clean Insights, a privacy-oriented analytics tool and Information Fiduciaries and Data Transparency, a prototype of a visualization tool that would enable companies to better document their collection and use of consumer data.

“[Assembly] provided an opportunity for participants to pop out of their usual bubbles of collaborators, to connect with committed thinkers, scholars, experts, advocates and hackers from a diverse set of contexts and experiences. Whether participating on a project or reviewing the teams’ work as [an advisor], I benefited deeply from the exposure to smart people, working collectively, to find novel approaches to difficult challenges.” — Nathan Freitas, 2017 and 2021 Assembly Fellow, 2020 Assembly Advisor
2018–2019

The 2018 and 2019 Assembly cohorts focused on the ethics and governance of artificial intelligence. AI technology has become increasingly sophisticated over the past decade, driving innovations in industries from healthcare to travel to manufacturing and becoming a feature in many homes, smartphones, and vehicles. However, as AI becomes even more ubiquitous, these advances have raised new questions: how do we prevent, detect, and mitigate bias in AI systems? What factors should municipal governments consider before deciding whether to implement AI technology in a city space? What are the risks of pursuing surveillance-related AI work, and how do we mitigate them? Between 2018 and 2019, the Assembly Fellowship developed ten frameworks, tools, and other projects aimed at addressing these challenges, including AI Blindspot, a framework for identifying and mitigating bias in AI systems and the Data Nutrition Project, a tool that improves the accuracy and fairness of algorithms by helping practitioners better assess datasets.

“The [Assembly Fellowship]…gave me an opportunity to experiment: I wore many hats, performed many roles, and ultimately learned more about myself and what I am good at and what I want to do. In a world that pushes us to excel rather than to explore, I really appreciated the space to try things I never thought I’d do — from writing a TV pilot to building interactive data visualizations to reading hundreds of military RFPs — and being given the support to try.” — B Cavello, 2019 Assembly Fellow
2020

In 2019, BKC launched Assembly: Disinformation, which built on the success of both the previous three iterations of the Assembly Fellowship and other programs at the Center. The 2016 United States general election vaulted online disinformation — and its real world ramifications — into the national discourse. In the aftermath of the election, the program sought to make progress on important questions about the United States’ vulnerability for foreign influence operations, the importance of digital literacy, and the role of social media companies in disseminating and amplifying false information. The 2020 cohort developed projects that included Semaphore, a prototype of a tool to help users publicly flag false information on platforms, Into the Voids, a framework for evaluating data voids and the harms they pose, and Disinfodex, a database that indexes publicly available information about disinformation campaigns.

“Outside of academia, it is so rare to be able to explore a complicated societal challenge like disinformation in a completely unconstrained way, and even rarer to actually produce a final project addressing that issue. The long-lasting relationships formed with my teammates and cohort from working on these projects will be something I take away long after the conclusion of Assembly.” — Jenny Fan, 2020 and 2021 Assembly Fellow
2021

In its capstone year in 2021, Assembly invited back five independently continuing alumni projects: AI Blindspot, Clean Insights, the Data Nutrition Project, Disinfodex, and Cloak & Pixel (an evolution of the equalAIs project, developed by Assembly fellows in 2018). The projects address the variety of challenges that the Fellowship explored since 2017: how can companies better balance user privacy and product development? How can we improve the accuracy and fairness of algorithms to help practitioners better assess the viability of datasets? How can we better track and understand influence operations on online platforms? Through this final year of the program, the cohort learned from each other, developed new iterations of their tools, and reached new audiences. Watch the cohort’s final showcase and learn more about the teams’ work here.

Alumni community continues to integrate new approaches to responsible technology
“Assembly was the most interesting collection of people I’ve ever met working on some of the hardest, most important problems on the internet. One unexpected gift I got from the program was a good look at the lots of different ways to change things: there are techniques and career paths I’d never have thought about [on my own]. Through the program, I met so many tremendous folks, and I now see so much more potential.” — John Hess, 2017 and 2020 Assembly Fellow

The Assembly Fellowship, one of three of the program’s core tracks, offered a model in which professionals from across sectors could learn from, challenge, and collaborate with each other. Its impact was twofold: while Assembly produced a number of public interest-focused prototypes, frameworks, and other projects, the Fellowship’s most lasting impact might be the people that comprise its community. The Assembly Fellowship helped to shape the way its alumni think about the intersection of technology and society: since their time in the program, alumni have pursued new career paths, continued to develop their Assembly projects, and used the tools and perspectives they gained as fellows to integrate new approaches to responsible technology.

Below are some stories from former Assembly Fellows about their experience in the program and its impact:

“As a practitioner working at the crux of democracy and technology for over a decade, Assembly was an opportunity to connect with the latest thinking about the challenges and solutions to how technology impacts people’s lives. The program offered a space to both reflect and learn, while at the same time contributing to developing practical tools and approaches that address some of the toughest challenges we face in data-driven societies… Being part of an interdisciplinary team expanded my interest in providing a useful bridge between advancing the intellectual frontiers of how data-centric technologies impact society and translating that knowledge into action by policymakers, technologists, and civil society. This has led to an exciting new path in my career as Managing Director at Data & Society, working with our research and engagement teams to shift the focus onto the people most impacted by technological change.” — Ania Calderon, 2019 and 2021 Assembly Fellow

“I was grateful that Assembly brought together people who might not have otherwise found each other…[my cohort was comprised of] a bunch of brilliant thinkers and doers who may have shared values, but certainly had different approaches and backgrounds that ultimately informed our work together. Working with activists and academics, corporate professionals and service members, and creatives and educators broadened my worldview and helped me realize that I have allies in more places than I would have thought. The fellowship also gave me an opportunity to experiment: I wore many hats, performed many roles, and ultimately learned more about myself and what I am good at and what I want to do. In a world that pushes us to excel rather than to explore, I really appreciated the space to try things I never thought I’d do — from writing a TV pilot to building interactive data visualizations to reading hundreds of military RFPs — and being given the support to try.” — B Cavello, 2019 Assembly Fellow

“As someone working in industry, it’s almost impossible to get dedicated time to think through really challenging issues. I applied to Assembly because the fellowship gave me exactly that: a chunk of time to dig into the complexities around the ethics and governance of AI. Through Assembly, I met a truly inspirational community of people who were interested in the same questions I was asking, but coming from all sorts of backgrounds. Together, we launched a research group that continues to this day — the Data Nutrition Project (DNP), which builds ‘nutritional labels’ for datasets meant to increase overall awareness and health of datasets being used to build algorithmic systems. DNP, and the amazing support from the Assembly program and community since, has really changed the trajectory of my own career. I am much more aware of and focused on addressing issues of inequality in algorithmic systems, and I now bring that to everything I do during my daily work — from building COVID analytics frameworks to assessing the quality of humanitarian datasets.” — Kasia Chmielinski, 2018 and 2021 Assembly Fellow

“Assembly, for me, achieved the delicate balance between the often slow, deeply considered pace of academia and the less thoughtful ‘move fast and break things’ mentality common in the tech industry. It provided an opportunity for participants to pop out of their usual bubbles of collaborators, to connect with committed thinkers, scholars, experts, advocates and hackers from a diverse set of contexts and experiences. Whether participating on a project or reviewing the teams’ work as [an advisor], I benefited deeply from the exposure to smart people, working collectively, to find novel approaches to difficult challenges. In addition, the access to and feedback of accomplished mentors and advisors was a critical aspect of the program, one that helped [project teams] reconsider, rethink, or more fully commit to a particular direction…In my case, I was fortunate to have Assembly be a place where a seed of an idea was germinated into a fully blossoming independent, grant-funded endeavor. Through our project [Clean Insights], the Assembly Fellowship will have a lasting impact on data privacy, security and sovereignty for real people around the world.” — Nathan Freitas, 2020 Assembly Advisor, 2017 and 2021 Assembly Fellow

“As a fellow, a member of the advisory board, and a member of the staff team, Assembly has been the most meaningful, impactful, and inspiring program I have been involved with in my more than seven years at Harvard: the support, the community, the ideas and their impact in the world are all things I expect to benefit and learn from throughout the remainder of my career. Launching and continuing to work with the Data Nutrition Project has been a consistent source of inspiration for me. So impressed by (and grateful for!) the reach and breadth of this program.” — Sarah Newman, 2020 Assembly Advisor, 2018 and 2021 Assembly Fellow, and 2019–21 Assembly staff

Over five years, the Assembly Fellowship brought together nearly 80 journalists, engineers, policymakers, designers, and other practitioners all deeply committed to the public interest. As the program draws to a close, BKC will continue to carry forward Assembly’s models, lessons, and approaches through the Center’s Rebooting Social Media Institute and other programs and initiatives.

Five Years of Assembly was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Own Your Data Weekly Digest

MyData Weekly Digest for July 23rd, 2021

Read in this week's digest about: 11 posts, 1 question, 2 Tools
Read in this week's digest about: 11 posts, 1 question, 2 Tools

Thursday, 22. July 2021

Berkman Klein Center

Why all data governance needs to consider children’s rights

Photo by zhenzhong liu on Unsplash Last month, UNICEF published a Manifesto on Good Data Governance for Children, an initiative that was the result of a year of collaboration between a working group of 17 experts, many of them affiliated with the Berkman Klein Center for Internet & Society and UNICEF. Why a focus on children? We know that massive amounts of data are increasingly bei
Photo by zhenzhong liu on Unsplash

Last month, UNICEF published a Manifesto on Good Data Governance for Children, an initiative that was the result of a year of collaboration between a working group of 17 experts, many of them affiliated with the Berkman Klein Center for Internet & Society and UNICEF.

Why a focus on children?

We know that massive amounts of data are increasingly being collected about all of us virtually everywhere we go, as our lives become more entwined with technology. So what is special about children’s data, and why did we choose to focus on children in particular?

Children’s rights are afforded extra protections under international human rights laws such as the UN Convention on the Rights of the Child, and data processing impacts on virtually all areas of children’s rights to some degree. States, companies and guardians have a duty under existing international human rights laws to prevent children’s personal information and data from being used to exploit them or violate their freedoms. The main difference between general data governance and children’s data governance is the presumption that children cannot effectively advance and advocate on behalf of their own interests because of their age and capacity. That is why we believe that children’s data merit special protection and a distinct consideration in international, regional and national governance regimes.

Privacy and protection of children’s information and identities are particularly important as they grow and experiment by making different choices and exploring different preferences, and as they develop their personalities. Children need to be afforded the agency to define who they are for themselves, without having their future pathways predetermined or their learning styles unduly narrowed down by algorithms.

Governance of children’s data also presents some challenging questions in relation to consent and children’s agency over their own data. Children, depending on their age, may be less suited than adults to provide meaningful consent for their data collection and use. This is because children are often less able than adults to make mature decisions on data use that may impact their future in ways that are difficult even for adults to understand. This problem is compounded by a general lack of transparency in relation to how data is used by technology companies — it is rare to see terms and conditions of data collection explained in child-friendly language or translated into minority languages. Generally there is already a power imbalance between the public and governments, companies, or other institutions that process their data, and children are in an especially vulnerable position within these relationships.

On the flip side, we also wanted to focus on children’s data because we are excited about the potential for its use for good. We know that children’s data can support research, development and provisions of services, and poor data governance can lead to a loss of potential benefits for children. In an increasingly data-driven global economy good data governance for children is essential for children, for development, and for business.

Festusminja, via Wikimedia Commons What does good data governance for children look like, and how do we get there?

The working group members wrote a series of background papers that informed the manifesto, covering state surveillance and the implications for children, young people in the commercialized digital environment, data governance gaps in light of Covid-19, responsible group data for children, children’s rights by design, governance of student data, and exploration of a fiduciary approach to child data governance.

As part of tackling the question of what good data governance for children looks like, the working group took the manifesto through a series of regional workshops in the US, Europe, Asia, and Africa, which helped to flesh out ten key action points:

1. PROTECT children and their rights through child-centered data governance. Such data governance should adhere to internationally agreed standards that minimize the use of surveillance and algorithms for profiling children’s behavior.

2. PRIORITIZE children’s best interests in all decisions about children’s data. Governments and companies should give priority to children’s rights in their data collection and processing and storage practices.

3. CONSIDER children’s unique identities, evolving capacities and circumstances in data governance frameworks. Every child is different, and children mature as they get older, so data governance regulations must be flexible. Marginalized children must never be left behind.

4. SHIFT responsibility for data protection from children to companies and governments. Extend the protection measures to all children below the age of 18, regardless of the age of consent.

5. COLLABORATE with children and their communities in policy building and management of their data. Through distributed models of data governance, children and their communities should have more say in how data is processed, by whom it can be processed, and with whom it can be shared.

6. REPRESENT children’s interests within administrative and judicial processes, as well as redress mechanisms. It is imperative that children’s rights are integrated into existing mechanisms, such as the work of data protection authorities.

7. PROVIDE adequate resources to implement child-inclusive data governance frameworks. Data protection authorities and technology companies must employ staff who understand children’s rights, and governments should allocate funding for regulatory oversight.

8. USE policy innovation in data governance to solve complex problems and accelerate results for children. Policy innovation can help public authorities to make the most of data, while at the same time safeguarding children’s rights.

9. BRIDGE knowledge gaps in the realm of data governance for children. There are some urgent knowledge gaps that need further research to ensure that data governance regulations are evidence-based.

10. STRENGTHEN international collaboration for children’s data governance and promote knowledge and policy transfer among countries. This Manifesto calls for greater global coordination on law and policy. Uncoordinated national-level data governance laws can lead to competing assertions of jurisdiction and conflict.

At the time of writing, increased efforts are being made to regulate the technology sector in all regions. At the same time, promising local and national initiatives are exploring ways to manage data in the public interest. A key takeaway from this manifesto is that children’s rights need to be front and centre of all of these new data governance initiatives, rather than being side-lined. This is increasingly important in a world in which children’s data cannot always easily be distinguished from adult’s data, and children are often impacted by data processing even when they are not the primary intended users of a digital platform or service.

You can read the full manifesto here, which contains rich information about many of the child rights issues impacted by data processing and elaborates on the ten action points listed above.

Watch Urs Gasser, former Executive Director of the Berkman Klein Center for Internet & Society — Harvard University; Jasmina Byrne, Chief of Policy, Office of Global Insight and Policy — UNICEF; and Emma Day, UNICEF Consultant, discussing the manifesto here with a panel of experts:

Julie Brill, Chief Privacy Officer, Corporate Vice President, and Deputy General Counsel of Global Privacy and Regulatory Affairs — Microsoft; Dorothy Gordon, Chair of the Inter-Governmental Council of the UNESCO Information For All Programme; and Riita Vanska, Specialist in the IHAN (Human-Driven Data Economy) Project — SITRA.

For further information contact Jasmina Byrne: jbyrne@unicef.org

Why all data governance needs to consider children’s rights was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


DIF Blog

DIF Grant #1: JWS Test Suite

DIF announces its first community microgrant, sponsored by Microsoft and rewarding the timely creation of a comprehensive test suite for detached-JWS signatures on Verifiable Credentials

The Decentralized Identity Foundation announced recently a new mechanism for rewarding and funding work in the common interest of its membership, the DIF Grants Program. The Steering Committee ratified an addendum defining these collaborations between a specific DIF working group and a grant sponsor, and this shiny new tool sits ready to be debuted in the toolkit available to DIF community work.  DIF's work is leveling up with this new mechanism for transparently funding work that supports DIF's mission and benefits the membership as a whole.

The first such grant was initiated by Microsoft, which has a long history of collaborations in DIF. Of the various modalities described by the grant program, The Claims and Credentials Working Group will be overseeing a new work item open to all DIF members that creates and harden a JWS test suite, with this grant funding a lead editor to drive the work and keep it to a pre-determined timeline, paid upon stable and complete release.

Photo by Natasya Chen History

Pamela Dingle, co-chair of the Interoperability WG, identified a recurring theme in conversations about interoperability across the great "JSON"/"JSON-LD" divide: while considerable cross-vendor, cross-stack alignment had happened over the last years to address many other aspects of credential exchange and identifier resolution, "translation issues" and varying interpretations of the JSON sections of the VC data model specification lead to divergent ways of structuring, defining, signing, and parsing VCs in JWT form.  The kinds of signature suite definitions that define Linked Data Proofs made strange bedfellows with the in-built mechanisms of JWT, which were hardened and commoditized earlier.  This results in a slightly "balkanized" landscape of VC-JWTs that make different concessions to the expectations of JSON-LD-native parsers and systems.

Initially, the idea of iterating one or more foundational specifications seemed a natural solution: a few key sections of any specification could be made more explicit, more normative, or less ambiguous. But just as the plural of anecdote is not data, so too is the plural of specifications not disambiguation. Another tack was chosen: implementers of the existing specification would come together and compare their running code to identify every difference and incompatibility, defining a test suite that interpreted the current specification.  They might still produce a list of changes they would like to see in a future specification, and that list might contain some contentious items that take a long time to align.  But in the meantime, they would have a practical roadmap to alignment and a benchmark for conformance to drive interoperability amongst each other and clarity for external interoperability.

A signature suite that spans two worlds

The specification under test in this new work item is Json Web Signature 2020 ("JWS2020" for short), a CCG signature suite that defines how the signature mechanisms and data structures native to "detached signature" JWS tokens can be parsed as Linked Data Proofs. The signature mechanisms of the JWT world and the LDP world are quite different– bridging them requires a deep understanding of both: differing security models, canonization and serialization complexities, explicit reliance on IANA registries, URDNA dataset normalization, and the like.  Advanced topics, even for this blog!

Understanding such a specification on a deep intellectual level and understanding its design decisions requires a firm grasp on all these complexities spanning two very different engineering traditions. Implementing the specification, however, should not; to be blunt, our community cannot afford such understanding gating off successful implementation, particularly if industrial adoption of decentralized identity is a shared goal.  Before the specification gets iterated, it needs to be testable, and even more implementable than it already is.

This is where a test suite offers pedagogical tool for first-time implementers, as well as a negotiation tool for seasoned veterans and large corporations managing vast roadmaps and production development pipelines. While many DIF members do not encode VCs in JWT form, this grant is clearly in the common interest because mainstream adoption will indisputably benefit from a clearer, more explicitly-defined, and definitively-testable "normative VC-JWT".

Selection Process

From 26 July 2021 to 9 August 2021, the chairs fielded applications through a google form, and submitted all valid application to github for posterity.  The selection criteria for the work item lead is as follows:

familiarity with JWS mechanics and common JOSE libraries/best-practices experience with automated testing, vector design, and test scripts/suites familiarity with LD Proofs and signature suites generally familiarity with JSON/JSON-LD interop problems, at the semantic, signature-verification, and representation/parsing levels Grantee Announcement

On 12 August 2021, The Claims & Credential (C&C) Working Group at DIF gladly announced Transmute Industries as the recipient of the first DIF Grant to provide a JWS Test Suite for Verifiable Credentials. One of the chairs, Martin Riedel, summarized the chairs' decision thusly:

Transmute has been a thought leader in the SSI space for years and uniquely fits the requirement profile laid out in the Grant announcement.

Orie Steele, the CTO of Transmute, is the initiating author of, lead editor of, and main contributor to the (LD-) JSON Web Signature 2020. Transmute has a strong background in TDD and is a strong proponent of delivering test vectors with specifications in order to achieve greater implementation-level interoperability. Examples include Test Vectors in Sidetree.js and did-key.js. The company already provides integrated libraries to support (LDS)-VC and JWT-VC side-by-side, demonstrating their familiarity with both representations. See vc.js. Lastly, Transmute has a strong track record of making specifications and libraries accessible to the general public in a variety of deployed web-projects (https://did.key.transmute.industries/, https://wallet.interop.transmute.world/ and others…)

Therefore we regard Transmute Industries as the ideal candidate to provide a comprehensive JWS test suite for LDS-VC and JWT-VC and support the community interactions around this project within DIF’s C&C Group.

Next Steps

The new work item will be announced and regular meetings initiated at the 23 August meeting of C&C– if you are interested in participating, please attend and/or drop a note in the C&C slack channel.

Wednesday, 21. July 2021

OpenID

Notice of Vote for Proposed Implementer’s Drafts of Two SSE Specifications

The voting period will be between Friday, July 23, 2021 and Friday, August 6, 2021, once the 45-day review of the specifications has been completed. The Shared Signals and Events (SSE) working group page is https://openid.net/wg/sse/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the […] The post Notice of Vote for Proposed Implemente

The voting period will be between Friday, July 23, 2021 and Friday, August 6, 2021, once the 45-day review of the specifications has been completed.

The Shared Signals and Events (SSE) working group page is https://openid.net/wg/sse/. If you’re not already a member, or if your membership has expired, please consider joining to participate in the approval vote. Information on joining the OpenID Foundation can be found at https://openid.net/foundation/members/registration.

The vote will be conducted at https://openid.net/foundation/members/polls/236.

– Michael B. Jones, OpenID Foundation Secretary

The post Notice of Vote for Proposed Implementer’s Drafts of Two SSE Specifications first appeared on OpenID.

Implementer’s Drafts of Two FAPI 2.0 Specifications Approved

The OpenID Foundation membership has approved the following Financial-grade API (FAPI) specifications as OpenID Implementer’s Drafts: FAPI 2.0 Baseline Profile FAPI 2.0 Attacker Model An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the first FAPI 2.0 Implementer’s Drafts. The Implementer’s Dra

The OpenID Foundation membership has approved the following Financial-grade API (FAPI) specifications as OpenID Implementer’s Drafts:

FAPI 2.0 Baseline Profile FAPI 2.0 Attacker Model

An Implementer’s Draft is a stable version of a specification providing intellectual property protections to implementers of the specification. These are the first FAPI 2.0 Implementer’s Drafts.

The Implementer’s Drafts are available at:

https://openid.net/specs/fapi-2_0-baseline-ID1.html https://openid.net/specs/fapi-2_0-attacker-model-ID1.html

The voting results were:

Approve – 52 votes Object – 2 votes Abstain – 3 votes

Total votes: 57 (out of 273 members = 20.9% > 20% quorum requirement)

— Michael B. Jones – OpenID Foundation Board Secretary

The post Implementer’s Drafts of Two FAPI 2.0 Specifications Approved first appeared on OpenID.

Elastos Foundation

New Release: Elastos Essentials 2.1 Lands on Android and iOS

...

Nyheder fra WAYF

LUMI vil kræve sikre identiteter fra institutionerne

Forskere fra lande som deltager i LUMI-konsortiet, vil via eduGAIN få adgang til supercomputeren LUMI med den brugerkonto som de har ved deres institution. Men kun hvis institutionen har et påkrævet modenhedsniveau i sin brugerstyring – og oplyser det i sine brugeres login-tokens. Language Danish Read more about LUMI vil kræve sikre identiteter fra institutionerne

Forskere fra lande som deltager i LUMI-konsortiet, vil via eduGAIN få adgang til supercomputeren LUMI med den brugerkonto som de har ved deres institution. Men kun hvis institutionen har et påkrævet modenhedsniveau i sin brugerstyring – og oplyser det i sine brugeres login-tokens.

Language Danish Read more about LUMI vil kræve sikre identiteter fra institutionerne

Digital Identity NZ

The end of the Privacy Paradox?

All the latest news from the Digital Identity New Zealand community The post The end of the Privacy Paradox? appeared first on Digital Identity New Zealand.

As many people working in the digital identity, security and privacy spaces will know, the Privacy Paradox is the discrepancy between an individuals’ intentions to protect their privacy and how they actually behave in the online world.  The relationship between an individuals’ intentions to disclose personal information and their actual personal information disclosure behaviours, can often be very different.

Many surveys have shown that privacy is a primary concern for people across the globe in our digital age.  However, on the other hand large numbers of people freely divulge personal information in exchange for services and convenience, often for relatively small rewards such as interacting with others within a social network.

This inconsistency of privacy attitudes and privacy behaviour is often referred to as the Privacy Paradox – people expressing privacy concerns often fail to act in accordance with them.

There have been many explanations for the Privacy Paradox.  Examples being that people:

find it difficult to associate a specific value to their privacy and therefore, the value of protecting it

do not consider their personal information to be their own and might not appreciate the need to secure it

lack awareness of their right to privacy or privacy issues 

believe their desire for a fully personalised experience outweigh the potential risks from big tech companies using their data for profiling.

Others may have chosen to take it as a sign that, despite what people say, they really just don’t care about their privacy and have used it as an excuse to look at increasingly invasive methods for monetising personal data.

Well, hopefully as pointed out by Richard Bird, Ping Identity’s Chief Customer Officer at the Indentiverse conference in Denver a few weeks back, this later explanation ran into a seemingly solid dead end earlier this year, when with great fanfare (and strong disagreement from the likes of Facebook) Apple released iOS 14.5 and began enforcing App Tracking Transparency.

All being well, most people should now be aware that the latest release of iPhone, iPad, and Apple TV apps are required to request a user’s permission to track their activity across multiple apps for data collection and ad targeting purposes.

The early data is not looking good for app developers and advertisers who rely on targeted mobile advertising for revenue.  While the worldwide opt-in rate has been creeping up, as of 28 June (2 months into the release), it is still only at 17%.  The US only figure is even lower, at 9%.

So ultimately if people are now showing they do care about privacy when they are given the choice, why has the Privacy Paradox been so persistent over the last ten years or so?

Is it simply that to date people have really not had a choice?  Or perhaps more starkly, the only choice they have had was to not use a service at all?  Is the greatest example of “an offer you can’t refuse” the ACCEPT button on the Terms of Use page?

Hopefully we are now heading into a time when the Privacy Paradox will be no more. A time when people are offered a genuine choice about sharing their personal information. 

Ngā Mihi,

Michael Murphy

Executive Director

To receive our full newsletter including additional industry updates and information, subscribe now

The post The end of the Privacy Paradox? appeared first on Digital Identity New Zealand.

Tuesday, 20. July 2021

Ceramic Network

Tech talk: Primitives for mutable content on Web3

Joel, Co-Founder of 3Box Labs, discusses how Ceramic enables mutable cryptodata objects. Recorded at EthCC [4] on July 20, 2021.


Digital ID for Canadians

Spotlight on Accenture

DIACC is very pleased to welcome Accenture as a member. In this spotlight interview, DIACC President Joni Brennan connects with Iliana Oris Valiente, Managing Director,…

DIACC is very pleased to welcome Accenture as a member. In this spotlight interview, DIACC President Joni Brennan connects with Iliana Oris Valiente, Managing Director, Canada Innovation Strategy and Blockchain Lead and Christine Leong, Global Lead Blockchain and Biometrics, to hear their thoughts on digital ID and the great work that Accenture is doing in this space.

DIACC Spotlight on Accenture – YouTube


Ceramic Network

Boardroom is bringing context to Web3 governance

The Boardroom governance platform is using Ceramic to bring rich reputations and conversations to DAOs.

Boardroom, the governance middleware platform that powers some of Web3's biggest DAOs, has gone live on Ceramic mainnet. Among the many new features built on Ceramic, Boardroom is debuting an Ideation feature that lets protocol contributors suggest, discuss, and pressure test ideas in a collaborative, censorship-resistant, and fully decentralized way.

Want to try Boardroom’s new Ideation feature? Visit the ShapeShift DAO.

What is Boardroom?

Boardroom is addressing the fragmented governance landscape with a simple and transparent platform for stakeholder management over DAOs and protocols.

As DeFi and DAOs (Decentralized Autonomous Organizations) have exploded in popularity, so has experimentation in new governance methods and tooling. This has led to massive market fragmentation, with no unified experience that serves all aspects of governance together. Today, proposal information, discussion, voting, and execution are splintered across many platforms.

Boardroom intends to fix this. They are building a powerful governance platform that allows many different governance tools to plug into a single seamless framework that can serve an organization throughout all stages of its governance lifecycle.

Governance is more than voting

Voting is often considered a critical part of governance for DAOs, but voting is only the tip of the governance iceberg. Long before any votes are cast, ideas are generated, discussed, signaled, and revised - then they come to a vote and potentially move to on-chain execution. Today, this frequently happens across many platforms and is often done implicitly through back-channels rather than in a clear and explicit way. This process is full of friction and often results in lackluster participation and suboptimal decision-making.

Good governance requires context around all the moving parts, and that was Boardroom's founding motivation: to bring context to governance by unifying the disparate pieces and making sure the relevant information can be surfaced in an easy and clear way to participants.

The portal pulls in profiles, project information, discussion forums, Snapshot votes, ideation tools, and more.

The Boardroom SDK normalizes data and governance-related actions from across multiple governance frameworks and lets projects build their own, into a single standardized platform. Built on the Boardroom SDK, Boardroom's governance portal provides a consistent, low-friction interface for community members to interact and engage with distributed governance.

As they were building Boardroom, the team recognized that two critical aspects were missing from most governance tools:

The social layer: Contextualized discussion and debate The people layer: Rich, pseudonymous reputations The social layer: Contextualized discussion and debate

While much emphasis is put on formal proposals and votes, the Boardroom team found that ideation and discussion, while softer, are more influential in governing the direction of projects. Kevin Nielsen, Boardroom co-founder:

The vote is extremely important to actually anchor a specific decision, but the most important part of the actual process is coordinating the human and social layer.

To elevate this part of the governance process and connect it to decision-making, Boardroom is building a set of ideation and forum tools.

On the Boardroom app, each governance proposal will include its own forum for discussion on the same page, presented in a unified experience - with participation restricted to token holders in the network.

Additionally, a new tool designed for ideation allows the community to suggest, show support for, and discuss new ideas before moving to a formal vote. Ideation is a way to get more people involved in the process in a clear, contextualized, and collaborative way. After debuting the Ideation feature last week with the Shapeshift DAO, their community has already generated and discussed several new ideas using Ideation. Here's a preview:

Ceramic Streams: Web3-native content for DAOs

In governance, forum posts and discussions must be trustless and censorship-resistant, or else protocol governance could be shaped by a single party with the power to warp this crucial part of the process - including Boardroom. Storing discussion threads on a server would leave this data susceptible to corruption and posts couldn't just be stored on vanilla IPFS without a management layer to link posts, reactions, and responses together into a consistent thread.

Ceramic provided a perfect solution for this user-generated content, with signed streams of content persisted on a decentralized network. Because Ceramic content is authenticated with the same decentralized identifiers (DIDs) that sit at the heart of IDX and Boardroom's user profiles, users can begin posting to forums without the need to authenticate in a new way. Each piece of content is signed, stored, and controlled by users, then aggregated into the complete thread by the application.

This approach provides a lightweight way to manage rich user content and enables full portability and composability of data. Other governance applications could easily build on top of the same ideation threads outside of the Boardroom UI by referencing the same Ceramic streams or discovering them through a users' IDX. This paves the way for a consistent ideation and discussion layer that can be used everywhere across the governance ecosystem.

The people layer: Rich, pseudonymous reputations

To be maximally effective, governance delegates and participants must be able to build and leverage a reputation. The anonymity provided by obscure hex addresses might be sufficient for one-off financial transactions, however governance is a long-term game and requires continuity across repeated social interactions. For example others might want to know, "Who are the delegates that are consistently making good suggestions, proving their intent, and putting skin in the game?" Says Nielsen:

What we want to highlight with voter profiles is not about real-world identity. It's about mapping stakeholders in the network to their on-chain activity, their previous decisions, and their role. It's contextualizing how they fit into that specific DAO, or ecosystem, or decision.

The fact that many users maintain multiple on-chain accounts and wallets adds complexity to this goal. While this practice may be good for opsec and privacy, it makes it more difficult to construct a unified governance experience for users across all of their accounts. To solve for this, Boardroom needed a way to allow users to link many accounts (and their interactions) together into a unified identity without compromising a user's desire for privacy.

Ceramic IDX: Aggregated identities for users

This is why Boardroom is moving their identity system and user profiles to IDX - the identity protocol built on top of Ceramic. In addition to storing profiles and other arbitrary information, IDX lets users link multiple on-chain accounts to the same identity, as well as prove ownership of various Web2 accounts such as Twitter, Github, and Discord.

As more projects like Rabbithole, Sourcecred, and Gitcoin begin to also use IDX for identity and user data storage, a user's history of on-chain and off-chain transactions can aggregate into a unified, cross-platform reputation which can be used everywhere across the Web3 ecosystem.

All of this information stored in IDX results in a rich source of user context which doesn't just aid the voting process, but rather adds value to every stage of engagement and governance across the Web3 ecosystem — all while staying pseudonymous.

Connecting the governance ecosystem

With Ceramic and IDX, Boardroom is building a DAO governance platform that brings together the on-chain, social, and human layers in a way that is lightweight and ready to scale. Said YJ Kim, engineer and co-founder of Boardroom:

We initially thought we'd need to use Discourse forums for ideation. But data is not easy to fetch from the Discourse API, we can’t make a PR to suit our needs since it’s closed source software, and each integration required the forum admins to manually whitelist our IP.

IDX allows Boardroom to build a focused, web3 version such that future applications both within Boardroom and in the broader governance ecosystem don’t have to face those hurdles. Everything is just...there. That not only makes future iterations easier, but also if a user begins using some other application built on IDX, we’ll be able to easily plug that data right into our portal if we wanted to - and vice versa.

What's next for Boardroom?

Boardroom recently announced the addition of projects using Governor Alpha, Governor Bravo, and AAVE-based governance frameworks to their platform, adding to existing integrations such as Snapshot. Any projects using these frameworks can easily onboard to Boardroom's platform and use their governance portal interface to vote, delegate, ideate, track treasury distributions, and more.

For projects using modified versions of these frameworks, Boardroom will work with teams to build adapters as needed. Start using Boardroom via their website, follow them on Twitter, or join the discussion in their discord community.

Website | Twitter | Discord | GitHub | Documentation | Blog | IDX Identity


Velocity Network

Interview with Rosie Rivel, Vice President and Chief Information Officer at Kelly Services

We sat down with Rosie Rivel, Vice President and Chief Information Officer (Interim) at Kelly Services to learn why Kelly Services and Rosie personally have decided to be a part of the Velocity Network Foundation. The post Interview with Rosie Rivel, Vice President and Chief Information Officer at Kelly Services appeared first on Velocity.

Monday, 19. July 2021

GLEIF

Q2 2021 in review: The LEI in Numbers

The Global LEI Foundation (GLEIF) is proud of its ongoing transparency initiatives. Namely its open approach to providing unrestricted access to the latest LEI data from around the world with the Quarterly LEI System Business Reports, which are made publicly available free of charge. Through this ‘LEI in Numbers’ blog series, GLEIF aims to highlight key data from the latest report, explaining tren

The Global LEI Foundation (GLEIF) is proud of its ongoing transparency initiatives. Namely its open approach to providing unrestricted access to the latest LEI data from around the world with the Quarterly LEI System Business Reports, which are made publicly available free of charge. Through this ‘LEI in Numbers’ blog series, GLEIF aims to highlight key data from the latest report, explaining trends and profiling successes from the global LEI rollout.

The latest report, covering Q2 2021, shows that 58,000 new LEIs were issued globally during that period. Issuance was slightly lower than the previous quarter (68,000 issued in Q1), however this trend is quite common during the second quarter of the year, as the world moves into the slower summer months. The overall picture is very encouraging; in Q2 2021, total LEI issuance grew by 3.3% and the total number of active LEIs now stands as 1.82 million.

For a further summary of the past quarter’s data, the below infographic contains the key statistics from Q2 2021.

For the first time, Turkey emerges as the largest growth market in Q2, with an impressive growth rate of 22.9%. This is one of the highest growth rates seen this year. This was driven largely by a regional deadline to map LEIs to legacy international securities identification numbers, in line with the joint initiative from the Association of National Numbering Agencies and GLEIF. In addition, Iceland has doubled its quarterly growth rate when compared to the previous quarter (19% in Q2, up from 9.2% in Q1).

While remaining high overall, there was a nominal decline in renewal rates across both EU and non-EU jurisdictions throughout the past quarter (64.3% down from 66%). In line with fluctuations in issuance and growth, it is normal for renewal rates to vary from quarter to quarter. GLEIF advocates for LEI renewals to be made mandatory whenever LEI issuance is mandatory. The prevention of lapsed credentials by proactive renewal of LEIs on an annual basis ensures that the Global LEI Index continues to provide the most accurate and up-to-date data as possible. This benefits the entire ecosystem and all users of LEI data, who rely on its accuracy to make informed business decisions with confidence.

For the full report which includes further detail on the status of LEI issuance and growth potential, the level of competition between LEI issuing organizations in the Global LEI System and Level 1 and 2 reference data, please visit the Global LEI System Business Reports page.

If you are interested in reviewing the latest daily LEI data, our Global LEI System Statistics Dashboard contains daily statistics on the total and active number of LEIs issued. This feature now enables any user to review historical data by geography, increasing transparency on the overall progress of the LEI.

For further detail, or to access historical data, please visit the Global LEI System Business Report Archive. We look forward to sharing our progress each quarter as we continue to drive LEI adoption in 2021.

Friday, 16. July 2021

Elastos Foundation

Elastos Bi-Weekly Update – 16 July 2021

...

SelfKey Foundation

SelfKey Wallets Latest Versions Released and KEY & KEYFI Airdrop Distribution Completed 🚀

SelfKey Weekly Newsletter Date – 14 July, 2021 The latest versions of the SelfKey Wallet released, and the completion of the KEY & KEYFI airdrop for Binance Hodlers. The post SelfKey Wallets Latest Versions Released and KEY & KEYFI Airdrop Distribution Completed 🚀 appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 14 July, 2021

The latest versions of the SelfKey Wallet released, and the completion of the KEY & KEYFI airdrop for Binance Hodlers.

The post SelfKey Wallets Latest Versions Released and KEY & KEYFI Airdrop Distribution Completed 🚀 appeared first on SelfKey.


Own Your Data Weekly Digest

MyData Weekly Digest for July 16th, 2021

Read in this week's digest about: 14 posts, 1 question
Read in this week's digest about: 14 posts, 1 question

Thursday, 15. July 2021

DIF Blog

🚀DIF Monthly #20 (July, 2021)

Table of contents Group Updates; 2. Member Updates; 3. Funding; 4. DIF Media; 5. Members; 6. events; 7. Jobs; 8. Metrics; 9. Join DIF 🚀 Foundation News DIF "Frequently Asked Questions" Knowledgebase DIF has launched a massive knowledgebase, structured as a long series of frequently-asked questions and answers.
Table of contents Group Updates; 2. Member Updates; 3. Funding; 4. DIF Media; 5. Members; 6. events; 7. Jobs; 8. Metrics; 9. Join DIF 🚀 Foundation News DIF "Frequently Asked Questions" Knowledgebase

DIF has launched a massive knowledgebase, structured as a long series of frequently-asked questions and answers. This synthesizes a year of educational efforts in the interop WG, blog posts, newsletters, and many other DIF inputs in a format we hope will be useful as a reference and onboarding document throughout the decentralized identity space. Please peruse it, particularly the sections about your personal research focus and/or your company's specialty and products, opening issues or PRs on github wherever you feel a correction or addition is needed. This is intended as a community resource, so PRs are open to your input!

Interop WG election

The Interoperability WG is seeking nominations for new chairs. The group is hoping to pivot to a greater focus on testing, so candidates with experience scripting tests or making specs testable (whether in decentralized identity contexts or otherwise) are particularly welcome. For details about the ballot, visit the mailing list or slack channel.

🛠️ Group Updates ☂️ InterOp WG (cross-community) IDUnion, BMWi Schaufenster, and new interop targets w/Hakan Yildiz Why do you need DIDs for SSI, with guest David Chadwick (Verifiable Credentials Ltd., UK, W3C VC-WG spec editor) CCG Thread on VCs with non-DID identifiers: a recurring issue in this group. x509 as issuer ; "DID" for holder to express a key (did:key-like encoding from IETF RFC). "Protocols for VCs should not presume DIDs". Good Health Pass Blueprint by Drummond Reed, Global Covid Certificate Network by Lucy Yang. Good Health Pass Report and Interop Profile out now-- draft requested before 17 June. GCCN annoucement. WACI-PEx update. Identiverse (conference) update. 💡 Identifiers & Discovery "controller" property in DID documents and verification methods. "controller" in the "verificationMethod" is less clear. "verificationMethod" can only have a single "controller", but verification methods can in fact be "controlled" by multiple entities. See issue. in practice there is always did_document.id === did_document.verificationMethod[N].controller Updates from Verifiable Conditions
* Type property value.
* Preference for second option ("Manu's proposal"). Review of transform-keys DID parameter. Needs to be updated to match latest DID Core. Discussion on DHS request for comments on mDL: Minimum Standards for Driver's Licenses and Identification Cards Acceptable by Federal Agencies for Official Purposes; Mobile Driver's Licenses
Proposed Rules Update Universal Resolver. DID Registration. 🛡️ Claims & Credentials Workitem Status: WACI-PEX. Restructure PR (going in). Work continues apace - please open issues if anything is unclear and attend monday meetings! Workitem Status: PE (Maintenance) + Credential Manifest. PE issue-triage/discussion with OIDF on potential updates to PE v1. Conversations about aligning PE an OIDF (under DIF IPR) are ongoing: notes from first two meetings here Workitem Status: VC Marketplace. Plugathon and interop-profile for participating companies. Blog post (series) explaining the plugathon project forthcoming. 🔓 DID Auth A/B Connect WG is hub of all this - DIF members can join, joint meetings are in the DIF Calendar. David Waite (Ping Identity) & Kristina Yasuda (OIDF-DIF Liaison) will present an overview of the OIDF-DIF joint work at Interop WG next wednesday SIOP V2 Progress on SIOP properties. OIDC4VP Draft spec 📻 DID Comm Discussion Topics: Q3 Done Plan: Get the DIDComm v2 ecosystem “mature enough” to be adoptable by end of Q3 2021. Encoding Keys in envelopes by value or by reference/ID - 191 Issue Sender Key/ID Encryption questions Alternative Transport discussions: Bluetooth & NFC DIDComm v2 library interface comparison (pack/unpack interface comparison)
* DIDComm-rs - Rust - Jolocom
* did-jwt/veramo - Typescript
* Go - Securekey PRs 161 - Attachments 172 - Fix inconsistencies with to/next attributes in a forward message. 185 - kid and skid headers keys vs key refs 195 - anoncrypt warning (updated) 198 - typ/cty language 200 - threading 202 - refactor attachments 206 - advanced sequencing 205 - APU/APV values 📦 Secure Data Storage 27th EDV Dedicated Call: Discussed Derek's batch API PR Issues reviewed: Deletion Semantics, Vault Deletion, JWE - CWE formats. Issues discussed: Data Vault Configuration, URL to a data vault vs privacy, Max structured document size, etc. Issues opened: Correlation of server-visible information, Default root zcaps for EDVs. Identity Hubs: JOSE and DAG-CBOR agreement/interop. Presentations by
* 3Box/Ceramic on DAG-JOSE.
* FISSION on Cryptree. 🔧 KERI Py implementation's use case and driving design decisions. Design process for TELs (txn logs/microledgers for VCs, smart contracts, or other verifiable data objects). Discussion of new, more detailed proposal did:indy:xxx:keri "tunnel" (re-registering indy DIDs as KERI AIDs and vice versa). ⚙️ Product Managers Keith Kowal proposed as a new chair for the group. Credential payloads discussion. Invitation (out of date) to review Global Health Pass Blueprint document. Blueprint 💸 Finance & Banking SIG Short updates on SIG processes Chris Kamier (Sustany) proposed as new chair. Collaboration with H&T SIG, and paving the way for shared interests ✈️ Travel & Hospitality Four weekly sub-group meetings: (Selected use cases to be taken to PoC given appropriate interest) Verifiable Credentials & Offers Travel Change & Disruption KYC / Customer Profile / Loyalty Government Sanctioned Credentials Investigating adoption and/or development of H&T schemas to support documented use cases Good Health Pass Collaborative - to extend the interoperability blueprint focus beyond Interational air travel. [🌏 DIF APAC/ASEAN ] DIF is hosting the APAC/ASEAN call to support the wider ecosystem of decentralized identities across the world. During the last meeting the topics focused on general community and technology updates and discussions. 🌍 DIF Africa DIF is hosting the Africa call to support the wider ecosystem of decentralized identities across the world. During the July call, we had an overview of YOMA Foundation from Lohan Spies - Founder of DIDx and CTO of Yoma. 🦄 Member Updates

DIF Associate members are encouraged to share tech-related news in this section. For more info, get in touch with operations.

💰 Funding OSCAR HAS AN EMA

NGI Open Calls (EU)

Funding is allocated to projects using short research cycles targeting the most promising ideas. Each of the selected projects pursues its own objectives, while the NGI RIAs provide the program logic and vision, technical support, coaching and mentoring, to ensure that projects contribute towards a significant advancement of research and innovation in the NGI initiative. The focus is on advanced concepts and technologies that link to relevant use cases, and that can have an impact on the market and society overall. Applications and services that innovate without a research component are not covered by this model. Varying amounts of funding.

Learn more here.

🖋️ DIF Media

Bloom Donates WACI to C&C WG
TLDR; WACI isn’t being donated as a snapshot or as a reference implementation of a finished protocol; it’s being donated as a starting point for an ongoing work item in C&C WG that will round out the spec to be more robust and flexible, after the WACI-PEx profile has been finished and there is feedback to consider from implementers of it, and/or of other applications.

Setting Interoperability Targets - Part 2

[DIF focuses on three distinct] “layers”, each of which has its own interoperability challenges that seem most urgent. It is [the Interop WG's] contention that each of these could be worked on in parallel and independently of the other two to help arrive at a more interoperable community-- and we will be trying to book presentation and discussion guests in the coming months to advance all three.

🎈 Events & Promotions

DIF F2F 2021/2
TBD

User Experience in SSI : An IIW Special Topic
July 22, 2021 | Virtual Event

This IIW Special Topic event creates the space for User Experience Professionals, Product Managers, Interface Designers, and those in related roles working on decentralized identity or self-sovereign identity applications and tools to discuss, share and collaborate together.

The Business of SSI: An IIW Special Topic
Aug 04, 2021 | Virtual Event

Thought leaders, researchers, educators, and more will come together for intensive discussion and thought-provoking dialogue on Opening Up the Learning-Earning Ecosystem, and what that means for your leadership, community, and students. Participants will hear from leaders and drivers of change and be able to engage with panels of experts as well as participate in open discussions on the latest developments within skills-based learning and hiring. Conference attendees will also benefit from learning cutting edge tools and hearing from ongoing pilot projects across the United States.

Internet Identity Workshop XXIII
October 12 - 14, 2021 | Virtual Event

You belong at IIW this Fall! You’ll acquire the real-time pulse of genuinely disruptive technologies that are the foundation of today's important Internet movements. Every IIW moves topics, code, and projects downfield. Name an identity topic and it's likely that more substantial discussion and work has been done at IIW than any other conference!

2021 Digital Trust Summit
July 15 - 21, 2021 | Virtual Event

Learner agency and trust are at the center of our universe. The virtual Digital Trust Summit will convene global Digital Trust leaders, changemakers, students, faculty and technologists to learn about and collaborate around the latest DT projects and programs, connect directly with learners and educators, and advance a collective agenda towards cultivating Digital Trust in our education and education-to-workforce systems.

Books

Manning - 37% off on the book "Self Sovereign Identity"!

Manning is an independent publisher of computer books and video courses for software developers, engineers, architects, system administrators, managers and all who are professionally involved with the computer business. Use the code ssidif37 for the exclusive discount on all products for DIF members.

💼 Jobs

Members of the Decentralized Identity Foundation are looking for:

Communications Director / Manager (Remote)

Check out the available positions here.

🔢 Metrics

Newsletter: 4.6k subscribers | 31% opening rate
Twitter: 4.597 followers | 9k impressions | 2.5k profile visits
Website: 18k unique visitors

In the last 30 days.

🆔 Join DIF!

If you would like to get involved with DIF's work, please join us and start contributing.

Can't get enough of DIF?
follow us on Twitter
join us on GitHub
subscribe on YouTube
read us on our blog
or read the archives

Got any feedback regarding the newsletter?
Please let us know - we are eager to improve.


aNewGovernance

OKP4 to join the Board of aNewGovernance AISBL

KP4 to join the Board of aNewGovernance AISBL:

Because human-centric data infrastructure has to be the new global model across all sectors, moving away from platform-centric and state-centric current situations, we are delighted to announce Toulouse-based OKP4 is joining our Brussels-based International Association

As the Data Strategy and the Data Spaces are being put in place in Europe, the other main priority of the EU27 is the Green Deal, probably with even more impact. We are therefore delighted to welcome OKP4 which is bridging both priorities:

“In today’s data-driven society, the success of any organization relies on knowledge extracted from relevant data. Paradoxically, 70% of data produced by companies is not exploited. Worse, data is almost never shared between stakeholders due to lack of standards, trust and/or common interests.

Because data is the only asset whose value increases when it is shared, OKP4 (Open Knowledge Platform for) proposes an infrastructure where stakeholders can share their data without exchanging it with the others. Our infrastructure indexes heterogeneous data and services (AI) to produce on-demand knowledge without disclosing the data. Another particularity of OKP4 is to fairly reward knowledge contributors: it aligns interests between parties, within a framework of security and mutual consent allowing trustworthy collaboration. We believe the current data economy helped by data marketplaces is just a transition towards a more efficient and fairer knowledge economy helped by actors such as OKP4.

OKP4 focuses primarily on the field of agriculture, but is a toolbox that could and will be applied to any data-based usecase. Today, we help organisations extract value from their data and help them build multiparty data-sharing ecosystems for maximum mutual benefits.”

In recognition to its valuable contribution, OKP4 will be represented in the aNewGovernance Board by its President, Emmanuel Aldeguer.

Brussels, Toulouse, 15 July 2021


Blockchain Commons

Gordian Seed Tool Reveals the Foundations of Cryptography

Blockchain Commons has released Gordian Seed Tool, a new iOS app that allows for the creation, storage, backup, and transformation of cryptographic seeds is now available on the Apple appstore. It is an independent, private, and resilient vault, which can protect the most important underlying secret used by most cryptocurrencies: the cryptographic seed. Read More

Blockchain Commons has released Gordian Seed Tool, a new iOS app that allows for the creation, storage, backup, and transformation of cryptographic seeds is now available on the Apple appstore. It is an independent, private, and resilient vault, which can protect the most important underlying secret used by most cryptocurrencies: the cryptographic seed.

Read More

Though most wallets focus on the private keys used to unlock their cryptocurrency transactions, Gordian Seed Tool takes a step back and allows you to manage the fundamentals upon which private keys are built: entropy and seeds. You can use coin flips, die rolls, card draws, or iOS randomness as the entropy to generate your seeds, or you can import them from other applications. Using those seeds, Gordian Seed Tool can then derive unique, non-correlatable public and private keys as you need them.

The focus of Gordian Seed Tool is to protect your seeds. It does so by supporting new approaches to security such as QR-based airgaps, where you store your secrets on a closely held device that isn’t fully networked, such as an offline device in airplane mode, or a strongly protected device like the secure enclave in Apple’s iPhone and more modern Macinotoshes. You then communicate with that device primarily through QR codes or text that can easily be transmitted across that gap of air. Your seeds are thus protected by modern mobile-device security such as data encryption, trusted hardware, and biometric access protection.

That data encryption comes about through Apple’s trusted encryption routines. Seed Tool then adds resilience through integration with iCloud. If your choose for your mobile iOS device to have a network connection, its seeds will be stored in iCloud using end-to-end encryption. If you lose your device, you can retrieve those seeds with a replacement device logged into the same Apple account. (If you prefer, you can also use a cellular networked device, such as an iPod Touch or wifi-only iPad, leveraging airplane mode, to entirely ensure that your seeds never leave your device.)

You’ll be able to easily identify your seeds using the Blockchain Commons object identity block, which includes a visual lifehash, a human-readable name, an icon, and an abbreviated digest. Put them together and you should be able to recognize each seed at a glance.

Besides generating seeds, storing them, and backing them up, Gordian Seed Tool can also transform your seeds. It can derive popular Bitcoin and Ethereum public and private keys from your seeds, answer requests for other derivations, encode seeds as bytewords, BIP39 words, or hex, or shard them into SSKR shares and save them to different locations or as social recovery with your family, friends and close colleagues. Your seed is the basis for a whole tree of secure data, and Gordian Seed Tool gives you access to all of it.

Gordian Seed Tool is just one of several Blockchain Commons apps that demonstrate the Gordian principles of independence, privacy, resilience, and openness. The Gordian QR Tool offers a way to store confidential QRs such as 2FA seeds, SSKR shares, and (for that matter) cryptoseeds. Our forthcoming Gordian Cosigner will demonstrate how to easily manage Bitcoin transactions across an Airgap.

If you’d like to learn more about the libraries, specifications, and references being created by Blockchain Commons in coordination with our airgap wallet community, please visit our discussion forums, our YouTube channel and our research repo. You can also support our work as a patron.

Wednesday, 14. July 2021

SelfKey Foundation

KEY & KEYFI Airdrop for KEY Hodlers on Binance Completed

The airdrop distribution of the KEY & KEYFI tokens to eligible KEY token hodlers on Binance is now complete. The post KEY & KEYFI Airdrop for KEY Hodlers on Binance Completed appeared first on SelfKey.

The airdrop distribution of the KEY & KEYFI tokens to eligible KEY token hodlers on Binance is now complete.

The post KEY & KEYFI Airdrop for KEY Hodlers on Binance Completed appeared first on SelfKey.

Tuesday, 13. July 2021

Ceramic Network

How to migrate from 3Box to IDX for profile queries

A guide and code snippets for migrating your application from 3Box to IDX for profile queries.

3Box Labs is sunsetting 3Box products in favor of the new and more powerful combination of Ceramic Network and the Identity Index (IDX) protocol. This post provides a short tutorial on how to update your application to load user profiles using the Self.ID SDK instead of the 3Box profiles API. Making this upgrade ensures your application will always be loading the most current profiles for your users.

Why use Self.ID instead of 3Box?

Migrating your application from using 3Box to Self.ID for fetching user profiles will ensure you are always querying up-to-date profiles for your users, whether a given user has migrated from 3Box to Ceramic/IDX or not.

As Ceramic mainnet goes live, existing 3Box users will have their profiles migrated over to IDX. Updating a user's IDX requires the user to be authenticated and sign the updates, so this will not happen all at once. Instead, the first time a user interacts with a Ceramic-enabled app that uses 3ID Connect for authentication they will be prompted to migrate their account from 3Box to IDX and the data will be transferred in the background for them.

To make sure you are accurately pulling profiles for all users, we recommend upgrading to IDX ASAP. The 3Box API will keep working for the immediate future. However, if you do not upgrade you will only load outdated profiles from the 3Box API for users who have migrated. The Self.ID API solves this by returning an IDX profile for migrated users and falling back to 3Box profiles for users who have not yet migrated.

How to query profiles with Self.ID

This tutorial will show you how to fetch profiles in order to display them in your application's UI. The code below allows your app to display an IDX profile if one exists, and if not, will fall back to displaying a legacy 3Box profile.

First, we need to install the following packages of the Self.ID SDK:

npm install @self.id/core @self.id/3box-legacy

Now let's import our dependencies:

import { Core } from '@self.id/core' import { getLegacy3BoxProfileAsBasicProfile } from '@self.id/3box-legacy'

Now let's load a profile for a user. Notice below that we first try to load the profile using the core.get method. This attempts to retrieve the user's basicProfile from IDX, however if one is not present because the user has not yet migrated to Ceramic/IDX, then we can fallback to use the getLegacy3BoxProfileAsBasicProfile function to get the user's profile from the 3Box API.

const core = new Core({ ceramic: 'https://gateway.ceramic.network' }) const ethAddress = '0xabc123...' let profile = await core.get('basicProfile', ethAddress + '@eip155:1') if (!profile) { profile = await getLegacy3BoxProfileAsBasicProfile(ethAddress) } console.log(profile)

You may have noticed that the Ethereum address is concatenated with @eip155:1. This is because the core.get method can be used to lookup data for accounts on any blockchain. eip155:1 is simply the blockchain namespace for Ethereum mainnet. See the CAIP-10 spec for more details.

Questions or support?

We're always available to answer any questions and assist you through this transition period. Reach out in the Ceramic Discord for help.

Website | Twitter | Discord | GitHub | Documentation | Blog | IDX


ID2020

Brazilian Privacy Advocate Danilo Doneda Joins ID2020 Board of Directors

We are delighted to announce today that Brazilian privacy advocate, Danilo Doneda, has been named as the newest member of the ID2020 board of directors. Doneda, a practicing attorney and professor, currently serves as director of CEDIS/IDP (Center for Law, Internet, and Society). He is also a member of the National Data Protection and Privacy Council, serving on behalf of Brazil’s House of Repres

We are delighted to announce today that Brazilian privacy advocate, Danilo Doneda, has been named as the newest member of the ID2020 board of directors.

Doneda, a practicing attorney and professor, currently serves as director of CEDIS/IDP (Center for Law, Internet, and Society). He is also a member of the National Data Protection and Privacy Council, serving on behalf of Brazil’s House of Representatives; the Board of Directors of the International Association of Privacy Professionals (IAPP); and the advisory boards of the United Nations Global Pulse Privacy Group and the Project Children and Consumption (Instituto Alana).

“Identity and privacy are both fundamental and complementary aspects of citizenship and human rights in an increasingly digitized world,” said Doneda. “Identification frameworks can be as powerful and trustable as possible, as long as they accomplish to provide effective and meaningful privacy features and controls.”

“We are delighted to welcome Danilo to the ID2020 board of directors,” said board chair, Kim Gagné. “ID2020 has always been committed to the idea that individuals must be able to control how their personal data is collected, used, and shared. Privacy — and the trust that derives from it — is essential for digital identity to achieve its potential for empowering and protecting individuals, especially the most vulnerable members of society. Danilo’s extensive experience and commitment to digital privacy and data protection made him an obvious addition to the board and we are grateful for his commitment to our work.”

Prior to his current role, Doneda served as the General Coordinator at the Department of Consumer Protection and Defense at the Brazilian Ministry of Justice. He worked as a professor and visiting researcher at numerous universities in Brazil and Europe and has authored books, papers, and articles on topics related to civil law, privacy, and data protection.

Danilo Doneda holds a Ph.D. in civil law from the State University of Rio de Janeiro and an L.L.B. from the Federal University of Paraná (Brazil).

About ID2020

ID2020 is a global public-private partnership that harnesses the collective power of nonprofits, corporations, and governments to promote the adoption and implementation of user-managed, privacy-protecting and portable digital ID solutions.

By developing and applying rigorous technical standards to certify identity solutions, providing advisory services, implementing programs, and advocating for the ethical implementation of digital ID, ID2020 is strengthening social and economic development globally. ID2020 Alliance partners are committed to a future in which all of the world’s seven billion people can fully exercise their basic human rights while ensuring that data remains private and in the hands of the individual.

www.id2020.org


WomenInIdentity

Women in Identity launch Code of Conduct project, challenging digital identity teams to create more inclusive solutions

Press Release London, UK,  July 13, 2021 Not-for-profit champions of diversity in the identity sector, Women in Identity, have signed up a number of high profile sponsors including RBC (Royal… The post Women in Identity launch Code of Conduct project, challenging digital identity teams to create more inclusive solutions appeared first on Women in Identity.

Press Release
London, UK,  July 13, 2021

Not-for-profit champions of diversity in the identity sector, Women in Identity, have signed up a number of high profile sponsors including RBC (Royal Bank of Canada), GBG and the Omidyar Network for the Code of Conduct research project that was announced earlier this year.

The Code of Conduct will provide a set of Guiding Principles and an Implementation Framework to organisations that are designing, developing and/or deploying digital identity systems within the financial services sector.

The first phase of work is now underway with a comprehensive review of existing research to provide evidence of non-diverse thinking across all aspects of the identity ecosystem. This will be followed up by interviews with citizens across different economic landscapes, as well as organisations that implement identity systems for financial services use cases.  With a clear problem statement and evidence of the global social and economic impact of exclusion, there will be a clear focus for the detailed framework development that will follow later in the Autumn.

Women in Identity research lead, Dr Louise Maynard-Atem commented: “More than one billion people worldwide lack a basic, verifiable identity — they are not on the ‘identity grid’. And in the UK, it is estimated that nearly 1 in 4 people don’t have traditional identity documentation. Without a recognizable proof of identity, there can be no financial or health inclusion, citizen inclusion or digital inclusion. By focussing on building out inclusive product teams, we believe our Code of Conduct offers a unique opportunity to ensure that identity products of the future can genuinely work for everyone. Working with our research and sponsorship partners, we’re committed to designing a practical and pragmatic approach to becoming a truly inclusive sector.”

Savita Bailur of the project’s research partner, Caribou Digital, stated: “Around the world, people are often excluded from identity products or pathways which impact on their financial inclusion and independence.  We’re excited to be working with Women in Identity on the Code of Conduct. Through our research across mature and emerging markets, and working with multiple stakeholders, we aim to design actionable guiding principles for building more inclusive identification for financial products.”

Mick Hegarty of sponsors, GBG, added: “We are in business to build trust in a digital world, so that everyone can transact online with confidence. That’s why we are delighted to work with Women in Identity and sponsor this Code of Conduct, which will help ensure that digital identity works for everyone.”

Women in Identity will be delivering this research in an iterative fashion and organisations wishing to participate in the research or sponsor further workstreams of the Code of Conduct project should contact Louise Maynard-Atem.

 

About Women in Identity

We are a not-for-profit, volunteer-led network of predominantly (but not exclusively) women working in the identity sector. Through engagement and outreach we support nearly 2000 members worldwide to bring their own value to the world of identity.

It’s our belief that developers of identity systems must always consider how their products will work for people outside the majority group. We champion the inclusion of people of all races, genders, abilities cultures and ages within the teams responsible for designing, building and testing ID verification systems.

Diversity doesn’t happen by chance. Organisations – and their leaders – have to make a conscious effort to recruit and develop individuals who don’t look like them. Women in Identity offers events and support materials to promote best practice around hiring and developing diverse teams in our sector.

Research lead:   Louise Maynard-Atem | Women in Identity

PR & Communications lead :  Karyn Bright | Women in Identity

Twitter: @WomeninID

Web: www.womeninidentity.org

The post Women in Identity launch Code of Conduct project, challenging digital identity teams to create more inclusive solutions appeared first on Women in Identity.

Monday, 12. July 2021

We Are Open co-op

4 benefits of Open Recognition Pathways

Internal benefits, external benefits, training, and communication In our last post, we talked about some strategies for implementing badges. In this post, we’ll talk about specific benefits in using badges for digital transformation and related strategic work. Open Recognition Pathways support strategic goals like staff development and retention, community outreach and engagement, as well as pro
Internal benefits, external benefits, training, and communication

In our last post, we talked about some strategies for implementing badges. In this post, we’ll talk about specific benefits in using badges for digital transformation and related strategic work.

Open Recognition Pathways support strategic goals like staff development and retention, community outreach and engagement, as well as promotional or marketing initiatives.

The More You See by Bryan Mathers is licenced under CC-BY-ND What are open recognition pathways?

Open Recognition Pathways are constructed from Open Badges. These badges are verifiable, digital artifacts that show a skill or achievement, and are shareable across the web. They appear as a digital image, which once clicked, presents data about the badge, including the criteria for earning it. They can be awarded, issued and earned by anyone and can be connected to form pathways that signpost additional opportunities.

Where are the credential sized holes by Bryan Mathers is licenced under CC-BY-ND How could pathways benefit an organisation? 1. Internal benefits Train up your staff. Badges can package learning criteria up and show a staff members motivation to learn. They can be used to help validate promotions and show a staff member’s commitment to professional development. Retention. Since badges provide recognition of an individual’s skills, they can help employees feel valued and seen, which promotes retention. Skills search and skills gaps identification. Badge portfolios provide an easily searchable directory of organisational skills, making it easy to identify staff with relevant skills for new projects or skills gaps that should be plugged 2. External benefits The Four Stages of Engagement by Bryan Mathers is licenced under CC-BY-ND Open Engagement. Badges make it easy for the earner to demonstrate their skills and values. Contributor Badges earned from a particular organisation, for example, can help a contributor show the skills applied during their open contribution. They also highlight the social values and donation of time they have given to an organisation. These benefits can increase volunteerism and help motivate further engagement with the organisation Outreach. Badges can be shared across the web and recipients are often proud to share their achievement. Badges issued can have branding and link back to the organisation. These activities can deliver meaningful marketing wins Pipeline. Badges can form pathways towards additional badges and opportunities. Low level badges can be the start of routes towards more meaningful engagement, potentially providing a pipeline of interested and talented people who can contribute to, and further the work of an organisation The Story of a Journey by Bryan Mathers is licenced under CC-BY-ND Let’s get more specific

Let’s explore in more detail how they impact three specific departments / initiatives:

3. Badges for internal training Provide recognition of learning — people who have taken a training can show that they have done so. They can also use a badge to share what they’ve learned and (humble) brag about their accomplishments amongst their colleagues. Internal/External Marketing — People are generally proud to share their achievements. This can drum up support for your programme internally, and it can help position an organisation as a ‘learning organisation’ that invests in its staff 4. Amplifying communications Ask for contributions — We know that people contribute as a way to learn new skills, build reputation and to belong to a community with similar values. Contributors who are recognised are likely to share that recognition, and as an added bonus, they’ll link back to the organisation that recognised them. Provide pathways to further engagement — Since badges can be connected to form pathways, you can send your audience on a journey that helps them get to know your community and become further engaged in the work that you do. How do WAO approach this work? Disco (very) by Bryan Mathers is licenced under CC-BY-ND

We generally see three phases of work:

A discovery phase into how badges could support your strategic goals, leading to a badge strategy. We usually recommend having both internal badges to help staff development and retention, as well as more contributor focused badges to promote community outreach and engagement. Next we explode that strategy into a project plan for badges. We figure out how to scaffold the development of the badges and the promotion of them with your organisation. Implementation! Rolling out new badges is pretty easy once they exist. If learning opportunities already exist, we can add badges for those opportunities, and if they don’t, we’ll develop them.

Interested? Get in touch!

Special thanks to WAO members past and present who worked on this summary.

4 benefits of Open Recognition Pathways was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 09. July 2021

Ceramic Network

Community Call: 07-09-2021

Ceramic core devs share progress since launching Mainnet, FungyProof gives a demo, and the community discusses potential integrations with Dfinity.
Join the next community call

We're an inclusive community and all calls are 100% open. Just add the Ceramic calendar and show up for the next meeting.

Want to present your work?

If you would like to share what you're working on with the Ceramic community in a future call, join the Ceramic Discord and leave the moderators a note in the #shareyourwork channel.


Commercio

What is Digital Transformation and why it is essential to have a blockchain  like Commercio.network  to implement it.

What is Digital Transformation and why it is essential to have a blockchain  like Commercio.network  to implement it. What is digital transformation? In 2014, the EU issued the eIDAS directives by regulating Trust Services to create electronic identities, sign and send electronic documents with legal value. This has made it possible to dematerialize all business processes in digital form
What is Digital Transformation and why it is essential to have a blockchain  like Commercio.network  to implement it. What is digital transformation?

In 2014, the EU issued the eIDAS directives by regulating Trust Services to create electronic identities, sign and send electronic documents with legal value. This has made it possible to dematerialize all business processes in digital form. According to an EU estimate, digital transformation is expected to increase gross domestic product by 2.2 trillion euros over the next 10 years.

 

Why is it essential to be a blockchain ?

The blockchain is the technology made famous by Bitcoin and Ethereum. It’s a network that allows you to exchange something that has value and must remain original to keep it. You can exchange copies of images on the internet, but you can’t transfer originals. If it were possible we could all send each other pictures of money via whatsapp and not exchange banknotes. The blockchain solves the problem of exchanging value by exchanging originals. That’s why the founders of Commerc.io thought of using it to exchange documents with legal value.

The Digital Transformation  blockchain Commercio.network is an open-source PROJECT that gives companies the ability to sign and exchange documents using blockchain technology to guarantee: 

Paternity: The document was created by the declared sender.  Non-repudiation: The sender cannot deny having  signed the document  Integrity: The document has not been altered during transport. 

There are two main groups of business documents: 

Structured: Mostly XML (Invoices, orders, order confirmations, etc).  Unstructured: Mostly PDF (Contracts, forms, letters, etc). 

The network is organized through an international network  of 100 independent companies. Any company, especially yours, is welcome to join the consortium and take advantage of this next-generation technology. 

 

The Commercio.network is called The Documents blockchain because it focuses on solving one core business problem: legally exchanging and signing business documents. 

Mission 

Our mission is to spread blockchain technology to worldwide companies with a bold Go to market  plan:

 

100 Node validators companies to onboard. 1000  distributors companies to onboard. 10.000 IT companies to onboard…  100.000 end customers companies on the blockchain. 

 

We want to provide every IT Company in the world a simple, fast, profitable way to create value for them and their end customers with:

the vision  the skills  the core technology  the network of validators  the developer tools 

We have removed the underlying complexity of implementing a blockchain project by defining a three layer approach. 

APPLICATION LAYER: where IT companies build apps for solving users’ problems.  NETWORKING LAYER: where the companies run the blockchain network nodes.  PROTOCOL LAYER: where the basic 8 Core Functions are implemented on the node. 

 

The 8 Core Functions Our blockchain core Node software has a group of “Smart Contracts” focused on the world of the document: 

CommercioWallet (Done) : Enable users account and access   CommercioID (Done): Enable users self sovereign identity  CommercioDocs (Done): Enable users to exchange business documents   CommercioSign (Done): Enable users to sign business documents   CommercioMint (Done): Enable users to create blockchain Tokens   CommercioKYC (Next) : Enable TSPs to issue verifiable credentials   CommercioDex (Next) : Enable users to exchange blockchain Tokens  CommercioPay (Next) : Enable users to get SEPA payments

L'articolo What is Digital Transformation and why it is essential to have a blockchain  like Commercio.network  to implement it. sembra essere il primo su commercio.network.


Own Your Data Weekly Digest

MyData Weekly Digest for July 9th, 2021

Read in this week's digest about: 14 posts, 2 questions, 1 Tool
Read in this week's digest about: 14 posts, 2 questions, 1 Tool

Thursday, 08. July 2021

We Are Open co-op

Badges for digital transformation

Trojan mice, paved cow paths, and constellation-creation As anyone familiar with the work of WAO’s members will be aware, we’re big fans of Open Badges. We’ve been involved with the movement from the beginning and, in this post, want to reflect on our experience of how they can be used for the digital transformation of organisations. Image CC BY-ND Bryan Mathers A quick note, as ever, on ter
Trojan mice, paved cow paths, and constellation-creation

As anyone familiar with the work of WAO’s members will be aware, we’re big fans of Open Badges. We’ve been involved with the movement from the beginning and, in this post, want to reflect on our experience of how they can be used for the digital transformation of organisations.

Image CC BY-ND Bryan Mathers

A quick note, as ever, on terminology. We’re going to use the shorthand ‘badges’ to refer to what people call variously Open Badges, digital credentials, digital badges, or microcredentials. The important thing is that there’s a standard behind what’s being issued so that the data hard-coded into the visual image can be recognised by different systems.

Credentials vs recognition

There are broadly two streams in the badges movement. The larger, and more obvious stream is focused on credentialing. In other words, issuing badges is about proving things like who you are, what you know, what you can do, or which group(s) you belong to. These badges tend to be issued by well-known, trusted authorities.

We’re all familiar with credentials. It could be a degree certificate, a driver’s license, or a gym membership card. To date, badges tend to have been used to demonstrate skill acquisition, but the recently-renamed W3C Verifiable Credentials working group seeks to expand that in both human and machine-readable ways.

The other stream in the badges movement is around recognition. This is perhaps best explained by The Bologna Open Recognition Declaration (BORD) from the Open Recognition Alliance:

Promoting the recognition of learning achievements to support identity construction, citizenship, career development, learning organisations and territories, trust and empowerment.

As we explored in a previous post, these are the kind of badges that might include those that are self-issued and then endorsed by others, or that are issued by peers are part of a Community of Practice.

Both credentialing and recognition are important parts of any organisational badge strategy. Perhaps the easiest way of thinking about them is that the former is a ‘top-down’ approach, while the latter is ‘bottom-up’.

3 badge strategies for digital transformation

We’ve seen time and again badges be introduced to organisations like this:

Management creates a badge as a pilot The pilot is successful because the badge is issued by management Someone new to badges is tasked to create a system They create Gold / Silver / Bronze badges for XYZ skills The programme fails through lack of engagement or confusion

It really doesn’t have to be like this! Instead of imposing badges, they should be grown as part of an ecosystem of organisational development and transformation.

Here are three of our favourite ways to embed badges in organisations. Your mileage may vary, but using elements of one or more of these approaches is likely to be more successful than trying to mandate badges from on high.

1. Trojan mice Illustration by Elizabeth Beier

Many people will know the story of the Trojan horse, a wooden horse used by the Greeks during the Trojan War. They tricked their way into the city of Troy by presenting a ‘gift’ which was actually full of an elite members of the Greek army. The ‘Trojan horse’ has come to serve as a metaphor for something that looks like one thing (i.e. harmless, innocuous) but is actually another underneath (i.e. threatens the status quo).

These days, people are savvy enough to see Trojan horses coming a mile off, which is why WAO recommends setting off Trojan mice instead. Instead of talking about implementing a badging system, you could try badging participation in a workshop, or issuing a badge along with a paper certificate in an existing awards ceremony.

There are many advantages of this experimental approach, including:

Avoiding blockers — by seeing what works and what doesn’t, you can focus on what gains traction. Moving faster — finding something that gains traction and then iterating on that is quicker than trying to design everything upfront. Improving responsiveness — Trojan mice-size projects are nimble and can adapt to a changing landcape.

This metaphor is an easy way to introduce Agile development practices to organisations which may be new to this approach. This helpful post outlines five different types of Trojan mice that you might want to try in your organisation, from the obvious to the oblique.

2. Paved cow paths Image via Gris Anik

Anyone who has been walking in hills and fields where there are cows will be familiar with cow paths. As cows are creatures of habit, when one cow begins to walk across a field, a second one follows, and then the whole herd. After a while, the cows have created a well-worn route from point A to point B. This is often the path of least resistance.

As you can see in the above photos, humans do the same. It’s unrealistic to expect humans to walk in one direction, turn 90 degrees, and then walk in another direction. They just cut the corner. Of course they do. We all do.

Image via Gris Anik

So one solution is not to create any paths at all, but instead see where people actually walk. Then create a path over the top. In other words, paving the cow paths.

If we now apply this to learning design, and in particular, badge design, we can see how this can be beneficial. Rather than planning everything out in advance in a way that looks good on paper or on a presentation slide deck, set a direction and some design constraints, and then co-design badges with early adopters!

3. Constellation-creation Image CC BY-ND Bryan Mathers

When we escape the light pollution of modern cities and look up at the night sky in places, we see countless stars in the sky. Some of these stars, however, are brighter than others, which led our ancestors to group them into the constellations that we know today.

All of those stars are entities in their own right. And, given the scale of the universe, how they’re grouped together is somewhat arbitrary. It depends on the observer. There’s nothing stopping you or I making up our own constellation tonight by connecting the dots in new and novel ways.

Skill tree from the game ‘Path of Exile’

In the world of gaming, there is an established notion of a ‘skill tree’. The above example is from a game called Path of Exile, and it looks very constellation-like. As you gain ‘skill points’ you can choose to level up your character in different ways. As you can see from the above screenshot, there are potentially hundreds, if not thousands, of different ways of doing this.

Skill tree from the game ‘Shadow of the Tomb Raider’

The above screenshot is from the game Shadow of the Tomb Raider and presents a visually-different way of doing something quite similar.

Next step… Negentropy!

Once you’ve got your badge system up and running, the key thing is to keep feeding it. What does this mean? Many people will know that the term ‘entropy’ means a system tending towards disorder. Your job is the opposite, the seldom used term ‘negentropy’ (yes that is a real word!) which means a system tending towards order.

If WAO can help you with any of these steps, please do get in touch so we can have a chat. We’re quite good at untangling people’s organisational spaghetti!

Drop us a note here: https://weareopen.coop/contact

Badges for digital transformation was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. July 2021

Ceramic Network

Building a more diverse and inclusive team

Discussions from 3Box Labs on building a more diverse recruiting pipeline, team, and community.

3Box Labs is about to open up a lot of new roles. As we grow into our next phase, we spent a lot of time last month digging into a critical question: how do we improve the diversity of our team?

Our current team is far from representative of the global population we aim to build for. Our team of 12 is 9 men and 3 women. 9 of us are white. We are geographically spread out - but only in North America and Europe. And our pipeline of incoming applicants is more homogenous than this.

Last month we had a series of full-team conversations about our approach to building a more diverse, inclusive, and equitable team. We had plenty of differing opinions and useful debates, and did not end with 100% agreement on everything. We did end with a deeper shared appreciation for how much this matters, an approach that we believe in, and some steps we are putting into action.

We'd love your support, input, and feedback as we put this into action and so are sharing thinking as well as our plans.

Why diversity is critical to us

Our team first dug into why it is important to us. There are many reasons building a diverse team matters. Bringing focus to those that hold the most meaning to our team helped us make sure we arrived at a plan of action that would resonate. It also let us get extremely specific about how this goal relates to our overall mission and the other goals — and other constraints — that we have as a team. This helped us ensure our plan of action would be practical.

After several iterations, we arrived at this summary of why this topic is so important to us, which now lives on our About Us page.

🤝🏽 Commitment to diversity, equity and inclusion

Building a diverse team is a top priority because nothing will impact our success more deeply. Our ambitions for change are global and our team must represent perspectives from across the world we aim to improve.

The challenges we face are incredibly complex. We must tackle these with ideas and ingenuities from a wide variety of backgrounds and contexts. Our technology impacts how information and identities are controlled online, with deeply personal and widespread effects. We each have blind spots and need many viewpoints to anticipate the consequences of our work. We want to face the many unforeseen changes and challenges we have ahead with a balanced and robust team, drawing on a diversity of strengths to adapt and improve through adversity.

We have work to do here, as a team and industry. It's our responsibility to model that only a diverse team working together can we succeed in building a more sustainable, fair and equitable web.

One thing absent from this is anything about our role in helping close the opportunity gap for underrepresented groups in crypto, tech, and knowledge economy jobs. We debated this internally at length, as we do feel a responsibility to advance equity more generally in the greater ecosystem in which we operate. However, we feared diluting the messaging that building a diverse and inclusive team is central to our core mission. Aiming to impact social justice in tech felt like a big undertaking and adjacent to our core goals. We have a deep conviction that building a diverse team is critical to our success as an organization, so we decided to keep our statement focused on reinforcing that rather than adding additional diversity goals related to social justice more broadly.

We talked with several DEI advisors and heard mixed reactions. One felt it absurd to make a 'business case' for an equitable team; it's simply the right and necessary thing to do. Others felt that grounding our commitment to diversity deeply in our team's mission would help us feel the priority more concretely and urgently. We ended, for now, keeping our commitment tied explicitly to our mission. We do hope that through that commitment we will be able to model equity in a space where lack of representation is too much the norm.

Our approach to building a diverse team

To live up to our commitment, we need a principled approach to hiring that values and emphasizes diversity while still holding to other values, including evaluating every candidate as an individual. We established two particularly important principles:

Invest heavily in proactively building a representative pipeline for every role Out of that pipeline, hire the candidate who adds the most to our team's mission

At the end of our hiring process, we always want to hire the person that will add the most value to our team. This is not purely about their ability to excel in the role; it also includes character that reinforces our values, and capacity to add new dimensions to our team. This step cannot function in isolation, though. If we hire the best candidates from a pipeline of white male candidates, we will always be hiring white males.

We must build a broad and representative pipeline of quality candidates for every role. As noted earlier, this does not happen organically right now. We have to put significant proactive effort towards it, then pair it with a thoughtful evaluation and offer process.  This is an investment, and will take more time and will mean slower hires on some roles.

At a startup with limited resources and an urgent need to hire, this is hard. The two priorities of hiring fast and hiring deliberately - including to add diversity - are directly at odds. By explicitly tying our commitment to diversity and success in our mission together, and adding more concrete tactics, we hope to turn diversity into a top-of-mind priority for every hire.

Concrete changes we're making Make it very clear that all candidates are welcomed on our team

Adapt our job descriptions, incorporating the knowledge that different candidates react differently to certain language and elements (e.g., men are more likely to apply to jobs where they don't meet all listed requirements). Three changes we've made:

Replace "Requirements" with a section called "You're likely to succeed if," which focuses on capabilities and outcomes and avoids past experiences that would narrow the candidate pool Explicitly encourage candidates from underrepresented groups to apply in every job description Ensure inclusive language, using a tool to detect wording that may appeal more to men than women

Share our commitment publicly. We added a commitment to diversity to our About Us page. We will also begin sharing more from inside our team about this topic and related ones, starting with this post.

Invest in proactive sourcing

We will spend time, money and creativity to bring more candidates from a diverse set of backgrounds into our pipeline. A few actions we are taking towards this:

Tech talks: our team will start doing these far and wide in the tech world to build bridges in new communities (if you'd like to set one up, reach out!)

Be wary of opportunistic hires: these tend to be from similar backgrounds, and deprives us of an opportunity to build a more intentional pipeline. This is so hard because when a great candidate from our network emerges, it's likely a very high signal on a very strong candidate. We haven't ruled this out for certain specialized roles, but we cannot let it become a habit.

Refine our interview process to value difference and remove bias

A few changes we are making on this front:

Early rounds will focus on skills relevant to the role. Later rounds, with more of the team involved, will evaluate value alignment and teamwork so there are more perspectives to balance out potential biases.

Removed sources of bias from our scorecard. For example, we used to score candidates on if they "would bring energy to our team." The goal was to find people who would energize us to work with, but it ended up creating a tilt towards high-energy extroverts which was not the intent.

Add "Would bring a new dimension to the team" to our scorecard, to make sure we consider this for every candidate.

Ensure we operate as an inclusive team

Most of our current efforts are focused on hiring. But equally essential is maintaining an inclusive team culture that is welcoming, inspiring, and comfortable to all. This is critical to retaining candidates and to make sure our outreach to new candidates is authentic. It also lets us all be happier and more productive, and comfortable being ourselves.

Seek help from groups and people doing great work in this arena

We have a lot to learn and a long way to go. There are great teams and individuals that we can learn from, support, and work with. We have a list of those we've already been inspired or educated by. Over time, we'll aim to partner with organizations driving diverse hiring in Web3, amplify the voices of those helping make Web3 a beacon of equity in tech, and make long-term commitments to support the most aligned communities.

We also continue to do more of this work in public. We hope this will allow others to learn from our experience and open opportunities to work together to improve the practices and norms throughout the ecosystem. We also hope it creates an opportunity for the community to provide feedback and advice back to us. We have a long ways to go and value any support, critiques, and help.

3Box Labs Site | Ceramic Website | Twitter | Discord Community


Digital ID for Canadians

Spotlight on ValidCert

1.What is the mission and vision of ValidCert? Our Mission is built on a foundation of Protection, Validation and Empowerment. The ValidCert Vision creates an…

1.What is the mission and vision of ValidCert?

Our Mission is built on a foundation of Protection, Validation and Empowerment. The ValidCert Vision creates an Eco-System of Trust. Our Eco-System of Trust (1) Protects Digital Identity and Micro-Certificates (2) Provides Validation of Credentials by Government Approved Institutions (3) Empowers Life-Long Learning

2. Why is trustworthy digital identity critical for existing and emerging markets?

There is a global need for upskilling and reskilling which is creating a focus on shorter courses and micro-credentials. The approach to education and learning is changing and the technology has to focus on protecting and providing a forum for Digital Identity and Credentials to help bridge the skills gap. If individuals, globally, can safely and efficiently claim who they are and share validated micro-credentials in a protected environment, this will have a positive impact on streamlining the hiring and recruiting process which directly adds economic value. Reducing fraud, protecting rights and increasing transparency can increase efficiency for existing and emerging markets.

3. How will digital identity transform the Canadian and global economy? How does your organization address challenges associated with this transformation?

In order for Digital Identity to be successful, governments globally need to collaborate on creating interoperability to ensure individuals are protected when information is being shared. Risks need to be focused on, which include cybersecurity threats and data breaches. Digital Identity can be directly connected to economic value with increased use of financial services, increased access to employment opportunities all resulting in reduced costs and time savings for resources. According to McKinsey & Company, a subset of key focus areas for Canada and governments globally include cost savings, reduced fraud, and improved labour productivity all impacting the economy. ValidCert addresses challenges associated with this transformation by focusing on the key focus areas. Our platform is low-cost focused on enhancing processes surrounding assigning micro-credentials, providing a platform for Students/Job Seekers that is supported by the Learning Institutions that 3rd parties, such as recruiters, are invited to review. We are protecting the identity of the Issuer, Assignee and the Viewer, who are all major players in our Trust Eco-System. This will assist in reducing fraud as well as ensure that individuals’ identities and credentials are validated to be a “fit” for the job they are seeking. This will save time and resources for hiring companies, whether private or public.

4. What role does Canada have to play as a leader in this space?

Canada has a very important role to play as a leader in digital identity and trust services. This includes developing, in collaboration with organizations and other countries, policies and frameworks to ensure acceptance of digital identities. Similar to a currency, Digital Identities need to have cross-border acceptance.

5. Why did your organization join the DIACC?

We joined DIACC because we have trust in the objectives and vision of DIACC and its’ members. DIACC is creating a collaborative community to share knowledge and information to create sustainable solutions for Canada and for other countries. As part of DIACC, we are fortunate to be a part of the SIG (special interest group) for Digital Credentials which is validating the ValidCert Mission and Vision. We are not only learning but sharing our technology, research and experiences in order to move the Digital ID Dial forward.

6. What else should we know about your organization?

We are utilizing the best of Private and Public Technology environments so institutions and individuals can decide what information they want to share. We are protecting the accomplishments of all existing and future users of the Valid-Cert Eco-System!


CU Ledger

CUFX Brings Additional Credit Union Voice to Financial Data Exchange (FDX) to Help Accelerate Digital Future

Denver, CO (JULY 7, 2021) – Open banking is proving to be a game changer for financial services. Open banking is the structured sharing of data using application programming interfaces (APIs) between financial institutions and third-party providers. It helps financial institutions more efficiently partner with fintechs and it helps consumers more easily and securely [...] The post CUFX Brings Ad

Denver, CO (JULY 7, 2021) – Open banking is proving to be a game changer for financial services. Open banking is the structured sharing of data using application programming interfaces (APIs) between financial institutions and third-party providers. It helps financial institutions more efficiently partner with fintechs and it helps consumers more easily and securely leverage their data across solutions

CUFX, or the Credit Union Financial Exchange, has been the credit union industry’s open integration standard allowing credit unions to efficiently share data between banking systems, reduce redundancy, reduce technical complexity, drive down costs and improve speed to market. Since its inception, the credit union industry has greatly benefited from CUFX by supporting credit unions’ own API gateways or through easy integrations with third-party providers.

CUFX is expanding its support of open banking and interoperability by joining the Financial Data Exchange (FDX) consortium. FDX is a non-profit organization operating in the U.S. and Canada that is dedicated to unifying the financial industry around a common, interoperable and royalty-free standard for the secure access of user permissioned financial data, aptly named the FDX API. By collaborating with FDX, CUFX will enable credit unions and interested organizations to bridge both standards to leverage a broader, more interoperable transaction set and market reach.

“With CUFX as a protocol for the exchange of data, this membership will give credit unions a seat at the table with a best-in-class standards organization and help CUFX further evolve in support of digital identities, open banking and API exchanges,” says John Ainsworth, president/CEO of Bonifii.

“We are thrilled to welcome CUFX to the Financial Data Exchange family,” said Don Cardinal, Managing Director of FDX. CUFX’s participation in our consortium will help FDX continue to connect credit unions of all sizes so that they can offer their customers API-based data sharing with third-party fintechs via a common industry standard,” added Cardinal.

With the pandemic causing credit unions to accelerate their digital strategies, now is the right time for credit unions to start to begin leveraging open banking to help them grow and compete as they serve their members.

To learn more about CUFX and how your credit union can participate, visit www.cufxstandards.com.

About Bonifii

Denver-based Bonifii is the financial industry’s first verifiable exchange network designed to enable trusted digital transactions using open standards and best-of-breed security technologies. Bonifii empowers credit unions to change the way they interact with their members by enabling a seamless user experience in every financial transaction through a secure, private, trusted and transparent resolution of the entities’ identity. To learn more about Bonifii, visit www.bonifii.com, email us at sales@memberpass.com or follow the company on the Bonifii blog, LinkedIn or Twitter.

About FDX

Financial Data Exchange, LLC (FDX) is a non-profit organization dedicated to unifying the financial industry around a common, interoperable, royalty-free standard for secure and convenient consumer and business access to their financial data. FDX empowers consumers through its commitment to the development, growth and industry-wide adoption of the FDX API, according to the principles of control, access, transparency, traceability and security. Membership is open to financial institutions, fintech companies, consumer advocacy groups, and other industry participants. FDX is an independent subsidiary of FS-ISAC. For more information and to join, visit www.financialdataexchange.org.

Bonifii Contact:
Julie Esser, SVP Client Engagement
jesser@memberpass.com

608.217.0678

The post CUFX Brings Additional Credit Union Voice to Financial Data Exchange (FDX) to Help Accelerate Digital Future appeared first on Bonifii.


Commercio

MiCA Potential Solution – Will MiCA affect my project ?

MiCA Potential Solution MiCA COULD provide a PAN EUROPEAN legal Framework MiCA COULD stop CASP ostracism by banks MiCA COULD provide an huge PAN EUROPEAN  Market opportunity like eIDAS did for Trust Services Providers MiCA COULD give  market integrity and provide consumers and investors with appropriate levels of protection and a clear understanding of their rights  MiCA WILL for su

MiCA Potential Solution

MiCA COULD provide a PAN EUROPEAN legal Framework

MiCA COULD stop CASP ostracism by banks

MiCA COULD provide an huge PAN EUROPEAN  Market opportunity like eIDAS did for Trust Services Providers

MiCA COULD give  market integrity and provide consumers and investors with appropriate levels of protection and a clear understanding of their rights 

MiCA WILL for sure try to maintain  financial stability by  keeping STABLE COINS under a STRICT regulation.

Will MiCA affect my project ?

MiCA imposes an obligation on issuers of crypto-assets to publish an information document (called whitepaper) with mandatory disclosure requirements. 

Small and medium-sized enterprises (SMEs) will be exempted where the total consideration of the offering of crypto-assets is less than €1,000,000 over a period of 12 months. 

Issuers of ‘stablecoins’ will not be subject to authorisation by a national competent authority (NCA) if the outstanding amount of ’stablecoins’ is below €5,000,000. 

 

MiCA is organized in eight  titles (chapters):

Title I sets the subject matter, the scope and the definitions.  Title II regulates the CASP offerings and marketing of crypto-assets to the public. Title III Requirements for issuers of asset-backed stablecoins Title IV Requirements and procedures for issuers fiat backed stablecoin (e-money) Title V Provisions on authorisation and operating conditions for CASPs. Title VI prohibitions and requirements to prevent crypto-assets market abuse. Title VII provides  the power to national competent authorities: ESMA and EBA Title VIII deals with the delegation acts  implementation of the Act

 

Title I subject, the scope and the definitions.

 

Article 1 Subject Matter

 

This Regulation establishes uniform requirements for the following:

transparency and disclosure requirements for the issuance and admission to trading of crypto-assets; the authorisation and supervision of crypto-asset service providers and issuers of asset-referenced tokens and electronic money tokens; the operation, organisation and governance of issuers of asset-referenced tokens, issuer of electronic money tokens and crypto-asset service providers; consumer protection rules for the issuance, trading, exchange and custody of crypto-assets; measures to prevent market abuse to ensure the integrity of crypto-asset markets

 

Article 2 Scope and exemptions

 

This regulation applies to entities engaged in the issuance of crypto-assets and services related to crypto-assets in the Union. This Regulation shall not apply to crypto-assets that qualify as: financial instruments  electronic money  deposits  and structured deposits  securitisation  This regulation shall not apply to the following entities and persons: the European Central Bank and national central banks as monetary public authority  insurance  carrying out the reinsurance and retrocession activities  liquidator or an administrator acting in the course of insolvency procedure persons who provide crypto-asset services exclusively for their parent companies, for their subsidiaries or for other subsidiaries of their parent companies; the European investment bank; the European Financial Stability Facility and the European Stability Mechanism; Public international organisations. This regulation shall not apply to the following entities   credit institutions for  issuing asset-referenced tokens  credit institutions for providing crypto-asset services   investment firms where they provide one or several crypto-asset services equivalent to the investment services and activities for which they are authorised payment institutions  for providing crypto-asset services

 

L'articolo MiCA Potential Solution – Will MiCA affect my project ? sembra essere il primo su commercio.network.

Tuesday, 06. July 2021

Berkman Klein Center

Meditation and Metaphorical Pancakes: Exploring the philosophy and physics of time with metaLAB

Creative workshop encourages hands-on experience to create questions about time Olivia Tai and Sarah Newman hosted the workshop. by Grace McFadden The philosophy of time can be introduced quite simply, according to Harvard Physicist Jacob Barandes: all it takes is a stack of pancakes. Barandes was a guest speaker at “The Mysteries of Time in Physics and Philosophy,” the second in the C
Creative workshop encourages hands-on experience to create questions about time Olivia Tai and Sarah Newman hosted the workshop.

by Grace McFadden

The philosophy of time can be introduced quite simply, according to Harvard Physicist Jacob Barandes: all it takes is a stack of pancakes.

Barandes was a guest speaker at “The Mysteries of Time in Physics and Philosophy,” the second in the Creative Workshop series that metaLAB is offering this summer designed and led by Sarah Newman, Director of Art & Education at metaLAB at Harvard, and supported by Olivia Tai, a student at Harvard College and research assistant at metaLAB. The workshop brought together sixty attendees from seventeen different countries.

The event began with Barandes discussing presentism and eternalism, two different theories of how time works. This discussion formed the basis of a hands-on creative exercise led by Newman and Tai to explore the “mysteries of time.”

Barandes framed time as follows: “Think of each layer in the stack as a successive moment in time. Roughly speaking, if you’re standing still, you’re at the same place in each successive pancake. You’re at the same place in the bottom pancake, the next pancake, the next pancake.”

Presentism, explained Barandes, can be thought of as the idea that we all experience the same pancake at the same time. This universal present, which Barandes likened to a hot pancake, is the only one in which we exist. This also means that neither the future nor the past is real.

Barandes then explained how Einstein’s theory of special relativity serves as a challenge to presentism, which he also referred to as the A Model of time.

“The special theory of relativity says we do not all experience pancakes the same way,” Barandes said.

As explained by Barandes, Einstein’s theory of special relativity says that because the speed of light is constant regardless of the observer, there are repercussions for how time happens to each person. He said that the familiar, everyday laws of physics dictate that if someone threw a baseball on an airplane, a person outside the airplane would perceive the baseball to be moving at the combined speed of the throw and the airplane.

“But if you shine a light with the flashlight on the airplane, what Einstein said was, you on the airplane will perceive light travelling at the speed of light. Someone on the ground looking up will also see light going at the same constant speed. Light doesn’t combine. This is a very weird thing, it means that somehow time and space have to change,” Barandes said.

Space and time being relative and interconnected serves as a challenge to the idea that the present is experienced universally.

“Maybe the most profound weirdness here is that if you’re moving relative to someone else, then your pancake slices are not horizontal. Your pancake slices are slightly diagonal. So you don’t slice the pancakes horizontally anymore. If you’re moving, you slice the pancakes a little bit diagonally,” Barandes said. “And if different observers in different states of motion slice the stack of pancakes differently, that’s a challenge to the notion that one pancake slice is the ‘hot’ one for everybody. So that’s a challenge to presentism.”

Eternalism, which, Barandes explained, is known as the B Model, holds that the past, present, and future are all equally real.

The talk by Barandes primed the audience to form questions about time. Newman then guided participants through exercises to explore their own questions about time.

Tai guided the audience through a short meditation. The object, explained Tai, was to experience time in a different way than we usually do in our busy lives.

“The experience of time can feel very different when you are turning your intention inwards rather than outwards,” Tai said.

As part of the pre-work to the event, participants were encouraged to spend 90 seconds creating a timeline of their lives, which they had the opportunity to share with others during the workshop. Newman then prompted the group to write down things they “knew”,“believed”, and “didn’t know”. Participants were then asked to do the same exercise, except focusing on their knowledge of time.

From there, participants used these thought experiments to generate two of their own questions about time, which formed the foundations of the final steps of the workshop: exploring these questions visually by drawing them, and then creating physical sculptures to represent their ideas.

In preparation for the event, participants had been asked to spend ten minutes gathering materials ranging from “something that changes in time” to “two things that rhyme with each other.” After choosing one of their questions about time, participants created a sculpture to represent their question using the objects they gathered prior to the event.

A sculpture created by participant Dashiel Carrera.

Sculptures created by participants sought to explore questions such as “Could we have a functioning society without a supposedly objective passage of time, measured by clocks?” and “What would it look like if we perceived space as time and vice versa?” The resulting prototypes from the event were made of materials including lichen, incense, dice, and a bedspring.

Newman’s methodology for her creative workshops emphasizes the importance of asking questions, the value of interdisciplinary perspectives on hard topics, and then using found materials to physically wrestle with these abstract ideas. This allows workshop attendees to free their thinking when it comes to concepts that can be difficult to conceptualize. The event received positive feedback from attendants, one of whom said that “time passed at light-speed” during the workshop.

The final event in the Creative Workshop series, entitled “Telegrams to Telepathic Memos: The Future of Trust and Truth Online,” was held on June 25. The metaLAB team will be displaying the creations from this workshop series here.

Meditation and Metaphorical Pancakes: Exploring the philosophy and physics of time with metaLAB was originally published in Berkman Klein Center Collection on Medium, where people are continuing the conversation by highlighting and responding to this story.


Velocity Network

Interview with Manoj Kutty, CEO of GreenLight Credentials LLC

We sat down with Manoj Kutty, CEO of GreenLight Credentials LLC, to learn why he believes academic institutions should join the Velocity Network.  The post Interview with Manoj Kutty, CEO of GreenLight Credentials LLC appeared first on Velocity.

milch & zucker has joined the Velocity Network Foundation

We are excited to announce that milch & zucker has joined the Velocity Network Foundation as it's 35th member. The post milch & zucker has joined the Velocity Network Foundation appeared first on Velocity.

We Are Open co-op

Where does “work” come from?

Updates from our June Co-op Day As is our custom, we held another monthly Co-op half day last week — a block of time scheduled and set aside to dive into some of the stickier issues and ideas of running a cooperative. These are things that we can’t get to in our regular catch up meeting, and Co-op days are days when we sit down together, tackle the topics that are more long winded and build
Updates from our June Co-op Day

As is our custom, we held another monthly Co-op half day last week — a block of time scheduled and set aside to dive into some of the stickier issues and ideas of running a cooperative.

These are things that we can’t get to in our regular catch up meeting, and Co-op days are days when we sit down together, tackle the topics that are more long winded and build solutions for ourselves. It’s also a time for us to hang out, bond and generally team-build.

🦋 We are open to evolving… some of the folks we’ve worked with

We Are Open is ticking along in the slow summer months with a general can-do attitude and some work from some lovely repeat clients. We are, however, not at capacity for the rest of the year, and so we talked about how to drum up more business. We’re lucky that we have folks knocking on our door quite a bit, but maybe we’ve gotten too comfortable with waiting for work to come in.

Of course we want to continue working with the good clients we have, but we also want new and exciting projects. We’re eager to learn about all the people doing good in the world, and we want to support them in their endeavours. We don’t want to be comfortable, we want to be inspired. So we set ourselves a goal of having 5 completely new clients each year.

To achieve this, we scheduled ourselves an “Hour of Power” each month — we’ll use this hour to actively look for interesting projects and people. We’re also going to, like, try this thing called “marketing”.

🚀 Speaking of marketing… image from the PDF version of our website

We worked on a PDF ‘information pack’ version of our website and a “WAO in 3 Slides” deck. This might seem like a strange thing to do for digital activists like us, but there are a lot of people who are more likely to spend most of the day in their email inbox. We also put the finishing touches on our new Collaborator Contract to make it easy for us to contract talented people to fill holes in projects, or because we’re eager to work with others.

The latter proves tricky because, while we committed to all things open, we also have to sign Non-Disclosure Agreements (“NDAs”) with certain clients. So there’s a tension between the two which we have to spell out in the contract. On the one hand we want collaborators (a.k.a. sub-contractors) to be open, on the other we want them to abide by NDAs.

Excitingly, we’ve come to the end of our first season of the Tao of WAO podcast. We’ll be back for another season at the end of the summer, so we gave a brief update of how that went in preparation for a full Season 1 Retrospective, which we worked on this week.

🧦 Co-op IRL… it has been a while since we saw each others socks (not featured: Doug’s socks)

Our last order of business for our Co-op half day was to discuss the fact that we haven’t been in the same place in a year and a half. Covid halted travel for everyone, but we know how important in-person bonding time is. So we’re eager to find a time to meet up. WAO members don’t want to fly (see why in the Spirit of Us), so we’ll likely head to The Netherlands and head there by train and ferry. It’s still hard to commit to travel, but we need to start talking about it. We’re thinking the first week in October, all things being equal.

🌞 Summer days

We’re privileged that we can use the summer lull to plan for the fall, and we’re excited that we have some interesting work coming up. If you have projects that you’d like to talk to us about, now is a good time to schedule a 30 minute chat!

Where does “work” come from? was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


SelfKey Foundation

Bridging Fiat and DeFi with the SelfKey Wallet

The latest version of the SelfKey Wallet, allows users to purchase KEY tokens using their credit/debit cards. The integration with MoonPay thus allows SelfKey users to easily convert their fiat currencies to crypto, resolving a primary obstacle for an average user seeking to enter the DeFi space. The post Bridging Fiat and DeFi with the SelfKey Wallet appeared first on SelfKey.

The latest version of the SelfKey Wallet, allows users to purchase KEY tokens using their credit/debit cards. The integration with MoonPay thus allows SelfKey users to easily convert their fiat currencies to crypto, resolving a primary obstacle for an average user seeking to enter the DeFi space.

The post Bridging Fiat and DeFi with the SelfKey Wallet appeared first on SelfKey.

Monday, 05. July 2021

DIF Medium

Setting Interoperability Targets Part 1 of 2

Conformance Testing for Measurable Goals [Written in consultation with the Interoperability Working Group chairs] A recurring topic in the Interoperability Working Group is that of defining short-, medium- and long-term goals or “targets” for interoperability. The topic comes up again every few months, and each time it does, the chairs try to tackle it head-on for a full meeting or two, som

Conformance Testing for Measurable Goals

[Written in consultation with the Interoperability Working Group chairs]

A recurring topic in the Interoperability Working Group is that of defining short-, medium- and long-term goals or “targets” for interoperability. The topic comes up again every few months, and each time it does, the chairs try to tackle it head-on for a full meeting or two, some emails, and various off-line discussions, but doing so never seems to arrive at a satisfactory or definitive answer. Each time, what feels a Herculean outlay of effort only gets us a tentative resolution, as if we’d deflected a debt collector with a minimum payment. Dear reader, we would like to refinance, or at least restructure, this debt to our community!

In this two-part overview of our goals for 2021, we would like to survey the landscape of “provable,” testable interoperability targets and give some guidance on what we would like to see happen next in the testable interop world. Then, in a companion article, we will lay out clear proposal for parallel, distributed work on multiple fronts, such that progress can be distributed across various subcommunities and hubs of open cooperation, each reasonably confident that they are helping the big picture by “zooming in” on one set of problems.

Photo by Christopher Paul High A seemingly uncontroversial target state

Interoperation is a deceptively transparent etymology: any two things that perform a function are inter-operating if they can perform their operation together and mutually. Two fax machines interoperate if one sends a fax and the other receives it, which is a far more impressive feat if the two machines in question were designed and built by different companies on different continents, fifteen years apart. This is the kind of target state people usually have in mind when they imagine interoperability for an emerging technology.

This everyday example glosses over a lot of complexity and history, however. Standards bodies had already specified exact definitions of how the “handshake” is established between two unfamiliar fax machines before either model of fax machine was a twinkle in a design team’s eye. In addition to all the plodding, iterative standardization, there has also been a lot of economic trial and error and dead-ends we never heard about. Even acceptable margins of error and variance from the standards have been prototyped, refined, socialized, and normalized, such that generations of fax machines have taken them to heart, while entire factories have been set up to produce standard components like white-label fax chips and fax boards that can be used by many competing brands. On a software level, both design teams probably recycled some of the same libraries and low-level code, whether open-source or licensed.

Decomposing this example into its requirements at various levels shows how many interdependent and complex subsystems have to achieve internal maturity and sustainable relationships. A time-honored strategy is to treat each of these subsystems in a distinct, parallel maturation process and combine them gradually over time. The goal, after all, is not a working whole, but a working ecosystem. Architectural alternatives, for example, have to be debated carefully, particularly for disruptive and emerging technologies that redistribute power within business processes or ecosystems. Sharing (or better yet, responsibly open-sourcing and governing) common software libraries and low-level hardware specifications is often a “stitch in time” that saves nine later stitches, since it gives coordinators a stable baseline to work from, sooner.

Naturally, from even fifty or a hundred organizations committed to this target state, a thousand different (mostly sensible) strategies can arise. “How to proceed?” becomes a far-from-trivial question, even when all parties share many common goals and incentives. For instance, almost everyone:

Wants to grow the pie of adoption and market interest Holds privacy and decentralization as paramount commitments Strives to avoid repeating the mistakes and assumptions of previous generations of internet technology

And yet, all the same… both strategic and tactical differences arise, threatening to entrench themselves into camps or even schools of thought. How to achieve the same functional consensus on strategy as we have on principles? Our thesis here is that our short-term roadmaps need testable, provable alignment goals that we can all agree on for our little communities and networks of technological thinking to converge gradually. Simply put, we need a few checkpoints and short-term goals, towards which we can all work together.

Today’s test suites and interoperability profiles

Perhaps the biggest differences turn out not to be about target state or principles, but about what exactly “conformance” means relative to what has already been specified and standardized. Namely, the core specifications for DIDs and VCs are both data models written in the Worldwide Web Consortium (W3C), which put protocols out of scope. This stems partly from the decisions of the groups convened in the W3C, and partly from a classic division of labor in the internet standards world between W3C, which traditionally governs the data models of web browser and servers, and a distinct group, the Internet Engineering Task Force (IETF).

VCs were specified first, with a preference (but not a requirement) for DIDs, with exact parameters for DIDs and DID systems deferred to a separate data model. Then, to accommodate entrenched and seemingly irreconcilable ideas about how DIDs could best be expressed, the DID data model was made less representationally-explicit and turned into an abstract data model. This shift to a representation-agnostic definition of DIDs, combined with the still-tentative and somewhat representation-specific communication and signing protocols defined to date, makes truly agnostic and cross-community data model conformance somewhat difficult to test. This holds back interoperability (and objective claims to conformance)!

W3C: Testing the core specifications

The only test suite associated with the core W3C specifications is the VC-HTTP-API test suite for VC data model conformance and the fledgling DID-core test suite, both worked on in the W3C-CCG. The former tests implementations of VC-handling systems against some pre-established sample data and test scripts through a deliberately generalized API interface that strives to be minimally opinionated with respect to context- and implementation-specific questions like API authentication. The latter is still taking shape, given that the DID spec has been very unstable in the home stretch of its editorial process arriving at CR this last week.

The VC-HTTP-API interface, specified collectively by the first SVIP funding cycle cohort for use in real-world VC systems, has been used in some contexts as a de facto general-purpose architecture profile, even if it is very open-ended on many architectural details traditionally specified in government or compliance profiles. Its authors and editors did not intend it to be the general-purpose profile for the VC data model generally, but in the absence of comparable alternatives, it is sometimes taken as one; it has perhaps taken on more of a definitive role than originally intended.

Following the second iteration of the SVIP program and the expansion of the cohort, the API and its test suite are poised to accrue features and coverage to make it more useful outside of its original context. Core participants have established a weekly public call at the CCG and a lively re-scoping/documentation process is currently underway to match the documentation and rationale documents to the diversity of contexts deploying the API.

Aries: End-to-end conformance testing

Other profiles, like the Aries interoperability profile, serve an important role, but it would be misleading to call it a VC data model test suite — it is more like an end-to-end test harness for showing successful implementation of the Aries protocols and architecture. Here “interoperability” means interoperability with other Aries systems, and conformance with the shared Aries interpretation of the standard VC data model and the protocols this community has defined on the basis of that interpretation.

Many matters specified by the W3C data model are abstracted out or addressed by shared libraries in Ursa, so its scope is not exactly coterminous with the W3C data model. Instead, the Aries interoperability profile has its own infrastructural focus, which focuses on scaling the privacy guarantees of blockchain-based ZKP systems. In many ways, this focus complements rather than supplants that of the W3C test suites.

Many of the trickiest questions on which SSI systems differ are rooted in what the Aries and Trust-over-IP communities conceptualize as “layer 2,” the connective infrastructural building-blocks connecting end-users to VCs and DIDs. As more and more features get added to be testable across language implementations, and as feature-parity is achieved with other systems (such as support for LD VCs), the case for productive complementarity and deep interoperability gets easier and easier to make.

The first wave of local profiles and guidelines

Other specifications for decentralized-identity APIs and wallets, modelled on that W3C CCG and/or extending the work of Aries-based infrastructures, are starting to crop up around the world. So far these have all arisen out of ambitious government-funded programs to build infrastructure, often with an eye to local governance or healthy competition. Canada and the European Commission are the most high-profile ones to date, building on earlier work in Spain, the UK, and elsewhere; Germany and other countries funding next-generation trust frameworks may soon follow suit.

It is important, however, to avoid framing these tentative, sometimes cautious attempts at bridging status quo and new models as universal standards. If anything, these frameworks tend to come with major caveats and maturity disclaimers, on top of having carefully narrowed scopes. After all, they are generally the work of experienced regulators, inheriting decades of work exploring identity and infrastructural technologies through a patchwork of requirements and local profiles that tend to align over time. If they are designed with enough circumspection and dialogue, conformance with one should never make conformance with another impossible. (The authors would here like to extend heartfelt sympathy to all DIF members currently trying to conform to multiple of these at once!)

These profiles test concrete and specific interpretations of a shared data model that provide a testing benchmark for regulatory “green lighting” of specific implementations and perhaps even whole frameworks. Particularly when they specify best practices or requirements for security and API design, they create testable standardization by making explicit their opinions and assumptions about:

approved cryptography, auditing capabilities, privacy requirements, and API access/authentication

These will probably always differ and make a universal abstraction impossible; and that’s not a bad thing! These requirements are always going to be specific to each regulatory context, and without them, innovation (and large-scale investment) are endangered by regulatory uncertainty. Navigating these multiple profiles is going to be a challenge in the coming years, as more of them come online and their differences come into relief as a stumbling block for widely-interoperable protocols with potentially global reach.

The Interoperability working group will be tracking them and providing guidance and documentation where possible. Importantly, though, there is a new DIF Working Group coming soon, the Wallet Security WG, which will dive deeper into these profiles and requirements, benefiting from a narrow scope and IPR protection, allowing them to speak more bluntly about the above-mentioned details.

Setting Interoperability Targets Part 1 of 2 was originally published in Decentralized Identity Foundation on Medium, where people are continuing the conversation by highlighting and responding to this story.


DIF Blog

Setting Interoperability Targets Part 2 of 2

Having shown in our last piece how interoperability "profiles" are designed, we now tackle some key technical problem areas ripe for this kind of profile-first interoperability work across stacks.

Medium-term interoperability challenges

[Written in consultation with the Interoperability Working Group chairs]

In our last essay, we explored the means and ends of interoperability targets and roadmapping across stacks and markets. “Interoperability,” like “standardization,” can be a general-purpose tool or an umbrella of concepts, but rarely works from the top-down.  Instead, specific use-cases, consortia, contexts, and industries have to take the lead and prototype something more humble and specific like an “interoperability profile”-- over time, these propagate, get extended, get generalized, and congeal into a more universal and stable standard. Now, we’ll move on to some technical problem areas ripe for this kind of profile-first interoperability work across stacks.

What makes sense to start aligning now? What are some sensible scopes for interoperating today, or yesterday, to get our fledgling market to maturity and stability as soon as safely possible?
Ponte Vecchio, Fiorenze, by Ray Harrington From testable goals to discrete scopes

The last few months have seen a shift in terminology and approach, as many groups turn their attention from broad “interoperability testing” to more focused “profiles” that test one subset of the optionalities and capabilities in a larger test suite or protocol definition. This decoupling of test suites from multiple profiles each suite can test helps any one profile from ossifying into a “universal” definition of decentralized identity’s core featureset.

As with any other emerging software field, every use case and context has its own interoperability priorities and constraints that narrow down the technological solution space into a manageable set of tradeoffs and decisions.  For instance, end-user identification at various levels of assurance is often the most important implementation detail for, say, a retail bank, and DID-interoperability (which is not always the same thing!) might be a hardware manufacturer’s primary concern in being able to secure hardware supply chains.

Every industry has its unique set of minimum security guarantees, and VC-interoperability is obviously front-of-mind for credentialing use-cases. For example, in the medical data space, “Semantics” (data interpretation and metadata) might be a harder problem (or a more political one) than the mechanics of identity assurance, since high standards of end-user privacy and identity assurance have already made for a relatively interoperable starting points.   Which exact subset of the many possible interoperability roadmaps is safest or most mission-critical for a given organization depends on many factors: the regulatory context, the culture of the relevant sectors, its incentive-structures, and its business models.

Cross-cutting industry verticals, however, are structural issues with how decentralized identity stacks and architecture vary, which can already been seen today.  By applying “first principles,” (or in our case, the “functions” of a decentralized identity system and its many moving parts) across use-cases and industrial contexts, certain shared problems arise. As is our default approach in DIF Interop WG, we applied DIF’s in-house mental model of the “5 layers” of decentralized identity systems, sometimes called “the 4+1 layers”. (See also the more detailed version).

We call these the “4+1” layers because our group agreed that a strict layering was not possible, and that all the architectures we compared for using verifiable credentials and decentralized identifiers had to make major architectural decisions with consequences across all four of the more properly “layered” categories. This fifth category we named “transversal considerations,” since they traverse the layers and often come from architectural constraints imposed by regulation, industrial context, etc. Foremost among these are storage and authorization, the two most vexing and cross-cutting problems in software generally; these would justify an entirely separate article.

In a sense, none of the topics from this transversal category are good candidates for specification in the medium-term across verticals and communities-- they are simply too big as problems, rarely specific to identity issues, and being addressed elsewhere. These include “storage” (subject of our newest working group), “authorization” (debatably the core problem of all computer science!), “cryptographic primitives”, and “compliance.” (These last two are each the subject of a new working group, Applied Cryptography and Wallet Security!). These interoperability scopes are quite difficult to tackle quickly, or without a strong standards background. Indeed, these kinds of foundational changes require incremental advances and broad cooperation with large community organizations. This is slow, foundation work that needs to connect parallel work across data governance, authentication/authorization, and storage in software more generally.

Similarly, the fourth layer, where ecosystem-, platform-, and industry-specific considerations constrain application design and business models, is unlikely to crystallize into a problem space calling out for a single specification or prototype in the medium-term future. Here, markets are splintered and it is unclear what can be repurposed or recycled outside of its original context. Even if there were cases where specification at this later would be timely, DIF members might well choose to discuss those kinds of governance issues at our sister-organizations in the space that more centrally address data governance at industry-, ecosystem-, or national- scale: Trust over IP was chartered to design large-scale governance processes, and older organizations like MyData.org and the Kantara Initiative also have working groups and publications on vertical-specific and jurisdiction-specific best practices for picking interoperable protocols and data formats.

That still leaves three “layers”, each of which has its own interoperability challenges that seem most urgent. It is our contention that each of these could be worked on in parallel and independently of the other two to help arrive at a more interoperable community-- and we will be trying to book presentation and discussion guests in the coming months to advance all three.

Scope #1: Verifiable Credential Exchange

The most clear and consensus-building, even urgent way forward is to bring clarity to Verifiable Credential exchange across stacks.  This has been the primary focus of our WG for the last year. Given that most of the early ecosystem-scale interest in SSI revolves around credentialing (educational credentials, employment credentials, health records), it is highly strategic to get translation and unified protocols into place soon for the interoperable verification and cross-issuance of credentials.

In fact, there has actually been a lot of good progress made since Daniel Hardman wrote an essay on the Evernym blog making a pragmatic case for sidestepping differences in architecture and approach to exchange VCs sooner. This aligned with much of our group’s work in recent months, which has included a survey of VC formats among organizations producing “wallets” for verifiable credentials (be they “edge” wallets or otherwise). Our group has also sought to assist educational efforts at the CCG, in the Claims and Credentials working group, and elsewhere to make wallet-producing organizations aware of the relevant specifications and other references needed to make their wallets multi-format sooner and less painfully. Much of this work was crystalized into an article by co-chair Kaliya Young and crowd-edited by the whole group; this work was a major guiding structure for the Good Health Pass work that sought to make a common health record format (exported from FHIR systems) equally holdable and presentable across all of today’s VC systems.

One outgrowth of this effort and other alignments that have taken place since Hardman’s and Young’s article is the work of prototyping a multi-community exchange protocol that allow a subset of each stack’s capabilities and modes to interoperate. This tentative, “minimum viable profile” is called WACI-PEx and is currently a work item of the Claims and Credentials working group. Work is ongoing on v0.1, and an ambitious, more fully-featured v1 is planned for after that. This profile acts as an “extension” of the broader Presentation Exchange protocol, giving a handy “cheat sheet” for cross-stack wallet-to-issuer/verifier handshakes so that developers not familiar with all the stacks and protocols being spanned have a starting point for VC exchanges.  Crucially, the results of this collaborative prototype will be taken as inputs to future versions of the DIDComm protocols and the WACI specification for Presentation Exchange.

Note: There has been some discussion of a OIDC-Presentation Exchange profile at some point in the future, but given that the alignment of DIDComm and Presentation Exchange started over a year ago, the most likely outcome is that work on this would not start until after v1 has been released of the “WACI-PEx” profile for DIDComm has been released.

Scope #2: Anchoring layer

Of course, other forms of alignment are possible as well in the “bottom 3” layers of the traditional stack, while we wait on the ambitious transversal alignment specifications and the ongoing work to align and simplify cross-format support for VCs (and perhaps even multi-format VCs, as Hardman points out in the essay above).

The Identifiers and Discovery WG at DIF has long housed many work items to align on the lowest level of stack, including general-purpose common libraries and recovery mechanisms. It has also received many donations that contribute to method-level alignment, including a recent DID-Key implementation and a linked-data document loader donated by Transmute. The group has also served as a friendly gathering point for discussing calls for input from the DID-core working group at W3C and for proposing W3C-CCG work items.

One particularly noteworthy initiative of the group has been the Universal Resolver project, which offers a kind of “trusted middleware” approach to DID resolution across methods. This prototype of a general-use server allows any SSI system (or non-SSI system) to submit a DID and get back a trustworthy DID Document, without needing any knowledge of or access to (much less current knowledge of or authenticated access to) the “black box” of the participating DID methods. While this project only extends “passive interoperability” to DIDs, i.e., only allowing DID document querying, a more ambitious sister project, the Universal Registrar, strives to bring a core set of CRUD capabilities to DID documents for DID methods willing to contribute drivers. Both projects have dedicated weekly calls on the DIF calendar, for people looking to submit drivers or get otherwise involved.

Scope #3: “Agent”/Infrastructure Layer

There is another layer, however, in between DIDs and VCs, about which we haven’t spoken yet: the crucial “agent layer” in the ToIP/Aries mental model, which encompasses trusted infrastructure whether in or outside of conventional clouds. The Aries Project has scaled up an impressively mature ecosystem of companies and experimenters, largely thanks to the robust infrastructure layer it built (and abstracting away from application-layer developers and experimenters).

Until now, differences of strategy with respect to conventional clouds and infrastructures have prevented large-scale cooperation and standardization at this layer outside of Aries and the DID-Comm project. Partly, this has been a natural outgrowth of the drastically different infrastructural needs and assumptions of non-human, enterprise, and consumer-facing/individual use cases, which differ more at this level than above or below. Partly, this is a function of the economics of our sector’s short history, largely influenced by the infrastructure strategies of cloud providers and telecommunication concerns.

This is starting to change, however, now that agent frameworks inspired by the precedent set by the Aries frameworks have come into maturity. Mattr’s VIII Platform, the Affinidi framework SDK (including open-source components by DIF members Bloom, Transmute, and Jolocom), ConsenSys’ own modular, highly extensible and interoperable Veramo platform, and most recently Spruce ID’s very DID-method-agnostic DIDKit/Credible SDK all offer open-source, extensible, and scalable infrastructure layers that are driving the space towards greater modularity and alignment at this layer.

As these platforms and “end-to-end stacks” evolve into frameworks extensible and capacious enough to house ecosystems, DIF expects alignment and harmonization to develop. This could mean standardization of specific components at this layer, for example:

The DIDComm protocol could expand into new envelopes and transports Control recovery mechanisms could be specified across implementations or even standardized on a technical and/or UX level Auditing or historical-query requirements could be specified to form a cross-framework protocol or primitive Common usages of foreign function interfaces, remote procedure calls like gRPC and JSON-RPC, or other forms of “glue” allowing elements to be reused or mixed and matched across languages and architectures could be specified as a community

We are very much in early days, but some see on the horizon a day when frameworks that don’t cooperate with one another can’t compete with the ones that join forces. After all, adoption brings growing pains, particularly for the labor market-- aligning on architectures and frameworks makes onboarding developers and transferring experience that much easier to do!

Next Steps

I would encourage anyone who has read this far to pick at least one of the three scopes mentioned above and ask themselves how they are helping along this alignment process in their day-to-day work, and if they truly understand what major players at that level are doing. Often large for-profit companies pay the most attention to what their competitors are doing, but here it is important to think outside of competition and look instead at non-profit organizations, regulators, and coalitions of various kinds to really see where the puck is heading. Sometimes consensus on one level is blocking compromise somewhere else.  It can be pretty hard to follow!

In recent articles, DIF has encouraged its members to think about an open-source strategy as comparably important to a business plan, a living document and guiding philosophy. I would like to suggest that the subset of DIF companies working with VCs and DIDs should also think of interoperability strategy and conformance testing as the most crucial pillar of that strategy-- if you cannot demonstrate interoperability, you might be asking people to take that strategy on faith!

Friday, 02. July 2021

aNewGovernance

AfroLeadership NGO to join the Board of aNewGovernance AISBL

AfroLeadership NGO to join the Board of aNewGovernance AISBL:

Because human-centric data infrastructure has to be THE new global
model, moving away from platform-centric and state-centric current
situations, we are delighted to announce AfroLeadership NGO, is
joining our Brussels-based International Association

As the Data Strategy and the Data Spaces are being put in place in Europe, as the
new US Administration is questioning the operating practices of global platforms, it
is critical our approach over Personal Data Sharing is Global. This is why we are
thankful and honored that AfroLeadership NGO, which has chapters in 12 countries,
already joins us in our works.

In recognition to its valuable contribution, AfroLeadership NGO will be represented
in the aNewGovernance Board by its President, Charlie Martial NGOUNOU.

Brussels, Yaounde, 2 July 2021


Elastos Foundation

Elastos Bi-Weekly Update – 02 July 2021

...

Own Your Data Weekly Digest

MyData Weekly Digest for July 2nd, 2021

Read in this week's digest about: 18 posts, 1 question
Read in this week's digest about: 18 posts, 1 question

Thursday, 01. July 2021

Energy Web

Announcing the Berlin Upgrade on the Energy Web Chain

Florian Wehde | Unsplash TL/DR: The Berlin upgrade is coming to the Energy Web Chain (EWC) on July 6th. This update will improve EWC performance and security, and keep the EWC aligned with the latest developments in the Ethereum ecosystem. If you are running an EWC node, you must update it to a Berlin-compatible client and update the chainspec as soon as possible. EWC validators have
Florian WehdeUnsplash

TL/DR:

The Berlin upgrade is coming to the Energy Web Chain (EWC) on July 6th. This update will improve EWC performance and security, and keep the EWC aligned with the latest developments in the Ethereum ecosystem. If you are running an EWC node, you must update it to a Berlin-compatible client and update the chainspec as soon as possible. EWC validators have already completed all the necessary updates. If you are an EWC user or EWT holder, no action is required.

In April 2021 the Ethereum mainnet implemented the Berlin upgrade, which introduced four improvement proposals (i.e., EIP-2565, EIP-2718, EIP-2929, and EIP-2930) that modified gas prices and created new transaction types. Combined, these updates improve the overall performance and security of EVM blockchains (including the Energy Web Chain).

In May 2021, after observing the successful Ethereum upgrade, the EWC validators voted to adopt Berlin on the EWC as well. Since then the EWC validators have successfully implemented the upgrade on the Volta test network and are now preparing for the upgrade to occur on the main EWC.

The EWC Berlin upgrade will occur at block 12,649,625, which will occur on or around Tuesday, 6 July (the precise time will depend on block time variation). All full node operators must update their node to a Berlin-compatible client and implement the Berlin chainspec prior to the hard fork transition block. Due to subtle variations in the EWC block time, it is recommended to perform the update several days prior to the expected transition date.

FAQ

Who does this announcement impact?

This announcement only impacts people or organizations that are operating a full node on the EWC. All EWC node operators should update the client and chainspec as soon as possible. As of the time of publication, all EWC validators have already completed the necessary updates to successfully implement the Berlin upgrades.

If I’m an EWT holder but I’m not operating my own full node, is there anything I need to do?

No, if you connect to the EWC via the public RPC and Metamask, an exchange (including Kucoin, Kraken, Liquid, Bitmart, or Hotbit), a web wallet (e.g., MyCrypto), or hardware wallet (e.g., Trezor, Ledger) then you do not need to take any action.

Why is the Berlin upgrade happening on the EWC?

The EWC is a public, proof-of-authority EVM blockchain and it benefits from adopting the latest upgrades and improvements from the wider Ethereum community. The EWC validators voted to adopt the upgrades contained in the Berlin fork in May 2021.

What will happen before, during, and after the upgrade?

As mentioned above, the decision to proceed with the upgrade is made by the EWC validators via the established governance process. After the decision is approved and before the upgrade occurs, all EWC validators and other node operators will update their node clients and chainspecs. During the upgrade (i.e., at the specified transition block), the new rules defined within the improvement proposals will take effect, creating a “new and improved” version of the EWC. There are no expected operational or performance impacts to the EWC during the upgrade. Following the upgrade, users and node operators can continue to interact with the EWC as normal.

If you have any questions or encounter any issues, please get in touch via our Telegram or Discord channels.

Announcing the Berlin Upgrade on the Energy Web Chain was originally published in Energy Web Insights on Medium, where people are continuing the conversation by highlighting and responding to this story.


omidiyar Network

The Intersection: Who’s in the Driver’s Seat — Us or Our Technology?

Still frame from The Intersection By Julia Solano, Eshanthi Ranasinghe, and Nicole Allred, Exploration & Future Sensing, Omidyar Network The Intersection, a short film in collaboration with Superflux, creates a hopeful future that reimagines extractive, hyperconnected technology to serve community, support nature, and value human relationships. Watch here. My low balance of -$1.50 g
Still frame from The Intersection

By Julia Solano, Eshanthi Ranasinghe, and Nicole Allred, Exploration & Future Sensing, Omidyar Network

The Intersection, a short film in collaboration with Superflux, creates a hopeful future that reimagines extractive, hyperconnected technology to serve community, support nature, and value human relationships. Watch here.

My low balance of -$1.50 glows red above my wrist as I cross through the turnstile and enter the station. Options to top up via crypto or credit are greyed out on my digital wallet; those accounts are empty. Guess I’m watching the ads today, just like they’re watching me.

As I sit down on the train, images — shoes I’ve always wanted, modeled by my favorite influencers, with payment plans to shrug off my empty bank account — flash before my eyes. There’s no hiding from them — the algorithms track my eye-movement, my heartbeat, my temperature. They know what I want before I do…

Technology is causing our perception of reality — and of each other — to splinter. The same stories and facts can be twisted and contorted to fit a desired narrative depending on which slice of the web we are tuned into. Social media content is engineered for our reaction, “taking our hopes and fears, our pain and suffering and using it as content between ads” (The Intersection); pitting us against each other in a race to the bottom line. We live in fundamentally different worlds — each with its own set of facts — depending on the media we consume.

The average person touches their phone 2,617 times every day, and cellphone addiction (nomophobia: the fear of going without your phone) is rising. This begs the question: Are we driving technology, or is our technology driving us?
Still frame from The Intersection What if this reality wasn’t limited to our screens?

Ingestibles, wearables, embeddables; smart homes and peer-to-peer surveillance; the Internet of Things (IoT) and Industrial IoT; 5G; smart cities; facial recognition, temperature checking pandemic drones, and satellite imagery — a new layer of ambient (ever-present, always-on, hyper-connected) technology is seamlessly enveloping the world around us. Ambient data collection and algorithmic decision-making allow our physical environments, like our devices, to be infiltrated with information and metadata constantly gleaned from our search histories, our purchases, our locations, our sleep patterns, our menstrual cycles.

Our physical and digital worlds are developing to become almost indistinguishable. This web of ambient technology is shaping our new reality. Question is, is that what we want?

With the encompassing and interconnected nature of ambient technology, we are barreling towards a society in which each interaction with the physical and digital world is fair game for extraction and exploitation.

This world isn’t science fiction — its creation is in progress, atop a complex web of existing technologies and systems already leveraging the personal data of users.

Still frame from The Intersection

Omidyar Network’s Exploration and Future Sensing team has been tracking this evolving trend since 2018. Our initial investigations provoked many questions: What could this world look like with a new layer of ambient technology? What currently drives its development and the course it takes? What is the experience of those already commonly marginalized, excluded, and/or discriminated against in current tech and surveillance systems? What might be the experience of immigrants, minorities, elderly, special needs, rural, urban, etc in this world? How could our interaction with technology, government, companies, society, each other change? Will these technology developments only be extractive, or is there a world where they could be supportive, or even regenerative? Is it possible for this world to be designed from a place that acknowledges our interconnectedness and centers diverse perspectives?

Making of “The Intersection”

To provoke the possibilities of this hyper-connected world, we partnered with Superflux, a speculative design and foresight agency. We hoped to provoke imagination about futures that could evolve what we, collectively, would like our relationship with technology to be, as we build the infrastructure, protocols, norms, social structures, and policies, for a technology ecosystem that can be empowering for us all.

“I don’t want the tech we build to consume the world, but to augment and support it. We’re starting to build something new. A new way of networking and connecting. The tech was built from the ground up. To support people, instead of mining them.” — The Intersection

Together we produced “The Intersection”, a short film that explores the future of ambient technology through the lens of four protagonists whose lives have been shaped by the dissonance of extractive technology norms, misinformation, surveillance capitalism, and context collapse:

Ericka, a young activist whose movement for justice is derailed by a never-ending feed of misinformation and conspiracies concocted to extract data. Jake, a journalist whose work is diluted and corrupted by AI systems that helped him prioritize clicks over the truth, taking news out of context and pandering to readers’ “intensely monitored, precisely categorized fears.” Amp, a hardware engineer who set out to change the world using technology, only to have her utopian dream shattered by the realization that the only problem she was solving was finding new ways for technology companies to make money. Tammi, a climate migrant who became a refugee in her own country because her family refused to evacuate in the face of an impending storm, citing “fake news” reports. Still frame from The Intersection

As the film begins, the future of ambient tech is all-encompassing and ever present in every aspect of life. Individually, then together, the protagonists find ways to re-evaluate their relationships with technology — to repurpose and redesign ambient tech to serve their communities, support nature, and value human relationships. They come together to reconcile their differences — illustrating how people can be united by a common purpose, regardless of how different their backgrounds and world views may be. The film ends at a new beginning, open to audiences to shape — where could we head from here? What would we do today, if we knew this is where we were heading?

“We wanted to show in an experiential way, the interconnected nature of the issues present in our world. Whether environmental, social or technological, these domains don’t reside within neat categories, they overlap and intermingle, giving birth to new hybrid forms that are as tricky to define as they are to address”– Anab Jain & Jon Ardern, Superflux.

We hope this film makes the future of ambient technology more tangible, prompts us to investigate our relationship with technology, and encourages us to build towards a more just, pluralistic world that centers the experiences of those most marginalized among us.

Watch the Intersection here. Read our report capturing ambient tech trends, the product of several weeks of ethnographic research and expert interviews, here.

The Intersection: Who’s in the Driver’s Seat — Us or Our Technology? was originally published in Omidyar Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Commercio

CREATE BLOCKCHAIN SOLUTIONS VIA COMMERCIO.APP – WORKSHOP

Registration for the July workshops is now open! Don’t forget to sign up, Commerce Consortium members only. https://docs.google.com/forms/d/e/1FAIpQLScR8mAx9u-um3SNVsZWDlJAuzdjirEyolYV-O_TuamR0Ur4BA/viewform The workshop is intended for CEO or Commercial Manager CTO or Development Manager WHEN : Friday 16 July at 14:30 Friday 23 July at 14:30 Friday 30 July at 14:30 WHAT : INTRODUCTION: Update on

Registration for the July workshops is now open!

Don’t forget to sign up, Commerce Consortium members only.

https://docs.google.com/forms/d/e/1FAIpQLScR8mAx9u-um3SNVsZWDlJAuzdjirEyolYV-O_TuamR0Ur4BA/viewform

The workshop is intended for

CEO or Commercial Manager

CTO or Development Manager

WHEN :

Friday 16 July at 14:30
Friday 23 July at 14:30
Friday 30 July at 14:30

WHAT :

INTRODUCTION:

Update on the Commerce.network project
video Seminar TOKEN ECONOMY and LISTING TOKEN

HOW TO DEVELOP A SOLUTION ON BLOCKCHAIN

Onboarding and Training on Commercio.app
Questions and answers node validator management

HOW TO PROPOSE A SOLUTION ON BLOCKCHAIN

Video Seminar of 36 use cases of problems solvable with blockchain
ABR Rewards system and Referral membership new members

 

L'articolo CREATE BLOCKCHAIN SOLUTIONS VIA COMMERCIO.APP – WORKSHOP sembra essere il primo su commercio.network.


We Are Open co-op

Spirit of WAO

AKA Tao of WAO (but not the podcast) As the state of the world has…evolved/deteriorated…We Are Open members have dealt with these changes and challenges in various ways. The co-op has grown up a lot over the last five years and with growing comes random conversations in which someone says “Do we need a policy on…?” This is adulting pure. In April, we had a existential chat about the cl
AKA Tao of WAO (but not the podcast)

As the state of the world has…evolved/deteriorated…We Are Open members have dealt with these changes and challenges in various ways. The co-op has grown up a lot over the last five years and with growing comes random conversations in which someone says “Do we need a policy on…?” This is adulting pure.

In April, we had a existential chat about the climate emergency, racial justice, lost traditions and more, and we decided that we wanted to write a page that summarises the points we agree on.

CC-BY-ND Bryan Mathers

We’ve published our new page on our wiki and are republishing here:

Spirit of WAO

In our April 2021 co-op day, we spoke about climate, justice, etc and would like to have a page that shows our spirit.

WAO is a collective of individuals who broadly agree on many things. Like any group of people, there are nuanced differences in our positions on the issues of the day, so instead of corporate pronouncements, this page exists to share what we believe in. It’s the product of discussion and debate during our monthly co-op days.

We believe in:
• Placing ourselves and our work in historical and social contexts so that we can make thoughtful decisions about our behaviours and mindsets.
• Seeing ourselves as part of nature not the rulers of it and acknowledging that there is a climate emergency. We are conscious of the lost lessons and spirit of the indigenous and strive for climate justice.
• Sharing resources to help combat prejudice wherever we see it (including, but not limited to: racism, sexism, ageism, ableism, homophobia, transphobia, xenophobia, and hostility relating to education or socio-economic status).

Instead of policies, we believe in dialogue that leads to action. Our members think carefully about charitable giving, and we have decided to make this an individual matter of conscience rather than a collective decision. As a cooperative, we consciously review the amount of travel we do for work, especially air travel.

One tool we’ve found useful in thinking about this is the Venn diagram (shown below) shared by the How to Save a Planet podcast. At the centre of what we’re good at, what needs doing, and what brings us joy is the work that makes a difference.

So to summarise, while you absolutely will see WAO members out on the street doing in-person activism, that’s not everyone’s style. Some of us run websites dedicated to informing people about important issues. Others are members of hyperlocal groups helping make the world a better place.

However we do it, WAO members do it intentionally and not because we have a policy for it.

See our wiki.

Instead of virtue signalling, get to work

Over the last year and a half, our members did courses (some of which the co-op paid for), went to demos, read a bunch of stuff and donated to various causes. They reflected on their own and together. We are not perfect, haven’t necessarily had an easy time of it, but at we keep doing the work we believe in, and we keep thinking about how our actions create impact.

Spirit of WAO was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 30. June 2021

We Are Open co-op

On Common Ground

WAO’s participation in the Catalyst Network Engagement WG WAO is in the midst of our latest Catalyst-funded project, having spent the majority of the pandemic supporting charities through digital transformation. Catalyst itself is undergoing a transformation, as explained by Ellie in this blog post. In a nutshell, Catalyst responded well during the emergency situation caused by the pandemic and
WAO’s participation in the Catalyst Network Engagement WG

WAO is in the midst of our latest Catalyst-funded project, having spent the majority of the pandemic supporting charities through digital transformation.

Catalyst itself is undergoing a transformation, as explained by Ellie in this blog post. In a nutshell, Catalyst responded well during the emergency situation caused by the pandemic and is now looking for how it can be more decentralised, equitable, and partnership-driven.

To help with that, they created several “temporary, experimental working groups” — one of which WAO got involved with. This was the Network Engagement Working Group, which met four times, facilitated mostly by Tess Cooper, although Doug did facilitate the third session while Tess was away.

Outputs

Skipping straight to what the group achieved, we spent a lot of time in Miro, but also one week in Etherpad (given WAO’s love for it, no prizes for guessing which week that was!)

Miro boardEtherpad

There was a mix of organisations represented, from small charities to larger network organisations, as well as digital agencies and individual freelancers. In the Slack channel there are 19 people, with about 12–15 of them being on any one call.

Next steps Zoomable version of this image available as Frame 8 on the Miro board

We took the opportunity as a group to plot our ideas on two axes: useful and feasible. More useful ideas go towards the top of the green rectangle, while more feasible ones go further to the right.

Once we had gone around discussing and moving these based on everyone’s input, we colour-coded them based on our original brief. Are these to do with core Catalyst business, reach and engagement, or both?

The list, sorted by colour from the top right is below.

Core Design-hops and other practice training approaches Come up with an open standard of what good digital practice/ infrastructure looks like within the charity sector An evolving open-standard that can fit on a slide (process & product) Define what Catalyst areas need focusing on most e.g. services, resources, how orgs are engaged, reaching diversity of orgs e.g. through their networks Reach / engagement Explore and articulate what people can get out of engaging beyond money Connecting UK Tech for Good organisers network with the local (generic and specialist) infrastructure organisations in their areas Test assumptions about when, how and why people would engage and contribute their time and energy to influencing Catalyst’s work, network activity and direction of growth and money required Train the trainers programme for tooling up infrastructure organisations so that they can carry actively support their members in digital transformation Both No-code expert to build relationships with suppliers and share how charities can easily apply these tools within their organisations Deepen understanding of the diversity requirement — is it diversity of voices / reaching and supporting contribution from organisations that service under-recognised communities / social sector organisations that face disadvantages themselves (e.g. regional barriers, small team, niche issue) Identify patterns of need, and connect that with digital best practice — build on the rich research that’s been done over the last year, keep adding and deepening and SHARE it across the network Image CC BY-ND Bryan Mathers Stars and wishes

We closed the final session talking about ‘stars’ and ‘wishes’ — i.e. highlights of the process as well as what we’d like to improve in future. The following were noted down by Ellie and shared in the Slack channel.

Stars openness and feel of this group so many of us care about this! hearing people’s thoughts being part of this conversation as a smaller, local org learning from others in the room and being prompted to think about things differently interesting collage of different peoples’ day jobs and experiences great facilitation from Tess and Doug grateful to connect in to this group, gives great energy to see people wanting to collaborate and find solutions watching people think through a gnarly problem contributions and insights of this group have been amazing Wishes to take what we’ve created on Miro and put it into a form that others can understand/input into have something more tangible/manageable to engage with for next steps something more tangible cross referencing things across each working group will help shape the next bits (maybe it’s because we’re in a bubble that it’s confusing for us) we remain connected in some way and see what comes out of this and other groups keep this going in some way, shape or form that Ellie gets all the support she needs meeting in-person! getting to know people better outside of this convo apply digital best practice to the way we’ve approached this group — user research, prototype then learn tangible actions from this to take forwards Next steps

Catalyst is hiring a new team over the summer to reconfigure its working practices and take forward some of the ideas of each of the working groups, where appropriate.

We Are Open would love to continue to be part of this exciting network, doing the hard yards of helping charities in their digital transformation journey! 🙌

If your charity, non-profit, or for-good organisation needs some help, why not get in touch? https://weareopen.coop/contact

On Common Ground was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.


Commercio

What is MiCA ? Markets in Crypto Assets

What is MiCA?   MiCA is the Acronym of Markets in Crypto Assets The EU will create a SINGLE regulation   for ALL 28 members Countries regarding Crypto A Regulation..  Basically  is a LAW IMPOSED upon all countries immediately that supersedes any national regulations. MiCA Regulation therefore is a single set of rules , Immediately applicable, Throughout the Single Market. […]

What is MiCA?

 

MiCA is the Acronym of Markets in Crypto Assets

The EU will create a SINGLE regulation   for ALL 28 members Countries regarding Crypto

A Regulation..  Basically  is a LAW IMPOSED upon all countries immediately that supersedes any national regulations.

MiCA Regulation therefore is a single set of rules , Immediately applicable, Throughout the Single Market.

Is MiCA good or bad?

To Answer this question let’s focus on why they are doing it. The EU is drafting new strategy on digital finance for the EU financial sector. The EU with MiCA aims to ensure that the EU embraces the Blockchain revolution and become leader with innovative European firms in the lead, On one side they want  to be innovation-Friendly to make digital finance available to European consumers and businesses. On the other side wants to  mitigate the risk to consumers (and banks).

MiCA Objectives

 

First objective give LEGAL CERTAINTY to  crypto-asset markets. The EU thinks there is a need for a sound legal framework, clearly defining the regulatory treatment of all crypto-assets that are not covered by existing EU legislation.   Second objective is to SUPPORT INNOVATION.  The EU wants to  Promote crypto-assets development and new user cases while putting in place a safe and proportionate framework to support innovation and fair competition. Third objective is CONSUMER PROTECTION. The EU thinks crypto-assets present many of the same risks as traditional  financial instruments.  Fourth objective is to ensure FINANCIAL STABILITY. The EU is literally  terrified by ‘stablecoins’, because  unlike cryptocurrencies have the potential to become widely accepted and potentially systemic disruptive for the status quo. 

L'articolo What is MiCA ? Markets in Crypto Assets sembra essere il primo su commercio.network.


SelfKey Foundation

The Bullish Dip and there’s Still Time to Join the Airdrop

SelfKey Weekly Newsletter Date – 23rd June, 2021 The recent dip in the Crypto Markets, Puell Multiple, and the double airdrop for KEY hodlers on Binance. The post The Bullish Dip and there’s Still Time to Join the Airdrop appeared first on SelfKey.

SelfKey Weekly Newsletter

Date – 23rd June, 2021

The recent dip in the Crypto Markets, Puell Multiple, and the double airdrop for KEY hodlers on Binance.

The post The Bullish Dip and there’s Still Time to Join the Airdrop appeared first on SelfKey.

Tuesday, 29. June 2021

WomenInIdentity

We’re hiring – new role of Executive Director

We were established to build a more diverse and inclusive community that will help shape the products and solutions developed by the identity industry.  Our Vision is to ensure that… The post We’re hiring – new role of Executive Director appeared first on Women in Identity.

We were established to build a more diverse and inclusive community that will help shape the products and solutions developed by the identity industry. 

Our Vision is to ensure that identity solutions intended FOR everyone are built BY everyone.  And our Mission is to inspire, elevate and support a more diverse workforce across a variety of roles and sectors within the identity ecosystem.

Given the increasing importance that an individual’s identity plays in unlocking services that are essential for participation in the global economy, it is critical that identity solutions represent the needs of ALL communities.   As part of our mission, we support grassroots development opportunities such as networking events, internships, educational research and personal development. All made possible by funding from our sponsors.

On the back of sustained growth since our formal launch in 2019, we are conducting a strategy review as to how we will best support the organisation for the next 18 – 24 months.  To help with this, we are now seeking to recruit a part time Executive Director, to help lead the next phase of our growth. This is a paid position and the successful candidate will have strategic and operational responsibility for the execution of our vision and mission. supporting colleagues, our volunteer program as well as managing the infrastructure and budget of Women in Identity.

Responsibilities of the Executive Director

Reporting to the Board Chair, the Executive Director will have overall strategic and operational responsibility for Women in Identity’s staff, programs, expansion, infrastructure, budget as well as the execution of its mission.  Women in Identity, established in 2019, is a newly formed not-for-profit with over 2,000 global members.    Knowledge of the digital identity industry is preferred but not required.  This is initially a part-time job (up to three days per week) but is expected to become full-time.  This role is hands-on and the ideal candidate will have experience designing and building out not-for profit teams and operations.

Leadership and Management

Ensure effective management of all programs including rigorous program evaluation, consistent quality of finance and administration, fundraising, communications, technology and operational systems as well as staffing. Active engagement of volunteers, Board members, committees, alumnae and other partner organisations and donors. Ensure a high level of commitment and energy is achieved through this engagement. Develop and maintain a strong relationship with the Board chair as well as other Board members. Serve as ex-officio on each committee.  Seek and build board involvement in setting the strategic direction for Women in Identity. Lead, develop and coach staff to ensure high engagement and performance in roles.

Fundraising and Communications

Support and expand all fundraising and revenue generating activities needed to sustain the existing programs and operations, including any global expansion Manage all external events, working closely with corporate sponsors to ensure events achieve intended participation and exposure Ensure all communications promote a strong brand and reputation and achieves the desired engagement with major constituencies Serve as primary spokesperson for Women in Identity Use their own network to continue to build and expand Women in Identity’s reputation and visibility in the Identity industry as well as attract new opportunities

Business Operations

Hire and retain competent and qualified staff. Ensure budget is adequate to support day-to-day operations as well as any growth expansion plans. Keep Board apprised of finances and budgetary needs. Establish all necessary policies and procedures for effective management of operations. Review and approve contracts for services. Any other duties as assigned by the Board of Directors.

Ideal candidate experience and capabilities:

Senior level management expertise in strategic planning, finance, human resources, technology, legal, risk management, fundraising, business management and/or other industry specific knowledge Strong diplomatic skills and proven track record of cultivating diverse relationships, facilitating and building consensus in a collaborative manner Exceptional integrity and credibility. Holds self to high standard of accountability and ethics. High emotional intelligence – well developed sense of self-awareness of strengths and limits and able to understand others’ strengths and limits and uses this to deepen relationships and build trust and empathy with others Strong work ethic, self-starter with high energy and a passion for Women in Identity’s mission. Effective communication skills – able to clearly communicate complex issues in a manner that is tailored to diverse groups Leadership – able to inspire and motivate others, manage conflicting and competing ideologies, and demonstrate flexibility and adaptability to achieve compromise when necessary Exceptional written and oral communication skills. Able to tailor message to board, donors as well as in-market volunteers and other staff. Highly organized and strong attention to detail. Able to navigate both strategic and tactical issues with a pragmatic focus on delivering meaningful results to the board and volunteers. Committed to the mission and vision of Women in Identity.

Interested? Get in touch with info@womeninidentity.org in the first instance.

The post We’re hiring – new role of Executive Director appeared first on Women in Identity.


Velocity Network

The Future of Healthcare Background Screening

PreCheck, along with their parent company, Cisive are proud to be leading the way in leveraging the Velocity Network Foundation's framework to create valuable solutions for multiple use cases for the U.S. healthcare systems and hopsitals. The post The Future of Healthcare Background Screening appeared first on Velocity.

omidiyar Network

Reimagining digital public infrastructure is no longer just a development agenda

By CV Madhukar, Responsible Technology, Omidyar Network In a world where digitization ubiquitous, we now have an opportunity to shape digital ID programs and other components of the digital public infrastructure in a direction that serves the public interest and powers the digital economy, rather than only serving private corporate interests. Back in 2015, the UN Sustainable Development goa

By CV Madhukar, Responsible Technology, Omidyar Network

In a world where digitization ubiquitous, we now have an opportunity to shape digital ID programs and other components of the digital public infrastructure in a direction that serves the public interest and powers the digital economy, rather than only serving private corporate interests.

Back in 2015, the UN Sustainable Development goals called for “legal identity for all” by 2030. At the time, the World Bank released data which revealed that 1.5 billion people in the world lacked formal identification. Those working on financial inclusion challenges were among the first to articulate the societal cost of this gap; if a person is unable to identify themself, then they will be unable to engage in for the formal financial system and economic growth will stall.

Yet conversations about digital identity innovation were mostly treated as a developing country issue, with ardent advocates in the West unwilling to acknowledge that digital identity could add value to developed economies. Now, as the world emerges from the pandemic, attitudes seem to have taken a 180-degree turn.

The recognition that digital identification is increasingly central to the digital public infrastructure (that is, digital spaces for the public good) is gaining currency worldwide. Earlier this month, the European Commission proposed the creation of trusted and secure “Digital Identity for all Europeans”. This is expected to go into effect in late 2022. US Treasury Secretary Janet Yellen also spoke in February about the need for Digital IDs, saying, “The same digital ID technology that protects against money laundering can also help us reach more people with relief.”

Every country will need to reimagine its technology infrastructure and make it fit-or-purpose for a digital century. While developing countries suffer from lack of technology infrastructure, developed economies have legacy systems that need urgent reimagining. The Clinton era effort of “Reinventing Government” will need a whole new makeover right now in the US with regard to reimagining technology infrastructure. But to really reimagine this, key stakeholders will need to overcome a deep chasm caused by the compartmentalization in thinking about the role of public sector and the private sector.

In her important work on the Entrepreneurial State, noted economist Mariana Mazzucato argues that the state as a fixer of market failures offers a limited view. The state as a market shaper and market maker is less recognized in the US, even though the state (through DARPA, NIH, etc.) has had a deep impact on the course of innovation in the private sector over several decades.

The very nature of infrastructure is such that it is an enabler for private sector economic activity. In the digital era, such infrastructure also has the ability to ”see” every transaction that takes place in these digital spaces. That is why when a website that requires some form of identification asks offers the option to login in with Facebook or Gmail credentials, the big tech platforms get to see much of the activity that the individual does on this third-party website. Similarly, when a payment platform uses the infrastructure provided by Visa or Mastercard, these companies gain insights on financial activity of the individuals that use such a provider.

At a time where data will be a “factor of production” in the new digital economy, those providing such infrastructure will naturally become massive monopolies or oligopolies. Using antitrust laws to remedy this without recognizing what caused data monopolies in the first place, is like trying to pick up water with a sieve. We need viable and scalable public alternatives for the existing digital infrastructure to truly curb such monopolies. This will provide a level playing field for all private market players, create the basis for interoperability, prevent data monopolies, and make services more accessible for individuals.

This is, according to a recent paper by Ethan Zuckerman, the strongest case for policymakers to take urgent and concrete measures for the development and adoption of digital public infrastructure as a way of shaping the trajectory of the future digital economy.

We need governments to own this agenda so that any new conception of digital public infrastructure is built on a solid foundation of safeguards. Even as we think about such large-scale deployment of technology, we must deliberately think about unintended consequences of such efforts and mitigate them. Safeguards in the form of technology, law, and institutions must drive the norms and behavior of both the designers of technology and its users, and the government must guarantee these protections. The financial sector or the pharmaceutical industry can offer valuable lessons in this regard.

Reimagining digital public infrastructure could have a positive, transformative impact on the global economy and global development. But it’s going to be important for the US and Europe to install safe and inclusive, digital public infrastructure (like federated digital ID and payment platforms) to see real change. If they make this kind of investment, we can begin to build a world where gaps in technology won’t limit the development programs governments can implement, the progress they can make, or support they can give people to improve their own lives.

Reimagining digital public infrastructure is no longer just a development agenda was originally published in Omidyar Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ceramic Network

Ceramic Mainnet is live

We're excited to announce that the Ceramic Mainnet is officially live!

We're excited to announce that the Ceramic Mainnet is live. For the first time, developers can now create and deploy data streams to a peer-to-peer network of production-ready Ceramic nodes. As an open source platform for decentralized data streams, Ceramic provides a flexible, secure, and scalable infrastructure for managing data and identities in Web3 applications.

Leading up to today’s launch, Ceramic’s Clay Testnet has had over 2,000 active developers, and more than 200 projects have signed up for the Mainnet Early Launch Program (ELP).

If you're unfamiliar with Ceramic or just getting up to speed on the technology, learn more about the core features and use cases they enable.

Who is using Mainnet?

A collection of applications are already live on Mainnet or will be going live in the coming weeks including: Boardroom, Zerion, GeoWeb, Sourcecred, DNS.XYZ, Self.ID, and RabbitHole – to name a few. These projects are using Ceramic Mainnet for functionality ranging from decentralized identity and user data storage (via IDX SDK), to multi-account and cross-chain reputation systems, social networks and community forums, NFT-owned content streams, and much more.

Want to see content on Mainnet? Check out Tiles, a Ceramic browser which displays all streams created on Mainnet – built by members of the Ceramic community.

Advancing what's possible on Web3

Ceramic brings the Web3 benefits of permissionless innovation and composability to all information on the web, unlocking a universe of content that can be openly queried, forked and remixed, or frictionlessly shared across applications and organizational boundaries – without relying on a single data center or database server.

Molly Mackinlay, Project Lead at IPFS, said:

Ceramic and IDX are a key piece of the Web3 puzzle. Placing their bets on IPFS, libp2p, blockchain, and DID-authenticated data, Ceramic gives developers the ability to build completely serverless applications using dynamic, verifiable, decentralized data streams.

As the world moves closer to realizing the full potential of Web3, open data and identity infrastructure such as Ceramic will complement blockchain’s open value infrastructure, playing a key role for developers looking to build full-stack Web3 applications.

Get started with Mainnet

Starting today, Mainnet will be progressively rolled out in phases over the next several months beginning with a controlled Early Launch Program (ELP) and ending with a fully permissionless network. This helps ensure a safe and reliable experience for all developers.

ELP Phase 1: Your app must use mainnet nodes hosted by the 3Box Labs organization. This is available today. To join, sign up for the waitlist. ELP Phase 2: You can run your own mainnet nodes for your app and no longer need to rely on 3Box Labs infrastructure, but you still need to be onboarded manually. This is available today. To join, sign up for the waitlist. Open Network Phase 3: You can openly run your own mainnet nodes or use a third-party node hosting service. No signup or waitlist needed. We aim to release Phase 3 within the next few months.

To learn more about how you can launch your Web3 application on Ceramic's Mainnet, visit developers.ceramic.network. If you have any questions, join the Ceramic Discord.

About Ceramic

Built on IPFS, Ceramic is a decentralized, open source platform for creating, hosting, and sharing data. With Ceramic's permissionless data streaming network, developers can store streams of information and ever-changing files directly on the decentralized web – and share updates with anyone in the world – all without trusted servers or intermediaries.

Looking for your next challenge?

Want to contribute to the Ceramic protocol, network, and ecosystem? 3Box Labs is hiring for a variety of positions including Blockchain Engineer, Technical Product Manager, Community Lead, and Software Engineers.


Digital ID for Canadians

Facial Biometrics: Liveness and Anti-Spoofing

Most of us understand how fingerprinting works, where we compare a captured fingerprint, from a crime scene for example, to a live person’s fingerprint to…

Most of us understand how fingerprinting works, where we compare a captured fingerprint, from a crime scene for example, to a live person’s fingerprint to determine if they match. We can also use a fingerprint to ensure that the true owner, and only the true owner, can unlock a smartphone or laptop. But could a fake fingerprint be used to fool the fingerprint sensor in the phone? The simplest answer is yes unless we can determine if the fingerprint actually came from a living and physically present person, who might be trying to unlock the phone. 

In biometrics, there are two important measurements, Biometric Matching and Biometric Liveness. Biometric matching is a process of identifying or authenticating a person, by comparing their physiological attributes to information that had already been collected. For example, when that fingerprint matches a fingerprint on file, that’s matching. Liveness Detection is a computerized process to determine if the computer is interfacing with a live human and not an impostor like a photo, a deep-fake video, or a replica. For example, one measure to determine Liveness includes determining whether the presentation occurred in real-time. Without Liveness, biometric matching would be increasingly vulnerable to fraud attacks that are continuously growing in their ability to fool biometric matching systems with imitation and fake biometric attributes. Attacks such as “Presentation Attack”,  “spoof”, or “bypass” attempts  would endanger a user without proper liveness detection. It is important to have strong Presentation Attack Detection (PAD) as well the ability to detect injection attacks (where imagery bypasses the camera) as these are ways to spoof the user’s biometrics. Liveness determines if it’s a real person while matching determines if it’s the correct, real person.  

With today’s increasingly powerful computer systems, have come increasingly sophisticated hacking strategies, such as Presentation and Bypass attacks. There are many varieties of Presentation attacks, including high-resolution paper & digital photos, high-definition challenge/response videos, and paper masks. Commercially available lifelike dolls are available, human-worn resin, latex & silicone 3D masks, as well as custom-made ultra-realistic 3D masks and wax heads. These methods might seem right out of a bank heist movie, but they are used in the real world, successfully too. 

There are other ways to defeat a biometric system, called Bypass attacks. These include intercepting, editing, and replacing legitimate biometric data with synthetic data, not collected from the persons biometric verification check. Other Bypass attacks might include intercepting and replacing legitimate camera feed data with previously captured video frames or with what’s known as a “deep-fake puppet”, a realistic-looking computer animation of the user. This video is a simple but good example of biometric vulnerabilities, lacking any regard for Liveness.

The COVID19 Pandemic provides significant examples of Presentation and Bypass attacks and resulting frauds. Pandemic Stay-at-Home orders, along with  economic hardships, have increased citizen dependence on the electronic distribution of government pandemic stimulus and unemployment assistance funds, creating easy targets for fraudsters. Cybercriminals frequently utilize Presentation and Bypass attacks to defeat government website citizen enrolee and user authentication systems, to steal from governments across the globe which amounts in the hundreds of billions of losses of taxpayer money

Properly designed biometric liveness and matching could have mitigated much of the trouble Nevadans are experiencing. There are various forms of biometric liveness testing:

Active Liveness commands the user to successfully perform a movement or action like blinking, smiling, tilting the head, and track-following a bouncing image on the device screen. Importantly, instructions must be randomized and the camera/system must observe the user perform the required action.  Passive Liveness relies on involuntary user cues like pupil dilation, reducing user friction and session abandonment. Passive liveness can be undisclosed, randomizing attack vector approaches. Alone, it can determine if captured image data is first-generation and not a replica presentation attack. Significantly higher Liveness and biometric match confidence can be gained if device camera data is captured securely with a verified camera feed, and the image data is verified to be captured in real-time by a device Software Development Kit (SDK). Under these circumstances both Liveness and Match confidence can be determined concurrently from the same data, mitigating vulnerabilities.   Multimodal Liveness utilizes numerous Liveness modalities, like 2 dimensional face matching in combination with instructions to blink on command, to establish user choice and increase the number of devices supported. This often requires the user to “jump through hoops” of numerous Active Liveness tests and increases friction.   Liveness and 3-dimensionality. A human must be 3D to be alive, while a mask-style artifact may be 3D without being alive. Thus, while 3D face depth measurements alone do not prove the subject is a live human, verifying 2-dimensionality proves the subject is not alive. Regardless of camera resolution or specialist hardware, 3-dimensionality provides substantially more usable and consistent data than 2D, dramatically increasing accuracy and highlights the importance of 3D depth detection as a component of stronger Liveness Detection.

Biometric Liveness is a critical component in any biometric authentication system. Properly designed systems require the use of liveness tests before moving on to biometric matching. After all, if it’s determined the subject is not alive, there’s little reason to perform biometric matching and further authentication procedures. A well-designed system that is easy to use allows only the right people access and denies anybody else.  

Care to learn more about Facial Biometrics? Be sure to read our previous releases Exploring Facial Biometrics. What is it? and Facial Biometrics – Voluntary vs Involuntary.

About the authors:

Jay Meier is a subject matter expert in biometrics & IAM, and an author, tech executive, and securities analyst. Jay currently serves as Senior Vice President of North American Operations at FaceTec, Inc. and is also President & CEO of Sage Capital Advisors, LLC., providing strategic and capital management advisory services to early-stage companies in biometrics and identity management. 

Meyer Mechanic is a recognized expert in KYC and digital identity. He is the Founder and CEO of Vaultie, which uses digital identities to create highly fraud-resistant digital signatures and trace the provenance of Legal and financial documents. He sits on DIACC’s Innovation Expert Committee and has been a voice of alignment in advancing the use of digital identity in Canada.

Additional contributions made by members of the DIACC’s Outreach Expert Committee including Joe Palmer, President of iProov Inc.


We Are Open co-op

“Participation. That’s what’s gonna save the human race.” (Pete Seeger)

Documenting WAO’s collaboration with Participate We Are Open is once again collaborating with Participate, which is always a good time for our members. The folks at Participate are awesome, and have previously sponsored our work on Badge Wiki among other community endeavours. Images via https://www.openpeeps.com This time around, we began our conversation from a starting point of our several year
Documenting WAO’s collaboration with Participate

We Are Open is once again collaborating with Participate, which is always a good time for our members. The folks at Participate are awesome, and have previously sponsored our work on Badge Wiki among other community endeavours.

Images via https://www.openpeeps.com

This time around, we began our conversation from a starting point of our several year old Badge Bootcamp email course. WAO was looking to update it, while Participate was looking to add some online sessions. However, as we chatted, our conversation quickly led us to something even more interesting: a social learning approach using Communities of Practice (“CoP”) + Open Badges (“badges”) to:

Upskill Participate team members Educate people who are CoP-curious about value cycles Provide real-world experience of badges to those new to them

It’s always best to start at the beginning, so when we realised our scope and desire had changed, we sketched out personas and a project timeline.

Screenshot of Jamboard from a session between WAO and Participate.

In one of our workshops, Mark Otter (Participate’s CEO) used the phrase “Crawl, Walk, Run”, which is a perfect metaphor for a scaffolded release of a product.

This phrasing gave us ideas around how to structure the project — namely:

Crawl — we’d start by upskilling staff users and gathering their feedback. Integrating their ideas into the structure and content would make it better for our “Walk” group of users. Walk — people (external users) who are curious about CoP and want to learn more. Run — after we’ve piloted this work with the CoP-curious folks, we engage with Badge Champions . These are people who are motivated and influential in getting an organisation to start thinking about upskilling in a non-traditional way. Learning Journey Design Image CC BY-ND Bryan Mathers

Too often learning design starts from the premise that there is a ‘body of knowledge’ for those taking the course to work through. The structure is therefore a bit like stepping stones across a river — any deviation and you ‘fall in’!

As experienced learning designers, and working with Participate who are experts in this, we decided on a different approach.

Learning journey design

The above structure ensures that there is a core learning experience which situates the participant and helps them to not only learn but *experience* what they are learning. For example, as you can see in the ‘Community as Course’ section, participants not only learn about CoP but get involved in (more than) one.

At any point, and not just when they’ve finished the above, participants can dive into the ‘Earning is Learning’ courses which help widen and deepen their knowledge and experience around CoP and badges. It might be that some participants never touch this material, which is fine, but the chances are that most will at least be curious to know a little more about either or both at some point.

Badge System Design

The great thing about badges is that they can be issued to anyone for anything. That means not only can Participate issue badges, but volunteer community moderators can too — and even participants can peer award!

Image CC BY-ND Bryan Mathers

In this initial structure, we’ve designed badges that recognise the ‘value cycles’ that are central to Communities of Practice. So the badges for ‘Community as Course’ encourage participation, creating value for others, and reflecting on that experience.

Likewise, with the ‘Earning is Learning’ courses, badges involve not only knowledge acquisition and reflection, but also the *creation* of badges. This ensures that participants have both the theory *and* the practice to uses badges in future.

Good badge design has three pillars:

A striking and relevant visual image Clear criteria on how the badge is earned/awarded A description explaining the value of the experiences recognised by the badge

In addition, and where relevant, it’s really useful to have *evidence* of what the participant did to earn the badge. In our example above, the ‘Value Creator’ badge involves participants joining other communities, situating themselves, and adding value. Capturing that value in the badge evidence, either as a link or a screenshot, helps explain what the participant did, and the (potential) impact it had.

Next steps

We’re about to embark on a series of internal community calls for Participate staff to build momentum around the project. This is an approach we’ve used with Mozilla, Greenpeace, and other organisations to help obtain buy-in. It’s a key part of a healthy architecture of participation.

Separately, but related, we’re considering adding an introductory course to our existing free, email based ones at learnwith.weareopen.coop.

Written by Doug Belshaw & Laura Hilliger

“Participation. That’s what’s gonna save the human race.” (Pete Seeger) was originally published in We Are Open Co-op on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 28. June 2021

Ceramic Network

Community Call (June 25, 2021)

The core team and community discuss upcoming protocol improvements, mainnet status, and recent community projects.

Commercio

THE FUTURE OF EQUITY CROWDFUNDING RUNS ON BLOCKCHAIN

THE FUTURE OF EQUITY CROWDFUNDING RUNS ON BLOCKCHAIN (Milan 28 June ) 2Meet2Biz.com, equity crowdfunding and debt platform, is the first among the platforms authorized by Consob to use blockchain technology to manage all internal document processes, thanks to its presence, as a validator node, on the Commercio.network blockchain. With equity crowdfunding Startups and SMEs can collect risk capital

THE FUTURE OF EQUITY CROWDFUNDING RUNS ON BLOCKCHAIN

(Milan 28 June ) 2Meet2Biz.com, equity crowdfunding and debt platform, is the first among the platforms authorized by Consob to use blockchain technology to manage all internal document processes, thanks to its presence, as a validator node, on the Commercio.network blockchain.

With equity crowdfunding Startups and SMEs can collect risk capital to finance the birth or the development of their business project.  Italy has been among the first countries to regulate this kind of investment creating a real ecosystem and an official registry by Consob. In recent years the sector has evolved in terms of the number of platforms and capital raised, as well as the number of transactions managed, but from a technological point of view few steps have been taken.

The desire to use blockchain technology in 2meet2biz stems from a precise need: to apply the potential of a blockchain to the processes that normally characterize equity crowdfunding campaigns and that, by type, are essentially replicable and procedurizable for all bidding companies.

That’s why 2meet2biz, developed and managed by Migliora Srl, has embarked on an absolutely innovative path: to ensure to its corporate clients, as well as to all investors who join the campaigns, transparency, security and immutability of all processes related to the collection of capital, from the receipt of the initial documentation until the conclusion of the campaign, this thanks to a blockchain technology compliant, moreover, to the eIDAS directives.

For the same reason – to integrate with third party technologies with the aim of improving efficiency and security in economic and financial transactions – 2meet2biz has chosen to use an open banking software system, fully integrated into 2meet2biz, which through APIs securely guarantees financial flows, automating all phases of bank reconciliations and transactions.

2Meet2Biz.com is now part of Commercio.network, a blockchain that since 2018 has enabled all companies around the world to manage the three indispensable fiduciary processes of digital transformation: Creating a digital identity with eID, signing a document with eSignature and certifying the exchange of documents with eDelivery. Thanks to this the customer experience of the platform is ZERO PAPER and the processes from onboarding to eKYC (electronic know your customer) are totally digital.

2Meet2Biz.com has become the 45th of the 100 validator nodes of commercio.network joining the most important European players already present, such as Namirial, Infocert, VAR Group Zucchetti. Being a Validator Node means to participate in the most innovative, sustainable and advanced European project in the Blockchain field, also proposed as a technology for the EBSI call for proposals financed through PCP by the European Commission.

This step from one side is part of a very precise strategy, that is to integrate 2meet2biz with

 

Commerce Network

third party technologies that allow it to progressively improve the customer experience of all its customers, and on the other hand it is the first step of a wider project to exploit all the potentialities

that this business model intends to offer, from a procedural point of view and not only, ensuring a degree of transparency never seen in its sector in full compliance with EU MICA directives.

2meet2biz intends to guide the processes of crowd campaigns through the digital transformation, with the development and use of SAAS technologies able to make even the investment process innovative, both for principals and investors, maximizing the value chain of the campaigns presented, from their origin to the access to the target market. A new international strategic approach, with a path that will soon be applicable to all investors, involving partners with platforms that comply with the standards required by the European regulatory bodies.

Info and contacts:

Migliora Srl

Serena Auletta

sauletta@miglioradv.it

347.44.98.409

 

L'articolo THE FUTURE OF EQUITY CROWDFUNDING RUNS ON BLOCKCHAIN sembra essere il primo su commercio.network.

Friday, 25. June 2021

Good ID

Authenticate Virtual Summit: Focus on Europe Recap

By: FIDO Alliance Staff The digital security, privacy and authentication landscape is evolving quickly in the European Union with new regulations that could have a broad ranging impact for its […] The post Authenticate Virtual Summit: Focus on Europe Recap appeared first on FIDO Alliance.

By: FIDO Alliance Staff

The digital security, privacy and authentication landscape is evolving quickly in the European Union with new regulations that could have a broad ranging impact for its citizens, as well as companies around the world. 

At the Authenticate Virtual Summit: Focus on Europe, which was held on June 17, experts on the authentication market in Europe provided insight into the latest developments including PSD2 SCA (Payment Services Directive Strong Customer Authentication), delegated authentication, eIDAS (electronic IDentification, Authentication and trust Services) and the EU Digital Wallet among other efforts.  

Kicking off the virtual summit, Andrew Shikiar, executive director and CMO of the FIDO Alliance outlined how the FIDO specifications work and why strong authentication is essential for multiple use cases including ecommerce, Internet of Things (IoT) and identity verification. 

“FIDO’s goal from day one was to certainly reduce reliance on passwords, but in some ways that was just a means to an end, really trying to address the data breach problem, as the vast majority of data breaches are caused by weak credentials,” Shikiar said.

As FIDO is moving forward, there has been a need to strengthen identity verification assurance to support better and safer account recovery. As part of that, Shikiar noted that the FIDO Alliance launched the Identity Verification & Binding Working Group (IDWG) which is driving that work forward.

“We’re seeking to establish best practices for possession based identity verification,” Shikiar said. “That will not only enable safer, easier and stronger account recovery, but doing so will also stop hackers from using the account recovery process as an opening for social engineering account takeovers.”

Helping to Limit Cart Abandonment

There is a tangible connection between ecommerce success and strong authentication, according to Rolf Lindemann, VP products at NokNok.

Lindermann noted that during the pandemic, ecommerce grew faster than ever before. But with 13% of credit card online payments not being completed, it’s clear that cart abandonment is still impacting business in a significant manner.

“We learned that authentication friction in general is a major factor for card abandonment,” Lindermann said. “This becomes obvious given that online authentication is at the core of all online transactions. Authentication is the front door to