Last Update 10:56 AM April 15, 2021 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Thursday, 15. April 2021

Ontology

Ontology Weekly Report (April 7th–13th, 2021)

Highlights It has been another busy week at Ontology HQ. Marking a significant milestone for DeFi at Ontology, Wing Finance, Ontology’s cross chain DeFi lending platform, is now live on Ethereum. We also cooperated with MDEX to launch a global campaign. Latest Developments Development Progress We have completed 75% of the Ontology EVM-integrated design, which will make Ontology fully compatib

Highlights

It has been another busy week at Ontology HQ. Marking a significant milestone for DeFi at Ontology, Wing Finance, Ontology’s cross chain DeFi lending platform, is now live on Ethereum. We also cooperated with MDEX to launch a global campaign.

Latest Developments

Development Progress

We have completed 75% of the Ontology EVM-integrated design, which will make Ontology fully compatible with the Ethereum smart contract ecology after completion. We have completed 60% of the latest Layer 2 technology, exploring the integration of Ethereum Layer 2 on the Ontology MainNet.

Product Development

This week ONTO x MDEX (BSC) jointly launched an AMA broadcast and a two week liquidity mining campaign to spread the word.

dApps

This week there 113 dApps launched l on MainNet. There were 6,541,382 dApp transactions completed on MainNet. 31,614 dApp-related transactions took place.

Community Growth

We are delighted to see the Ontology community is continuing to grow at a rapid speed. This week we onboarded over 514 new members across our global communities.

Follow us on Twitter or Telegram to keep up with our latest developments and community updates. As always, we encourage anyone who is interested in Ontology to join us.

Global News

Wing Finance, Ontology’s cross chain DeFi lending platform is now live on Ethereum, providing opportunities to a much wider audience. Ontology’s Chief of Technology, Ning Hu, was invited to speak at the 2021 ECUG Con. He delivered a speech on how to build a trusted, safe and efficient integrated-agreement one-stop platform for the on-chain and off-chain data, using a decentralized network. We are growing our talented team at Ontology. 7 tech work positions and 10 comprehensive positions are now open. For more details, see our job openings here.

In the Media

CNN — Bitcoin is back above $60,000 as Coinbase gets ready for public debut

Ontology’s Founder, Li Jun was featured in CNN this week.

“Bitcoin and other cryptocurrencies are like digital gold as more and more assets are becoming digitalized and tokenized,” said Li Jun, founder of Ontology, a blockchain firm. “Cryptocurrencies are going to become even more important.”

Coin Post — Ontology’s DeFi lending platform goes live on Ethereum Cointelegraph — DeFi’s critical missing piece: Credit scores

The solution to bridging the gap between requiring assets and managing uncollateralized loan risk is simple. Ideally, the credit model is robust enough to support active lending rather than purely serving as a theoretical framework.

OScore is exactly as the article mentions: repetitional. Credit on-chain can’t just be based on assets solely, it also has to be based on the person utilizing the asset.

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (April 7th–13th, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Affinidi

25 Real-world Use Cases of Verifiable Credentials

Verifiable credentials are the next frontier in the world of digital identity and security. They address many of the concerns we face today with regards to our documents and identities. But since it is a relatively new technology, there are gaps in connecting it to real-world scenarios, and his article aims to bridge this gap. The use-cases that we have come up with are some potential areas where

Verifiable credentials are the next frontier in the world of digital identity and security. They address many of the concerns we face today with regards to our documents and identities.

But since it is a relatively new technology, there are gaps in connecting it to real-world scenarios, and his article aims to bridge this gap. The use-cases that we have come up with are some potential areas where VCs can be used.

It is aimed to give insights on the unlimited potential of VCs across every sphere of our life. Specifically, we hope that business leaders, entrepreneurs, and developers will use this article as a guide to create some breakthrough solutions that will benefit the society at large.

So, let’s jump into the use-cases and before that, a brief introduction on the three parties involved in VC-based transactions.

Issuer

An issuer is an entity that is authorized to issue a credential. These issuers are typically government organizations, healthcare centers, banks and financial institutions, schools and universities, and possibly even organizations that provide proof of employment.

These entities use a combination of methods such as digital signatures and custom schemas to prove that they are competent to issue a credential.

Holder

A holder is someone who is the owner of the credential and has complete control over how it can be managed, with whom these credentials can be shared, or revoked. Holders are typically individuals or organizations.

Since the holder is the owner of the credential, the onus is on this entity to create a verifiable presentation, which is the compilation of data sent by one or more issuers in a machine-verifiable format that adheres to the existing standards.

Verifier

A verifier is an entity that verifies a credential and ensures that it comes from a competent issuer, is tamper-proof, and is still relevant (not expired or revoked). A verifier takes the verifiable presentation from the holder to determine its authenticity.

VC Use-Case#1: Employment Endorsements

VCs can be used to verify current and past employment and is a safe way to share them with prospective employers and entities who need this information.

Issuer — Employers, job search sites, payroll providers, and startups like GoodWorker that verify a user’s work experience. Holder — An individual who claims a certain work experience Verifier — Prospective employers who need this information to decide if the applicant is a right fit for the job. VC Use-Case#2: Protecting your Driver’s License

Let’s take a peek into how you can apply, share, and safeguard your driver’s license through verifiable credentials.

Issuer — A startup company that issues standard and interoperable driver license Verifiable Credential (VC) after validating the government-issued driver’s license Holder — An individual owning this interoperable driver license VC Verifier — A government authority or any other entity that wants to verify whether the license is valid and if it belongs to the holder. VC Use-Case#3: Approving Loans

The cumbersome process of verifying and approving loan applications can be greatly simplified with VCs.

Issuer — A startup fintech company that checks the credit score and financial background of the applicant, and accordingly, issues a VC. Holder — An individual applying for a loan Verifier — A financial company that verifies the holder’s background to decide whether the loan should be approved or not and the associated interest rate. VC Use-Case#4: Verifying Educational Qualifications

Some educational institutions and even employers require applicants to possess certain educational qualifications such as a major in certain fields or a predefined qualifying percentage to become eligible for admissions into postgraduate or managerial courses.

Issuer — An educational institution or a startup that checks the educational background of the holder and issues a VC accordingly Holder — An individual claiming to have a certain educational qualification or percentage. Verifier — An educational institution that checks if the holder meets the criteria for a course/degree. VC Use-Case#5: Checking Eligibility for Vehicle Rentals

Many vehicle rental companies charge an hourly rental fee based on the creditworthiness of the renter, past rental history, and rate of accidents and offenses.

Issuer — A startup company that collates data about the holder’s rental history, rate of road accidents and offenses, and creditworthiness, and issues a VC that encompasses these details Holder — An individual wanting to rent a vehicle Verifier — The vehicle renting company. VC Use-Case#6: Accessing Medical Records

When medical records are in an easily shareable format, they can come in handy during emergencies.

Issuer — A healthcare or startup company that collects all the medical records of an individual across different medical institutions and issues a VC containing all the necessary information Holder — An individual holding the medical record Verifier — A healthcare provider VC Use-Case#7: Eliminating Frauds in Elections

Voting frauds are rampant where people impersonate others or falsify their age to cast vote. VCs have the potential to eliminate this fraud.

Issuer — A startup company or a government entity like Digilocker that issues a VC containing an individual’s date of birth, identifying birthmark, age, nationality, visual proof like a tamper-proof photo, and a unique government ID. Holder — An individual who wishes to vote. Verifier — The election commission or election officials who verify the holder’s eligibility to vote. VC Use-Case#8: Examining Visa Validity

The airport authorities examine the validity of a visa before allowing a traveler to enter a country, and VCs can streamline the process.

Issuer — The embassy of a country or any other government entity that is responsible for issuing visas to travelers Holder — A traveler Verifier — The airport authorities, who check the credentials of the traveler and the validity of the visa. VC Use-Case#9: Inspecting Tickets in Trains/Buses

Ticket inspectors check the validity of the travelers’ tickets to ensure that they have paid for the travel and there are no frauds involved. These tickets can be digital passes or VCs, so the user doesn’t have to carry a physical ticket or produce any PII.

Issuer — The transportation company that is responsible for issuing tickets like the Indian Railways that establishes the validity/date of travel, name of the traveler, seat number, age, and a visual identification like a photo. Holder — A traveler Verifier — Ticket inspectors who want to ensure that the right traveler is occupying the right seat and the ticket is valid for that trip. VC Use-Case#10: Setting up Health Checks

VCs can come in handy to allow patients to take up the scheduled health checkups with any healthcare provider. These credentials can tell any healthcare provider the current health status of the patients and the tests that are due, so the entire experience is seamless.

Issuer — A startup company that consolidates the healthcare information, test schedules, and medical allergies and dependencies of individuals. Holder — A patient who has to take tests periodically. Verifier — The healthcare provider offering the tests. VC Use-Case#11: Generating Boarding Pass

Airlines that generate boarding passes for travelers can use VCs to verify the identity of the traveler.

Issuer — The airline company or its travel partner sites that issued the ticket. Holder — A traveler Verifier — The system or individual at the airport responsible for issuing a boarding pass after validating the name of the passenger, flight number, and date of travel. VC Use-Case#12: COVID Tests for Safe Travel

COVID-19 tests have become mandatory for travelers to arrest the spread of the pandemic. And VCs can be used to safely examine these test results

Issuer — A startup that consolidates the identity of the holder, his or her COVID test results, and the date of the test. Holder — A traveler/holder who has tested for COVID recently Verifier — The system or individual at the terminal responsible for ensuring that only patients with a negative COVID test result can board the airplane/train/bus. VC Use-Case#13: Application for Utility Connections

People moving to a new city/home have to share their PII including address and unique ID number to get a new utility connection or change the name on the existing one. VCs can come in handy for this.

Issuer — A startup that checks the rental/sale agreement to ensure that it is in the name of the holder, PII of the holder, and date of move-in. A VC containing a reference to these details is issued Holder — An entity that has moved to a new place. Verifier — The utility company VC Use-Case#14: Age-related Services

Sometimes, an individual has to be over 18 years of age to avail of certain services like sky-diving, entry to a pub, etc, and VCs are a convenient way to check this criterion.

Issuer — A startup that checks the holder’s identity and age and issues a VC that validates that the holder is over 18 years. Holder — An individual claiming that he or she is over 18 years of age Verifier — An adventure company, pub, or just about any service company that offers its services strictly to those over 18 years. VC Use-Case#15: Application for Credit Cards

The process of issuing credit cards requires an extensive check of the applicant’s creditworthiness and financial background. Many times, the rate of interest depends on the credit score of the holder

Issuer — A startup or a credit bureau service that collates an individual’s credit history and score. Holder — An applicant who wishes to own a credit card Verifier — The credit card company VC Use-Case#16: Background Checks

Many companies require their prospective employers to go through a mandatory background check before they join the company, and VCs can streamline this process.

Issuer — A startup or an HR company that handles background verification of prospective employees. Holder — A prospective employee Verifier — The company that has recruited the holder VC Use-Case#17: Rental Passports

Renters would want to know the rental history, credit score, income details, and other pertinent information before renting a place to a tenant and VCs are a convenient way to share this information.

Issuer — A startup that issues a VC after consolidating all the required information for rental verification Holder — A prospective tenant Verifier — A renter VC Use-Case#18: Access to Premium Services

The creditworthiness of an individual is verified before giving him/her access to premium services like credit clubs.

Issuer — A startup or a credit bureau that checks the credit score and financial background to generate a VC. Holder — An individual applying for these premium services Verifier — A company offering these services VC Use-Case#19: Supermarket Shopping

The supermarket shopping data can be credentialed so that customers can post-pay their supermarket bills monthly

Issuer — A startup that collates the shopping bills of a customer. Holder — Customer of a supermarket Verifier — The supermarket VC Use-Case#20: Computing the Insurance Premiums

Insurance companies compute the cost of premiums based on the creditworthiness of an individual, past behavior like accidents or claims, missed payments, and more.

Issuer — A startup that collates all this information Holder — An entity paying insurance premiums Verifier — Insurance companies that calculate the cost of insurance premiums VC Use-Case#21: Using Death Certificates

Death certificates are necessary to prove that an individual is deceased and to help the legal heirs to access the wealth that’s due to them.

Issuer — A startup or a government agency that checks the death certificate and determines the legal heirs of the deceased. Holder — A deceased individual or the legal heirs of the deceased Verifier — Insurance, financial, and wealth management companies VC Use-Case#22: Making Disability Claims

Workers who were injured while carrying out the normal activities of their job are eligible for disability claims, but over the years there has been a lot of litigation in this area. VCs can bring down this processing time

Issuer — A startup company that gathers details of an individual’s nature of accident at work, hospital records that verify the same, date of accident, and identity of the individual. Holder — An injured worker who is making the disability claim Verifier — The employer of the injured worker, insurance companies, and lawyers. VC Use-Case#23: Driving Offences

Drivers who commit repeat offenses are levied a heavy penalty while first-time offenders are allowed to get away with minor punishments or sometimes just a warning, depending on the nature of the road offense.

Issuer — A startup company that brings together an individual’s driver’s license, past driving offenses, tickets and penalties, date of occurrence, and other relevant information. Holder — A driver Verifier — Police, the Department of Motor Vehicles, and any other government or legal entity responsible for handling driving offenses. VC Use-Case#24: Loyalty Programs

VCs can make it easy to track an individual’s loyalty program memberships, points earned, balance available, eligibility of the individual, and more.

Issuer — A startup company that collates the loyalty points of an individual across a specific organization and ties it to the identity of the individual to eliminate fraud. Holder — An individual who is a part of a loyalty program Verifier — A store, service, or partner company where the loyalty points can be used. VC Use-Case#25: Selling/Buying a Property

Buyers and sellers don’t necessarily have to know each other and VCs can help to create mutual trust without revealing personal information.

Issuer — A startup or a real estate company that creates VCs for both buyers and sellers. Information for the buyer would include whether that entity owns the property in question and is at the location where it is claimed to be. In the case of a seller, the VCs can give information about the credibility of the seller and possible financial ability. Holder — A prospective buyer/seller Verifier — A buyer or seller. It can also be a real estate agent or company.

We hope these real-world use cases give you an idea of the vast possibilities of VCs.

If you’d like to give any of these use-cases a shot or want to come up with something new, this is the right time. Start building your ideas with Affinidi’s resources and be a part of the PoCathon to win some cool prizes.

25 Real-world Use Cases of Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


My Life Digital

Will GDPR affect advancements in Artificial Intelligence?

#ThrowbackThursday Attending events was a good way to get the discussion around data privacy and ethics started. This one was particularly interesting when you consider Artificial Intelligence and the possibility for it to be a black box.   Will GDPR affect advancements in Artificial Intelligence?  The short answer is, yes! But that doesn’t ma

#ThrowbackThursday

Attending events was a good way to get the discussion around data privacy and ethics started. This one was particularly interesting when you consider Artificial Intelligence and the possibility for it to be a black box.  

Will GDPR affect advancements in Artificial Intelligence? 

The short answer is, yes! But that doesn’t make it a bad thing. 

Innovation in Artificial Intelligence (AI) is one of the greatest achievements in the modern digital age. It is the ability for computers and machines to perform human-like activities such as learning, problem solving and decision making.  


The potential of AI is something of great excitement to futurists and transhumanist believers, who predict that AI could be billions of times smarter than humans, with the possibility of individuals needing to merge with computers to survive. 
 

Popular technology integrated with AI includes device wearables for example an Apple Watch that can monitor your physical activity and certain health attributes. Such devices have a private benefit to the individual consuming it, but they also create a wide array of data sets, that contribute to creating the internet of things. 
 

Organisations can benefit from access to this data, to continue innovating technologies and solutions through AI or by using the data to produce sophisticated insights for both the consumer and organisation. When put to good use, the potential these technologies produce for society is exponential. However, when organisations take advantage and negatively exploit the wide array of personal data created by such technologies, we could see the current global trust crisis deepen. 


A colleague attended an event that discussed the potential of AI applications to be further integrated within the NHS. Here concerns were raised by several AI developers that GDPR could stifle innovation within the field. This was immediately challenged by the “GDPR experts” in the room. Their view was that GDPR is necessary to protect the collection, use and sharing of personal data. And, those who worked within the NHS believed GDPR would actually reduce the number of opt-outs, stating that; if citizens had greater control over how their personal data was used, for example, development of applications to improve their healthcare management, they would be more likely to give permission for this purpose. 
 

Another point raised was that fully anonymised data is not covered by GDPR. AI developers can continue to freely use anonymised data to develop AI applications. Those creating AI solutions that will have a benefit to the citizen or the wider public could consider claiming either legitimate or public interest as the legal justification for collecting, using or sharing personal data. However, to use this compliantly would require transparency and an organisation would need to demonstrate that have considered the citizen’s rights. Organisations that do this and truly deliver value from using citizens’ personal data, can genuinely build stronger trust the organisation and citizens. 
 

GDPR will ensure that organisations, either using personal data to develop AI or capturing personal data through applications run by AI algorithms, are not crossing the moral line and exploiting citizens personal data. GDPR should not stifle innovation within this sector if data is used ethically and the organisation is being fully transparent about the purposes of using individuals’ personal data.

The post Will GDPR affect advancements in Artificial Intelligence? appeared first on MyLife Digital.

Wednesday, 14. April 2021

Authenteq

Simple, straightforward ID verification with the click of a link

The post Simple, straightforward ID verification with the click of a link appeared first on Authenteq.

Anonym

Google Pushes Privacy, But Can a Leopard Really Change Its Spots?

At first glance it looks like Google is turning over a new leaf. The tech giant has announced it will no longer track third party cookies on Chrome and it won’t help build another way of tracking users’ browsing history for targeted advertising.   Instead of third party cookies, Google wants to categorize users into groups&

At first glance it looks like Google is turning over a new leaf. The tech giant has announced it will no longer track third party cookies on Chrome and it won’t help build another way of tracking users’ browsing history for targeted advertising.  

Instead of third party cookies, Google wants to categorize users into groups known as Federated Learning of Cohorts (FLoCs), based on consumer type and habits, interests and preferences, and then let advertisers target those groups or FLoCs instead of individuals. It’s the centrepiece of Google’s new “Privacy Sandbox”. 

On the face of it, Google is prioritizing privacy and trying to claw back trust, while throwing the ad industry, which relies on data amassed via Google, under the proverbial bus. 

But can a leopard really change its spots? 

No.  

Google knows consumers are fed up with the data economy, and regulators are currently investigating three separate antitrust allegations against it. It certainly needs some positive optics. But this new play for privacy is nothing but a smokescreen concealing the fact the tech giant will remain largely unaffected by the new rules. As they note over at Recode: “A third party cookie ban won’t hurt the search giant’s healthy first party data ad business.” 

The biggest tell is that its FLoCs proposal does not apply to mobile phones that use Google’s Android operating system, where Google maintains multiple identifiers for each user and gives those identifiers to advertisers. Since globally 71.93 percent of mobiles run Android and mobile internet use is now at 54.46 percent and growing, excluding Android mobile phones from the new setup drastically dilutes Google’sprofessed desire to abandon ad tracking. Of course, one bit of good news for Apple iPhone users is the iOS 14 update stops cross-app tracking, which will limit Google’s influence on their devices.  

What’s more, Google will continue to let advertisers target users via the email addresses it stores, and it will continue to target ads using first party data from its own billions of users on YouTube, Gmail and Chrome.Remember, Chrome has 65 percent market share worldwide, and most of Google’s revenue comes from ads on Google Search.  

As Ken Glueck, Executive VP of Oracle points out in his article “With dominance over both the browser and the mobile OS, Google no longer needs cookies.” 

Further, the FLoC groupings are so dynamic and defining they will eventually reveal the individuals within them. Individuals will be tracked and have their data assigned to so many different FLoCs over time that eventually a simple generated list will distil down to reveal the actual individual. Same, same but different, right? 

We like how Ken Gluek sums up Google’s smokescreen attempt at a privacy-first future:  

“What Google doesn’t really say is that effectively none of Google’s own privacy invasive practices are changing. Chrome will still monitor every web site and action a logged-in consumer takes on the web. Android will still collect your precise geolocation, your movements, and your app usage, while surreptitiously mapping every Wi-Fi base station and Bluetooth beacon on the planet. Google search will still catalogue every desire and query no matter how intimate, while the array of Google’s own first-party analytic and advertising cookies will collect more data than the now banned third-party cookies ever would have.  

The FLoCers must be ROFLing (I know, very uncool) all over Mountain View because what they have just done—unilaterally—is wiped out the competition for consumer data and any semblance of competition in online advertising, without actually enhancing privacy.  

Google’s sandbox is little more than an attempt at using privacy as a pretext to solidify its dominance.” 

The bottom line for us is that users still need to be proactive in protecting their own personal data, using Sudo profiles and all the privacy tools in MySudo, like private browsing that blocks ads and tracker cookies by default, alternative private email and phone numbers.  

If you want to go deeper into the Google issue, Ken Gluek’s article is really worth a read. 

The post Google Pushes Privacy, But Can a Leopard Really Change Its Spots? appeared first on Anonyome Labs.


Coinfirm

Enhanced Due Diligence: Cointelligence Fund Deploys Coinfirm’s AML Platform

14 April, LONDON, UK – Coinfirm, the industry-leading blockchain RegTech firm, and Cointelligence Fund, a digital asset management firm, have announced their collaboration to further regulatory compliance, apply effective risk management, and employ superior due diligence to crypto assets through the provision of the AML Platform. Through Coinfirm’s AML Platform, Cointelligence Fund will be able...
14 April, LONDON, UK – Coinfirm, the industry-leading blockchain RegTech firm, and Cointelligence Fund, a digital asset management firm, have announced their collaboration to further regulatory compliance, apply effective risk management, and employ superior due diligence to crypto assets through the provision of the AML Platform. Through Coinfirm’s AML Platform, Cointelligence Fund will be able...

auth0

MFA With WebAuthn for FIDO Device Biometrics Now Available

WebAuthn Authenticators offer a streamlined user experience and enhanced security
WebAuthn Authenticators offer a streamlined user experience and enhanced security

IBM Blockchain

The rising NFT tide lifts all tokens: So what is an NFT?

How can businesses absorb disruptive impact and begin to integrate these new business models in existing and new streams of business opportunity? NFTs have taken the world by storm. Rejuvenating the blockchain movement started by Bitcoin, followed by smart contract platform Ethereum, NFTs seem to be a natural progression in the explosion of asset tokenization, […] The post The rising NFT tide li

How can businesses absorb disruptive impact and begin to integrate these new business models in existing and new streams of business opportunity? NFTs have taken the world by storm. Rejuvenating the blockchain movement started by Bitcoin, followed by smart contract platform Ethereum, NFTs seem to be a natural progression in the explosion of asset tokenization, […]

The post The rising NFT tide lifts all tokens: So what is an NFT? appeared first on Blockchain Pulse: IBM Blockchain Blog.


auth0

Auth0 Joins the Amazon Web Services Public Sector Partner Program

Global identity management provider recognized for commitment to government, education, and nonprofit customer success
Global identity management provider recognized for commitment to government, education, and nonprofit customer success

KuppingerCole

Informatica is Moving Data Management to the Cloud

by Martin Kuppinger Introducing Intelligent Data Management Cloud (IDMC) as a Comprehensive, Cloud-Native, and Cloud-First Approach to Data Management Data is the new gold, the new oil, or whatever. In the cloud, you only own the data, but not the applications, systems, or networks anymore. Data is essential for Digital Transformation. There is so much data that is sprawling that it is hard to

by Martin Kuppinger

Introducing Intelligent Data Management Cloud (IDMC) as a Comprehensive, Cloud-Native, and Cloud-First Approach to Data Management

Data is the new gold, the new oil, or whatever. In the cloud, you only own the data, but not the applications, systems, or networks anymore. Data is essential for Digital Transformation. There is so much data that is sprawling that it is hard to keep control of it. And, within the SaaS and PaaS tenants, data is what belongs to the tenant. Many organizations fail in managing this data sprawl and implement appropriate Data Governance.

Data Is Ubiquitous: Supporting the Reality of Multi-Cloud and Hybrid Environments

Data is everywhere. It resides in many different places. Protecting, governing, managing, and, last but not least, using data requires the ability to have comprehensive management across the entire breadth of tools, data lakes, databases, etc. involved. Informatica uses the claim of “cloud-first, multi-hybrid” for describing this scenario and their new solution, IDMC, the Informatica Intelligent Data Management Cloud. It is about supporting the cloud-first strategies, but also the fact that most organizations face both multi-cloud and hybrid environments. It is about having solutions that help to get a grip on data, regardless of where it resides, and about leveraging the potential and value of that data for the sake of the business.

Informatica IDMC is the successor of Informatica IDP (Intelligent Data Platform), delivered via the IICS (Informatica Intelligent Cloud Services). It is a cloud-native implementation of a very wide range of services for handling virtually any aspect of the broader Data Management field, from Data Discovery to delivering a 360-degree view of the data. Informatica has integrated more than 250 services into this new offering.

The Target: One Cloud to Manage all Data, Across all Clouds and Beyond

Informatica positions IDMC as one of the central clouds, such as Infrastructure Clouds (like AWS, Azure, GCP, and others), Application Clouds with their platform approach (such as Salesforce, ServiceNow, Oracle, SAP, and others), and other types of clouds delivering the foundation for critical business and IT capabilities.

Informatica places IDMC in the center, like a spider in the web – the cloud that helps to manage the data from all other clouds. Considering the essential role of data for the success of businesses, this is a valid perspective.

The Need for Data Management and Governance

There is, without any doubt, a need for reclaiming control of corporate data. Most organizations today don’t know (exactly) which data they have and where it resides. This is a challenge for both utilizing the data (“the new gold”) and for compliance and data security. Thus, an approach that helps in getting a grip on data is essential.

With the IDMC, Informatica positions itself as the one-stop-shop solution that delivers all the services around Data Management from a single source. And Informatica delivers these solutions in an integrated, cloud-first platform built on modern microservices-based architecture. That makes Informatica IDMC a product that raises the bar in the broader Data Management market and is worth further analysis. While such complex, integrated solutions also come with their own challenges (does someone really need 260 different services?), IDMC’s modular approach allows customers to pick the parts they want to focus on first.

Independent of specific products, organizations must invest in the field of Data Management and Data Governance to leverage the value of the data they own and to mitigate the risk of failing in data management and security. Informatica’s IDMC is of specific interest for large organizations looking for a centralized, comprehensive solution covering a range of deployment models and integrations to existing data.


Ontology

DeFi Needs Reliable Credit Scoring System, but We Must Be Cautious

A byline by our Founder, Li Jun Originally published in BeinCrypto. Credit scores have been a cornerstone of risk evaluation in the global banking systems for decades. They allow banks to determine who qualifies for a loan, at what interest rate, and what credit limits. Lenders use these scores to evaluate the likelihood that the borrower will fulfill their obligations and repay their
A byline by our Founder, Li Jun

Originally published in BeinCrypto.

Credit scores have been a cornerstone of risk evaluation in the global banking systems for decades.

They allow banks to determine who qualifies for a loan, at what interest rate, and what credit limits. Lenders use these scores to evaluate the likelihood that the borrower will fulfill their obligations and repay their loan.

The global lending and payments market reached $6.7 trillion in 2020. It will possibly reach $7.6 trillion in 2021.

If you’ve ever applied for a bank loan, you’ll know that these traditional credit scoring systems and in-depth identity verification processes work together. These include proof of address and a copy of a passport or official identification.

Now, Europe’s Open Banking initiative, PSD2, is set to bring credit scoring into the 21st century. It will make it possible for lenders and borrowers to access a full picture of an individual’s financial history in real-time.

Its introduction into financial services systems will revolutionize the loan process. It will increase speed, accuracy, and more importantly, financial inclusion.

DeFi lending still in its early stages

In comparison to this highly sophisticated system, lending and borrowing in the DeFi industry is still in its nascency.

However, as we know, over the past 12 months it’s grown at an incredible rate. Total Value Locked (TVL) in DeFi as of March 2021 stands at $39.7 billion, according to DeFi Pulse.

What’s more, it is lending that makes up for the largest segment of that market. The DeFi lending market sits at $17.8 billion. Decentralized exchanges follow closely behind at $15.6 billion.

What the DeFi industry has been missing is a credit scoring system that provides a full picture of an individual’s varied crypto assets across different wallets and chains.

To increase trust and reputation when it comes to lending and borrowing through DeFi, we need a system that supports cross-chain interaction and verifiable credentials.

By connecting user identities with personal accounts, users can bind their digital assets and contact addresses making it easy for the correct due diligence to take place.

Crypto credit scores will allow lenders to view the borrower’s eligibility. However, they will also help them to avoid over-collateralization when looking to borrow assets. They will have the ability to put their positive credit scores to use and to access more rewarding opportunities.

A picture of assets needed for DeFi and traditional merger

As the DeFi industry progresses to merge with traditional financial systems, there will need to be an evaluation of on-chain and off-chain assets.

To create a trusted merger between these two worlds, a full picture of traditional and digital holdings and history needs to be made available.

This will further bolster the benefits of legislation like Europe’s PSD2. It will provide a more rounded, integrated look at asset holdings and histories, including crypto assets, in real-time.

However, as we’ve seen with any new technology that deals with highly sensitive data, we must be cautious.

Privacy and security must be put first

Any decentralized credit scoring system applied to DeFi lending and borrowing needs to put user privacy and security first. We cannot ask individuals to give up their data sovereignty in exchange for a well-working DeFi lending system.

Decentralized digital identity systems can help immensely here. By coupling decentralized credit scoring with a decentralized digital identity system, no one party will hold full control over an individual’s financial data.

The buck will stop with the individual. The World Economic Forum has been promoting these kinds of digital identity solutions for a long time. In addition, the UK Government endorsed their use universally as the cornerstone of future economies.

If DeFi is serious about going mainstream and further nurturing relationships with institutional players, a reliable means for evaluating risk in a timely, accurate manner while permitting the same level of due diligence is integral.

In addition, for the DeFi industry wants to win the trust of mainstream finance, it’s imperative that it stays humble to its users. It must also avoid the mistakes many disruptors in big tech have made in recent years.

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

DeFi Needs Reliable Credit Scoring System, but We Must Be Cautious was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


UNISOT

How Product Provenance could benefit your company

Improving Supply Chain Management is key for the efficiency of resources, economics and time. We enhance all the advantages of ERP systems and IoT solutions, using the scalable, immutable and secure benefits of a global data ledger. The post How Product Provenance could benefit your company appeared first on UNISOT.

Improving Supply Chain Management is key for the efficiency of resources, economics and time. We enhance all the advantages of ERP systems and IoT solutions, using the scalable, immutable and secure benefits of a global data ledger.

The Product Provenance module will enable you to create a digital representative of each item or batch in the supply chain that follows the whereabouts throughout its complete lifecycle; we call that a Digital Twin, and a smart one.

Now you can securely add powerful and immutable data and thus create your own customized Library. You choose which data you want to share, and who can access this data.

FOLLOWING EXAMPLES OF DATA CAN BE ADDED:

Temperature
Volume
Weight
Dimensions
Location
Production methods
Energy consumption
Water consumption
CO2
Certificates
Files
Images
Videos

When moving items from one stakeholder to the other it is necessary to be able to transfer ownership of data. This feature is built-in to our system and will ensure you know exactly when and which items have left your responsibility and are handed over to the next actor in the global supply chain. All concerned selected information and documents will automatically and securely be transferred to the new owner.

Now we come to the split phase. We told you it was a Smart Digital Twin, let’s elaborate on that. Digital Twins can be split into several Child Twins; they each inherit all important information and documentation from their Parent. Parent items can be split and packaged into thousands of separate items and continue their way in the supply chain, each accompanied by all relevant data.

When split, then merge. You can create new Twins by merging 2 or more Digital Twins. When you add other ingredients to a Twin you merge and create a new product that holds all the attached data and information of the parent and becomes a new Smart Digital Twin.

All the history and data for each of these components is now available for you to be proved, viewed and analyzed. We provide a user-friendly way for you to view the complete Product DNA Tree of the raw materials of this specific product.

You are now ready to share this in-depth information either as a private secure link, or as a public link for anyone to see. The public link can be scanned through a printed QR code or via an RFID that is incorporated in a product. When your customers or consumers scan the QR code or the RFID with their smartphone, you can show and prove the detailed history about the product they choose to purchase.

Another feature we added to this module is the ability to chat and communicate directly between all actors involved. No more excessive emails or text messages, not to mention the delays, costs and confusion they are causing. You can now exchange important queries, updates and urgent matters regarding specific assets, products or items all the way through a product’s life cycle. Imagine how easy and efficient this auditable trail will be for all actors involved.

The Product Provenance module can be used via a desktop or mobile application. Management can direct asset flows, manage ownership transfers and receive instant KPI.  Field personnel can easily and securely add and report important information that cannot be collected automatically via IoT sensors or API integrations. 

Imagine what this could mean to you, as the creator and owner of this collected information, when you choose to sell this information and to what price. 

Let’s connect and set up a time for a personalized demo, where we can show why and how easy we can improve the supply chain management for your company.

Annemie Bergmans
Marketing Manager UNISOT

 

The post How Product Provenance could benefit your company appeared first on UNISOT.


KuppingerCole

Microsoft Acquires Nuance to Drive AI-Based Workplace Innovation

by Warwick Ashford Cloud, AI software, and voice recognition firm Nuance is Microsoft’s latest strategic acquisition aimed at putting the tech giant in a strong position to shape healthcare and workplace applications of the future. Microsoft has agreed to acquire Nuance for around $19bn and the deal is expected to close before the end of 2021. First Focus on Healthcare The acquisition has the

by Warwick Ashford

Cloud, AI software, and voice recognition firm Nuance is Microsoft’s latest strategic acquisition aimed at putting the tech giant in a strong position to shape healthcare and workplace applications of the future. Microsoft has agreed to acquire Nuance for around $19bn and the deal is expected to close before the end of 2021.

First Focus on Healthcare

The acquisition has the potential to put Microsoft at the forefront of workplace innovation due to Nuance’s AI software expertise focusing on voice recognition, stemming from core algorithms developed in the early 1980s by the founders of Dragon Systems.

It comes as no surprise that Microsoft plans to use the acquisition to deepen its support for the healthcare industry, given Nuance’s experience in this sector and the fact that Nuance’s Dragon speech to text systems have specialized in healthcare to cater for the scope and accuracy needed to cope with medical terminology. Nuance's clinical speech recognition offerings include the Dragon Ambient eXperience, Dragon Medical One, and PowerScribe One for radiology reporting.

Microsoft introduced Microsoft Cloud for Healthcare in 2020 as part of its industry-specific cloud strategy, and the acquisition of Nuance means that it can now add voice recognition technologies to support the growing electronic healthcare records (EHR) market.

The healthcare sector is ripe for the application of AI technologies because of its position in a complex landscape of highly innovative research on patient care and treatments, continually streamlined business operations, social and economic pressures, and current events.

AI in Healthcare

In recent research, fellow KuppingerCole analyst Anne Bailey notes that AI-based products can be found in four categories: Community Health Initiatives, Image Classification-Assisted Diagnosis, AI-Powered Analytics for Improving Clinical Workflows, and AI-Powered Analytics for Hospital Management.

“AI can be highly suitable in products that improve patient care and improve hospital workflows and can be a strong data processing tool to leverage data-driven insights across the sector,” she says.

The research also notes that trends in the healthcare industry that include precision healthcare, AI-powered diagnostics, and data interoperability will be disruptive to both patient care and to business operations.

Use cases for artificial intelligence in the healthcare industry include chatbots, assistance for adherence to treatment plans, image-classification for diagnosis, AI-powered analytics for improving hospital workflows, and voice recognition, of course, for improved patient experience and security.

According to Microsoft, the acquisition is expected to double its total addressable market (TAM) among healthcare, taking the company’s TAM in healthcare to nearly $500bn.

Beyond Healthcare

The acquisition, however, has a much wider focus than healthcare alone, particularly in terms of combining cloud, voice recognition, and other AI technologies. Microsoft undoubtedly sees the potential of expanding Nuance’s capabilities in healthcare to other industry sectors.

Microsoft recognizes Nuance as a pioneer in the real-world application of enterprise AI, and plans to integrate Nuance voice recognition software with its enterprise and collaboration software, which has the potential to change the way people work.

Microsoft also has plans to integrate Nuance with its AI technologies, which again has the potential of changing the way people work by enabling conversations to be converted into machine-readable text, which can then be analyzed using AI, giving healthcare providers and other professionals more time to focus on more important things.

Microsoft also sees the potential for changing the way people work by combining Nuance’s interactive voice response (IVR) software with Microsoft cloud. This can be applied across any industry to augment Dynamics 365 enterprise software and Teams collaboration with new AI capabilities and voice biometrics features to improve security and reduce fraud.

For more information about adopting emerging technologies, see related KuppingCole research focusing on the utilities and energy, and the finance industries.

Conversational AI

Microsoft has long recognized the importance of voice recognition, investing in developing its own Cortana voice recognition technology in the early 2000s and in acquiring conversational AI startup Semantic Machines in 2018, but the acquisition of Nuance will take its voice recognition capabilities to an even higher level.

From its origins at Dragon Systems, Nuance has been a voice recognition expert. What sets it apart is the combination of mature voice recognition technology with conversational AI, which is very valuable with enormous potential for future enterprise applications. Microsoft hopes to capitalize on Nuance’s long legacy in conversational AI to compete with the likes of Amazon, Apple, IBM, and others developing consumer and business applications based on the technology.

Conversation AI technology is expected to grow, and could potentially revolutionize the way humans interact with technology and hence fundamentally alter the way they work. Conversational AI is yet another example of how AI technologies are able to augment human capabilities to boost efficiency, increase productivity, and free up time for more important and strategic activities.

For more information on Conversational AI, see this Market Compass on Conversational AI Building Platforms.

Given that some conversational AI technologies are already available in the market and the likelihood of a lot more appearing on the market in the wake of Microsoft’s acquisition of Nuance, organizations should start thinking about how these technologies could be best used to benefit their businesses. As with all emerging technologies, the best and logical place to start is by identifying the most beneficial use cases for your business and work from there.


Transmute TechTalk

Takeaways from the Suez Canal Crisis

An Appeal for Supply Chain Agility — Powered by Verifiable Credentials Ever Given — Wikimedia Commons The Suez Canal debacle had a massive impact on global supply chains — estimated at >$9B in financial hits each day the Ever Given was stuck, totaling at nearly $54B in losses in stalled cargo shipments alone. And it’s no secret that the canal, which sees nearly >12% of global trade move thro
An Appeal for Supply Chain Agility — Powered by Verifiable Credentials Ever Given — Wikimedia Commons

The Suez Canal debacle had a massive impact on global supply chains — estimated at >$9B in financial hits each day the Ever Given was stuck, totaling at nearly $54B in losses in stalled cargo shipments alone. And it’s no secret that the canal, which sees nearly >12% of global trade move through it annually, dealt an especially brutal blow to the oil and gas industry while blocked (given it represents the primary shipping channel for nearly 10% of gas and 8% of natural gas).

While the Ever Given itself was a container ship, likely loaded with finished goods versus raw materials or commodities, the situation has already — and will continue to — have a massive negative impact on totally unrelated industries…for months to come. Here’s an example of the resulting impact on steel and aluminum prices; this had related impact again to oil and gas (steel pipes flow oil) as well as infrastructure and…finished goods (like cars). And the costs continue to climb as the drama continues with port authorities and insurers battling over what’s owed to who.

Transmute is a software company — a verifiable credentials as a service company to be exact — and we’ve been focused specifically on the credentials involved in moving steel assets around the globe alongside our customers at DHS SVIP and CBP for the last couple years now. Now, there’s no “silver bullet” for mitigating the fiscal impact of the Ever Given on global trade, and ships who arrived the day it got stuck or shortly after certainly faced a tough decision — sail around the Cape of Africa for up to ~$800K [fuel costs alone] + ~26 days to trip or wait it out at an up to $30K per day demurrage expense [without knowing it’d only be stuck for 6 days or ~$180,000.

So what if you’re a shipping manager and you can make this decision faster? Or, make the call before your ship arrives at the canal? [Some did make this decision, by the way]. What if your goods are stuck on the Ever Given — do you wait it out? Switching suppliers is costly, and you’ve likely got existing contracts in place for much of the cargo. Even if you could fulfill existing contracts and demand on time with a new supplier, what do you do with the delayed cargo expense? What if you’re unsure whether you can sell the duplicate and delayed goods when they reach their originally intended destination?

Well, verifiable credentials — a special kind of digital document that’s cryptographically provable, timestamped and anchored to an immutable ledger at the very moment in time it’s created — can give companies the kind of data needed to make these sorts of decisions. With use over time for trade data, verifiable credentials build a natural reputation for all the things the trade documents are about: suppliers, products, contracts, ports, regulations, tariffs, time between supply chain handoff points, etc.

This type of structured data is of such high integrity that supply chain operators can rely on it and feel empowered to make decisions based on it.

What I’m hoping comes from this global trade disaster is a change in the way supply chain operators make critical decisions. Supply chains of the future will be powered by verifiable credentials, which seamlessly bridge all the data silos that exist today — whether software-created silos or even the paper-based manual, offline silos.

Today, it’s possible to move from a static, critical chain style of management where we often find ourselves in a reactive position to supply chains that look more like an octopus. High integrity data about suppliers and products enables proactive, dynamic decision making in anticipation of and in real time response to shifts in the market — ultimately capturing more revenue opportunities and mitigating risk at the same time.

Takeaways from the Suez Canal Crisis was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


Blockchain Commons

2021 Q1 Blockchain Commons Report

In Q1 2021, Blockchain Commons largely focused on working with companies to integrate our Gordian architecture, best practices, specifications, reference libraries, and reference applications into their wallets and services. This included: Releasing three Gordian reference apps for beta testing; Planning the creation of an independent Gordian Recovery app for use with third-party wallets; Updating

In Q1 2021, Blockchain Commons largely focused on working with companies to integrate our Gordian architecture, best practices, specifications, reference libraries, and reference applications into their wallets and services. This included:

Releasing three Gordian reference apps for beta testing; Planning the creation of an independent Gordian Recovery app for use with third-party wallets; Updating keytool-cli for new UR usages and future specifications; Improving Lifehash for black & white usage; Testing new improvements on Lethekit; Supporting the adoption of Gordian best practices by a variety of manufacturers; and Directly working with Bitmark to architect Autonomy.

We also did work to support the wider community, including:

Producing a major design document on multisigs; Supporting DID’s advancement on its standards track; Working to develop the did:onion DID method; Developing packages to support activists; and Testifying to legislatures in Nevada, North Dakota, and Wyoming.

(Also see our previous, Q4 2020 report.)

Read More Gordian Work

Our advances on Gordian were linked to a large-scale transition in the meaning of the Gordian system. Prior to 2021, we were offering a variety of applications, but we were simultaneously working with companies to create specifications, creating a real tension. Now that we’ve seen the beginnings of adoption, we’ve been able to change the focus of our applications from being consumer-focused to being exemplars for other companies to study. In Q2, we expect this to mature into a Gordian Seal program that denotes companies who are producing products that follow the Gordian principles and best practices. Like similar projects such as the FIDO Alliance, we expect the Gordian Seal to make it easier for everyone in the blockchain ecosystem to work together, creating busdev and interoperability opportunities.

Gordian Testing. Blockchain Commons now has three iOS reference apps available for testing via Testflight: Gordian Cosigner, Gordian Guardian, and Guardian Wallet. Wallet was our original app, recently updated to support the newest Lifehash; Cosigner is the companion signing app that we introduced last quarter; and Guardian is our newest release, a key-storage tool for iOS. As reference apps, these projects are mainly meant as exemplars, demonstrating the best practices and principles suggested by the Gordian system as well as exemplifying how multiple apps can interact through Airgaps and Universal Resources (URs). See our new video for a real-life example of Gordian Cosigner, Guardian, and Wallet working together to securely create a multisig account and sign a PSBT.

Independent Recovery. We are already working to split a new app, “Gordian Recovery”, off of Gordian Guardian. One of our best practices for the Gordian system demands that a user can recover funds from a Gordian-approved wallet even if the company producing it disappears. Gordian Recovery will provide one way to do so, allowing a user to independently use the app to recover funds held in a wallet, no matter the status of the wallet manufacturer. At its debut, we expect Gordian Recovery to support at least three different companies with their own hardware or software wallet systems. Generally, Guardian Recovery, like Gordian Cosigner, demonstrates how Blockchain Commons will interact with the wallet ecosystem by providing not just specifications, best practices, and references, but also complementary apps that can support third-party wallets.

Keytool Updates. Our keytool CLI (command-line interface) app also received major upgrades in Q1. We expanded it to support the new UR crypto-request and crypto-response features and added support of arbitrary requests. This is a first step in moving away from the m/48’ derivation used by current wallets for multisigs, which often results in master or cosigner xpub reuse. This means that we now have the infrastructure to demonstrate how to solve the problems of m/48’, but there’s still a lot of legacy usage that needs to be resolved. (We also reviewed one other solution for the m/48’ derivation problem and found it overly complex, so: onward.)

Lifehash Improvements. Most of the Gordian specifications that we’ve created while working with our Airgapped Wallet Community focus on interoperability, particularly moving data across airgaps. Lifehash is something else: it’s a user-interface specification meant to give users trust in the continuity of cryptocurrency accounts without having to depend on reading addresses, which aren’t user-intuitive. Instead, Lifehash creates unique, colored icons for each account that remain consistent over time. This quarter, we adjusted the colors used by Lifehash so that they look better in black & white printing, and simultaneously experimented with dithering on the 200x200 LetheKit display, to ensure that we could display meaningful information on a small black & white display.

Lethekit Updates. The DIY LetheKit has been our prime reference for hardware implementations of the Gordian best practices and thus one of our testbeds. Not only did we test it out with dithered Lifehashes last quarter, but we also were able to use its display to test out animated QRs for transferring PSBTs. It worked! (Albeit, slowly.)

Gordian Adoption. Finally, as noted, we’re very happy to see continued adoption of Gordian. Last quarter, we talked about all of the software and library development being done by wallet companies. This quarter, we’re seeing more companies committing to including SSKRs, URs, and other Gordian features in their wallets. Sparrow Wallet has been expanding its capabilities, while Foundation Devices was the newest to announce their integration of some UR features, including animated PSBTs. Foundation and Sparrow have both been working with us for a while: we’re thrilled to see both hardware and software wallet companies incorporating Blockchain Commons specifications! We’re also aware of two more software-wallet companies who haven’t made announcements yet, and we’ve been extensively working with a third company to produce a Gordian-approved wallet that entirely matches our principles and best practices and adds on some great UI besides. We can now announce that third company is Bitmark …

Autonomy Architecting. Blockchain Commons has been working with Bitmark to design Autonomy, a crypto wallet and digital assets manager. Their goal is to make it easy for families to gain independence and preserve generational wealth. Autonomy combines a mobile app with your own private full node running in the cloud. We used a partitioned architecture (keys across app+node) and a personal recovery network (using SSKR) to remove all single points of failure. When you transact, everything is multisig between your app and full node. Either one can fail and you won’t lose your funds. Nothing else like this currently exists in the market.

Other Advances

Here’s some details on our other major projects:

Multisig Documents. This quarter, we released our first major expansion to our #SmartCustody tutorials since 2019: a 10,000-word document on Designing Multisig for Independence & Resilience. The state of multisig wasn’t sufficiently advanced to provide specific advice when we originally wrote #SmartCustody, so we’re now happy to include it by breaking down the design of multisig addresses into theoretical, transactional, operational, and functional elements and providing design patterns and examples to help you put those elements together. Want to know why Blockchain Commons is today focused on multisig technology and how to create them to meet your own needs? It’s all here.

DID Recommendation. We’ve been working with the DID specification since it was incubated at early Rebooting-the-Web-of-Trust workshops. We’re thrilled that it’s now a Candidate Recommendation. There might still be revisions and new candidates, but this is the start of the last stage in creating a standard for decentralized identity on the internet.

DID Method Expansion. Meanwhile, Blockchain Commons is doing its own work on DIDs, with a new did:onion method implementation. We’re happy to say that Spruce Systems is already considering it for their DID resolver. We’re also considering returning to work on our own BTCR DID method, which was one of the first DID methods, but has gotten somewhat out of date with recent updates to Bitcoin.

Apt Packages. We have produced apt packages for seedtool and keytool for use with Debian and Tails OSes. This is a first step in providing support tools for human-rights activists, which we expect to get more emphasis in our work in Q3.

Legislative Work. Finally, Christopher Allen’s work with legislatures multiplied last quarter. In Wyoming, he led efforts that resulted in the Wyoming Digital Identity Act, which includes a specific definition of digital identity that works toward the principles of self-sovereign identity. Work on a private-key-protection bill was less successful, as it had to be withdrawn after being spoiled by an amendment. Christopher has also recently testified before legislatures in North Dakota and Nevada. This is all crucial work because actual engineering will ultimately be limited and directed by what states legislate, so we want to make sure the laws fit the technology and the needs of the individual.

Intern Search. We’re thrilled that our Summer 2021 internship program has been sponsored by the Humans Rights Foundation. We’ve thus put out a call for interns, and we’re expanding our usual development work to also include direct support for activists, who need help to maintain their privacy in potentially hostile regimes. However, to really support activists requires knowing what they need. So, we’re also considering working with interns to do research and conduct interviews to create user engagement models for activists. This would be similar to the Amira model from RWOT. If you are interested in having developers work with or mentor blockchain interns, want to suggest intern projects, or even have engineers who might be interested in working on Blockchain Commons projects briefly during the summer, please mail us.

Taproot & Schnorr First Steps. We may see a major upgrade for Bitcoin as soon as this fall, if a speedy trial is approved for Taproot and Schnorr. The first will increase privacy for #SmartCustody features such as timelocks and multisig, while the second will allow aggregate multisigs, which offer several advantages over traditional ECDSA signatures. Our brand-new musign CLI provides the first support for Schnorr signatures, with more to come as we ramp up to this expansion.

Supporting the Future

We’ve laid out our initial roadmap for the next two cycles of Blockchain Commons work, covering spring and summer. Important topics includes finalizing our Gordian Seal program, solving problems with xpub reuse for multisigs, supporting and advancing SSKR, supporting and advancing URs, and architecting QuickConnect 2.0. If you’d like to know more about our roadmap, especially if you considering becoming a patron, please contact us directly.

If you’d like to support our work at Blockchain Commons, so that we can create the next specifications, architectures, reference applications, and reference libraries to be used by the whole community, please become a sponsor. You can alternatively make a one-time bitcoin donation at our BTCPay.

Thanks to our sustaining sponsors, Bitmark and Blockchainbird, as well as our GitHub Sponsors, including Flip Abignale (@flip-btcmag), Dario (@mytwocentimes), Foundation Devices (@Foundation-Devices), Adrian Gropper (@agropper), Eric Kuhn (@erickuhn19), Trent McConaghy (@trentmc), Mark S. Miller (@erights), @modl21, Protocol Labs (@protocol), Dan Trevino (@dantrevino), Glenn Willen (@gwillen), and Jesse Posner (@jesseposner).

Christopher Allen, Executive Director, Blockchain Common

Tuesday, 13. April 2021

KuppingerCole

Martin Kuppinger: Beyond SAP Security & SAP GRC: Reflecting the Changing Business Workloads

Defining strategies on governance, risk management, compliance, security, and identity beyond the SAP silo Business applications are under change. While some remain on-premises and in traditional architectures, others have shifted to the cloud – and several of these being provided by specialist vendors such as Workday or Salesforce. The established vendors such as SAP also are changing their pla

Defining strategies on governance, risk management, compliance, security, and identity beyond the SAP silo

Business applications are under change. While some remain on-premises and in traditional architectures, others have shifted to the cloud – and several of these being provided by specialist vendors such as Workday or Salesforce. The established vendors such as SAP also are changing their platforms, applications, and delivery models, while also acquiring SaaS vendors such as SuccessFactors and Ariba. The days of homogeneous, vendor-focused, one-stop-shopping business applications are past. Most organizations are dealing with a heterogeneous landscape of business applications, regarding both vendors and deployment models. While this raises the more fundamental questions whether IT organizations that still have a SAP unit are still reflecting today’s reality, or should undergo fundamental change, there is an ever more pressing need for delivering governance, risk management, compliance, security, and identity for all types of business applications and beyond to other parts of the IT services such as ESM/ITSM (Enterprise/IT Service Management) and newly born digital services.

Martin Kuppinger will look at this evolution and discuss what to change and how to balance depth of capabilities for certain environments with the need for a broad support of heterogeneous (business) applications




Arndt Lingscheid: How to Build a Strong Security and Compliance Foundation for Your SAP Landscape

Cyber-attacks can have severe consequences when it comes to SAP S/4HANA applications. These attacks increasingly focus on the company’s application layer and use privileged user accounts. Unfortunately, many security departments see the SAP application layer as a “black box,” and they view the security of SAP applications as the responsibility of their Basis or SAP application colleagues, leavin

Cyber-attacks can have severe consequences when it comes to SAP S/4HANA applications.

These attacks increasingly focus on the company’s application layer and use privileged user accounts. Unfortunately, many security departments see the SAP application layer as a “black box,” and they view the security of SAP applications as the responsibility of their Basis or SAP application colleagues, leaving these applications at risk. Securing an SAP S/4HANA business application environment involves more than roles and authorizations.

The loss of sensitive data can lead to severe penalties, damages reputation, and endanger the overall business of businesses within minutes.

This session helps SAP decision makers (CIOs, CFOs, and CISOs) and IT operations managers successfully meet these challenges and secure their SAP landscapes.

The session first looks at how security frameworks can help lay the foundation for a strong security strategy. It then walks through SAP’s portfolio of security and compliance solutions through the lens of the Cybersecurity Framework provided by the National Institute of Standards and Technology (NIST) — a framework that is widely used for establishing standard security guidelines and best practices within organizations — to provide SAP customers with a toolkit for creating a comprehensive security strategy that meets their unique and varied needs. Lastly, it explains how to control the activities with a security infrastructure to meet compliance and business requirements and to provide insight that helps those at the C level make better decisions.


Britta Simms: Next Generation Cyber Resilient S/4HANA Transformations

Companies are under attack. More and more attacks result in costly and/or high-profile security breaches. The world is currently experiencing a wave of digital transformation, that brings with it not only new levels of complexities such as these, but also offers opportunities for organizations to strengthen their cyber resiliency. Accenture, together with strategic partner Onapsis, have develope

Companies are under attack. More and more attacks result in costly and/or high-profile security breaches.

The world is currently experiencing a wave of digital transformation, that brings with it not only new levels of complexities such as these, but also offers opportunities for organizations to strengthen their cyber resiliency. Accenture, together with strategic partner Onapsis, have developed an integrated approach to deliver security by design to our clients, at any phase of their digital SAP transformation journeys. This Accenture methodology has embedded security concepts as an integral part of the overarching solution – therefore enabling clients to better understand their respective security implications and opportunities in order to effectively “transform”.

In this keynote, Accenture leader Britta Simms, responsible for SAP Platform Security in Europe, will present this joint approach to achieving integrated security by design, as part of the S/4 transformation lifecycle.




Jochen Fischer: SAP Applications Under Attack! How to Enforce the Three Lines of Defense




Alex Gambill: The Tricky Business of Protecting Your Assets in SAP: A Holistic Perspective

With 77 percent of the world’s transaction revenue touching SAP ERP systems, these crown jewels have long been the prime target for cybercrime and internal threats due to Separation of Duties (SOD) risks, weak access controls and lack of identity management and governance. Today, a holistic approach to security in SAP—and other business systems—is not a nice-to-have but a must-have. This session w

With 77 percent of the world’s transaction revenue touching SAP ERP systems, these crown jewels have long been the prime target for cybercrime and internal threats due to Separation of Duties (SOD) risks, weak access controls and lack of identity management and governance. Today, a holistic approach to security in SAP—and other business systems—is not a nice-to-have but a must-have. This session will give attendees a deep understanding of the current threat landscape and a 3600 perspective on what is needed for not only integrated security but also audit and compliance in the complex SAP environment.




Marco Hammel: How to Avoid Costly SAP Security Pitfalls. Why to Make Security Start With People and Not With Tools




Hernan Huwyler: Security and Governance Done Right




Interview with Hernan Huwyler




Panel: SAP Security in Context of a Corporate IT




Mastering Today’s SAP Threat Landscape - Joint Interview with Accenture & Onapsis

In order to effectively protect organizations, the constantly changing threat landscape needs to be understood. Threats could initiate from inside or outside of the organization, targeting the infrastructure, applications or users to obtain business critical data. Our panel discussion will focus on the most recent SAP threats, what’s different with the move to S/4, and valuable lessons learned on

In order to effectively protect organizations, the constantly changing threat landscape needs to be understood. Threats could initiate from inside or outside of the organization, targeting the infrastructure, applications or users to obtain business critical data. Our panel discussion will focus on the most recent SAP threats, what’s different with the move to S/4, and valuable lessons learned on the importance of an integrated approach. We will talk with Dr. Rene Driessel – SAP Security Lead DACH at Accenture and Frederik Weidemann – Chief Technical Evangelist at Onapsis, to dive deep into today’s SAP security landscape.




Interview with Markus Weißensel




Northern Block

Patient-Centric Identity Management for Healthcare with Jim St-Clair [Podcast]

Listen to SSI Orbit Podcast Episode #7 with special guest, Jim St-Clair from the Lumedic Exchange, as he discusses Patient-Centric Identity Management for Healthcare using Self-Sovereign Identity (SSI) technology with Northern Block's CEO, Mathieu Glaude (full podcast transcription available too). Listen, consume & share. The post <strong>Patient-Centric Identity Management for Healthc


Listen to this Episode about patient-centric identity management for healthcare on Spotify

 

Mathieu: Okay, we’re on. Hi Jim, how’s it going?

Jim: I’m well, sir. How are you today?

Mathieu: Doing well, thanks. Before getting into all of the cool stuff that you’re doing at Lumedic and the different communities that you’re working in, I’d like to start this conversation by taking a step back to understand how you got into healthcare. It doesn’t seem as if you got into it just yesterday, so would you mind giving some background of how you got into the healthcare space?

Jim: Sure, absolutely. I want to be clear that I characterize my healthcare involvement as around health IT. I don’t have the privilege of being in healthcare like many frontline workers and doctors and nurses but have come into it as a technologist, going back about 12 years ago.

I’ve been in technology and the public sector for a little over 20 years. During my time in the public sector and public sector consulting in Washington DC, I got more involved with federal agencies in healthcare, in federal health IT, and in data efforts. I used that opportunity to work for HIMSS, the Health Information and Management System Society. HIMSS is the world’s largest cause-based non-profit, specifically focused on healthcare technology and the use of technology for patient empowerment and patient engagement. I spent a year and a half with them as the senior director for interoperability and standards.

I left that job to go back into public sector consulting, supporting the State of Vermont with implementing systems as part of the Affordable Care Act. Then, I moved into another small company supporting a large enterprise software development initiative with the Department of Veterans Affairs (the VA). Following that, I worked for another small company in Maryland with a focus on the Center for Medicare and Medicaid Services, which is part of US HHS (Health and Human Services). I stayed involved with HL7 and several other standards groups and consortia. I began working with Lumedic in January of this year, specifically focused on digital identity standards, especially in self-sovereign identity, and also continuing to work in HL7. Lumedic is part of Providence Health Systems, and we play a very active role in several HL7 initiatives for health IT and data exchange. It’s very complementary to the work we’re doing with groups like Sovrin, the Trust over IP Foundation, and so on.

Mathieu: For those who aren’t familiar with HL7, like me, what is it?

Jim: Sure. HL7 is an international body; HL stands for Health Level Seven. I apologize for not remembering exactly when they got started, but they’ve been around for several decades. They have various international working groups and focus areas that are using standard schemas (very similar to W3C and others) to develop health data standards for interoperable health data in healthcare systems.

Mathieu: There are other organizations as well; FHIR, or the Fast Healthcare Interoperability Resource, is another one that you’re a part of. Or, at least, when you were part of HIMSS, was that a part of that too?

Jim: Oh. I’m glad you mentioned that. FHIR fast health information resources is, in fact, the latest JSON-based iteration of technology that HL7 has developed over the last six or seven years. The development of health standards has been iterative. Going back to the data standards and the way data was organized and used in the client-server environment; all of that has changed with web services, and with APIs, and cloud services and the internet, etcetera. Health data standards have changed as well. FHIR fast health information resources is the latest iteration for being able to capture health data from electronic health care records and make it available through web services.

Mathieu: Got it; thanks for the breakdown. There’s a newer act a little more recently, but I’ll just take a step back before getting to it. There was a nice representation that Michael Nash from your team put together, showing the progress of healthcare. As it started in its pre-digital era, you started to get different standards and legislation like HIPAA. Then, you started to get more digital standards and high-tech standards as the paper era moved into the digital era. Closer to today, with the 21st Century Cures Act coming into place, moving from the digital era towards more the patient era. How have you viewed that whole evolution in healthcare, and how has the 21st Century Cures Act aligned with what you guys are doing or what you’re thinking?

Jim: Mike Nash, our CEO, has espoused this vision, which I think is outstanding; about the healthcare in the US coming into the patient era, and how Lumedic is part of the patient era now. To walk through that evolution that you articulated; like so many other industries, we moved from paper to digital over the course of the last couple of decades.

Healthcare records had historically been big, thick binders of paper and faxes and notes, which would still persist today. There was a change in regulatory focus to take into account the fact that healthcare records have been more and more transitioning into the world of digital and digital records — electronic healthcare records systems. That prompted a law in 1996, called the Health Information Portability and Accountability Act (HIPAA). HIPAA was ground-breaking in its obviousness in saying that number one, your healthcare data should be able to go wherever you want it to go; and number two, healthcare systems were responsible for how they manage, safeguard, and distribute electronic health information, which we called EHI or PHI (protected health information). Everyone appreciates the fact that the law helped to clarify and articulate that data should be available to you, wherever you need it to be, that it could be transferable, and most importantly, there are protections behind it.

However, it still had an organizational ‘silo’ feel to it, and that traditional client-server view. Since that time in 1996, more and more things are available through the internet, mobile apps, and electronic systems. Then, you have various middleware and patient engagement experience platforms, and use of third-party apps, and health and fitness apps. If anything, I think the term I use now is the ‘ubiquitousness’ of healthcare data; your healthcare data should be ubiquitous, in terms of your ability to get it and to combine it with other information that you’re collecting. It should be ubiquitous in its presentation and ability to be used at different places. That’s where we’re getting to with the patient era. That is: you, as the patient, have the technology, and we have the architectures to empower you to be able to collect your data and use it as you see fit. That is quite a cultural change.

Your ability to get access to that data, and be able to use that data as you see fit, has oftentimes been questionable just because of healthcare silos. But, we’re now at a point where the architecture and the technology no longer restrict that sort of silo of data, and you should be able to take advantage of it and use apps to do so. That is a thematic undercurrent to everything that we do with Lumedic, but in particular around the Lumedic Connect identity platform.

Mathieu: HIPAA provides guidance for health information exchange between organizations, which is still valid with the 21st Century Cures Act. It’s a component that fits within that act; the act makes it increasingly possible for more patient-driven, or patient-owned models, is that correct?

Jim: That’s a great point. The 21st Century Cures Act, as suggested when it passed in the 21st Century, goes into effect in 2021. In many ways, it layers on top of HIPAA, and I think you bring up a good point that’s worth clarifying. HIPAA was passed in 1996. In 2010 you had HITECH, the Health Information Technology for economy and community health. HITECH added another layer to HIPAA in terms of recognizing health information and promoting the use of electronic health care records. Now, you have the 21st Century Cures Act; there are many good summaries out there that your listeners can find for the 21st Century Cures Act. The most important aspect is how it takes such a broader message and regulatory guideline around the availability of health information; something called patient access APIs. This offers subsequent modification to things like the HIPAA privacy rule and others, that help to bolster the underlying message that patients should access their own information.

Mathieu: So, if we look at a Patient-Centric ID or the patient era moving forward; why is that the solution to fix the problems? I’m assuming there’s a lot of problems that are related; you’ve described a few of them, such as the siloed view of data and the lack of interoperability and stuff like that. What are the major benefits of Patient ID that you guys are excited about?

Jim: I’d say that there are a couple of things; on its surface, one can be excited about the principles of privacy-preserving architectures, and building in the concepts of self-sovereign identity. You know, the things that you and I, and other folks in the identity standards world are excited about: to allow people to control their own information, and help control how their information is shared. Even more than that, if the information is controlled by the individual and by the patients, they’re now empowered to control where that information goes.

I often use an analogy that culturally, healthcare in some ways is still like going to see the tribal shaman 10 000 years ago: I show up at a healthcare environment; some doctor says that there’s some issue. Maybe I understand it, or maybe I don’t, which is a patient health literacy issue. Maybe I have access to the data, or maybe he or she magically presents a lab result and says, “Hey, this is the issue, and now you have to go and do this, this, and this.” These are three other things that you’re being told to do, that change when you now have the individual able to manage how their data is collected and how their data is disseminated — how they do things in the healthcare environment with a healthcare organization. I think it now allows them to be empowered to make decisions about, “Well, I’m not only going to see this doctor here, but I’m also going to see this other doctor. When I do see this doctor, these are the things that I think they need to know about me, but I shouldn’t have to consent to share my entire life history with them if I don’t feel it’s relevant. If I’m going to a back specialist, for example, do they need to know something about substance abuse, that may or may not have anything to do with that.

How do I, as a patient, get more involved with how my health information moves around, in a way that I’ve never been involved in before, whether it was paper or digital.

Mathieu: That’s another interesting aspect to that; you definitely have the privacy-preserving aspects to the whole story here, but I think we’re seeing in health care and health technology that we’re getting more and more opportunities to use different methods for health data inputs. Whether it’s wearables, whether it’s going for different specialized treatments or scans or activities like that; being at the center and having control of that, and being able to amalgamate all of that information for yourself at the center. That all goes along with the whole big data story, where there are tons of opportunities to use more and more data in healthcare today. It’s probably another benefit of having this patient-centric identity management model.

Jim: That’s a wonderful analogy, and I’d add to it as well. I’m sure you’ve read quite a bit about the use of AI in healthcare, and AI services and chatbots, and that sort of thing. The more that you have the ability to collect your own information, and use that in support of tools such as an AI-driven application, for instance, then the more things can be customized to you or built on what we call evidence-based medicine (EBM) as a framework for decision making.

There’s a proliferation of health apps and advancements in technologies that allow you to monitor your own heart rate at home, and that assist you with medication adherence and how you take medication—gathering that information in your own app, having that information available to you as part of your own data, as part of clinical decision making, and deciding where your data is really where the 21st Century is going. As we discussed, the 21st Century Cures Act is your ability to get that information, to work with information from mobile apps and health applications, and then decide how that information is shared and controlled. The consent belongs to you.

Mathieu: Got it. We’ve been hearing more about the work you’re doing through different communities like the Trust over IP. Lumedic is an organization that was existing before, but was purchased by Providence Health Systems a few years ago. Would you mind making a distinction between those two organizations, and what both of them are up to?

Jim: Absolutely. Providence Health Systems is the ninth-largest healthcare system in the USA, and they are a not-for-profit Catholic hospital system located in Seattle. They support seven states under that banner. As part of Providence Health Systems, they have a for-profit portfolio of companies called Tegria. One of the companies under the Tegria portfolio is Lumedic. As you mentioned, Lumedic was acquired independently a couple of years ago, to be added to that portfolio to support new tech startups and health innovations within that portfolio, under the umbrella of a large major health care system. To add on top of that, Providence has its own digital innovation group, and they coordinate with a venture fund. It gives them a broad base of involvement for identifying healthcare technology solutions that obviously advance their mission for care delivery in the states that they support.

Mathieu: Within Lumedic, do you have the opportunity to leverage the ecosystem of different companies or different resources that Providence has assembled, or that Tegria has under their portfolio?

Jim: That’s a great point, and one I’m not afraid to advertise. As you know, there are lots and lots of health tech companies out there. Very few of them have the opportunity to daily participate in and interact directly with a health care system, beyond a traditional customer-vendor relationship; not only that, but a health care system of the size and scope of Providence. I consider myself to be very fortunate that I get to work alongside senior Providence healthcare professionals and leaders as part of what we’re doing in advancing the field. Specifically, we’ve created the Lumedic Exchange, which is a voluntary organization of stakeholders from other healthcare organizations, but including Providence. The Exchange focus is on developing use cases and workflows around this concept of verifiable credentials. That gives us the opportunity to tap into not only folks who are technically part of Providence Health Systems, but Providence itself has jurisdiction over the State of Oregon, and it’s got other systems in California. This gives a tremendous degree of variety and organizational diversity, each representing different healthcare organization perspectives as part of this development work. There are very few health tech companies that can offer that sort of background and diversity, in terms of their solution development and application.

Mathieu: What was the vision behind the Lumedic Exchange? It makes sense to have some sort of ecosystem or consortium, governing the health apps, or the data, or the different systems and what people are using. Is that the idea behind the Lumedic Exchange?

Jim: That’s a great question. I know we haven’t talked about it too much besides the references to the Trust over IP Foundation, but the Lumedic Connect product is based entirely on the Trust over IP Foundation framework for managing decentralized identity, and using the W3C (WorldWide Web Consortium) verifiable credential model.

If you think about what that verifiable credential trust triangle model looks like; between the issuer, the credential holder, and the credential verifier — what type of business processes and use cases are there, for using that verifiable credential in healthcare use cases? Despite what we could bring to the table for technical knowledge and engineering to develop the Lumedic Connect product as a small company, or even in conjunction with Providence Health, we couldn’t necessarily come up with every use case by ourselves. Nor could we get industry buy-in for a use case of how a verifiable credential could be used in a healthcare scenario, without the participation of as many healthcare organizations as possible. So, we created the Lumedic Exchange to allow healthcare organizations to join for free and participate in that process to get educated about verifiable credentials, and contribute their perspective. For example, if I have a patient coming in for an x-ray, and they have their insurance through Blue Cross/Blue Shield, what needs to go into a verifiable credential? What’s the registration process? How do I tie that credential to the imaging system and to the image? How does it work with their insurance eligibility and registration? That sort of thing.

Mathieu: I always worry about the education gap with any new technologies, but is it clicking with the health providers that are coming into the Lumedic Exchange? Are they really seeing the value of portable digital verifiable credentials?

Jim: That’s a great question. I would like to say that they see the value of it, but it is a brand new effort. We launched around November of last year. Of course, launching anything last year in the pandemic obviously complicates some of the messaging. What I can say is that we have strong participation from several folks in the payer community, from Mastercard, identity services from other representatives in Providence, and from HireRight, the human resources information company. They are consistently participating in a couple of the workgroups we’ve stood up, plus some new ones that are addressing various issues for credentials. So, I’m very optimistic that the momentum is going to build this year. I think both membership contributions and value will all be built as the year goes on.

Mathieu: It seems like the one opportunity, or one of the good things with COVID (if there’s any of them), driving the push towards more and more digitization. The use of proof-of-vaccines; that’s one of the first use cases that it seems you’re focused on.

Jim: Yes, and I like the way you paraphrase that; both fortunately and unfortunately. It’s terribly unfortunate that we have such a human toll and tragedy concerning the pandemic. On the other hand, it has prompted consideration and forced changes in business models, in ways that we’ve talked about and never been able to do. They are now a reality; whether it’s telework and now the adoption of telehealth and virtual care. In many ways, the substance around a health pass or vaccination credential was there before, and it’s something we’ve been working on as an underlying technology with Lumedic Connect and for Providence for a couple of years. However, fortunately, or unfortunately, the foundation of vaccination coverage and a vaccination credential serves as a foundation for really demonstrating a use case, in one of the most positive and fast moving ways.

Mathieu: People definitely understand that when they see it, and it has been interesting.

I remember early last year when we were in the early days of COVID, with the COVID credentials initiative starting and a lot of different efforts, trying to look at a way to use these new technologies for good, without sacrificing privacy, or any ethical considerations, and so on. Although this stuff could excite you, is there stuff that scares you about the COVID? There seem to be a lot of different COVID projects that are going on right now.

Jim: Absolutely. Of course, I’ll be quick to point out ours as being one of the best, but I do so in the context that the vaccination credential use case is but one of many use cases that we’re looking at, as part of the broader aspect of managing patient health information that we’ve been talking about so far, when you called out things like HIPAA and the 21st Century Cures Act.

You’re right, unfortunately. There are several prominent cases going on right now, or several prominent industry applications, where privacy seems like a real concern. It’s understandably very difficult for a consumer to look at these different things, and be told, “Oh, don’t worry, it’s on the blockchain.” That’s somehow supposed to mean something in terms of privacy, without underlying what a governance framework is, or what an inherent privacy-preserving architecture is. We were aiming for all of that first and foremost, even before dealing with vaccination credentials, and so that’s built on it too. In working with things like the Good Health Pass Collaborative, we’re trying to come up with that type of Good Housekeeping/UL seal of approval that shows that this credential application and mobile application have taken into account key considerations for privacy, security, interoperability, a trust framework, and so on. This would enable consumers to know that you can trust and appreciate that these apps are only working on your behalf and that they’re not representing either corporate surveillance of healthcare or some other government intrusion into your health information.

Mathieu: There’s a lot of good work happening, as you mentioned, at the Good Health Pass Collaborative. With the trust models or trust frameworks that you’ve been working on within the Lumedic Exchange, is there a lot of overlap there? Have you been able to contribute to that?

Jim: I think you make a great point; it’s entirely collaborative. If you think about the things that we’re working through in terms of privacy, and interoperability, and trust frameworks in the Good Health Pass Collaborative; while called into focus around the issue of vaccination credentials, they also pertain across the board to any other healthcare information factor. As you mentioned, there are lots of good companies in the COVID Credentials Initiative (CCI) and lots of work there; but, of course, as an organization, they’re focused on COVID credentials. Ours is a focus on health information credentials, patient-centric identity management first, with COVID credentials being a natural overlay to that.

Mathieu: There’s a need for trust models or trust frameworks across different industries or ecosystems. We see many developments taking place in the financial sector; that’s the area in which we work a little more. How do you describe a trust model or trust framework to someone? It’s a new concept, that wasn’t necessarily there with traditional architected systems.

Jim: You’re right, it is very new. When you’re advancing this model of trust, first and foremost and speaking as a Chief Trust Officer, I of course spend some time explaining what a Chief Trust Officer is. I emphasize to them that previous roles for chief this, and chief that, whether it was a Chief Information Officer, Chief Technology Officer, Chief Data Officer; all those roles highlighted how organizations were taking a new organizational leadership around that particular concept. That is, how important information was, or how important data was. Now, we feel we’re in the era where trust has to be called out as something that is an organizational value, an organizational effort. So, with that in mind, you need a trust framework.

When asked how I look at a trust framework, or a trust assessment framework: digital trust is considered to be an embodiment of all the ongoing activities and areas of security and privacy, and data management or data governance. That activity manifests itself as some way that parties who may, or may not, know or like each other, can agree upon trusting each other. That generally represents a combination of things that we’re already familiar with, such as ISO standards, or GDPR (General Data Protection Regulation), or in financial services, GLBA (Gramm-Leach-Bliley Act) and other financial securities. Acts that combine together, and lay out some way that organizations can attest, or be certified, to say, “I’m following security standards; I have security in place; I’m following privacy standards.”

There is a right-to-privacy for the users; there’s a right to protect their data, and a right to be forgotten. All of that is serving as a framework to be able to say, “Can I now trust that the rules in this organization, combined with the rules in that organization, will allow me to use this credential as a model to identify me?” Despite the fact that you may not know everything about me, we both agree that we’re following these rules together, and you can trust that I can do this. Or, more importantly, that I can trust you, that you can have my information, or that we can have some sort of transaction or relationship together.

Which, when you think about it, is the heart of where blockchain, and Bitcoin and everything started. Even in anonymous, almost adversarial, relationships, you could have rules for how transactions took place, so that two people could conduct a financial transaction without ever knowing each other. Everyone agreed to what that was, and that was, in fact, really a trust assurance framework. It was engineered into the Bitcoin code and the way in which Bitcoin operates. That serves as a foundation, to then extend that to verifiable credentials, and saying, “I present a credential that comes from someone you trust, or an organization that’s trusted. That serves as the basis for our transaction together, without having to collect and store a bunch of other information.

Mathieu: I remember some years ago, when blockchain was supposed to be used for everything. In the early days of decentralized identity management, a lot of the thinking was, “Hey, let’s just throw everything on-chain.” I think we realized pretty quickly, that definitely does not hold up; you can’t put personal information on-chain. I’m definitely happy with the evolution that we’ve seen, with off-chain secure communication with verifiable credentials, and using pretty cool cryptography and protocols to allow that to happen. Does the Lumedic ecosystem publish its own trust framework, and does it use a specific blockchain as a utility to write information that’s pertinent to issuers and certain credential information?

Jim: Yes, and first of all, let me say I think your summary was beautiful! As someone who has been in blockchain and healthcare for about four years, I translate it from a Dilbert cartoon: blockchain is like radioactivity; you can use it for good or evil, and you don’t want to get any of it on you. I believe a lot of people had a vision about it; to simply use blockchain like bitcoin, and somehow it’s all anonymous. When, in fact, blockchain does a beautiful job of being able to eliminate anonymity, and being able to trace and associate transactions with a specific person, with specific information. However, when you abstract it out between on-chain and off-chain as you’re suggesting, then you have a new way to be able to enforce privacy, especially with the principles of decentralized identifiers. We are built on a combination of the Hyperledger Indy/Aries framework and part of the Sovrin Network. So, you have those principles and trust framework associated with Sovrin and SSI, as part of the framework for the Hyperledger Indy/Aries network or software components that we’re using. To your question; yes, the Lumedic Exchange is developing its own trust framework and own credential rules, which we intend to publish. They’re developed by the working groups, and published on our lumedicexchange.org website to be available as examples for anyone.

We see that the most important things to do first, are culturalization, education, and adoption. We won’t necessarily be the only solution for this, although we’d obviously like to help all the patients in America. However, truly understanding it and moving it forward as a concept for how patient information should be managed, is probably the most important thing first and foremost. Therefore, we’ll publicly share all of the trust framework and governance framework documentation we develop out of the exchange. That will allow people to build upon it, and when the next thing comes along — whether it’s KERI, or Hedera Hashgraph, or the next version of Hyperledger Indy/Aries — we’ll have a framework to help support that.

Mathieu: That’s awesome. Similar to the rest of the community, it’s great to see you guys participate. I know you’re passionate about participating in different organizations; whether it’s the Trust over IP, ID2020, or other ones. Going back to that nice illustration that Mike Nash had put together; that starts at the paper era. Well, definitely not the elimination of the paper era, as we’re talking more and more about today. This technology needs to be accessible to people, and it needs to fit the different cultures and the different ways of using it.

Jim: Absolutely. Your commentary drives me to another point too, which is this idea of patient-centric, and patient empowerment, and consumerism. I know that’s on Mike’s graphic as well, that we have to take into account the gradual but steady building of consumerism and healthcare, in a way that hasn’t existed before. By virtue of making this technology accessible, consumers can take advantage of it as part of doing their business of daily life.

You know, the use of faxes and the use of paper records have still continued to persist. Nothing makes me want to stick a pencil in my eye faster than when I see an advertisement for a secure cloud-based fax platform. I’d say, “Fantastic. What a great job hedging over the cart path, and I’m going to go buy another horse.” It’s not advancing where technology can go, to help consumers and empower patients. There’s a great deal of that effort going on within HL7, of course, which is why I’m excited to make it a complementary effort. There are so many folks around the US and around the globe, who are working HL7 to advance more effective ways to exchange health information. We advance other standards from the global community around identity, to be able to make it a patient-centric way of managing that.

It’s nothing short of revolutionary. “Revolutionary” means that there are winners and losers, and you hope that the most important winner, of course, is the patient.

Mathieu: Yes: with the understanding, as you said perfectly, that there’s not going to be one winner — you need to be able to play within the ecosystem and add the value that you’re good at adding. It’s not that you can simply replicate the siloed models that are there today; that’s not the evolution or revolution that we’re looking for here. I’ll post Mike’s illustration in the show notes for this podcast. We’ve referred to it a couple of times, and I think it’s a great overview of the different eras within healthcare.

Mike Nash, Lumedic

Within the Lumedic Exchange, there are different organizations that are part of this, and there are different working groups that are happening. What’s being worked on?

Jim: Let me highlight probably my most exciting working group, which is Health Equity. We’ve set out to try and get about seven working groups up and running. We have an Identity working group, who are looking at things like identity attributes, and binding things I’m sure you’re familiar with. These are issues that are a fundamental block-and-tackling for how verifiable credentials work. We have another working group around Registration and Eligibility, which starts to tie in the actual workflow of a patient registration process in a healthcare system, into the use of a verifiable credential. We’re also looking at some very specific healthcare opportunities, for things like skilled nursing facility, and for imaging and pharmacy applications.

In our Health Equity and Population Health working group, I got excited about it because I spent a lot of time working on another concept called “Social Determinants Of Health,” or SDOH. In healthcare, that concept has a lot of conversation going on. This deals with looking into non-clinical factors that make up your healthcare aspects and your lifestyle. As someone said, social determinants of health are our health factors, no matter what. We tend to look at them for things like housing security, economic security, and food security. These are social factors that weigh-in, for whoever is being considered as part of the clinical analysis. It’s not information that your doctor historically has gathered or that you typically see in a clinical record. But, we understand these factors more and more as being a critical consideration. If you live in an area where it’s challenging to get affordable housing, that’s going to factor into impacts on clinical care and your ability to get to the doctor, and just have a place to live.

That translates into points of digital information about you that are separate from what you have in your health care record, but have to be considered. Where should they be considered, better than in a platform that you control for where that information is available and then shared with your health care provider? Conversely, there are things about your healthcare information that may need to be taken into account when you’re being provided housing, or for mental health, or for food security. However, that doesn’t mean that you’re giving consent to everyone that you meet, to have total access to your health care record.

There are all of these various factors and determinants that are looked at for modern healthcare: those are sources of digital information that only the individual concerned can control. So, we’re looking at how these factors of health equity and social determinants of health get considered from a verifiable credential standpoint. How do I tie together some of the work being done in HL7, and things like the Gravity Project for social determinants of health, into the identity model that we have in using a verifiable credential? Probably most importantly, how do we ensure that using verifiable credentials doesn’t create a problem for digital health equity (the so-called digital divide), but in some way, maybe even helps to mitigate or improve health equity problems that we’re seeing in digital health right now?

Mathieu: Are these layers that one would see if they’re using a digital wallet? There could be an agent in your wallet, like your Lumedic Connect, that is able to process this the right way. I’ve seen it differently in different use cases, where we start talking less about credentials, but more about capabilities and skills. And so, you have this translation layer; is that something that would be similar to the health equity stuff?

Jim: Absolutely. That’s a wonderful observation, because, for us, this effort doesn’t stop with just the credential. The patient era is about patient empowerment. I mentioned before about AI and other tools you can use; I think that there’s nothing more important than continued advancement of the presentation layer, of that user experience. So much of this is just under the covers.

I joke around in the blockchain and healthcare community, to say, “The first rule of blockchain, is to stop talking about blockchain.” What you’re really talking about is, what is it in the decentralized application that improves things? We have to have applications that elderly Medicare recipients can understand how to use, as easily as a 25-year-old millennial Javascript developer. Specifically, in the world of healthcare: what can be done about healthcare terms and diagnosis and treatment plans that are not only easy to understand, but can translate into things from a user experience or an engagement platform, that work with the patient to keep them informed, to help them make decisions, and at the same time, help safeguard their data and the ability to share data and provide consent.

Mathieu: Are there new revenue opportunities, too? Are these conversations that come up with different people within the ecosystem? You mentioned that you got to talk about the value or the benefits, rather than just saying ‘blockchain’ or just talking about the technology. I’ve been seeing this for a while now, where this is great. Once I establish myself; let’s say I’m a healthcare provider, or let’s say I’m an insurance company, let’s say I’m a lab; these are all participants that would fit within the Lumedic Exchange. Do you work with them to define what the revenue models are? Are there concerns about, “Hey, what if my credentials are reusable, and shareable, and stuff like that — do I potentially lose on future revenue? Are these issues and topics that you guys talk about?

Jim: You know, I can’t say that we’ve had some of those specific conversations. But, I think some of what you’re alluding to is really important; the general concern overall that so many healthcare applications are just another form of surveillance capitalism. And, saying, “Hey, what a great application to help you manage your health care,” only to find out that you’re putting all of your personal data in it, and they’re doing something with it behind the scenes. That’s clearly not something we want to have as a problem.

I think that in general, as you mentioned the concept of verifiable credentials and using your patient identity, we all need to be cognizant of ways that you don’t inadvertently create that surveillance capitalism, or find new ways to construe information that becomes restrictive. Nobody wants to be able to have their healthcare data present a situation for an insurance company to pass judgment on not covering them or to raise their rates. Are there ways that healthcare information is anonymous around decision-making, or anonymized, that allow you to protect your identity, limit bias or unethical approaches, and at the same time, advance how that information is used for effective quality measures or quality metrics?

Mathieu: That’s interesting. There’s something strong about having selective disclosure. But also, beyond that, being able to control your image or your persona, and not needing to necessarily divulge all the information about yourself so that you’re not completely exposing yourself to everyone and everywhere, basically, along the surveillance capitalism lines that you just said there. As a patient, if I’m now using Lumedic Connect, I have some credentials on there. I’m able to use different services and stuff like that that you’ll have on there. But, when I show up to a doctor’s office, and they’re still using their traditional systems in there; whether it’s an EHR or whatever it is, is that how it works? Is it that I show up at my doctor’s office, I scan a QR code, and I give them consent, and I transfer certain data to their system?

Jim: Yes, that is the long-term vision, for sure. You’re right; those are areas that our working groups are still sorting through to understand the specific mechanics. It is being approached with the idea that you would have, basically, a QR code presented. A QR code-reader has the fundamental interface point for everyone to consider. It could be Near Field Communications (NFC) or Bluetooth as we build upon it. However, that initial point of registration, and leveraging that QR code, is where our conversations are starting, that is correct.

Mathieu: Got it. I saw something posted that it’s being tested internally within the Providence system. This is being piloted; it’s being tested, it’s being used, as we speak.

Jim: Yes, it’s being piloted and tested around the vaccination credential first, because that’s obviously the use case; the other ones are in development. I would mention again, the opportunity that we’re afforded with Providence Health Systems is a landscape of 85 million patients and 120 000 caregivers. So, we have a very broad and diverse landscape in healthcare, to be able to leverage “in our own backyard,” which will help you know demonstrate efficacy for health care overall. It’s not merely a single clinic, in one single town someplace; which makes it very exciting to show the depth of development that’s possible.

Mathieu: Awesome. The other projects I’ve seen in the healthcare space have been more from the doctor, or the physician side of things. I know you collaborate with NHS, and I’ve heard conversations with them before. Is there collaboration with them, too?

Although, they’re touching the other side within the larger ecosystem; everyone needs to be plugged in.

Jim: That’s a great question. We don’t actually collaborate with NHS in the UK at the moment. I’ve had the pleasure to meet some colleagues there and talk with them a bit, but our focus is on the US. The US is so large, and so diverse in terms of what we’re trying to address; I’d welcome the chance to work with the NHS in the future, but we want to conquer US healthcare first.

Mathieu: That makes sense. In closing, Jim, what’s on the roadmap? What should people expect from Lumedic for the rest of 2021? How could people help out?

Jim: A couple of key points: our work with the Good Health Pass Collaborative, as well as internally with Lumedic, will continue to advance what a secure, privacy-preserving vaccination credential is like; if, in fact, people are called upon to need it for travel, and return to work, and so on.

We’re building upon that to help improve and revolutionize the registration workflow, which potentially saves patients a whole bunch of time and preserves their privacy as well; it can be tremendously efficient on the part of the healthcare systems, too. Also, continued growth and participation in the Lumedic Exchange; to define new use cases, new proofs-of-concept, and being able to advance this verifiable credential patient information platform concept that much further in 2021 and into 2022.

Mathieu: I love the vision towards the patient era. I think you guys are at the forefront, and I look forward to keeping a close eye on what’s happening, and the progress that’s going to be happening this year and over the next few years.

Jim, thank you very much for doing this with me today.

Jim: Thank you, Mathieu. It’s been a pleasure, and I look forward to you keeping me honest in our developments, my friend.

The post <strong>Patient-Centric Identity Management for Healthcare</strong> with Jim St-Clair [Podcast] appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Forgerock Blog

Hybrid IAM and Cloud Steer Maersk Toward Improved Experiences and Cost Savings

You’ve seen those iconic blue containers filled with everything from cars to TVs and stamped with the Maersk logo countless times on ships docked at ports around the world. But have you ever thought about what it takes to orchestrate the movement of that cargo?  Maersk operates in 134 countries, shipping 10 million containers to 76 global ports annually with the help of 88,000 employees. A

You’ve seen those iconic blue containers filled with everything from cars to TVs and stamped with the Maersk logo countless times on ships docked at ports around the world. But have you ever thought about what it takes to orchestrate the movement of that cargo? 

Maersk operates in 134 countries, shipping 10 million containers to 76 global ports annually with the help of 88,000 employees. A complex network of people and systems are responsible for its safe passage. Take a car battery. You can’t put a battery into a shipping container and simply say goodbye. A battery is considered dangerous cargo. It goes into a specialized climate-controlled container, requires special customs clearance, and must follow specific logistics to safely transport the batteries over land. And Maersk must keep every vendor, partner and customer updated during each point of the journey. As Maersk’s Angel Donchev, vice president of Platform Tech Lead - Web, Mobile, API/EDI, Blockchain, says, “Maersk is a fascinating company.”

To reduce the complexity of these operations, streamline processes and keep its various constituents happy, Maersk is leveraging a hybrid identity and access management (IAM) and cloud strategy with identity serving as a “pivotal role,” according to Angel. In fact, he says, “The more adoption you have around the cloud, the more identity becomes critical for you because you need to authenticate services, users, partners, vendors and all kinds of different personas, as well as connected devices.”

Since embarking on this hybrid IAM and cloud strategy, Maersk is experiencing numerous benefits. The company has shortened authentication time by a factor of four, so customers, partners and vendors can quickly access essential information through any digital channel in less than a millisecond. The organization has also decreased onboarding time for new vendors from months to less than a week. Lastly, Maersk has reduced costs for Angel’s massive department by 45%, while at the same time increasing the engineering capacity by 45%.

Check out this video to hear more about how Maersk’s hybrid IAM and cloud strategy is helping the company achieve its goal of becoming the leading global integrator of containers and logistics.

 


IBM Blockchain

Blockchain and letters of guarantee

Paper-intensive financial instruments, especially those that require back-and-forth negotiations between parties, are ripe for digitization and blockchain. Add in the potential for fraud with paper processes, and the rationale for blockchain is even greater. The bank guarantee, or letter of guarantee, is just such an instrument. Bank guarantees facilitate doing business by adding trust to […] Th

Paper-intensive financial instruments, especially those that require back-and-forth negotiations between parties, are ripe for digitization and blockchain. Add in the potential for fraud with paper processes, and the rationale for blockchain is even greater. The bank guarantee, or letter of guarantee, is just such an instrument. Bank guarantees facilitate doing business by adding trust to […]

The post Blockchain and letters of guarantee appeared first on Blockchain Pulse: IBM Blockchain Blog.


Coinfirm

Suspicious Activity Report (SAR) for Crypto Regulatory Compliance

What is a SAR? A SAR (Suspicious Activity Report) is a filing to a financial intelligence unit of suspected illicit activity. What triggers a SAR? Suspicious Activity Reports are triggered by suspected illicit activity being picked up by a financial institution, money services business, crypto exchange or other obliged entity. Examples include monetary thresholds, insider...
What is a SAR? A SAR (Suspicious Activity Report) is a filing to a financial intelligence unit of suspected illicit activity. What triggers a SAR? Suspicious Activity Reports are triggered by suspected illicit activity being picked up by a financial institution, money services business, crypto exchange or other obliged entity. Examples include monetary thresholds, insider...

Global ID

The GiD Report#155 — Coinbase is basically a bank and Jamie Dimon is worried

The GiD Report#155 — Coinbase is basically a bank and Jamie Dimon is worried Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: Coinbase is basically a bank Jamie Dimon is literally a bank boss (and he’s very worried) Paypal’s super app
The GiD Report#155 — Coinbase is basically a bank and Jamie Dimon is worried

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

Coinbase is basically a bank Jamie Dimon is literally a bank boss (and he’s very worried) Paypal’s super app also has banks worried And Zuckerberg is worried the Winklevoss twins might get the last laugh This week in fintech America’s unprecedented transformation This week in identity Some Groups are really effective Stuff happens 1. Coinbase is basically a bank Photo: Stock Catalog

A couple weeks ago we talked about the rise of fintech and the age of the digital wallet. That time is now. On Wednesday, Coinbase will move forward with its direct listing.

Let’s just take a look at some of the crazy numbers. (Disclaimer: GlobaliD co-founder and CEO Greg Kidd is an early Coinbase investor through his investment firm Hard Yaka.)

~$140 Billion (implied pre-IPO market cap) $1.8 billion (total revenue for Q1 2021) 1 million (users added per week)* $10 billion (digital assets added per week)* *according to Michael Saylor (founder of MicroStrategy)

Like I said, crazy numbers.

But for Coinbase — and the space in at large — these are still early innings. Coinbase has spent a lot of time building the basic infrastructure and plumbing.

For the most part, that means giving retail, pros, and institutions a place to buy, sell, and hold their crypto.

Longer term? It could mean almost anything. Coinbase will be positioned to offer a bevy of financial services. It will be holding a ton of assets. And it will have a huge number of customer accounts.

In other words, Coinbase is already basically a bank. The difference is that Coinbase could be a whole lot more.

Coinbase reports an estimated $1.8 billion in total revenue for Q1 2021 2. JPMorgan is literally a bank and Jamie Dimon, chief banker, is scared.

A couple weeks ago we talked about how digital wallets would eat banks’ lunches. JPMorgan CEO Jamie Dimon clearly got the memo.

And so talked about it in his own memo.

Here’s Axios:

JPMorgan CEO Jamie Dimon today warned in his annual letter that the U.S. and European banking sectors are being surpassed in scale by shadow banks and fintech rivals.
Why it matters: Dimon, who has at least some pull in the Biden White House, is asking for a “level playing field.” Or, put another way, a loosening of capital requirements on banks and/or greater regulatory oversight of fintech.

And here’s CNBC:

“Banks … are facing extensive competition from Silicon Valley, both in the form of fintechs and Big Tech companies,” like Amazon, Apple, Facebook, Google and Walmart, Dimon wrote, and “that is here to stay.”
Fintech companies, in particular, “are making great strides in building both digital and physical banking products and services,” Dimon said. “From loans to payment systems to investing, they have done a great job in developing easy-to-use, intuitive, fast and smart products.”
This, in part, is why “banks are playing an increasingly smaller role in the financial system,” he said.

Jamie Dimon sees the writing on the wall.

JPMorgan’s Jamie Dimon calls on companies to be policymakers in annual letter JPMorgan Chase CEO Jamie Dimon: Fintech is an ‘enormous competitive’ threat to banks 3. Check out this headline: “PayPal is building a ‘super app.’ Should banks be worried?”

From American Banker:

For years, bankers agonized over the day when Big Tech firms would finally set their sights squarely on financial services. Mainly they worried about four companies: Amazon, Apple, Facebook and Google.
Meanwhile, a fifth tech powerhouse, somewhat smaller but growing fast, was adding products traditionally offered by retail banks. This company built a huge customer base, but it didn’t position itself as a head-on competitor to the nation’s largest banks. Instead, it sought to partner with insured depositories. By early this year, it had a bigger market capitalization than all but two American banks.
The company in question, PayPal Holdings, recently sketched out strategic plans that summon the industry’s long-held fears about the tech giants. At the firm’s investor day in February, PayPal executives promised to build a mobile app that will allow consumers to shop at millions of merchants, while also accomplishing most of what they currently do at banks. Already, the app’s users can transact with debit cards, borrow to make purchases, pay their bills, get paid by their employers, cash checks, make investments, send money to relatives overseas and more.

Now imagine Coinbase, except in 10 years.

4. The Winklevii lost out on Facebook. But Zuckerberg missed the boat on crypto.

Here’s a cool Forbes feature on The Twins via /gregkidd:

“The idea of a centralized social network is just not going to exist five or ten years in the future,” Tyler predicts, when asked about Facebook. “There’s a chasm between the old world and this new crypto-native universe.”

Greg’s take:

Ah the irony if the Winklevoss boys really did get the last laugh with Mark and Facebook. Ironically, fate makes us strange bedfellows with the twins through Gemini, NFTs and Protocol Labs/Filecoin. Small world as I departed Harvard’s MPP program the year Zuck was accepted to college there. If you haven’t seen the movie The Social Network yet, here is the clip of the “Right and Wrong” scene.

Via Greg — Revenge Of The Winklevii

5. This week in fintech

Plaid is super glad it’s not owned by Visa:

Plaid valuation tops $13 billion in first funding after a scrapped $5.3 billion merger with Visa

Signal wants in on digital wallets (via /toddjcollins + /alessusnik):

Signal is testing a payments feature that lets you send cryptocurrency to friends Signal Adds Payments — With a Privacy-Focused Cryptocurrency Ben Kaufman’s snarky take (via /anej):

Facebook is still trying to figure out payments:

Facebook confirms ‘test’ of Venmo-like QR codes for person-to-person payments in US — TechCrunch

And Ripple scored a major win in its ongoing fight with the SEC:

Ripple Labs Wins Access To SEC Internal Crypto Discussions — Law360 In The Ripple Case, The SEC Is Now On Trial — And Knows It 6. America’s unprecedented transformation

According to Doug Sosnik — senior adviser to the Brunswick Group, and political director for President Bill Clinton: Digital disruption is a hinge moment in American history that’s unlike any since the transition from the Agrarian Age to the Industrial Age in the late 1800s.

Via Mike Allen:

Sosnik, who translates big-think political analysis into colorful PowerPoint decks that are eagerly awaited by Washington insiders, gave Axios AM readers a sneak peek at a new presentation that isolates these massive trends.
7. This week in identity Via /jvs— Cool demonstration vid from Mircal (2fa provider using ZKPs): MIRACL Trust® Authentication Via /matej-kokosinek — Top reddit comment: “Let’s start with digital ids in addition to physical ones and then we can talk about replacing them” Apple presses ahead with aim to replace paper passports and ID with iPhone Via /jamesstlouis — Personalization and biometrics: a future for consumers, if they want it | The Rosie Report Via /antoine — ACLU warns ‘a lot can go wrong’ with digital vaccine passports Via /vs — Onfido Collaborates with Microsoft to bring the Future of Reusable Identity One Step Closer 8. Some Groups are really effective

r/WSBs apparently provides good stock tips:

We examine the market consequences of due diligence (DD) reports on Reddit’s Wallstreetbets (WSB) platform. We find average ‘buy’ recommendations result in two-day announcement returns of 1.1%. Further, the returns drift upwards by 2% over the subsequent month and nearly 5% over the subsequent quarter. Retail trading increases sharply in the intraday window following publication, and retail investors are more likely to be net buyers following reports that earn larger returns. Thus, in sharp contrast to regulators concerns that WSB investment advice is harming retail traders, our findings suggest that both WSB posters and users are skilled.
8. Stuff happens Kid Debit Card Startup Greenlight Valued at $2 Billion in Andreessen-Led Round Dave Schwartz on NFTs: What’s the Deal with NFTs? China’s rulers want more control of big tech Patreon now valued at $4 billion as VCs plow money into creator economy companies China Creates Its Own Digital Currency, a First for Major Economy Briefing: Apple, Facebook Spar Over Social Network’s Involvement Epic Games’ Antitrust Case Via /m — Twitter Held Discussions for $4 Billion Takeover of Clubhouse Via /m — Facebook’s Hotline app combos Clubhouse and Instagram Live — 9to5Mac California bill would create free banking services for state’s residents Via /m Sendbird raises $100M at a $1B+ valuation, says 150M+ users now interact using its chat and video APIs — TechCrunch

The GiD Report#155 — Coinbase is basically a bank and Jamie Dimon is worried was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Peer Ledger

What’s the catch? Traceability for responsible seafood supply chains

The new Netflix release — Seaspiracy — has taken the internet by storm as it tackles the many problems plaguing our oceans. While some have criticized the documentary for generalizing the issues and using outdated studies, Seaspiricy highlights the very real harm unethical fishing practices cause for people and our planet. A report from the Environmental Justice Foundation found that there w

The new Netflix release — Seaspiracy — has taken the internet by storm as it tackles the many problems plaguing our oceans. While some have criticized the documentary for generalizing the issues and using outdated studies, Seaspiricy highlights the very real harm unethical fishing practices cause for people and our planet.

A report from the Environmental Justice Foundation found that there were “cases of slavery, debt bondage, insufficient food and water, filthy living conditions, physical and sexual assault and even murder aboard fishing vessels from 13 countries operating across three oceans.” These human rights abuses can often be hard to track, as the vessels are far out in the ocean and rarely come to shore.

In an article from the Future of Fish, they discuss the need for greater supply chain transparency in the seafood industry, noting that while many large industry players are aware of the human rights issues, they did not believe it could be happening within their supply chain.

Companies like Wal-Mart, Costco, and Whole Foods have been found selling seafood caught using forced labour and more companies are realizing the need to take immediate action to protect their supply chains from human rights violations. As governments, corporations and not for profits fight to fix these issues, there is no doubt that technology will play an important role.

Peer Ledger’s MIMOSI Connect traceability platform supports the shift towards greater transparency in the seafood industry. Our blockchain enabled technology gives companies a trusted, immutable record of transactions and metrics across their entire supply chain to support responsible supply chain management and due diligence. MIMOSI Connect allows companies to capture and track transactions and important metrics to instantly map and monitor their supply chains. For companies seeking to get serious about traceability within their supply chain, MIMOSI Connect provides the proof to ensure that your practices and values stay aligned.

References Josh Silberg (April 5, 2021), Seaspiracy Harms More Than It Educates, Hakai Magazine. Retrieved from https://www.hakaimagazine.com/article-short/seaspiracy-harms-more-than-it-educates/ Reliefweb (June, 2019), Blood and Water: Human rights abuse in the global seafood industry. Retrieved from https://reliefweb.int/report/world/blood-and-water-human-rights-abuse-global-seafood-industry Future of Fish, Human Rights and Seafood: Sustainability Means Ending Slavery, Too. Retrieved from https://www.futureoffish.org/blog/human-rights-and-seafood-sustainability-means-ending-slavery-too Virginia Gewin (February 25, 2021), How new technology is helping to identify human rights abuses in the seafood industry. Retrieved from https://thecounter.org/new-technology-helping-identify-human-rights-abuses-seafood-industry-forced-labor/

IDnow

Why Does Germany Place So Many Restrictions on Online Gambling?

Over the past several years, Germany has become more open to innovative remote identity verification methods within the online gambling sector. These are welcome developments, and IDnow actively supports operators to meet their Know Your Customer (KYC) obligations with seamless and secure options. Protecting minors from adult media content and adult entertainment has always been […]

Over the past several years, Germany has become more open to innovative remote identity verification methods within the online gambling sector. These are welcome developments, and IDnow actively supports operators to meet their Know Your Customer (KYC) obligations with seamless and secure options.

Protecting minors from adult media content and adult entertainment has always been a top priority in Germany through the Protection of Young Persons Act (Jugendschutzgesetz – JuSchG) and the German Interstate Treaty on Gambling (ISTG), the latest being the ISTG 2021 effective July 1, 2021.

Licensing requirements for online gambling operators in Germany are stringent. For example, advertisements for online gambling offerings are allowed but with specific restrictions. In particular, advertising cannot be aimed towards minors; they cannot be presented as a way to solve financial problems, and there are strict limitations that ban broadcasting, or online advertising between 9 am and 6 pm when minors might be exposed. This is where strict identity verification requirements (Age Verification or AV) of players kick in to help protect minors from accessing such content and entering online gambling platforms.

For everything there is to know about the new german regulation, read our new guide!

Convenient solutions

Some operators have gone about enforcing AV requirements by following treaty guidelines, whereas other operators have chosen not to follow regulatory requirements and offer simple AV methods where the player picks games according to age and plays accordingly. Other operators have chosen to simply not offer certain adult games in strictly regulated markets like Germany. The solutions are simple, and service providers like IDnow are able to offer products that are compliant and easy to use.

To date, identity verification requirements for gambling operators fall under the German Anti-Money Laundering Act (AML Act) as well as the Interstate Gambling Treaty. The guidelines on “permissible” remote AV methods come from the Kommission für Jugendmedienschutz (KJM), the Commission for the Protection of Minors. The KJM permits online gambling operators a choice of innovative remote methods legally by evaluating whether such systems meet technical and security requirements. Once a method has been evaluated to “satisfactorily” perform an AV, an operator is able to select the appropriate service. The remote method should enable the player’s ID document to be validated and must perform a comparison of the user’s biometric data.

IDnow, which began offering remote identity verification according to the strictest regulatory requirements under German AML law, was one of the first remote identity providers to receive a positive evaluation for both its Videoident and Autoident products to meet AV requirements securely. Document verification is a combination of checks being performed, which includes Optical Character Recognition (OCR) check of the Machine-Readable Zone (MRZ -the two lines at the bottom of the passport), with Selfie Comparison + Liveness Detection (selfie of the player).

All players entering an online gambling site must be age verified and operators who choose to use remote KYC methods ought to ensure they have been evaluated and approved by the KJM, such as IDnow product solutions which are available 24/7. Some of the biggest operators in the gambling world rely on compliant AV methods accepted in Germany to kick in responsible gambling. One of the biggest concerns that have come up for operators time and time again is the ability to offer a seamless user experience from verifying and onboarding players without any friction while ensuring that safe and responsible players are converted into regular customers. A top priority is to create a safe space for players and operators. IDnow is committed to helping operators maintain legal AV requirements and ensure high conversion rates.

 

By

Razia Ali
Senior Account Manager Global Gambling
Connect with Razia on LinkedIn


Infocert (IT)

Confcommercio, InfoCert e Sixtema insieme per l’innovazione delle PMI

Con l’accordo siglato con le due società del Gruppo Tinexta, la Confederazione punta ad agevolare la trasformazione digitale delle imprese associate a sostegno della loro competitività. Benefici concreti anche per le sue sedi territoriali. Roma, 13 aprile 2021 – InfoCert (Tinexta Group), la più grande Autorità di Certificazione a livello europeo, Sixtema (Tinexta Group), partner […] The post Con

Con l’accordo siglato con le due società del Gruppo Tinexta, la Confederazione punta ad agevolare la trasformazione digitale delle imprese associate a sostegno della loro competitività. Benefici concreti anche per le sue sedi territoriali.

Roma, 13 aprile 2021 – InfoCert (Tinexta Group), la più grande Autorità di Certificazione a livello europeo, Sixtema (Tinexta Group), partner tecnologico di PMI, associazioni di categoria, intermediari finanziari, studi professionali ed enti, e Confcommercio, la Confederazione Generale Italiana delle Imprese, delle Attività Professionali e del Lavoro Autonomo, hanno siglato un accordo quadro per favorire una maggiore competitività delle Pmi mediante la promozione e diffusione dei servizi di Digital Trust (come ad es. identità digitale, PEC, firma digitale, fatturazione elettronica) e di soluzioni digitali innovative.

La partnership consentirà alle 700.000 imprese associate a Confcommercio di accedere, a condizioni agevolate, a servizi e soluzioni digitali altamente specializzati attraverso le Associazioni territoriali, che a loro volta potranno dotarsi degli stessi strumenti, ampliando e migliorando anche la loro offerta.

“Oggi è impensabile operare nel terziario di mercato senza adottare le soluzioni digitali più innovative che sono un’opportunità in più per le imprese per poter reggere l’urto della crisi economica scatenata dalla pandemia” – dichiara Fabio Fulvio, direttore responsabile del settore Marketing, Innovazione e Internazionalizzazione di Confcommercio – “Già da anni siamo in prima linea per offrire le migliori soluzioni tecnologiche e la formazione più efficace ai nostri associati, affinché possano integrare le soluzioni digitali nel loro business. Infocert e Sixtema sono partner ideali per le nostre Associazioni territoriali e di categoria, proprio perché offrono soluzioni all’avanguardia in ambito digitale con tutto il supporto e l’affidabilità necessari”.

Fino a pochi mesi fa, strumenti quali SPID, Posta Elettronica Certificata e Firma Digitale erano troppo spesso percepiti come elementi di cui dotarsi per meri obblighi normativi. Allo scoppiare della pandemia, il loro uso è diventato sempre più comune e frequente. Oggi, alle imprese, appaiono come abilitatori d’efficienza e facilitatori del business.

“L’Italia, con ben 19 milioni di identità SPID rilasciate, detiene il record di identità digitali in Europa. Questo è solo un esempio di quanto la trasformazione digitale stia cambiando la quotidianità di cittadini e aziende ma, ancor di più, ne determinerà il futuro prossimo” commenta Carmine Auletta, Chief Innovation & Strategy Officer di InfoCert, Tinexta Group. “Il Digital Trust, cioè la Fiducia Digitale, è ormai un fattore strategico irrinunciabile di competitività delle imprese. In uno scenario complesso come quello attuale e in un mercato sempre più globale, le Associazioni come Confcommercio hanno un ruolo determinante nell’accelerazione di questo cambiamento culturale”.

Anche in quest’ottica, Confcommercio prevede degli specifici momenti formativi, a beneficio delle proprie rappresentanze locali, erogati in collaborazione con Sixtema che peraltro sarà il loro punto di riferimento anche per tutta l’assistenza di secondo livello.

“Metteremo al servizio di Confcommercio, delle sue articolazioni territoriali e delle imprese associate, la lunga esperienza maturata nella progettazione e nello sviluppo di soluzioni tecnologiche ad hoc nonché nella consulenza specializzata per PMI e associazioni di categoria. Avere l’opportunità di contribuire all’innovazione di una platea imprenditoriale e associativa così vasta, quale quella di Confcommercio, rappresenta al contempo una grande sfida ma anche un motivo di vero orgoglio” conclude Claudio Scaramelli, Amministratore Delegato di Sixtema, Tinexta Group.

InfoCert SpA

InfoCert, Tinexta Group, è la più grande Certification Authority europea, attiva in oltre venti Paesi. La società eroga servizi di digitalizzazione, eDelivery, Firma Digitale e conservazione digitale dei documenti ed è gestore accreditato AgID dell’identità digitale nell’ambito di SPID (Sistema Pubblico per la gestione dell’Identità Digitale). InfoCert investe in modo significativo nella ricerca e sviluppo e nella qualità: detiene un significativo numero di brevetti mentre le certificazioni di qualità ISO 9001, 27001 e 20000 testimoniano l’impegno ai massimi livelli nell’erogazione dei servizi e nella gestione della sicurezza. Il Sistema di Gestione della Sicurezza delle Informazioni InfoCert è certificato ISO/IEC 27001:2013 per le attività EA:33-35. InfoCert è leader europeo nell’offerta di servizi di Digital Trust pienamente conformi ai requisiti del Regolamento eIDAS (regolamento UE 910/2014) e agli standard ETSI EN 319 401, e punta a crescere sempre di più a livello internazionale anche mediante acquisizioni: detiene il 51% di Camerfirma, una delle principali autorità di certificazione spagnole e il 16,7% di Authada, Identity Provider tedesco all’avanguardia. InfoCert, infine, è proprietaria dell’80% delle azioni di Sixtema SpA, il partner tecnologico delle Associazioni di categoria.

Sixtema SpA

Sixtema è il Digital Enabler per Associazioni, Small Finance e PMI che promuove la digital transformation del tessuto produttivo e imprenditoriale del Sistema Italia.

Sixtema offre piattaforme verticali per la gestione a 360° degli associati alle più importanti organizzazioni di rappresentanza del paese, favorendo lo sviluppo dell’economia del territorio.

Grazie alle sue soluzioni rende l’innovazione accessibile per la digitalizzazione dei processi, garantendone sicurezza e piena compliance normativa.

Con i servizi di Consulting, affianca le imprese con attività di advisory per accompagnarle operativamente nei processi di trasformazione digitale, e di evoluzione normativa.

I 130 dipendenti operano dalle sedi di Modena, Firenze, Ancona e Roma.

Tinexta Group

Tinexta, quotata al segmento STAR della Borsa di Milano, ha riportato i seguenti Risultati consolidati al 31 dicembre 2020: Ricavi pari a Euro 269,1 milioni, EBITDA pari a Euro 77,9 milioni e Utile netto pari a Euro 37,9 milioni. Tinexta Group è tra gli operatori leader in Italia nelle quattro aree di business: Digital Trust, Cybersecurity, Credit Information & Management e Innovation & Marketing Services. La Business Unit Digital Trust eroga, attraverso le società InfoCert S.p.A., Visura S.p.A., Sixtema S.p.A. e la società spagnola Camerfirma S.A., prodotti e soluzioni per la digitalizzazione: firma digitale, identità digitale, onboarding di clientela, fatturazione elettronica e posta elettronica certificata (PEC) per grandi aziende, banche, società di assicurazione e finanziarie, PMI, associazioni e professionisti. La Business Unit Cybersecurity opera attraverso le società Yoroi, Swascan e Corvallis e costituisce uno dei poli nazionali nella ricerca ed erogazione delle soluzioni più avanzate per la protezione e la sicurezza dei dati. Nella Business Unit Credit Information & Management, Innolva S.p.A. e le sue controllate offrono servizi a supporto dei processi decisionali (informazioni camerali e immobiliari, report aggregati, rating sintetici, modelli decisionali, valutazione e recupero del credito) e RE Valuta S.p.A. offre servizi immobiliari (perizie e valutazioni). Nella Business Unit Innovation & Marketing Services, Warrant Hub S.p.A. è leader nella consulenza in finanza agevolata e innovazione industriale, mentre Co.Mark S.p.A. fornisce consulenze di Temporary Export Management alle PMI per supportarle nell’espansione commerciale.  Al 31 dicembre 2020 il personale del Gruppo ammontava a 1.403 dipendenti.

Confcommercio

Confcommercio-Imprese per l’Italia associa oltre 700.000 imprese del commercio, del turismo, dei servizi, dei trasporti e delle professioni costituendo la più grande rappresentanza d’impresa italiana. Con il suo articolato e diffuso sistema associativo – territoriale, di categoria e di settore – Confcommercio tutela e rappresenta le imprese associate nei confronti delle istituzioni, nazionali ed internazionali, valorizzando il ruolo del terziario di mercato e dell’economia dei servizi. Fanno parte dell’attività “istituzionale” della Confederazione la stipula di contratti nazionali di lavoro (quello del commercio e quello del turismo) e di accordi collettivi; la promozione della formazione imprenditoriale, la promozione di strutture collegate, enti, associazioni e istituti finalizzati allo sviluppo dei settori e delle imprese rappresentate.

* * *

Per maggiori informazioni: InfoCertPress Relations Advisor BMP Comunicazione per InfoCert team.infocert@bmpcomunicazione.it Pietro Barrile +393207008732 – Michela Mantegazza +393281225838 – Francesco Petrella +393452731667 www.infocert.itTinexta S.p.A.Chief External Relations & Communication Officer Alessandra Ruzzu Press Office manager Carla Piro Mander Tel. +39 06 42 01 26 31 carla.piro@tinexta.comMedia Advisor Barabino & Partners S.p.A. Foro Buonaparte, 22 – 20121 Milano Tel.: +39 02 7202 3535 Stefania Bassi: +39 335 6282 667 s.bassi@barabino.itSpecialist Intermonte SIM S.p.A. Corso V. Emanuele II, 9 – 20122 Milano Tel.: +39 02 771151 ConfcommercioDirezione Centrale Comunicazione e Immagine Ufficio Stampa Francesco Ragaini +39 335 1253 794 f.ragaini@confcommercio.it www.confcommercio.it

The post Confcommercio, InfoCert e Sixtema insieme per l’innovazione delle PMI appeared first on InfoCert.digital.


Stati Generali della Conservazione digitale 2021

Ancora una volta PA, Istituzioni e Università, con il patrocinio di AgID, insieme per dare risalto all’importanza strategica dei processi di digitalizzazione. Il prossimo 16 Aprile, Assintel e il Laboratorio di Documentazione dell’Università della Calabria, in collaborazione con AssoConservatori Accreditati, organizzano gli Stati Generali della Conservazione. Con il patrocinio di AgID. Dato il pro
Ancora una volta PA, Istituzioni e Università, con il patrocinio di AgID, insieme per dare risalto all’importanza strategica dei processi di digitalizzazione.

Il prossimo 16 Aprile, Assintel e il Laboratorio di Documentazione dell’Università della Calabria, in collaborazione con AssoConservatori Accreditati, organizzano gli Stati Generali della Conservazione. Con il patrocinio di AgID. Dato il prorogarsi delle restrizioni anti Covid, l’evento si svolgerà online, con numerosi ospiti di rilievo che si alterneranno in una manifestazione dal tiolo: “Recovery Fund e qualità della conservazione digitale per la competitività del paese”.

Recovery Fund e qualità della conservazione digitale per la competitività del paese

Anche quest’anno avrà luogo l’ormai nota manifestazione che riunisce i vari stakeholder sul tema della Conservazione Digitale dei Documenti. L’evento patrocinato da Assoconservatori Accreditati – la sezione di Assintel dedicata al tema della Conservazione digitale – e dall’Università della Calabria, co-organizzatrice dell’evento, ha lo scopo di riunire e portare al confronto tutti gli attori di questo complesso ecosistema rispetto ai percorsi di realizzazione e completamento della digitalizzazione, nell’ottica del miglioramento della competitività del Paese.

Sarà possibile partecipare all’evento on-line da questa pagina https://www.assintel.it/eventi/stati-generali-conservazione-2021/

InfoCert, punto di riferimento per la Conservazione Digitale a Norma.

InfoCert è stata tra le prime aziende italiane ad essere accreditata da AgID come Conservatore, requisito normativo necessario per erogare servizi di Conservazione Digitale. Da allora, è punto di riferimento per professionisti, aziende e PA che vogliono digitalizzare i processi di conservazione con le soluzioni LegalDoc.

InfoCert ormai da anni fa anche parte della community di Assoconservatori Accreditati, una rete che riunisce aziende del mondo ICT in una sezione dedicata alle imprese private accreditate in AgID per i servizi di conservazione digitale dei documenti. Il motivo di questa verticalizzazione è semplice: tutelare il mercato da sistemi di gare pubbliche inquinati da logiche che non tutelano la qualità e la libera concorrenza.

The post Stati Generali della Conservazione digitale 2021 appeared first on InfoCert.digital.


Elliptic

Crypto Regulatory Affairs: Bank Bosses Look for Regulatory Clarity in Crypto

2021 has been a big year for the banking-crypto convergence. Major financial institutions such as JP Morgan and Goldman Sachs have announced crypto-related offerings for their clients - part of a broader trend that's seen banks looking to crypto to spur growth and innovation. 

2021 has been a big year for the banking-crypto convergence. Major financial institutions such as JP Morgan and Goldman Sachs have announced crypto-related offerings for their clients - part of a broader trend that's seen banks looking to crypto to spur growth and innovation


GIMLY

NFTs and the need for Self-Sovereign Identity

NFTs have great potential for provable scarcity and ownership, yet the lack of a verifiable identity layer limits the fulfilment of this potential. This article presents a case of NFT fraud and copyright infringement, and describes how self-sovereign identity (SSI) can be the solution to verify the origin and legitimacy of an NFT and its linked object.

In this article, we first highlight the great potential that NFTs have in terms of provable scarcity and ownership, as well as an important shortcoming that is limiting the fulfillment of this potential: the lack of a verifiable identity layer. We describe how this lack of verifiable identity not only limits the use of NFTs for provable scarcity and ownership – it opens the gates for fraud and scams. Finally, we describe how self-sovereign identity (SSI) can be the solution to verify the origin and legitimacy of an NFT and its linked object, and we invite you to join Gimly and bitcoin artist Petek@RaydarRayne on our journey to fulfill the potential of NFTs for digital and physical artists alike.

Non-fungible tokens: the basics

Non-fungible tokens (NFTs) are provably unique digital assets. They cannot be duplicated or divided creating digital scarcity. NFT's contain information that link a digital or physical object, and are recorded in smart contracts on decentralized blockchain networks which makes the record mathematically tamperproof. Ownership of an NFT is provably unique which is why investors and collectors are excited for this new digital format allowing them to have trustworthy collectables in the digital space. Under the hood, NFTs are actually software programs that exist on decentralised blockchains like Bitcoin and Ethereum which give them their unique trust properties. Learn more about NFTs here.

This works very well with on-chain virtual objects that live on the blockchain which can be provably and uniquely linked to the NFT. Some good examples of this are digital cats on Cryptokitties, trading cards on Pepecard, virtual land on Decentraland and video game items on Ultra.

However, when we want to issue NFTs representing a physical or digital object that is not on the blockchain – such as a piece of art – we run into problems. Because the NFT and its object are not immutably linked, important questions arise about what it is that the NFT conveys ownership over. And is the creator of the NFT even authorized to convey such ownership? What is the legitimacy of the NFT and the creator of the NFT in the first place?


“Genesis” - Petek’s original S2F painting commissioned by PlanB.
The fake "The Fourth Turning" NFT The identity problem

Petek @RadarRayne, a Bitcoin artist well-known for her artwork representing the stock-to-flow model in collaboration with PlanB, recently found herself asking exactly these questions. A few weeks ago, she discovered that rarible.com listed an NFT falsely representing one of her latest prints in the limited series "The Fourth Turning". This NFT was created by a scammer and was not actually backed by any of her artwork. It had nothing to do with Petek's art and was a serious infringement of her copyrights! So she asked herself: how could this happen?

Most blockchains, such as Bitcoin and Ethereum are pseudo-anonymous. Accounts on the blockchain do not reveal any information about the identity that controls them. Anyone could take a screenshot from Peteks website, and NFT her artwork. There is no reliable way for the potential buyer to verify the origin and the legitimacy of the NFT creator. A buyer would need to visit Petek's website and even reach out in person to verify whether the NFT is legitimate.

Thankfully, in this case some of Petek's clients indeed reached out to her and Petek was able to have the NFT listing removed. But surely, there will be many more similar cases where such fraud is not detected in time.

This example deals with a physical piece of art, but the lack of a verifiable identity at the origin of an NFT is equally problematic for digital art. The lack of NFT verifiability also leads to intellectual property & copyright infringements. Of course, it is possible to trace back the chain of custody to the creator's public address and see if similar artwork from the same artist is created using that address. But an instant and fool-proof manner to verify the legitimacy of an NFTs creator is lacking none the less. Without such verification built into the NFT, an NFT proves ownership only over that NFT itself, and nothing more.

Self-sovereign identity: verifying the origin and legitimacy of an NFT and the linked object

This is exactly the problem that self-sovereign identity (SSI) solves! SSI is a new set of standards that guide a new identity architecture for the Internet. With a focus on privacy, security interoperability, SSI applications use public key cryptography in consertion with public blockchains to create persistent identities with private and selective disclosure of information for people (and more generally, identity information for all kinds of entities - organizations, objects, IoT devices and more).

Because SSI is built for blockchain-based identities, it is the perfect solution to bring identity to NFTs! SSI applications enables the creator or artist to proveably sign off that a digital or physical asset was created by them. Buyers can then verifiably check they are indeed purchasing an NFT that was created by the artist.

Without this, NFTs of the future should not be considered legitimate, unless people don't mind them being "backed by thin air". SSI is what lends NFTs the "value beyond the token itself" immutably, and cryptographically connected to said asset in perpetuity.

Learn more about SSI and IoT with non-fungible tokens

We are currently consulting with Petek on how we can future proof her artwork from being scammed by offering SSI-backed NFTs with value that is also connected to the actual artwork through "artist approved" verification methods and proveable ownership with our self-sovereign identity NFC chips.

Do you want to work with us? Be sure to reach out via hello@gimly.io, or visit our contact page below.

Contact

PingTalk

The Temporary State of Impossible, with Mick Ebeling

In my work with companies, corporations and governments around the world, one of the themes I frequently encounter during these interactions is that something can’t be done because “it’s impossible.” This mindset discourages innovation, stalls organizational progress and limits business success—and once entrenched, is extremely difficult to overcome. It’s a mentality that’s pervasive in the identi

In my work with companies, corporations and governments around the world, one of the themes I frequently encounter during these interactions is that something can’t be done because “it’s impossible.” This mindset discourages innovation, stalls organizational progress and limits business success—and once entrenched, is extremely difficult to overcome. It’s a mentality that’s pervasive in the identity space, where it’s impossible to know for sure that every person we interact with in the digital world is who they say they are; at least that is what we’ve led ourselves to believe.


Dark Matter Labs

Breaking the Silos Through Participatory Systems Mapping

Youth Employment and the Future of Work in Bhutan and Beyond By Eunji Kang and Eunsoo Lee (Dark Matter Labs), Tshering Wangmo, Tshoki Zangmo and Bishnu Chettri (UNDP Bhutan) Originally published by UNDP Bhutan, this blog documents the collaboration between UNDP Bhutan and Dark Matter Labs in a 3-month project investigating the systemic challenge of youth employment and the future of wo

Youth Employment and the Future of Work in Bhutan and Beyond

By Eunji Kang and Eunsoo Lee (Dark Matter Labs), Tshering Wangmo, Tshoki Zangmo and Bishnu Chettri (UNDP Bhutan)

Originally published by UNDP Bhutan, this blog documents the collaboration between UNDP Bhutan and Dark Matter Labs in a 3-month project investigating the systemic challenge of youth employment and the future of work in Bhutan. The intelligence report published by UNDP Bhutan can be found here.

*This blog includes a link to a glossary which defines the terms used.

Introduction Bhutan in transition

Bhutan, a country of roughly 760,000 people and one of the very few carbon-negative countries in the world, faces unique challenges in designing a viable and sustainable employment market for its growing youth population. With more than half the population below the median age of 27, it is experiencing an unprecedented demographic bulge that is projected to increase further in the next couple of decades. To make matters worse, youth unemployment rates have remained consistently high over the past decades, with the highest record in 2018 when it reached 15.7 percent. Now that COVID-19 has impeded major industries such as hospitality and tourism, the economic shocks have hit young people the hardest. According to the Rapid Socio-Economic Impact Assessment on Bhutan’s Tourism Sector conducted in April 2020, the pandemic has impacted over 50,000 young people working in the tourism and hospitality sector. They made up 50% of the welfare applications received in 2020.

These demographic and economic shifts prompt questions about how Bhutan’s economy and society may develop in the 21st century, and within this, what the future of work would look like. The pandemic, while posing several challenges, also presented opportunities to redefine the country’s employment scenario with many educated, experienced Bhutanese diasporas returning home from abroad. In the midst of huge technological leaps taking place globally, there is a growing demand for new markets and business models, new jobs, the reshaping of old industries and the re-definition of work itself in Bhutan as well. Central to this is the need to unleash human potential — creating capabilities in the system for innovation, purpose, shared value creation, and the realisation of new futures.

Systems approach to youth employment — a methodology for collective sensemaking How can we think beyond job markets, and expand our understanding of the employment issue?

Too often, unemployment is viewed as a supply-demand problem, where solutions are designed to trigger job creation (increasing supply) or job training (increasing employability and fulfilling demand). But what if that is not enough? The persistence of the problem in Bhutan, and elsewhere, confirms that intervening in the job market alone does not provide a sustainable solution. The behaviour of the individual, and the social and economic environments that the labour market is embedded in — are shaped by numerous interconnected factors such as, the availability of social protection, parental influence, changing expectations, stigma of certain professions, access to digital infrastructure, and so on. The sum of their complex interdependencies we call — a system (or systems). This is how we — UNDP Bhutan and Dark Matter Labs — approached Bhutan’s youth employment issue in this 3-month project. The challenge for us was to find a methodology that would allow us to build an awareness of the systemic nature of the issue at stake, and to broaden our horizons in order to imagine new solutions.

Figure 1. Layering and communicating complexity through systems mapping How can we understand and embrace complexity?

Complexity begets complexity for wicked problems. An attempt at analysing the intricacies of systems — the pain points, underlying drivers, risks, and numerous interconnected factors — can easily become overwhelming for one person. Moreover, if we have ten people attempting to do the same, we would probably end up with ten different versions of the analysis. So, how do we prevent ourselves from getting lost in this vast web of interdependencies? How do we do this collectively? How do we build a shared comprehension that can steer us towards the coveted solution space?

Through extensive engagement with multiple stakeholders, UNDP Bhutan and Dark Matter Labs experimented with the methodology of participatory systems mapping as a process of engagement — creating a shared language and comprehension. More importantly, the systems map that we collectively developed, functioned — and still functions — as a critical storytelling tool that represents the diverse interests and experiences of the people involved in the process. Below, we share some key learnings from this process.

Complexity vs. Simplicity

Using data collected from prior research (which involved desk research and interviews) and ideas generated through stakeholder workshops, we mapped the connections between factors — each representing a problem — that influence individuals’ options and behaviors regarding work: from family background to market conditions, and global and regional forces. To organise and process the vast number of nodes (factors) and their connections, we created a classification system using the following domains.

Figure 2. Classification system for nodes, using contextual and direct domains

As we mapped the connections, we realised that the more connections there were, the more challenging it was to make sense of the data. Multiple layers of lines not only made it difficult to track connections visually, but also to see patterns and hierarchies. However, privileging simplicity might risk simplification — leaving out details and erasing the nuances that make the map authentic. So, the challenge was twofold: we needed to create a visual solution that could communicate complexity without sacrificing legibility, and devise a narrative solution that could preserve — and would allow us to exfoliate — the layers of complexity.

We did the latter by highlighting a series of micro-narratives that each represent a critical issue that contributes to the challenge of youth unemployment in Bhutan. Each narrative comprises a cluster of nodes (pain points) and takes into account the weight of the connections (which is determined by the number of connected nodes). Superimposed on the map itself, these micro-narratives provide an entry point into the critical issues, and allow zooming in at the numerous branching connections.

Figure 3. Micro-narratives composed of multiple interconnected nodes Navigating complex systems

A map is not merely a representation of a place (in this case, a system), it serves a function of navigation. Just as geographic coordinates give you a sense of location, and terrain data informs your direction — a systems map helps to orient ourselves; to find direction and make better decisions. The micro-narratives are one layer of information, the domain classifications are another. These layers were designed not to simplify information, but to provide entry points from where the full complexity of the system can be explored. And, because this is an open-ended map that has been validated and iterated constantly through the stakeholder workshops, the narratives also provoke and invite further engagement and augmentation.

Perspectives from within the system

While drawing inspiration from the system dynamics model which sees the world as a complex system of feedback loops, our approach to systems mapping diverged in its method and purpose. The participatory — as opposed to the technical — approach to systems mapping invites multiple stakeholders, experts and non-experts alike, to map their knowledge, experience, and perspectives onto a collective map. Rather than the map itself, it is the process of co-creation, of collective reflection and discussion, that leads to a shared comprehension of the problem at hand. Often, it is easy to interpret the systems map as something factual, providing a holistic overview of the system from a distance. Systems maps crafted through the participatory process may not serve such an objective function. Instead, they can demonstrate the collective insights gathered from those within the system.

Figure 4. Identifying “opportunity domains”, highlighting possible interventions across the whole system. New possibilities of sensemaking in the digital age

During the process of mapping and synthesising, we used digital tools such as Graph Commons and Kumu that played a vital role in supporting human synthesis and discovering insights. By interacting with machine-generated clusters and correlations, we became more aware of the human biases involved in creating connections and causal relationships. For example, due to the nuances of language: the way in which a node was described had an effect on how we drew connections, and thus required continuous crafting and validation. The digital tools and algorithms helped us in this process and contributed to the visualisation and compounding of our very human sensemaking efforts by highlighting and weighting the connections we had made.

Figure 5. Process of ‘human’ and ‘machine’ sensemaking

We are aware of the growing number of tools and technologies that support our human sensemaking capacity, such as sentiment analysis, factor analysis, network analysis and many more. Perhaps through these new methods of combining human and machine analysis, while acknowledging the limitations and possibilities of both, we could arrive at a more nuanced and plural understanding of the system.

The opportunity space — creating horizontal capacities for innovation Building a portfolio of experiments across the system

Complex systems require a variety of different actors: citizens, civil society organisations, research organisations, entrepreneurs/local businesses and governments working in coalition to build transformative, long-term visions, and diverse solutions. The portfolio approach enables us to create a network of experiments — for collective learning, hypothesising, testing and validating across different sites. It can reduce information asymmetries and allow different actors to share and manage risks in the real world. The catalyst for change may not come from one organisation or one single intervention. Instead, it can emerge in the form of portfolios, and networks of portfolios, owned and developed by multiple organisations invested in and committed to the same mission.

Every actor in the system can create their solutions

The portfolio approach brings into perspective the possibilities of a dynamic innovation mechanism, where the number of experiments multiplies by the number of actors involved, and interventions can generate positive change across different parts of the system, simultaneously. It leads to a number of critical questions as well. How can we ensure that these experiments are not carried out in silos? How can we facilitate actors to organise around shared missions? How can ownership be distributed, equitably, regardless of the power dynamics within the system? How can we create the conditions for deep system awareness, and facilitate learning between actors? These questions prompt us to think about the capacities and capabilities that could be built into the system to enable the matters of the “how”.

The framework for the portfolio of experiments that we designed, partly reflects these concerns:

Figure 6. A framework for a portfolio of experiments developed in order to identify investment gaps. Ideas from participants from the systems map workshops were mapped on to the framework.

In the Y axis, we categorise types of experiments, ranging from governance and regulatory, to financing and cultural experiments. On the X axis, capacities are defined according to scale — ranging from individual, collective, to system capacities. Instead of seeing the experiments as direct solutions to tackle the problem (a form of product innovation), this new framework prompts us to ask what the experiments can catalyse in terms of building the capacities to innovate from within the system. Does it contribute to building individual capacities — a sense of agency, awareness, ability to imagine, and collaborate? And does it contribute to our collective capacities of shared sensemaking, decision making, generating political will and legitimacy? And/or is it about building the infrastructure and mechanisms at the system level for shared accountability, shared data, knowledge, financial structures, and metrics that could help us prioritise and measure the success of our interventions?

These perspectives and questions provide a glimpse into the potentials of shifting the focus from a service/product centred innovation (emphasis on the what) to a capacity/capability centred approach (touching upon the how) where every actor in the system can innovate and build their solutions/portfolios towards long-term transitions. It also brings to attention the role of backbone organisations such as UNDP and its partners in facilitating and provoking systems change. Perhaps a first step towards this shift is cultivating and refining the methods to build deep system awareness, allowing us to explore freely, digest, validate, and socialise the complexity of the system.

Next steps

Moving forward, UNDP Bhutan and Gross National Happiness Commission (GNHC) will continue the process of co-creating a portfolio of experiments with relevant stakeholders in Bhutan and beyond. Experiments will be led by different actors, but with a common portfolio logic and a common framework to support collective intelligence; as well as the capacity to support continuous learning, testing and experimentation. The portfolio space aims to institute a framework where experiments, initiatives and interventions can over time coalesce, generating more intelligence, leveraging more connections, and accelerating learning and impact.

The portfolio logic design is a new adventure for the UNDP Bhutan team and we look forward to more learnings from the ground. If you are interested in learning more about our work and being a part of this journey with us, do shout out or reach out to us at innovation.bt@undp.org.

The team at Dark Matter Labs have started working with UNDP Bangkok Regional Innovation Centre and UNDP Philippines on another wicked problem of our time — food systems. We hope to continue our efforts in developing new methodologies and platforms that facilitate co-discovery and collaborative problem solving.

Note

If you want to find out more about how to engage with this mission, please contact UNDP Bhutan Innovation team, Youth Co:Lab and Dark Matter Labs: tshering.wangmo@undp.org, eleanor.horrocks@undp.org, eunji@darkmatterlabs.org

Acknowledgements

Special thanks to:
Eleanor Horrocks (UNDP Bangkok Regional Hub)
Hyojeong Lee (Dark Matter Labs), System visualisation

Breaking the Silos Through Participatory Systems Mapping was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 12. April 2021

Fission

Overview of LogSeq with Tienson Qin

An overview and demo of LogSeq, a privacy-first, open-source platform for knowledge sharing and management. Presented by project creator, Tienson Qin, at the April 8th Fission Tech Talk.

For our Thursday Tech Talk on April 8th, 2021, we were joined by Tienson Qin, creator of LogSeq. LogSeq is "a privacy-first, open-source platform for knowledge sharing and management".

The first fun surprise is that Tienson used LogSeq itself for the presentation. You can find the content on Tienson's page.

The long-term goal is to help children and more people to connect their thoughts with the past and the future

@TiensonQin

LogSeq was started in February 2020. Not as a "clone" of anything, but inspired by TiddlyWiki, Emacs Org mode, Roam Research, and more.

The logo is very basic – and the community is working on some upgraded designs – but it's meant to represent footprints in the sand: the marks you make in your life, gathering and sharing knowledge.

Supports both Markdown and Org-mode formats in the browser. You don't have to learn Emacs and you can still have all of the power of Org mode.

Doesn't quite have all of the features of Org mode implemented...yet! [1]

Tienson showed us the File System Access API: currently a Chrome only browser API that allows for reading and writing from local file systems. This means being able to mix and match editing files in Emacs, with LogSeq editing files from the browser, or any combination.

LogSeq is written in ClojureScript, plus OCaml and DataScript, because Tienson didn't have any JavaScript experience but did have lots of experience with Clojure:

From his background @tiensonqin didn't have JavaScript experience, but rather Clojure. So, built using ClojureScript for the front end, plus OCaml and DataScript pic.twitter.com/CxDqz4saG6

— FISSION (@FISSIONcodes) April 9, 2021

We got lots of demos of the query system. Uses DataScript for "raw" queries, as well as some built in LogSeq query syntax. Everything is customizable by the user, and editable from directly within LogSeq itself.

Tienson is working on this full time, and has a small team of other developers working on it too. There are 111 contributors on Open Collective, and LogSeq also supports two of their downstream dependencies:

excalidraw: a virtual whiteboard for sketching hand-drawn like diagrams babashka: a native scripting environment for Clojure

It was very inspiring to hear from Tienson. Several times he referenced sharing knowledge and making it available to his daughter, and making LogSeq open source, privacy preserving, and available for years to come.

So awesome to hear from @tiensonqin about his wish for his daughter and others to keep building and sharing knowledge graphs and between all PKM tools. pic.twitter.com/EX93tMsv6f

— FISSION (@FISSIONcodes) April 9, 2021

As part of the final wrap up, a LogSeq user Hilary mainly just wanted to say thanks to Tienson, which was awesome to hear:

No questions here, I just wanted to say that I’ve been using logseq for a while now and I love it - I’m not much of a developer, just a normal user keeping track of a lot of meetings, tasks, notes, and writing projects for work - and wanted to say thanks, Tienson, you are doing an awesome job.

We're definitely interested in seeing if we can add Fission webnative support for LogSeq, so that files can be synced and optionally encrypted, without having to have a Github account.

Resources Use the app right in your browser logseq.com Visit the documentation logseq.github.io, especially the New to LogSeq section Source code in Github logseq/logseq, including releases of the desktop Electron app Feature requests and forum discussion discuss.logseq.com Sponsor on OpenCollective Join the Discord chat Follow the project on Twitter @LogSeq and Tienson @tiensonqin

[1] We had some discussion live around how long Emacs has been around. 1976 was when Emacs created, Emacs Lisp came along in 1985, and Org mode started relatively recently in 2003 "out of frustration over the user interface of the Emacs Outline mode".

Sign up for more Thursday Tech Talks like this one on the Fission events page »


KuppingerCole

Accelerate your Identity's Digital Transformation

Join Martin with Jackson and TJ from Clear Skye as they talk about how Clear Skye accelerates your Identity's digital transformation.

Join Martin with Jackson and TJ from Clear Skye as they talk about how Clear Skye accelerates your Identity's digital transformation.




auth0

Auth0 Appoints Lucy McGrath as Vice President of Privacy

Former Thermo Fisher Scientific and NBCUniversal executive leads Auth0’s global privacy programs
Former Thermo Fisher Scientific and NBCUniversal executive leads Auth0’s global privacy programs

Global ID

EPISODE 06 — Establishing trust and safety in tomorrow’s networks

EPISODE 06 — Establishing trust and safety in tomorrow’s networks Is the internet a healthy place? How do we ensure that the digital platforms and networks where we play and work are constructive environments? In our latest episode of The GlobaliD Podcast, Greg Kidd talks trust and safety with Tiffany Xingyu Wang, Chief Strategy Officer of Spectrum Labs and co-founder of the Oasis Consortium. P
EPISODE 06 — Establishing trust and safety in tomorrow’s networks

Is the internet a healthy place? How do we ensure that the digital platforms and networks where we play and work are constructive environments?

In our latest episode of The GlobaliD Podcast, Greg Kidd talks trust and safety with Tiffany Xingyu Wang, Chief Strategy Officer of Spectrum Labs and co-founder of the Oasis Consortium.

Past episodes:

EPISODE 05 — How ZELF combines the power of payments and messaging EPISODE 04 — The future of blockchain with the creator of Solana EPISODE 03 — Should we trust Facebook? EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security

Have a question for Greg? A topic you’d like covered? A guest you’d like to see? Let us know!

GlobaliD on Twitter Greg on Twitter

For more of Tiffany:

Tiffany on Twitter and LinkedIn Spectrum Labs on LinkedIn OASIS Consortium on LinkedIn

EPISODE 06 — Establishing trust and safety in tomorrow’s networks was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Space Elephant \ Unikname

Webinar – 3 leçons à retenir sur la sécurisation des accès web

L’article Webinar – 3 leçons à retenir sur la sécurisation des accès web est apparu en premier sur Unikname.
Webinar – 3 leçons à retenir sur la sécurisation des accès web Charlène Demaret
À la suite d’un piratage, les conséquences peuvent être dramatiques… Sécuriser efficacement son site web sans non plus devenir « parano » est donc essentiel. D’autant plus au travers du contexte sanitaire actuel, où le nombre d’entreprises évoluant vers la digitalisation augmente de jour en jour. Ce phénomène induit par conséquent une croissance considérable des risques de piratage.
Sécuriser les accès de mes sites web en 3 leçons Mardi 27 Avril – 10h30

Selon une étude menée par OWASP, l’une des failles de sécurité les plus importantes aujourd’hui (classée en deuxième position) est la mauvaise sécurisation des accès administrateurs et utilisateurs. En parallèle et grâce à PT Security, on sait aujourd’hui que les hackers peuvent prendre le contrôle de 9 sites sur 10 analysés en passant pas les accès utilisateurs, et peut-être que le vôtre en fait partie.

L’objectif de ce Webinar est tout d’abord de vous faire prendre conscience des risques encourus à cause de quelques maladresses, et de vous montrer comment sécuriser simplement vos accès administrateurs et utilisateurs afin de garder une longueur-d’avance sur les cyber-attaquants.

Au sommaire de ce Webinar :

Introduction – Le marché de la sécurité des sites web

Enjeu – Le risque d’une faible sécurité des comptes administrateurs et utilisateurs

Leçon 1 – Humain – Les bonnes pratiques à connaître

Leçon 2 – Technologie – Les solutions qui y répondent

Leçon 3 – Gestion – Organisation & Maintenance

Échange – Questions / Réponses

Réservez votre place, c’est GRATUIT ! « Sécuriser les accès de mes sites web en 3 leçons »

Nom

Prénom

Entreprise

Email

J’en profite pour m’abonner à la Newsletter Unikname (1 à 2 mail par mois, pas de spam 😉 )*

10 + 2 =

Inscription Vos informations vous appartiennent et restent sur nos propres serveurs, elles ne seront divulguées à personne, promis. Voir notre politique de respect de la vie privée.* Webinar – 3 leçons à retenir sur la sécurisation des accès web

par Charlène Demaret | Avr 12, 2021

À la suite d’un piratage, les conséquences peuvent être dramatiques… Sécuriser efficacement son site web sans non plus devenir « parano » est donc essentiel. D’autant plus au travers du contexte...

lire plus Lancement des programmes partenaires Unikname

par Charlène Demaret | Avr 9, 2021

Vous nous les avez (beaucoup) réclamés, et nous vous avons écouté ! Nous sommes fiers de vous présenter nos programmes partenaires Unikname. Et l'idée ne vient pas seulement de vous, mais émane...

lire plus Replay – Cybersécurité – Quelles sont les technologies d’aujourd’hui et de demain pour Unikname ?

par Charlène Demaret | Mar 23, 2021

Vous connaissez sûrement les bénéfices de la solution Unikname, mais savez-vous réellement ce qui s’y cache... Parfait ! On vous explique tout dans ce Webinar ! Montez à bord de notre navette...

lire plus Replay – 4 astuces pour éviter le piratage de votre site WordPress

par Charlène Demaret | Mar 12, 2021

On sait aujourd’hui que les CMS les plus piratés sont les sites WordPress (90% selon une étude réalisée par Sucuri). Dans la majorité des cas, il s’agit d’une faille de sécurité provenant d’une...

lire plus Replay – 6 fonctionnalités clés à découvrir chez Unikname

par Charlène Demaret | Mar 3, 2021

Unikname promet la sécurité à portée de main et participe à la sécurisation de tous les accès web, mais comment cela fonctionne concrètement ?

lire plus Trilogie Mars – Comment fonctionne la solution Unikname ?

par Charlène Demaret | Mar 1, 2021

Profitons de l'approche du printemps et de l'actualité autour de notre conquête spatiale pour parler technique. Comment ? Au travers de 3 Webinars dédiés ! Il nous semble pertinent d'évoquer avec...

lire plus Replay – 3 secrets pour sécuriser et mieux gérer votre flotte de sites web

par Charlène Demaret | Fév 20, 2021

Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À l’occasion ce nouvel épisode, vous découvrirez nos secrets en matière de sécurité et surtout des...

lire plus Comment sécuriser et mieux gérer votre parc de sites web ?

par Charlène Demaret | Fév 15, 2021

3 secrets pour sécuriser et mieux gérer votre flotte de sites web WEBINAR Jeudi 18 Février à 10h30  Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À...

lire plus Replay Webinar – Où en êtes-vous dans la sécurisation de vos sites web ?

par Charlène Demaret | Fév 5, 2021

Selon une étude menée par Sucuri, 90% des CMS piratés en 2019 étaient des sites WordPress. Dans la majorité des cas, le piratage était dû à une mauvaise gestion des accès administrateurs. À la suite...

lire plus Où en êtes-vous dans la sécurisation de vos sites web ?

par Charlène Demaret | Fév 2, 2021

WEBINAR Mercredi 3 Février à 10h Où en êtes-vous dans la sécurisation de vos sites web ? On en parle ce Mercredi 3 Février à 10h à l'occasion d'un Webinar animé par Charlène Demaret, Customer...

lire plus « Entrées précédentes

L’article Webinar – 3 leçons à retenir sur la sécurisation des accès web est apparu en premier sur Unikname.


www.bloki-chain.com

Bozza automatica

FIDUCIA NELLA RACCOLTA  BLOKCHAIN PER LA RACCOLTA DEI RIFIUTIIL TRACCIAMENTO DEI RIFIUTI ATTRAVERSO LA BLOCKCHAINLa tecnologia Blockchain può efficacemente essere impiegata per  tracciare in modo certificato la raccolta, il trasporto e lo smaltimento di rifiuti.In questo processo sono considerate le fasi che partono dall’immissione dell’ordine, di lavoro notarizzandone la conformità

FIDUCIA NELLA RACCOLTA 

 

BLOKCHAIN PER LA RACCOLTA DEI RIFIUTI

IL TRACCIAMENTO DEI RIFIUTI ATTRAVERSO LA BLOCKCHAIN


La tecnologia Blockchain può efficacemente essere impiegata per  tracciare in modo certificato la raccolta, il trasporto e lo smaltimento di rifiuti.

In questo processo sono considerate le fasi che partono dall’immissione dell’ordine, di lavoro notarizzandone la conformità ed evitando la corruttibilità, attraverso l’utilizzo di un servizio che la tecnologia Blockchain fornisce nativamente: gli Smart Contract.

Attraverso l'utilizzo di  Smart Contract,  si va a verificare la congruenza del dato inserito ad ogni passaggio dell’attività eseguendo dei controlli specifici, caso per caso, e garantendo incorruttibilità ed eliminando ogni rischio di contraffazione e anomalia.


TRASPARENZA DELLA GESTIONE RIFIUTI ATTRAVERSO LA PIATTAFORMA
BLOKI


GESTIONE RIFIUTI 4.0

Il Digitale fornisce ormai da tempo nuovi strumenti per le imprese, dove la tecnologia e la digitalizzazione dei processi hanno portato  novità incredibili per migliorare la gestione rifiuti.

Termini come Blockchain, IOT, Machine Learning e ‘Intelligenza Artificiale‘ sono sempre più utilizzate nel settore dei rifiuti. Queste tecnologie  saranno sempre più presenti in futuro garantendo migliori prestazioni dell’organizzazione e performance più efficaci in ogni passaggio della gestione del rifiuto, a partire dalla profilazione iniziale di una nuova attività  fino alla gestione della tariffazione puntuale.        
Affidarsi alla Blockchain è un passaggio chiave; integrata con i sistemi gestionali permette di certificare e rendere disponibili, in modo non alterabile, dati, informazioni, tempistiche e processi. Permette prestazioni più veloci, minori costi di trattamento e più sicurezza e garanzia di sostenibilità e conformità.

Analizziamo un caso specifico. Il D.lgs. n. 22/1997 (cosiddetto decreto Ronchi) e successivamente il D.lgs. n.152/2006 hanno introdotto 3 strumenti fondamentali:

 

· il Formulario di Identificazione dei Rifiuti (FIR)

· il Registro di Carico e Scarico dei Rifiuti

· il Modello Unico di Dichiarazione ambientale (MUD)

La scelta di utilizzare Bloki integrato nella piattaforma gestionale per certificare dati relativi ai processi interni di trattamento rifiuti è quindi motivata dalla possibilità di avere l’archiviazione e l’accesso certificato ai documenti sopra citati direttamente sulla Blockchain. Oltre questo, ogni singola operazione sarà dotata di marca temporale precisa ed immutabile.

 

LA PIATTAFORMA  BLOKI

VANTAGGI

La piattaforma introduce diversi vantaggi per gli operatori:  presenza di backup delle informazioni che non è legato alla propria struttura di archiviazione, ma che si trova Bloki,  un registro distribuito su diversi nodi (server) facilità nelle verifiche da parte di auditor o enti di controllo, che potranno reperire le informazioni relative ai trattamenti dell’Azienda direttamente dalla Blockchain BLOKI, grazie alle caratteristiche di immutabilità e incensurabilità delle informazioni inserite miglioramento dell’immagine e della reputation aziendale trasmettendo una maggior fiducia, data dalla scelta di potere condividere le proprie informazioni in un modo totalmente trasparente


TIPOLOGIE DI RIFIUTI

Grazie alla propria flessibilità,  BLOKI  riesce a gestire ogni  tipologia di rifiuto e di processo. Il Tracciamento Certificato dei Rifiuti prevede che BLOKI supporti le fasi operative che possono essere schematizzate come di seguito:

Ricezione della richiesta attività e apertura task
Definizione e Apposizione identificativo lotto
Certificazione delle varie fasi di passaggio del Task
Chiusura del Task

L'articolo Bozza automatica proviene da www.bloki-chain.com.


MyDEX

Hidden in Plain Sight — the Transformational Potential of Personal Data

Hidden in Plain Sight — the Transformational Potential of Personal Data This is the sixth and final in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy. This blog focuses on the scale of the economic (and social) opportunity — and why it is often overlooked. Previous blogs f
Hidden in Plain Sight — the Transformational Potential of Personal Data

This is the sixth and final in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy.

This blog focuses on the scale of the economic (and social) opportunity — and why it is often overlooked.

Previous blogs focused on how to unleash the full potential of personal data, why every citizen should be provided with their own personal data store, how to achieve these changes at scale, common misconceptions that derail progress and a review of the key components needed for the overall ecosystem to work.

To catch up on progress on our Macmillan My Data Store Pilot click here.

It’s odd to the point of bizarre, but it’s true. Today, there is endless hype about the enormous economic potential of data. That is why the UK Government is developing a National Data Strategy. Yet most debate (and therefore decision-making) about it demonstrates a deep misunderstanding of where this value lies and how it can be unleashed. For a National Data Strategy to be successful, it has to get its underlying economic logic right.

Cost plus versus cost out

Currently, nearly every Government proposal and policy relating to data (including personal data) treats data as a corporate asset that the organisation has invested in. The organisation therefore needs to earn a return on this investment (it is assumed). Like any other product, data needs to be sold (or rented) for a margin. The entire focus is on measuring and the potential size of ‘the market for data’, the supposed value of different bits of data in this market, and how to enable this market to work better. It’s all about monetisation of data.

At first glance, this seems logical. After all, if organisations have invested a lot of time and money creating data assets, it’s only sensible that they should find a way to cover these costs. What this misses, however, is the opportunity to take cost out of the system as a whole — to move it to a different cost baseline — where new ways of sharing and using data pay for themselves many times over without anyone having the need to sell or ‘monetise’ anything.

Henry Ford’s mass production moving assembly line is a good example of the immense opportunities opened up by such system-wide ‘cost out’ approaches.

Before the moving assembly line, cars were extraordinarily expensive items, made painstakingly by craftspeople who made each component separately. Because each component was hand made and therefore slightly different, to assemble them into a working machine, they had to re-work every component to make it fit.

This required exquisite skill. The ability to do it well was a key source of the craftsperson’s added value. But it was also incredibly expensive. By relying on standardised components, Ford’s production lines eliminated the need for this rework. And by bringing each component to the worker when they needed it, his moving assembly line eliminated unnecessary time spent searching for, travelling to, or waiting for parts to arrive — thus reducing another layer of effort and speeding up outputs.

When Henry Ford first experimented with his ‘cost out’ moving assembly line, people were astonished by the efficiency gains it delivered. Before the experiment, it took 29 workers 15 minutes to make a magneto (which helps start ignition). After the experiment it took 14 workers 5 minutes: an 85% leap in productivity. Similar productivity gains followed as he applied the same methods to the rest of the car. Car output soared from 68,773 in 1912 to 735,020 in 1917. The price of his cars fell by 90%. Nobody had seen just how much waste was embedded into how the old system worked. The waste was previously invisible to them.

Given the extensive pollution, congestion and occasional carnage caused by the motor car, many people might say it ended up being more of a curse than a blessing. But we are not talking about the merits of the car itself here. We are talking about economic principles that powered an entire industrial revolution.

The moving assembly line eliminated huge amounts of waste that were embedded into every sinew of how the previous system worked: the waste caused by the need to rework every component and by poor logistics which meant that the right things could not get to the right people at the right time.

Personal data stores apply the same economic logic to transform the costs of producing data driven services. Verified attributes are the digital equivalents of Henry Ford’s standardised parts. By enabling one organisation to instantly re-use data verified by another organisation they eliminate the need for vast amounts of duplicated effort and rework (re-creating each data point from scratch or checking its details, provenance etc).

And by enabling individuals to hold these verified attributes in readiness, and to share them safely and efficiently with service providers when they need to be used, personal data stores act as data logistics engines bringing the right data components to the right places at the right time (the equivalent of Henry Ford’s moving assembly line).

Our early experiments are showing similar efficiency and productivity gains to those realised by Henry Ford. There is now an opportunity to create order of magnitude reductions in the costs of providing data-driven services (initially in the public sector). Just to be clear: by order of magnitude reductions in costs, we don’t mean five or even ten per cent here or there but a complete system that is five or ten times more efficient.

When dogs don’t bark

In one of his famous detective stories Sherlock Holmes notices something crucial that no one else had: he notices something that hadn’t happened.

Humans are hard-wired to notice things that happen. But it’s very difficult to become aware of things that are not happening — things that could or should be happening but aren’t.

This happened with the motor car. Before Henry Ford’s revolution, the idea that owning and using a motor car would become a mass market open to ordinary people was literally laughable — when Ford suggested it was possible, people laughed at him. They said there was no demand for it. They were right. But the reason why there was ‘no demand’ was because cars were prohibitively expensive. The dog wasn’t barking. People weren’t noticing something that wasn’t happening: making cars affordable.

When Ford made his breakthrough, the dog started barking. Instead of being an exclusive, privileged plaything of the very rich, motorised mobility was democratised. Result? Society was transformed as people started driving to work, to shops, to leisure destinations and to new homes in newly created suburbs. The 20th century mass ‘consumer / car economy’ was built on this breakthrough.

Online search is another example. Before Google, very few people conducted searches for information because doing so was so prohibitively expensive in terms of time and effort. Another non-barking dog. Then Google made search very easy, and now most people conduct searches dozens of times a day, spawning a vast new market apparently out of thin air.

Waking the data dog

Today, it’s commonly said that individuals don’t want to manage their data. “There is no demand for it,” we are told again and again. But that’s because, under the current system, it’s prohibitively expensive in terms of time and effort to do so. How many people want to invest precious time, effort (and sometimes money) finding, collecting and presenting the information they need to get stuff done: filling in forms; proving facts about themselves; trying to join the dots created by organisations that work in isolated silos? No wonder they do it as little as possible. Until the opportunity to do things differently is presented to them, helping people handle their data better remains another non-barking dog.

By applying those same principles of standardised parts and improved logistics — by waking that dog — personal data stores have the potential to democratise how society uses personal data, spreading the benefits to every citizen. Just like ‘the great car economy’, the way services are created and delivered will be transformed while making new ones possible — services that enable citizens to use their data to organise and manage their lives much better, to make and implement better decisions and to undertake associated administration across every aspect of their lives (money, home, education and career, travel, leisure etc) … all at a fraction of the previous cost and effort currently involved.

Everything we’ve written in our last five blogs has been focused on waking the 21st century dog of low cost, high quality, mass produced, privacy protecting, personalised, data driven services. Yet, time and time again we are told ‘there is no demand’ for a new personal data logistics infrastructure that empowers citizens with their data, just as Ford empowered citizens with mobility.

Why is this? Because they haven’t noticed that the dog is not barking. Their attention is focused on the current system as it is, not what it could be. They are simply not seeing the huge amounts of waste embedded in its workings (or if they do, they undertake ‘red tape reduction’ initiatives that reproduce the very causes of that waste) because they are not looking at the connections between the parts, only at how to improve the efficiency of each part in splendid isolation. They have not learned the lessons of mass production.

A positive feedback loop

All the really big service and quality of life breakthroughs of the past 100 years — including the provision of universal running water, sewerage, electricity, education and health services — have two things in common. First, they were first and foremost infrastructure projects, making something essential universally available at low cost. Second, they directly improved individuals’ lives and, at the same time, they also improved society’s health and resilience and the efficiency and productivity of the economy as a whole. They combined personal and public benefit.

Providing every citizen with their own personal data store — another case of universal service provision — follows this pattern. The way it achieves this combined benefit is order-of-magnitude reductions in the friction, effort, risks and costs individuals and service providers incur when attempting to collect, store, share and use personal data.

In a post-Covid world trying hard to ‘build back better’ the need for infrastructure that enables verified data about people to be shared while protecting their privacy has never been more apparent. Government today has a once in a generation opportunity to do something truly transformative and rebuild in a way that benefits everyone. This is what the National Strategy for Personal Data can and should be aiming for.

Hidden in Plain Sight — the Transformational Potential of Personal Data was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


www.bloki-chain.com

Bozza automatica

FIDUCIA NELLA LOGISTICA   SETTORE LOGISTICA E SUPPLY CHAINL’efficienza è uno dei più importanti fattori di competitività sul mercato.La costante ricerca di agilità da parte delle aziende e il loro desiderio di arrivare sul mercato col prodotto giusto, di qualità e nei tempi stabiliti si scontra con l’assenza di visibilità end-to-end e un disomogeneo livello organizzativo e di d
FIDUCIA NELLA LOGISTICA      SETTORE LOGISTICA E SUPPLY CHAIN

L’efficienza è uno dei più importanti fattori di competitività sul mercato.

La costante ricerca di agilità da parte delle aziende e il loro desiderio di arrivare sul mercato col prodotto giusto, di qualità e nei tempi stabiliti si scontra con l’assenza di visibilità end-to-end e un disomogeneo livello organizzativo e di digitalizzazione delle aziende coinvolte.



La logistica è l'insieme delle attività organizzative,  gestionali e operative che governano nell'azienda e nella filiera  i flussi di materiali e delle relative informazioni nella rete delle imprese,  dalle origini presso i fornitori fino alla consegna dei prodotti finiti ai clienti e al servizio post-vendita .  In quanto tale, è una delle funzioni chiave all’interno dell’azienda. In un’ottica che vede l’organizzazione aziendale come un sistema aperto interagente con l’ambiente esterno, la logistica rappresenta l’insieme delle modalità attraverso cui l’impresa recepisce gli input (di varia natura, sia materiali che informativi), e distribuisce gli output prodotti verso i sistemi di vendita. Si può affermare senza dubbio che la logistica definisce l’identità dell’azienda e il modo in cui essa viene identificata sia da coloro che sono a monte nella filiera produttiva, sia da coloro che sono a valle, fino ad arrivare appunto ai consumatori.


BLOCKCHAIN VANTAGGI


La Blockchain da  risposte a molte delle criticità che interessano questo settore (carente tracciabilità dei prodotti, difficile attribuzione delle responsabilità, difficile integrazione operativa,  possibilità di frode e compliance normativa difficilmente dimostrabile) costituendone la base per dar vita ad una catena del valore fondata sui principi di trasparenza, sicurezza e tracciabilità.

La Blockchain, coordinata con piattaforme di sensoristica e oracoli per massimizzare l’efficienza delle reti più estese, si pone come fondamento della trust economy e vede nelle catene di distribuzione uno dei primi ambiti di applicazione.


L’UTILIZZO DELLA BLCKCHAIN

 

risponde alla risoluzione della visibilità end-to-end, che riguardi tutti i processi offrendo la massima capacità informativa garantisce la sicurezza del dato e dei processi (dati di produzione, tracciabilità dei componenti, spostamenti dei materiali, la logistica, lo stock, conformità normativa e rispetto degli standard di qualità) a favore di tutti gli stakeholder genera una gestione della supply chain resiliente introduce la semplificazione in ogni passaggio di processo

L'articolo Bozza automatica proviene da www.bloki-chain.com.


Bozza automatica

FIDUCIA NELLE FIRMEFIRMA DIGITALE REGISTRATA IN BLOCKCHAIN Puoi scegliere di apporre la tua firma o su tutte le pagine che compongono il documento o solo su una pagina.E’ possibile usare l’opzione “multifirma” per consentire di apporre sullo stesso documento più firme.Con BlokiSign il ‘giro delle firme’ tra diversi firmatari del documento  è sicuro e tracciato  con marca temporale s

FIDUCIA NELLE FIRME

FIRMA DIGITALE REGISTRATA IN BLOCKCHAIN

 

Puoi scegliere di apporre la tua firma o su tutte le pagine che compongono il documento o solo su una pagina.

E’ possibile usare l’opzione “multifirma” per consentire di apporre sullo stesso documento più firme.
Con BlokiSign il ‘giro delle firme’ tra diversi firmatari del documento  è sicuro e tracciato  con marca temporale su Blockchain.

Con pochi passaggi metti la tua firma e la marca temporale al documento Pdf

Firma e Marca Temporale

 

 

 

 

L'articolo Bozza automatica proviene da www.bloki-chain.com.


Bozza automatica

FIDUCIA NEL LAVORO DELLE PERSONE DOVE, COME , COSA E QUANDO     Il consumatore potrà conoscere in tempo reale,  specie, metodo di pesca, area di provenienza e nome della nave che ha pescato quello che sta comperando   L'acquacoltura  come risorsa a garantire sicurezza alimentare. Gestire in maniera ottimale la vita dell’allevamento, grazie a sensori, dati satellitari

FIDUCIA NEL LAVORO DELLE PERSONE

DOVE, COME , COSA E QUANDO

 

 

Il consumatore potrà conoscere in tempo reale,  specie, metodo di pesca, area di provenienza e nome della nave che ha pescato quello che sta comperando

 

L'acquacoltura  come risorsa a garantire sicurezza alimentare. Gestire in maniera ottimale la vita dell’allevamento, grazie a sensori, dati satellitari sulla qualità dell’acqua, sistemi di Internet of Things in grado di regolare, ad esempio, l’alimentazione dei pesci in base alla situazione ambientale. Introdurre tecnologie avanzate per un acquacoltura di precisione

Poter scegliere consapevolmente cosa acquistare potrebbe fare la differenza, non solo per chi acquista ma anche per chi vende

.

L'articolo Bozza automatica proviene da www.bloki-chain.com.


FIDUCIA NELLA SCIENZA Assicurare l'integrità dei dati caricati Tramite la rete blockchiain di Bloki garantire l'integrità dei dati, memorizzando un'unica versione immutabile della verità. Bloki e Sanistory gestiscono in modo semplice le attività di sanificazione  tra  i partecipanti alla rete  i quali possono collaborare con la fiducia, in quanto si scambiano informazioni controllan

FIDUCIA NELLA SCIENZA


Assicurare l'integrità dei dati caricati


Tramite la rete blockchiain di Bloki garantire l'integrità dei dati, memorizzando un'unica versione immutabile della verità. Bloki e Sanistory gestiscono in modo semplice le attività di sanificazione  tra  i partecipanti alla rete  i quali possono collaborare con la fiducia, in quanto si scambiano informazioni controllando l'accesso ai dati.
Ottenere la piena tracciabilitàMan mano che le informazioni ( bollettini, certificati, report, analisi, foto ecc.)  attraverso  la supply chain, vengono registrati in Bloki-Sanistory, si viene a creare una  traccia di verifica, che implica che si può risalire alle origini di un evento, organizzato in sequenza e corredato con data e ora. Questo aiuta a ridurre la contraffazione  di quell'evento e i partecipanti  possono localizzare il dato  in pochi secondi, in modo da poter rispondere rapidamente.


Acquisire nuovi aspetti di efficienza e operativa Dalla risoluzione delle controversie all'attivazione delle fasi successive nelle transazioni della supply chain.  I contratti intelligenti di Bloki possono automatizzare i processi per una maggiore velocità ed efficienza e  intervengono automaticamente quando vengono soddisfatte le condizioni stabilite.

 
Tecnologia per supportare le tue esigenze

​Oggi chi lavora nel settore della sanificazione rispettando i protocolli sanitari, ricoprono  ruoli essenziali nella nostra società. Eqr Lab si impegna a fornire agli operatori del settore  strumenti e servizi pensati per ottenere più   intuizione dai dati  e per semplificare le attività di condivisione.Cerchiamo di aiutare le organizzazioni a raggiungere i loro obiettivi in termini di efficienza, resistenza e affidabilità, in modo da soddisfare le esigenze delle comunità.

L'articolo proviene da www.bloki-chain.com.


FIDUCIA NEI RIFIUTI Certezza dei processi dichiarati SMALTIMENTOLa registrazione in Blockchain Bloki del corretto protocollo per lo smaltimento  dei rifiuti edili, passando dal corretto trasporto al centro di raccolta e smaltimento autorizzato al formulario di identificazione dei rifiuti.​ PROGETTOI Processi di progettazione, autorizzativi, realizzativi, di
FIDUCIA NEI RIFIUTI

  Certezza dei processi dichiarati   SMALTIMENTO La registrazione in Blockchain Bloki del corretto protocollo per lo smaltimento  dei rifiuti edili, passando dal corretto trasporto al centro di raccolta e smaltimento autorizzato al formulario di identificazione dei rifiuti. ​   PROGETTO I Processi di progettazione, autorizzativi, realizzativi, di collaudo, di agibilità ecc possono essere certificati  in modo aggregato  e condivisi secondo le diverse competenze (Società imprese, progettisti, uffici, collaudatori) in maniera brillante dalla Blockchain e dagli Smarct Contract di Bloki.
  MATERIALI La provenienza e  la qualità dei materiali usati. Per ogni tipologia di materiale la sua specifica disciplina e qualificazione, e il possesso del benestare di tutte le conformità e l'idoneità tecniche. Il tutto registrato in Blockchain Bloki in modalità sicura, marcata temporalmente e di  facile e semplice accessibilità. Tutti i materiali possono essere tracciati e rintracciati dal momento della produzione fino alla consegna finale in cantiere. Ciò significa che un documento di dati non manipolabile verrebbe memorizzato e gestito per l’intero ciclo di vita del materiale, caratterizzando  una profonda trasparenza
  COSTRUZIONE anche la fase di costruzione è caratterizzata da questa continua  registrazione non manipolabile che concatenata a tutte le altre registrazioni crea un'unica transazione e una catena di valore, che oltre al valore della fiducia passa anche attraverso al valore economico. Uno su tutti il tracciamento dei prodotti di scarto  che nell’edilizia costituisce un problema  sia economico che ambientale. Una gestione tramite blockchain Bloki , lungo tutto il ciclo di vita, creerebbe un registro dei residui che potrebbero essere riutilizzati. Efficienza questa gemellata dalla responsabilità della persona impiegata all'uso dei materiali
  VENDITA Infine se la fiducia intesa come valore viene collegata al concetto di catena – tale da creare una cosiddetta catena della fiducia – si ottiene quella realtà (e quindi non più aspettativa) – sempre sotto il controllo delle parti che concorrono a una transazione – che prende il nome di catena del valore.
CONCLUSIONELe caratteristiche tecniche e funzionali delle Blockchain si calano positivamente nel settore delle costruzioni e delle manutenzioni degli immobili e delle opere immobiliari.I Processi di progettazione, autorizzativi, realizzativi, di collaudo, di agibilità ecc possono essere certificati  in modo aggregato  e condivisi secondo le diverse competenze (Società imprese, progettisti, uffici, collaudatori) in maniera brillante dalla Blockchain e dagli Smarct Contract di Bloki.Dal momento in cui l’opera è ‘pensata’ al momento in cui comincia a essere utilizzata e anche successivamente durante la vita utile (manutenzioni correttive e preventive),  la Blockchain Bloki è in grado di certificare dati, informazioni e processi in modalità sicura, marcata temporalmente, semplice e accessibile.I. A. E.INTELLIGENZA ARTIFICIALE EDILE VANTAGGII vantaggi di Bloki per chi progetta e costruisce, per  gli uffici competenti e per i proprietari e locatari sono notevoli:• Risparmio dei costi
• Migliore qualità delle opere
• Ottimizzazione delle informazioni e dei processi
• Verificabilità e tracciabilità
• Supporto alla Responsabilizzazione, alla Compliance e al Tracciamento delle vicende giuridiche, finanziarie e urbanistiche degli immobili, in particolare di quelli interessati da vicende giudiziarie. 

L'articolo proviene da www.bloki-chain.com.


AGRI FOOD

FIDUCIA NEL CIBOUna nuova era per la filiera alimentare Le persone, vogliono sapere da dove arrivano gli ingredienti che mangiano. Se poi questo cibo è per il loro bambino giustamente vogliono un prodotto dove la tracciabilità  dell'intera filiera sia  visibile e incorruttibile.Oggi tramite la blockchain un fornitore può  garantire questa sicurezza e qualità dei propri prodotti

FIDUCIA NEL CIBO

Una nuova era per la filiera alimentare

 

Le persone, vogliono sapere da dove arrivano gli ingredienti che mangiano. Se poi questo cibo è per il loro bambino giustamente vogliono un prodotto dove la tracciabilità  dell'intera filiera sia  visibile e incorruttibile.

Oggi tramite la blockchain un fornitore può  garantire questa sicurezza e qualità dei propri prodotti  migliorando la sua reputazione  ed eliminando  frodi alimentari e sprechi, garantendo la qualità promessa.

Aiutaci a scegliere il meglio. Diventa un fornitore trasparente e affidabile. Fai leva su una fiducia reciproca...crea accettazione.  Tutto quello che ti stiamo chiedendo è semplice. Inserisci in blockchain  i dati che hanno reso il tuo prodotto il nostro cibo più sicuro. Così facendo, può essere che si crei fiducia e accettazione anche dalla foto B.

Il Valore del tracciamento

​La possibilità di verificare la provenienza...tutta la sua storia, dal momento della raccolta, alla lavorazione, al trasporto, fino ad arrivare alla nostra tavola

L'articolo AGRI FOOD proviene da www.bloki-chain.com.


Aergo

CRISPY WHALES Makes a Splash on AERGO YouTube

In the quest to strengthen and expand AERGO ecosystem of partners, AERGO has decided to incubate CRISPY WHALES through the community Agora voting. There were 100% agreement from the community to approve the incubation funding for CRISPY WHALES. What and how will CRISPY WHALES benefit AERGO? Let’s watch this video from CRISPY WHALES CEO Wonjin Lim https://youtu.be/JRTA4DjHmIs Summary

In the quest to strengthen and expand AERGO ecosystem of partners, AERGO has decided to incubate CRISPY WHALES through the community Agora voting. There were 100% agreement from the community to approve the incubation funding for CRISPY WHALES.

What and how will CRISPY WHALES benefit AERGO? Let’s watch this video from CRISPY WHALES CEO Wonjin Lim

https://youtu.be/JRTA4DjHmIs
Summary

Banana Clips is an online marketplace, created by CRISPY WHALES, where video content creators can sell or purchase short clips using web and mobile applications. Anyone with a smartphone can easily upload videos and anyone can easily search and purchase them via the web.

The uploaded short clips are analyzed by an AI system, and then unique steganography is implanted in each short clip through a copyright management system to be used for identification and verification. The watermarked short clips are stored and protected over the AERGO platform.

Video content production and/or editing professionals using Banana Clips can purchase short clips and may use them free from copyright infringement through professional and advanced search methods.

CRISPY WHALES aims to provide a transparent marketplace through Banana Clip to ensure various short clips are securely provided and that each user can be fairly compensated. Furthermore, the company’s goal is to create a transparent and ethical culture by establishing a video copyright verification system, based on AERGO, that can be shared worldwide.

CRISPY WHALES Makes a Splash on AERGO YouTube was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 11. April 2021

KuppingerCole

Analyst Chat #71: Cybercriminal Behavior in the COVID Era

While the world tries to cope up with the on-going pandemic, cybercriminals have got their hands on a gold mine. Annie and Matthias sit down again to chat about the overall picture of cyberattacks, including COVID-related lures.

While the world tries to cope up with the on-going pandemic, cybercriminals have got their hands on a gold mine. Annie and Matthias sit down again to chat about the overall picture of cyberattacks, including COVID-related lures.




Identosphere Identity Highlights

Identosphere #27 • Azure AD VC Demo • 7 Laws ID Standards • VC Use Cases

Welcome to Identosphere’s Weekly Update If you haven’t already, you can subscribe and read previous issues at newsletter.identosphere.net. Support Identosphere on Patreon Coming Up PoCATHON by Affiniti • Mar 26 – May 9, 2021We invite developers across the world to come and build applications that generate secure, portable and privacy-preserving credentials enabling trust across entities using Affin
Welcome to Identosphere’s Weekly Update

If you haven’t already, you can subscribe and read previous issues at newsletter.identosphere.net.

Please support Identosphere on Patreon. Patron’s get special access to our quarterly report/summary. If you are new to the field this is a great resource highlighting the best of our coverage for the past 6 months.

Coming Up

The EOSIO Identity Working Group - Kickoff • April 12th
Gimly is excited to start the EOSIO identity working group WG (Twitter #eosio_id)! This open working group (WG) will create and foster identity solutions using EOSIO technology, by creating open W3C compliant self-sovereign identity standards, interoperability, and ecosystem development for eosio based identities.

Covid-19 Credentials Initiative: Use Case Implementation • April 13

Digital Wellness Passes: The Missing Links Plug and Play Travel • April 14
This event will expand on the takeaways from our first digital wellness pass event “The Missing Links” including interoperability, standards, authentication, backend data exchange with vaccine certificates/testing results, and much more.

Covid-19 Technology Innovations • April 14
“Explore the technology innovations being pioneered in response to the Covid-19 pandemic, and what potential for Scottish ventures this presents.” by Peter Ferry, Gary McKay and Julian Ranger (Siccar, APPII and digi.me)

Privacy by Design • Apr 14 
With Dr. Ann Cavoukian, Executive Director of the Global Privacy & Security by Design Centre.

Series on self-sovereign identity (SSI) Conditis • April 14
This monthly webinar series is an opportunity to hear from UK experts in distributed identity and learn the mechanics of decentralized identity systems.

Internet Identity Workshop XXXII (#32) • April 20-22 <—Register NOW. This is the main event for the whole community. The best place to dive in and get connected. 20% discount code: Identosphere_IIWXXXII_20

OpenID Foundation Virtual Workshop • April 29, 2021

PoCATHON by Affiniti • Mar 26 – May 9, 2021
We invite developers across the world to come and build applications that generate secure, portable and privacy-preserving credentials enabling trust across entities using Affinidi’s APIs

Identiverse 2021 • June 21-23 (Denver)

Covid 19 Everything You Need to Know About “Vaccine Passports” IdentityWoman \ Mother Jones

Andy Slavitt, a White House senior adviser for COVID response, specified at a March 29 briefing that “unlike other parts of the world, the government here is not viewing its role as the place to create a passport, nor a place to hold the data of citizens.” 

WHO goes there? Vaccination Certificates Technology and Identity Stephen Wilson

the proper goal of a digital vaccination certificate should be confined to representing nothing more and nothing less than the fact that someone received their jab. Such a Verifiable Credential would include the place, date and time, the type of vaccine, and the medico who administered or witnessed the jab.

We don’t need immunity passports, we need verifiable credentials Cointelegraph

In theory, their idea is great. In practice, it’s terrible. Or, as the Daily Beast put it: “Vaccine Passports Are Big Tech’s Latest Dystopian Nightmare.”

Standards and Organizations The 7 Laws of Identity Standards OpenID

A identity standard’s adoption is driven by its value of the reliability, repeatability and security of its implementations.

A standard’s value can be measured by the number of instances of certified technical conformance extant in the market.

Certified technical conformance is necessary but insufficient for global adoption.

Adoption at scale requires widespread awareness, ongoing technical improvement and a open and authoritative reference source.

When Libraries/Directories/ Registries act as authoritative sources they amplify awareness, extend adoption and promote certification.

Certified technical conformance importantly complements legal compliance and together optimize interoperability.

Interoperability enhances security, contains costs and drives profitability.

Verifier Universal Interface by Gataca España S.L.

This draft version can be found at https://gataca-io.github.io/verifier-apis/ and has been built using ReSpec.

This draft version for VUI includes today 6 APIs:

Presentation Exchange

Consent Management

Schema resolution

Issuer resolution

ID resolution

Credential status resolution

DIF Steering Committee election coming up

Among the conclusions of this analysis was that a larger steering committee would garner more trust and visibility into DIF's internal governance as an organization. An operating addendum was adopted last month which formalizes procedures for periodic elections and distribution requirements.

Testing self-sovereign identity with the Lissi demo

We are convinced this demonstrated user flow can help to better understand the interactions in a digital identity ecosystem such as IDunion. [...] The Lissi team is in discussion with trust service providers, authorities, municipalities, agencies, associations and other relevant stakeholders to meet all the necessary requirements and provide you with the best user experience.

Verifiable Credentials Use Cases - Affinidi

Starting with an intro to VCs in March, Affinidi has been rolling out a series on Verifiable Credentials use-cases!

Driving License as a Verifiable Credential

Verifiable Credentials in Ben’s Serendipity

Think about it for a moment. No physical documents at all, but a simple and secure self-sovereign identity that Ben had complete control over. More importantly, look at the interoperability and flexibility as Ben could use them in different situations and across multiple platforms.

On-Demand Employment Endorsements

Opening a Bank Account

Accessing Medical Records Anywhere

Protecting Your Driver’s License

The EOSIO DID method specification

We have been working with the Decentralised Identity Foundation to shape this specification, and also want to thank the W3C Credentials Community Group for their support in the creation of the Verifiable Condition type, a necessary component to create the EOSIO DID document to represent EOSIO account permissions.

SSI In IoT, The SOFIE Project The Dingle Group

For the 22nd Vienna Digital Identity Meetup* we hosted three of the lead researchers from the EU H2020 funded The SOFIE Project.  The SOFIE Project wrapped up at the end of last year a key part of this research focused on the the use of SSI concepts in three IoT sectors (energy, supply chain, and mixed reality gaming) targeting integrating SSI in without requiring changes to the existing IoT systems.

DID Comm has its own site

DIDComm lets people and software use DIDs to communicate securely and privately over many channels: the web, email, mobile push notifications, QR codes, Bluetooth, message queues, sneakernet, and more. 

Company Updates Elastos DID: What’s Ahead for 2021

Tuum Tech is working on an Elastos DID-based application called Profile, a rising competitor to LinkedIn – in large part by leveraging Elastos DIDs. Unlike in conventional applications where data and identities are managed and controlled by centralized systems, users will retain full ownership of their data when using Profile.

Spruce Developer Update #8

“We are currently working on a project that will enable creator authenticity for digital assets including NFTs.”

“focused on advancing did-tezos as the first formally verified DID Method.”

DIDKit Updates

Credible Updates

IDunion enters the second project phase initiated by the Federal Ministry of Economic Affairs and Energy of Germany

The goals of this new project phase includes the establishment of a European cooperative, the launch of a production network and the implementation of 40+ different pilot applications from several areas.

On Self-Sovereign Identity: What's the Business Value of SSI? Hackernoon

To businesses, immediate value comes from removal of the costly and challenging GDPR compliance. Forbes reports that in 2018 in the UK alone $1,1 billion was spent by companies on GDPR preparation, while US companies allocated over $7.8 billion on protecting customers’ personal information.

SAP Completes Pharmaceutical Industry Pilot to Improve Supply Chain Authenticity

today announced the completion of an industry-wide pilot utilizing self-sovereign identity (SSI) credentials to establish trust in the pharmaceutical supply chain for indirect trade relationships.

SAP Pharma Solution Supports Supply Chain Compliance

SAP has chosen an open, interoperable technology to validate all stakeholders in the pharma supply chain in order to provide customers with the best solution for compliance under the U.S. Drug Supply Chain Security Act (DSCSA) requirements. The DSCSA also limits stakeholders’ interactions to ATPs.

The benefits of Self-Sovereign Identity wallets: Collaborating on Self-Sovereign Identity wallets Rabobank (Part 2)

SSI and identity wallets will make it easier for citizens, organizations and governments to manage (digital) identities. It will make registrations and transactions across the internet private and secure. It will also make organizations more efficient and effective.

The Importance of Data Inputs and Semantics for SSI with Paul Knowles [Podcast]

The platform was an incredibly federated platform when I built it because I didn’t know that SSI existed. So as soon as I found that ecosystem, I tore up the rulebook and said, “This isn’t going to work; I have to rebuild it.”

Removing Anonymity Online Would Risk The Most Vulnerable Users Anonym

We all know online abuse can be incredibly damaging and Pew puts it like this: “In its milder forms, [online abuse] creates a layer of negativity that people must sift through as they navigate their daily routines online. At its most severe, it can compromise users’ privacy, force them to choose when and where to participate online, or even pose a threat to their physical safety.”  

Azure AD Verifiable Credentials Announcing Azure AD Verifiable Credentials MS ID Blog

We started on a journey with the open standards community to empower everyone to own and control their own identity. I’m thrilled to share that we’ve achieved a major milestone in making this vision real. Today we’re announcing that the public preview for Azure AD verifiable credentials is now available: organizations can empower users to control credentials that manage access to their information.

Azure AD Verifiable Credentials Entering Public Preview Kuppinger Cole

This solution enables organizations to design and issue verifiable credentials to their users, be it enterprises issuing employment credentials to their employees, universities enrolling students or issuing diplomas, governments issuing passports, ID cards, and countless other uses.

Azure Active Directory VCs - preview introduction Daniel Krzyczkowski

I have configured Verifiable Credentials accordingly to details in the documentation. I have an existing Azure AD B2C tenant so it was much easier because users have to sign in first before they can be issued a verifiable credential.

Blogs On the Horizon: Tykn and Social Impact Through Digital Identity IdentityReview

The Turkish Government has recently announced that it will be using Ana to accelerate work permit distribution for its 3 million refugees. The Turkish Ministry of Foreign Affairs—alongside the United Nations Development Programme (UNDP), the INGEV Foundation, the World Food Programme (WFP), TÜBİTAK and the Istanbul Chamber of Commerce—developed this application with the intent of making refugees financially independent.

Self-Sovereign Identity and Government – Data Exchange Cybernetica

We then begin sharing portions of that data with third parties, leading to a situation where the data is now in three locations, the weakest of which is still most definitely the end-user’s method of storage, where all the data on them is accumulated.

Self Sovereign Identity Systems - The Passion Pad

We should have the right to manage our identity, free of any country or the place where we live. By giving this right to the government or any central authority, we give them much more power. Separating data rights from the actual data is important. User should have the right to decide who should have the access to his/her data. 

Creating Verifiable credentials in ASP.NET Core for decentralized identities using Trinsic

This article shows how verifiable credentials can be created in ASP.NET Core for decentralized identities using the Trinsic platform which is a Self-sovereign identity implementation with APIs to integrate.

Videos Digital Identity, use Verifiable Credentials with Blockchain Microsoft Mechanics

Joy Chik, Microsoft’s Identity CVP, joins Jeremy Chapman to show you how it works and gives you the key steps to get up and running.

Jolocom's lightning talk at DWeb meetup - Self-sovereign Identity In Germany

A brief video introduction to use cases, strategies and challenges of the four German SDI projects.

Papers Blockchain, Self-Sovereign Identity and Digital Credentials: Promise Versus Praxis in Education

technology as a public good for the education sector. It levers on the lead author’s perspective as a mediator between the blockchain and education sectors in Europe on high-profile blockchain in education projects to provide a snapshot of the challenges and workable solutions in the blockchain-enabled, European digital credentials sector.

Not SSI but interesting NFTs on Holochain? Easy as passing the ball Holochain

in Holochain, every element on every user’s chain is already guaranteed to be unique, and you don’t get scarcer than that. You just need to track the history of who the unique thing has been transferred to.

What is up with Yat? NFTs.WTF

What’s strange about the project is that the people behind it (Richard Spagni and Naveen Jain) are not novices, but world-leading privacy experts and seasoned founders and investors in crypto, leading many to give them the benefit of the doubt that they will deliver on the promises to users and that there must be some reason for them to not share more technical details. [Yikes! -nfo]

Exclusive: Trust in tech cratered all over the world last year Axios

Edelman said the main reason for the trust fall is the increasingly “complicated” relationship between the public and technology — including the spread of misinformation, rising privacy alarm and bias in artificial intelligence.

What Are the Six Key Areas of the FATF Consultation? Elliptic

On March 19th, Paris-based Financial Action Task Force (FATF), the global standard-setting body for anti-money laundering and counter-terrorism finance (AML/CFT), released its Draft Updated Guidance for a Risk-Based Approach to Virtual Assets and Virtual Asset Service Providers. Or, in compliance acronym speak the FATF's draft guidance for its RBA to VAs and VASPs.

Thanks for reading

Subscribe on newsletter.identosphere.net and support us at patreon.com/identosphere


KuppingerCole

The Future of Exchanging Value

by Anne Bailey The European Central Bank (ECB) is exploring a digital euro. Early October 2020, the ECB Governing Council released a whitepaper discussing its initial research on such a project. One of the ECB’s clearest and most repeated messages is that a digital euro will not replace cash, but complement it. A digital euro is meant to facilitate electronic payments, while cash would remain a v

by Anne Bailey

The European Central Bank (ECB) is exploring a digital euro. Early October 2020, the ECB Governing Council released a whitepaper discussing its initial research on such a project. One of the ECB’s clearest and most repeated messages is that a digital euro will not replace cash, but complement it. A digital euro is meant to facilitate electronic payments, while cash would remain a valid form of payment and the primary form of value storage.

The European Central Bank’s Exploration of a Digital Euro

This exploratory paper comes as a direct result of a digitalized economy whose momentum is ever growing because of new technologies, legislation meant to increase the competitiveness of digital competitors, and traction that digital currency projects are gaining around the world. Despite the changing world, and the changing needs and expectations of individuals, businesses, and other economic actors, a digital euro is not meant to upend the current monetary system. The paper is a painstakingly methodical exploration of how-to architect, legislate, and standardize a digital euro to maintain financial stability and preserve the ECB’s capacity to influence monetary policy.

The paper itself is a proactive effort from the ECB, dedicating time and resources towards determining the opportunities, risks, and costs of adopting a digital euro. This is also apparent in the scenarios that would motivate the ECB to pursue a digital euro project, the majority of them being a proactive step to maintaining competitiveness, improving efficiencies, and increase the stability of the monetary system.

Scenario

Proactive

Reactive

A digital euro would foster the digitalization of the European economy

x

 

Decline of cash payments

 

x

A different form of money – digital currencies issued by other central banks, decentralized organizations, or private companies – becomes a credible alternative to the euro

 

x

A digital euro would be necessary or beneficial to strengthen European monetary policy

x

 

If a digital euro would mitigate the negative effects that a cyber incident, natural disaster, or other extreme event would have on payment services

x

 

Adopting a digital euro causes the euro to gain international relevance

x

 

Adopting a digital euro improves overall costs and ecological footprint of monetary and payment systems

x

 

 

This is a charitable view of the ECB, whose digital euro project is still in the proof-of-concept stage with the decision whether to formally launch it being made earliest in mid-2021. This means that in the time it will take to methodically prepare a digital euro project – as it should, this is not a project to leap into lightly – the ECB may fall into a reactive position, as cash payments continue to decline and other currency options become more prominent and potentially challenge the euro.

The New Way to Exchange Value

The ECB whitepaper explores the possible architectures that a digital euro could be built on, letting the decision be shaped by solution requirements such as privacy, user access to the digital currency, or protecting its use for payments rather than investment. The architecture question will be the focus of the ECB’s future research, though we can predict that it will likely follow a centralized infrastructure with access facilitated by supervised intermediary banks. Decentralized architectures are part of the conversation, but options such as distributed ledger technologies (DLTs) are relatively untested to the scale that the ECB would require and pose higher risks to the stability and security of the Eurosystem. The overall risk-adverse tone of the report and self-determined requirements that the back-end infrastructure of any digital euro project should be controlled by the central bank indicates a preference for known and/or centralized solutions, though many of the active central bank digital currency (CBDC) projects around the world are working with DLTs.

The interest the ECB is showing for alternative payment methods indicates two things: that the ECB has a desire to be a leader in shaping the future of payments, and that their decision whether or not to pursue a digital euro will have a strong influence on the emerging use cases for the digital exchange of value between parties. This is not simply a digital payment, but the digital exchange of a request accompanied by trusted identity, resulting in the receipt of value. For example, the user of a carsharing service unlocks the car door with a one-time key issued to their mobile app. Or a hotel booking automatically issues room keys and admission to the building upon payment. Or the owner of an electric vehicle agrees to sell back reserve power to the grid. These are use cases where a digital transaction does not end with the successful transfer of money, but where payment is the trigger for the exchange of value, making the transaction a two-way process.

Decentralized: How and Why?

The next months will be an interesting time of research and inquiry: Which of the global CBDC projects will succeed, and will it become the standard architecture of digital currencies? Will CBDCs be able to fulfill the future of two-way digital transactions to exchange value? Or is it only possible if they operate on a decentralized architecture? And how will the ECB’s goal of preserving the intermediary roles of banks in monetary system interplay decentralized payment systems, whose major outcome is to disintermediate?

We believe the future of payments and the exchange of value will be use-case driven. Grassroots solutions will emerge for particular use cases like carsharing to fill the time until major institutions develop their proposals and standards. There will be stiff competition to establish the platforms, architectures, and methods that gather a critical mass. And we will continually be surprised at the new use cases for the digital exchange of value.

Stay up to date on CBDC and the future of value exchange with our research, and join the conversation at our events.

Friday, 09. April 2021

Space Elephant \ Unikname

Lancement des programmes partenaires Unikname

L’article Lancement des programmes partenaires Unikname est apparu en premier sur Unikname.

Lancement des programmes partenaires Unikname Charlène Demaret
Vous nous les avez (beaucoup) réclamés, et nous vous avons écouté ! Nous sommes fiers de vous présenter nos programmes partenaires Unikname. Et l’idée ne vient pas seulement de vous, mais émane d’une envie d’aller plus loin, plus haut et plus vite. On vous laisse donc découvrir le panel de choix qui s’offre à vous en lisant cet article…

Par où commencer ? Pour la petite histoire, nous avions déjà des programmes partenaires mais à destination de notre communauté Uns Network, la blockchain sur laquelle repose la solution Unikname. Apparement, ça ne suffisait pas puisque certains d’entre vous souhaitaient EN PLUS, proposer la solution auprès de leurs propres clients. Aucun problème ! Vu que nous sommes réceptifs à cette envie de recommander Unikname auprès d’autres professionnels, nous n’avons pas hésité une seule seconde et nous nous sommes plongés à coeur perdu sur le sujet.

Programme
OPEN

Rétrocession de

15%

Sur tous les abonnements
(soit 22,5 €HT / abonnement One Web)

Découvrir Programme Partenaire Open

Chez Unikname, on souhaite donner l’opportunité à TOUS de communiquer autour d’une solution innovante et qui participe à la sécurité du web. Nous avons donc conçu un programme qui permet à chaque personne qui le souhaite, de promouvoir à son niveau, la solution d’authentification forte made in France auprès de son réseau.

Récupérez 15% des ventes réalisées

En participant à la visibilité de la solution d’authentication nouvelle génération, vous êtes récompensé d’une rétrocession de 15% sur les ventes des abonnements One Web. Concrètement, vous gagnez 22,50 € pour chaque abonnement vendu par vos soins, et ce, en continu. Vos gains peuvent donc vite grimper et Unikname peut devenir une source de revenu à part entière !

En quoi ça consiste concrètement ?

Il est essentiel de connaître la solution Unikname Connect pour en parler et la recommander, ça va de soit non ? Pour rejoindre le programme Open, vous devez tout d’abord, être client Unikname et donc utilisateur de la solution !

L’objectif derrière cet engagement, est d’embarquer le partenaire dans l’ensemble du processus de création de compte Unikname et de ce que ça implique. Pour recommander un service ou un produit, il est préconisé de l’utiliser afin d’avoir la légitimité d’en parler…

Pas besoin de dépenser une somme mirobolante évidemment ! Commencez par sécuriser 1 site web (+ 1 offre) en souscrivant à l’abonnement One Web, à partir de 150 € HT / an, soit 12,50 € par mois pour protégez vos comptes administrateurs contre le piratage. Et si vous faîtes le calcul, votre abonnement peut vite être rentabilité avec votre rétrocession !

Par où commencer ?

À la suite de votre inscription sur le site www.unikname.com. qui vous permet de sécuriser vos accès web, notre relation de partenariat peut débuter ! Il vous suffit de rendre visible la solution Unikname de la manière dont vous le souhaitez et qui vous semble la plus pertinente vis-à-vis de votre communauté. Elle peut se présenter sous la forme d’un article de blog, une page web, un Webinar, une bannière publicitaire ou encore des postes sur les réseaux sociaux par exemple.

Afin que Unikname identifie la source des abonnements que vous avez généré, vous devrez utiliser votre lien d’affiliation au sein des supports sur lesquels vous communiquez, et le tour est joué !

Pourquoi rejoindre le programme Open ?

Rejoindre le programme d’affiliation Open, c’est gagner de l’argent facilement ! Faire partie de notre réseau de partenaires c’est :

Profiter d’une solution d’authentification forte made in France qui assure la sécurité de vos accès Compléter ses revenus facilement sans contrainte et à son rythme Valoriser une expertise et une légitimité en recommandant une solution innovante qui répond à un vrai besoin du marché Être informé de chaque nouveauté en exclusivité et rejoindre une communauté active Vous connaissez désormais le programme partenaire Unikname de base, on vous laisse découvrir notre deuxième plan partenaire ouvert également aux professionnels. Programme
ALLIANCE

Rétrocession de

33%

Sur tous les abonnements
(soit 50 €HT / abonnement One Web)

Découvrir Programme Partenaire Alliance

Vous êtes un intégrateur – une agence digitale – un développeur web – un éditeur de logiciel ou encore un freelance, alors le Programme Alliance devrait retenir votre attention !

Gagnez 33% de commission sur vos ventes

Contrairement à l’offre d’affiliation Open, le Programme Partenaire Alliance implique plus de responsabilités mais contient plus d’avantages !

Idem que le précédent programme, vous êtes invité à mettre en avant la solution Unikname Connect en la recommandant, mais vous pouvez aussi l’imbriquer dans votre offre, l’occasion de booster et compléter votre offre de services. Pour l’ensemble des ventes réalisé sur l’abonnement One Web, récupérez 33% de rétrocession, soit 50 € par abonnement dans votre poche !

Un partenariat gagnant-gagnant L’engagement de Unikname

Dans une dynamique de réussite, Unikname s’engage à fournir l’ensemble des éléments nécessaires à la présentation de la solution. Il s’agit d’un kit partenaire, un pack clé-en-main composé de supports marketing, commerciaux et de la documentation technique accessibles depuis le dashboard Unikname.

Du contenu et des outils pour réussir c’est bien, mais avant de se lancer dans le grand bain, l’équipe Unikname de tient à votre disposition pour vous accompagner lors des premiers échanges clients si besoin.

Évidemment, la visibilté de la marque fonctionne dans les deux sens ! Une page complète répertoriée dans la catégorie des partenaires certifiés est comprise, permettant ainsi de valoriser vos services et votre domaine de compétences. Les actions promotionnelles telles que les postes sur les réseaux ou la co-animation de Webinars peuvent être envirsagées.

Pour plus de proximité, un channel dédié et exclusif entre les équipes Unikname et le partenaire est alors à disposition. Une question d’ordre technique, conseil, bonne pratique ou autre demande sont les bienvenus ! Le canal idéal pour partager avec le réseau partenaires des informations en avant-première !

L’engagement du Partenaire Alliance

La mission du Partenaire est de prendre en charge l’intégration – la formation – le support et le maintien de la solution Unikname Connect chez son client. Et oui, rejoindre le Programme Alliance c’est devenir Partenaire Intégrateur Certifié Unikname. Vous êtes donc autonome dans la gestion du projet de A à Z !

Identique au Programme Open, vous devez au préalable être  utilisateur de la solution Unikname, et l’utiliser sur vos propres environnements. Vous êtes libre de prendre l’abonnement qui vous convient le mieux, même si nous vous recommandons de souscrire au Pack Agency, qui vous permet de gérer les sites ou applicatons web de vos clients. À vous de voir !

L’une des caractéristiques de ce partenariat est que vous contribuez à la sécurisation du réseau UNS Network ! Pour rappel, la solution d’authentification Unikname repose sur la technologie de la blockchain UNS Network, maintenue aujourd’hui par 23 délégués. Contribuer à la sécurité de ce système consiste finalement à mettre à disposition une partie de l’un de vos serveurs au réseau. Pour en savoir plus sur vos implications, contactez nos équipes à emarketing@unikname.com

C’est aussi une bonne manière d’intégrer la communauté active des utilisateurs de la solution et de participer à l’évolution du produit.

Absorbez votre abonnement Unikname dès la première vente !

Vendre des abonnements Unikname c’est bien, mais vous pouvez aller encore plus loin et gonfler vos revenus aisément. Quelques exemples de services sur lesquels vous pouvez facturer votre client final :

Intégration et Maintenance

Partons du principe que dans le deal convenu entre vous et votre client soit compris la maintenance du site web. Saisissez cette opportunité en proposant à votre client un complément de service : la gestion et la maintenance de la solution Unikname.

Formation des équipes

Autre contexte, imaginons que vous travaillez sur un projet interne à une organisation au travers duquel de nombreuses personnes vont être amenées à utiliser un identifiant Unikname. Vous pouvez dans ce cas, organiser une session de formation auprès des futures équipes utilisatrices ! Même si la solution est simple à prendre en main, présenter l’intérêt de la solution Unikname Connect et accompagner les utilisateurs dans cette transition est parfois utile. Alors n’hésitez pas !

Design de la page de connexion

Voici un autre exemple pour gagner de l’argent facilement ! Vous êtes une agence web avec des compétences graphiques et votre client souhaite installer Unikname sur son interface web ou sa boutique en ligne. Proposez dans ce cas là, la personnalisation graphique de l’espace de connexion aux couleurs de l’entreprise. Vous avez également le possibilité d’animer cette page au gré des saisons, comme Unikname tente de le faire sur sa propre interface de connexion.

Développement de vos propres extensions

Vous êtes un éditeur de logiciel ou une agence web évoluant sur des environnements techniques différents en fonction des besoins de vos clients. Dans le cadre d’une demande client ou de votre propre intérêt, vous pouvez développer l’extention qui permet d’implémenter Unikname sur un environnement sur lequel la solution n’est pas encore disponible ! L’occasion d’en tirer exclusivement des revenus et d’élargir le périmètre d’utilisation de la solution.

Comment devenir partenaire ?

Pour rejoindre le Programme Partenaire Alliance, il vous suffit de suivre les 3 étapes :

 

Créez votre compte Unikname Pro et utilisez la solution Unikname Connect sur vos propres environnements Recommandez Unikname auprès de vos clients et accompagnez-les jusqu’au bout du processus Recevez une rétrocession de 33% des abonnements générés et recevez le Label Partenaire Certifié Intégrateur Unikname

Pas besoin de dépenser une somme mirobolante évidemment ! Commencez par sécuriser 1 site web (+ 1 site offert) en souscrivant à l’abonnement One Web, à partir de 150 € HT / an, soit 12,50 € par mois pour protéger vos comptes administrateurs contre le piratage. Et si vous faîtes le calcul, votre abonnement peut vite être rentabilité avec votre rétrocession !

L’intérêt de rejoindre le Programme Partenaire Alliance

Pour résumer tout ce que nous venons de vous présenter, voici les 6 raisons principales de rejoindre le Programme Certifié Alliance :

 

Assurez la sécurité de vos propres accès web et ceux de vos clients Complétez votre offre de services en vous associant avec un acteur français spécialisé dans la sécurité web Renforcez votre image d’expert et démarquez-vous de vos concurrents en recommandant des solutions clés en main pour vos clients Valorisez vos compétences au travers d’une solution certifiante et d’un label partenaire intégrateur Gagnez en visibilité et élargissez votre périmètre d’intervention Consolidez la confiance et fidélisez vos clients sur le long terme Programmes Partenaires ouverts à TOUS

En plus des offres de partenariat Alliance et Open ouverts aux professionnels, il existe 3 autres programmes pour tous ceux qui souhaitent s’investir dans la communauté.

Programme
FORUM

PLAN GRATUIT

Installez la solution Unikname Connect
gratuitement sur votre forum publique Installer la solution Programme
INFLUENCEUR

NOTORIÉTÉ

Devenez ambassadeur de la marque Unikname en recommandant la solution auprès de votre réseau Contacter Unikname Programme
COMMUNITY

UNS NETWORK

Obtenez plus de Tokens UNS en multipliant vos connexions chez les sites partenaires Unikname Contacter Unikname

C’est à vous de jouer désormais ! Quel est le programme partenaire qui conviendrait le plus à votre business ?

Commencer avec Unikname Simple et rapide

N’attendez pas pour protéger votre fonds de commerce en sécurisant vos comptes administrateurs et utilisateurs :

Créer mon compte Webinar – 3 leçons à retenir sur la sécurisation des accès web

par Charlène Demaret | Avr 12, 2021

À la suite d’un piratage, les conséquences peuvent être dramatiques… Sécuriser efficacement son site web sans non plus devenir « parano » est donc essentiel. D’autant plus au travers du contexte...

lire plus Lancement des programmes partenaires Unikname

par Charlène Demaret | Avr 9, 2021

Vous nous les avez (beaucoup) réclamés, et nous vous avons écouté ! Nous sommes fiers de vous présenter nos programmes partenaires Unikname. Et l'idée ne vient pas seulement de vous, mais émane...

lire plus Replay – Cybersécurité – Quelles sont les technologies d’aujourd’hui et de demain pour Unikname ?

par Charlène Demaret | Mar 23, 2021

Vous connaissez sûrement les bénéfices de la solution Unikname, mais savez-vous réellement ce qui s’y cache... Parfait ! On vous explique tout dans ce Webinar ! Montez à bord de notre navette...

lire plus Replay – 4 astuces pour éviter le piratage de votre site WordPress

par Charlène Demaret | Mar 12, 2021

On sait aujourd’hui que les CMS les plus piratés sont les sites WordPress (90% selon une étude réalisée par Sucuri). Dans la majorité des cas, il s’agit d’une faille de sécurité provenant d’une...

lire plus Replay – 6 fonctionnalités clés à découvrir chez Unikname

par Charlène Demaret | Mar 3, 2021

Unikname promet la sécurité à portée de main et participe à la sécurisation de tous les accès web, mais comment cela fonctionne concrètement ?

lire plus Trilogie Mars – Comment fonctionne la solution Unikname ?

par Charlène Demaret | Mar 1, 2021

Profitons de l'approche du printemps et de l'actualité autour de notre conquête spatiale pour parler technique. Comment ? Au travers de 3 Webinars dédiés ! Il nous semble pertinent d'évoquer avec...

lire plus Replay – 3 secrets pour sécuriser et mieux gérer votre flotte de sites web

par Charlène Demaret | Fév 20, 2021

Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À l’occasion ce nouvel épisode, vous découvrirez nos secrets en matière de sécurité et surtout des...

lire plus Comment sécuriser et mieux gérer votre parc de sites web ?

par Charlène Demaret | Fév 15, 2021

3 secrets pour sécuriser et mieux gérer votre flotte de sites web WEBINAR Jeudi 18 Février à 10h30  Vous gérez plusieurs sites web sur différentes technos ? Alors ce Webinar est fait pour vous ! À...

lire plus Replay Webinar – Où en êtes-vous dans la sécurisation de vos sites web ?

par Charlène Demaret | Fév 5, 2021

Selon une étude menée par Sucuri, 90% des CMS piratés en 2019 étaient des sites WordPress. Dans la majorité des cas, le piratage était dû à une mauvaise gestion des accès administrateurs. À la suite...

lire plus Où en êtes-vous dans la sécurisation de vos sites web ?

par Charlène Demaret | Fév 2, 2021

WEBINAR Mercredi 3 Février à 10h Où en êtes-vous dans la sécurisation de vos sites web ? On en parle ce Mercredi 3 Février à 10h à l'occasion d'un Webinar animé par Charlène Demaret, Customer...

lire plus « Entrées précédentes

L’article Lancement des programmes partenaires Unikname est apparu en premier sur Unikname.


IBM Blockchain

Blockchain newsletter for March: Exploring blockchain and interoperability

Moderna and IBM plan to collaborate on smarter COVID-19 vaccine management. Initial focus is on IBM Blockchain capabilities in end-to-end vaccine traceability in the supply chain and the IBM Digital Health Pass. IBM Digital Health Pass is the technology behind New York State’s Excelsior Pass, successfully piloted at Barclays Center and Madison Square Garden. Would […] The post Blockchain newslet

Moderna and IBM plan to collaborate on smarter COVID-19 vaccine management. Initial focus is on IBM Blockchain capabilities in end-to-end vaccine traceability in the supply chain and the IBM Digital Health Pass. IBM Digital Health Pass is the technology behind New York State’s Excelsior Pass, successfully piloted at Barclays Center and Madison Square Garden. Would […]

The post Blockchain newsletter for March: Exploring blockchain and interoperability appeared first on Blockchain Pulse: IBM Blockchain Blog.


1Kosmos BlockID

The Benefits of Passwordless Authentication


Metadium

META DID for 2.1 million Moneytree users

Dear community, We are happy to share good news with you: MYKEEPiN, Coinplug’s identification and authentication service based on Metadium blockchain, will be used to offer a simple login to Moneytree users. Moneytree app is Galaxia Moneytree’s digital asset exchange platform. With the support of MYKEEPiN’s simple login feature, Galaxia Moneytree can guarantee its users’ convenience while increa

Dear community,

We are happy to share good news with you: MYKEEPiN, Coinplug’s identification and authentication service based on Metadium blockchain, will be used to offer a simple login to Moneytree users. Moneytree app is Galaxia Moneytree’s digital asset exchange platform.

With the support of MYKEEPiN’s simple login feature, Galaxia Moneytree can guarantee its users’ convenience while increasing their personal information protection. In addition, the company can reduce the risk of data leakage and its management costs by eliminating third parties.

The service will be officially launched on the 20th and, through it, the use of MYKEEPiN will continue to expand, bringing more users into the Metadium blockchain. As more companies join MYKEEPiN Alliance and implement MYKEEPiN, we expect to further grow the reach of Metadium.

We thank you for your continuous support.

Best,

Metadium Team

META DID for 2.1 million Moneytree users was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Azure AD Verifiable Credentials Entering Public Preview

by Anne Bailey Microsoft announced on April 5, 2020 that its Azure AD Verifiable Credentials is now in public preview. This solution enables organizations to design and issue verifiable credentials to their users, be it enterprises issuing employment credentials to their employees, universities enrolling students or issuing diplomas, governments issuing passports, ID cards, and countless other us

by Anne Bailey

Microsoft announced on April 5, 2020 that its Azure AD Verifiable Credentials is now in public preview. This solution enables organizations to design and issue verifiable credentials to their users, be it enterprises issuing employment credentials to their employees, universities enrolling students or issuing diplomas, governments issuing passports, ID cards, and countless other uses. This is an exciting step to meaningfully give agency back to individuals to securely hold and exchange their identity data online and in the physical world.

Decentralized Identifiers (DIDs) are a critical technological component to Microsoft’s solution. While there are many DID protocols, the viability of such an ambitious decentralized identity project required a scalable, independent and decentralized network. ION is the result. ION is an open source, public, and permissionless network that underpins Microsoft’s decentralized identity projects. ION is a sidetree on Bitcoin mainnet, enabling anyone to generate and anchor DIDs in the public preview of Azure AD Verifiable Credentials. ION was designed by Microsoft team members in collaboration with members of the Decentralized Identity Foundation (DIF) and from Transmute, SecureKey, Mattr, Consensys, and others.

From Concept to Adoption

The concept to use decentralized solutions for issuing, holding, and exchanging identity data has been viable for about 5 years now. We have seen proof of concepts, short term projects, pitches, and customers beginning to adopt working decentralized identity solutions. These have mostly been user-facing, targeting the general public with self-sovereign wallets to hold credentials, onboard their own credentials, and exchange in peer-to-peer networks. These products functioned, but stagnated without the ability to use these credentials to access many different types of services. But in the last year, momentum has picked up for solutions designed for enterprise usage to address IAM needs. And these products, Microsoft's own included, have matured to the point that widespread adoption just might be on the horizon.

There were many roadblocks that prevented mass adoption, and we are interested to see if Azure AD's Verifiable Credentials will dismantle them.

Integrations with directory services and authentication sources must be part of a solution. These are integral parts to an enterprise's IAM architecture, and even if an identity credential is held by an individual, it must be able to interoperate with the system. Easy adoption in widely used systems: decentralized identity started out as a solution to be adopted based on moral principle. To reach mass adoption, it must be chosen by enterprises and organizations of all types not because it is philosophically admirable, but because it makes sense. While the security and privacy benefits for the individual are very clear, it must have connectors to widely used systems across many sectors. Common standards must be used to create compatibility between different systems. Decentralized Identifiers (DIDs) and Verifiable Credentials are both foundational standards that enable decentralized identity solutions to remain independent of any centralized registry, identity provider, or certificate authority, and still be sharable in a trustworthy fashion. DIDComm is another emerging standard to watch out for. Read this Market Compass for more information on the decentralized identity space. Use for employee credential issuance as well as consumer: CIAM evolved out of IAM. If enterprises begin using decentralized identity solutions for their employees and partners, there will likely be a natural overflow to CIAM if those solutions are successful. Issuing credentials for an enterprise's own use within its own ecosystem is also a less overwhelming scenario (although issuing credentials to consumers may end up being a very similar process with decentralized solutions). Additionally, employee mobility and completing remote onboarding of new employees are very compelling use cases that can lower risk and expense for enterprises. The link to identity verification is essential for mass adoption. The only way an enterprise can trust a credential that is issued by an unknown issuer, which the user holds on their own device, is if there is an additional verification. On top of the cryptographic verifications that accompany decentralized identity solutions, incorporating identity verification into decentralized identity solutions is a powerful way to verify in real time that the holder of a credential is indeed the one being described in the credential. Is Mass Adoption Possible Now?

The chances are better now than they have ever been. It's been a long journey since blockchain first became a buzzword, but it has evolved and matured into an entirely different beast, one where blockchain is still present but does not feature proximately.

Enterprises are also not alone when first dipping their toes into the decentralized identity space. The appeal of using a decentralized identity solution that is already compatible with existing infrastructure – Azure AD and Microsoft Authenticator, for example – is not insignificant. Microsoft is putting together a partner ecosystem of identity verification providers that give global functionality. And the list of customer references shows viability in education, health, and government sectors around the world.

The chances for mass adoption are higher because the use cases are more relevant, the solutions more mature, and the foundational pieces – such as identity verification – are being established. 2021 will be a decisive year for decentralized identity, perhaps for the better.


Dock

The DOCK Token Migration is Now Complete

The last four months have gone by in a flash and the token migration that began on the 8th of December 2020 is now complete. As can be seen on Etherscan, almost 963 million ERC20 tokens have been sent to the vault address. As per our token release schedule, 150M

The last four months have gone by in a flash and the token migration that began on the 8th of December 2020 is now complete. As can be seen on Etherscan, almost 963 million ERC20 tokens have been sent to the vault address. As per our token release schedule, 150M of these tokens will be released as emission rewards for validators on the Dock mainnet, and to fund the treasury, during the life of the network. This will put the circulating supply of Dock’s mainnet token at circa 850M.

In summary, a total of 962.8M tokens (of the 1B total supply) have been migrated which equates to 96.28% of all tokens.

Exchanges were heavily involved at different stages of the migration.  At the time of writing Dock mainnet token trading is facilitated on:

Binance Gate Huobi Kucoin HotBit

Now what?

With the significant milestone of the migration now complete, the network and the entire Dock ecosystem can start to benefit from the tokens utility, and of course the significantly reduced and stable fees afforded by the Dock blockchain. As a point of interest, transfers on Dock start at 2.07 tokens, increasing for larger amounts.

As a reminder, Dock’s mainnet token plays a key role in the network, it provides:

An integrated payment mechanism for performing network operations including: the creation of decentralized identities (DIDs), issuing credentials, creating schemas and more. Emission rewards for validators who provide their resources to the network by producing blocks and confirming transactions. Decentralized governance, as detailed in our earlier post on governance, DOCK will be utilized to manage changes to the network. How the mainnet is run and by whom will be decided on by voting, and Dock tokens will be the mechanism by which members of the network will vote. This functionality will become more apparent as Dock moves to Proof of Stake later this quarter.

Wallets and explorers

Moving to Dock’s own blockchain has of course required the community to switch to using new wallets. The current solutions are available here as are instructions on how to set each wallet up. The Dock team have also raised a pull request with hardware wallet Ledger and have spoken with their integrations team. As soon as the request has been reviewed and merged, the Ledger Nano S will support DOCK. It is worth noting that discussions with their integration team have revealed that due to a high workload it could be a few months before they support DOCK.

Another change for the community has been the switch to the Polkadot-based explorer. While functional, it is recognised that this doesn’t have all the features that some community members would like to see. We’re happy to confirm that the mainnet will be supported by a new explorer and it is anticipated in the coming weeks that we’ll be making this publicly available. You can stay tuned for more details on any of our social channels, links below.

Sign up for our newsletter Receive our latest blog posts direct to your inbox Twitter LinkedIn

Coinfirm

Coinfirm and Crypto.org Chain CRO Compliance

09 April, LONDON, UK  – The leading RegTech and blockchain analytics provider, Coinfirm, is proud to announce a partnership with Crypto.org Chain (CRO), a public, open-source and permissionless blockchain designed to be a public good that helps drive mass adoption of cryptocurrencies. The Crypto.org Chain (CRO) ecosystem benefits users for payments, NFTs and DeFi by...
09 April, LONDON, UK  – The leading RegTech and blockchain analytics provider, Coinfirm, is proud to announce a partnership with Crypto.org Chain (CRO), a public, open-source and permissionless blockchain designed to be a public good that helps drive mass adoption of cryptocurrencies. The Crypto.org Chain (CRO) ecosystem benefits users for payments, NFTs and DeFi by...

Thursday, 08. April 2021

Microsof Identity Standards Blog

What's New in Passwordless Standards, 2021 edition!

Hi everyone and welcome to chapter 14 of 2020! It’s been a little while since we talked about standards for passwordless so we’re excited to tell you about some new enhancements and features in FIDO2 land that you'll start seeing in the wild in the next few months!     Specification Status   The Web Authentication API (WebAuthn) Level 2 specification is currently a Candidate

Hi everyone and welcome to chapter 14 of 2020! It’s been a little while since we talked about standards for passwordless so we’re excited to tell you about some new enhancements and features in FIDO2 land that you'll start seeing in the wild in the next few months!

 

Specification Status

 

The Web Authentication API (WebAuthn) Level 2 specification is currently a Candidate Recommendation at the W3C. "Level 2" essentially means major version number 2.

 

The version 2.1 of the Client to Authenticator Protocol (CTAP) specification is a Release Draft at the FIDO Alliance. This means the spec is in a public review period before final publication.

 

These new draft versions are on their way to becoming the next wave of FIDO functionality (as of the writing of this blog, we support Level 1 of WebAuthn and CTAP version 2.0). We think you might want to hear about what we think is especially fun about WebAuthn L2 and CTAP 2.1.

 

Enterprise Attestation (EA)

 

Enterprise Attestation is a new feature coming as part of WebAuthn L2 and CTAP 2.1 that enables binding of an authenticator to an account using a persistent identifier, similar to a smart card today.

 

FIDO privacy standards require that "a FIDO device does not have a global identifier within a particular website" and "a FIDO device must not have a global identifier visible across websites". EA is designed to be used exclusively in enterprise-like environments where a trust relationship exists between devices and/or browsers and the relying party via management and/or policy. If EA is requested by a Relying Partying (RP) and the OS/browser is operating outside an enterprise context (personal browser profile, unmanaged device, etc), the browser is expected to prompt the user for consent and provide a clear warning about the potential for tracking via the persistent identifier being shared.

 

Authenticators can be configured to support Vendor-facilitated and/or Platform-managed Enterprise Attestation. Vendor-facilitated EA involves an authenticator vendor hardcoding a list of Relying Party IDs (RP IDs) into the authenticator firmware as part of manufacturing. This list is immutable (aka non-updateable). An enterprise attestation is only provided to RPs in that list. Platform-managed EA involves an RP ID list delivered via enterprise policy (ex: managed browser policy, mobile application management (MAM), mobile device management (MDM) and is enforced by the platform.

 

Spec reference:

CTAP 2.1 - Section 7.1: Enterprise Attestation
WebAuthn L2 - Section 5.4.7: Attestation Conveyance Preference

 

Authenticator Credential Management and Bio Enrollment

 

Credential Management is part of CTAP 2.1 and allows management of discoverable credentials (aka resident keys) on an authenticator. Management can occur via a browser, an OS settings panel, an app or a CLI tool.

 

Here's an example of how the Credential Management capability is baked into Chrome 88 on macOS (chrome://settings/securityKeys). Here I can manage my PIN, view discoverable credentials, add and remove fingerprints (assuming the authenticator has a fingerprint reader!) and factory reset my authenticator.

 

 

 

Clicking on "Sign-in data" shows the discoverable credentials on the authenticator and allows me to remove them. This security key has an Azure AD account and an identity for use with SSH.

 

 

 

 

Bio Enrollment allows the browser, client, or OS to aid in configuring biometrics on authenticators that support them. This security key has one finger enrolled. I can either remove the existing finger or add more.

 

 

Here's an example of authenticator credential management via a CLI tool, ykman from Yubico.

 

 

 

Spec references:

            CTAP 2.1 - Section 5.8: Credential Management

            CTAP 2.1 - Section 5.7: Bio Enrollment

 

Set Minimum PIN Length and Force Change PIN

 

CTAP 2.1 allows an enterprise to require a minimum PIN length on the authenticator. If the existing PIN does not meet the requirements, a change PIN flow can be initiated.

 

An authenticator can also be configured with a one-time use PIN that must be changed on first use. This is an additional layer of protection when an authenticator is pre-provisioned by an administrator and then needs to be sent to an end user. The temporary PIN can be communicated to the end user out of band. We see this being used in conjunction with Enterprise Attestation to create a strong relationship between an authenticator and a user.

 

Spec reference:

CTAP 2.1 - Section 7.4: Set Minimum PIN Length

 

Always Require User Verification (AlwaysUV)

 

AlwaysUV is part of CTAP 2.1 and allows the user to configure their authenticator to always prompt for user verification (PIN, biometric, etc), even when the Relying Party does not ask for it. This adds an extra layer of protection by ensuring all credentials on the authenticator require the same verification method.

 

Spec reference:

CTAP 2.1 - Section 7.2: Always Require User Verification

 

Virtual Authenticator DevTool

 

This one is not tied to updates of either specification but we love it and wanted to share! Chrome and Edge (version 87+) now include a virtual authenticator as part of DevTools. It started as a Chromium extension back in 2019 and is now native! Oh, and the code is on Github!

 

It is a great tool for testing, debugging and learning! Try it with one of the awesome WebAuthn test sites: Microsoft WebAuthn Sample App, WebAuthn.io, Yubico WebAuthn Demo.

 

To access the tool, open Developer Tools ( F12 or Option + Command+ I ), click the Menu icon on the top right (…) then More tools and WebAuthn.

 

 

Enabling the virtual authenticator environment will allow you to create a new authenticator by picking a protocol (CTAP2 or U2F), transport (USB, Bluetooth, NFC or internal), resident key (discoverable) and user verification support.

 

 

 

As new credentials are created, you’ll see them listed and the sign count will increase as the credential is used.

 

 

Want to know more? Here’s an amazing blog by Nina Satragno from the Chrome team over at Google who created this amazing DevTool!

How we built the Chrome DevTools WebAuthn tab

 

Wrap Up

That rounds out the major features we believe will have the most impact. Here’s a few other enhancements and features that are important to mention!

cross-origin iFrame usage Apple’s anonymous platform attestation format Large blob storage extension to support storing a small chunk of encrypted data with a credential to support use cases like SSH certificates

 

If you’d like to hear more about any of these enhancements/features (or anything else identity related, let's be honest), leave us a note in the comments.

 

Thanks for reading!

 

Tim Cappalli | Microsoft Identity | @timcappalli

 

 

 


Fission

Announcing Fission's Latest Developer Release: Optimus Prime

We're proud to announce Fission's latest product update version: Optimus Prime. The big new feature in this update is account linking to the CLI.

We're proud to announce the latest release of our command line tool for developer app publishing: Optimus Prime.

Account Linking for Desktops

The big new feature in this update is account linking between the browser and desktop.

Before the update, users had two separate accounts -- one they used in the browser as a webnative/WNFS user and another used to publish apps at the CLI. Now, if you already have an account you created in the web browser, you can link that account and use it at the CLI.

In this update, we also added a few great quality of life improvements include:

Better handling of the managed IPFS daemon More log messages at the CLI, which helps you to see exactly what is happening when you run a command More emoji, which makes those log messages more colorful and fun. 🙂

Here's an image that shows the account linking feature nicely (It also demonstrates many log messages in all their emoji glory).

Full details are on the release page on GitHub. For install and upgrade instructions, see the guide, including notes on, macOS users using brew.

Link your desktop

Be sure to hop over to our documentation guide to see the newly added section about account management and linking your browser-based and CLI-created accounts.


auth0

End-To-End Testing With Playwright Sharp

Automating web browser interactions is a great way to test the functionalities of our web application. Let's see how to create UI tests in C# using Playwright Sharp, a browser automation library.
Automating web browser interactions is a great way to test the functionalities of our web application. Let's see how to create UI tests in C# using Playwright Sharp, a browser automation library.

Holochain

NFTs on Holochain? Easy as passing the ball

Can you do NFTs on Holochain? Yup!

Non-fungible tokens, or NFTs, have suddenly become rather popular. I don’t exactly know why now is their time — maybe it’s because celebrities are beginning to sell them, or maybe it’s because recent price increases of BTC and ETH have given hodlers a lot of new money to play with. Whatever the cause, the world is taking notice. And not just the crypto world — traditional auction house Christie’s recently sold an NFT for $69 million USD.

What is an NFT?

If you don’t know what an NFT is, it’s a record that tracks ownership of some special digital object, from its creation to its current owner. This record lives on a blockchain or other distributed ledger. The object, on the other hand, lives somewhere else, usually a traditional web server or decentralised file storage system. It might be a piece of digital art, an algorithmically generated cat, an item of digital clothing for your favourite virtual world, or (my favourite, on account of its subtle-yet-undeniable comment on the nature of crypto-trading) a tulip. That can fight other tulips.

The important thing to know is that, unlike normal ‘fungible’ crypto-tokens such as ETH, each NFT represents ownership of something special, unique, one-of-a-kind.

There’s a funny thing about that one-of-a-kind something. Because it’s a digital object, it can be cheaply copied and shared a zillion times, and there’s not much the owner can do about it.

But surprisingly, that’s what makes it so valuable.

What’s really interesting about an NFT?

If you buy a painting from a famous artist, you can do a few things with it. You can stick it in a vault where nobody can damage (or enjoy) it. Or you can put it in your living room to impress all your dinner party guests with. (As long as they’re in your bubble or you’ve all gotten vaccinated already.)

Or you can put it up in a public place where everyone can enjoy it. Something unexpected is about to happen now: the more people get to see it, the more famous it becomes, which makes it even more valuable. Think about Da Vinci’s Mona Lisa; it’s estimated to be worth $1 billion, and the fact that anybody can print up a copy for their living room hasn’t hurt its value one bit. But the distinction of owning the original — now that’s something special, something that people might be willing to drop a billion on.

Digital art is even easier to copy than the Mona Lisa. So far this has been seen as a threat; music, film, and software companies have fought hard against piracy. That’s because their business models have centred around restricting who can access the things they produce, then charging for access. At the beginning of the internet age, this suddenly became nearly impossible to enforce.

What if we could do for digital art what art galleries did for physical art? What if we decoupled ownership from possession in a way that safely ensured that creators could make a respectable living? What if digital art became more valuable the more freely it was copied and enjoyed? And while we’re at it, could digital tech be used to eliminate (or reduce) the middlemen and let more of the revenue reach the people who do the work to inspire, challenge, and delight us?

According to the website cryptomedia.wtf, this is exactly the sort of opportunity that NFTs hold. Reading this manifesto, I do really wonder if they might just upend the revenue strategy of content producers, allowing them to let go of artificial scarcity and profit from abundance. Could this be an example of a bridge from the extractive economics of the past into a cooperative economics that many of us are hoping for?

Why blockchain?

In order to do NFTs, we need some sort of registry that records the creation of art pieces, along with subsequent ownership transfers, so people can’t try to sell things they didn’t own. It’d be nice if this registry were publicly auditable — oh, and tamper-resistant, which implies that it shouldn’t be controlled by any particular organisation. And since it’s a piece of software, maybe it could automate ownership transfers and payments too. You can probably see now why blockchain technology is such a good match for NFTs.

Could we do NFTs on Holochain?

Yup!

How?

We talked to a couple of Holochain’s core devs about this, and while they came up with different designs, they all agreed that it wouldn’t be hard to do. You only need one simple rule, a ‘pass-the-ball’ rule, where you only get to pass the NFT to someone else if someone passed it to you (or you created it in the first place). They described this rule as ‘transitive’ — as long as it’s applied to every ball-passing event, it’s guaranteed to be correct all the way back to the object’s creation event.

Some people think you have to add some additional consensus to manage scarcity of NFTs, but in Holochain, every element on every user’s chain is already guaranteed to be unique, and you don’t get scarcer than that. You just need to track the history of who the unique thing has been transferred to.

I find all this easy to understand, because Holochain is modelled on the idea of person-to-person interactions — and we all understand how those work. Holochain adds some extra magic to prevent cheating. I won’t go into much detail, except to say two things:

Every time you pass the ball, you need to ask a few people to witness the act. But you don’t get to choose who witnesses it, which helps keep you honest. If someone passes the ball to you, you’re probably going to want to make sure they aren’t cheating. So this NFT app should let you check with a few prior witnesses to make sure the ball wasn’t secretly passed to someone else already. Why Holochain instead of blockchain?

Plenty of reasons:

Cost. Because Holochain apps are so resource-efficient, there’s no need to charge a transaction fee. Ethics. A lot of smart people have already investigated blockchain’s ecological impact, claiming that we can’t really afford to be adopting this kind of technology right now. A Holochain NFT app, by contrast, would be light enough to run on the basic devices its users already own. Developer efficiency. Blockchain NFTs require a couple different stacks for payment, ownership, storage, and UI. With Holochain, those functions could all be integrated into one stack. User experience. Not only are crypto wallets really confusing for the average user, but the disconnect between components has caused real losses for NFT art collectors. I don’t think I need to explain why this is a serious problem. Better component integration means a safer — and nicer — experience for everyone. Scalability. Holochain apps don’t use global consensus and they run on user devices. That means they scale with the number of people using them. Accountability. People have already been selling things they don’t control the rights to. An NFT app on Holochain could leverage some of the identity tools we're building.

Put together, all of these things could mean greater adoption for NFTs — as well as massive new opportunities unlocked by the efficiency, cheapness, and versatility of a future Holochain-based NFT platform.


Affinidi

How to implement driving license use case using Verifiable Credentials

How to Build Use Cases with Affinidi — Driving License as a Verifiable Credential This article explains the importance of Verifiable Credentials for Driving License while this article gives you an insight of what verifiable credentials are, in the first place. As a continuation of the above articles, we’re now moving to the next step, which is the actual implementation. STEP 1: As a first
How to Build Use Cases with Affinidi — Driving License as a Verifiable Credential

This article explains the importance of Verifiable Credentials for Driving License while this article gives you an insight of what verifiable credentials are, in the first place.

As a continuation of the above articles, we’re now moving to the next step, which is the actual implementation.

STEP 1:

As a first step, get a react project template of your choice. Check out https://create-react-app.dev/docs/getting-started to get the react boilerplate in one command. Create three repos, one each for the issuer, holder, and the verifier apps. (You can also use any other tech stack of your interest though in this article, we use react).

STEP 2:

Moving on, get the API access key and API key hash to access the Affinidi APIs from https://apikey.affinidi.com/?source=vcms-api-sandbox and add in .env file in your react app.

// .env file
REACT_APP_API_KEY=
REACT_APP_API_KEY_HASH=
REACT_APP_ENVIRONMENT= STEP 3:

Now, the user(Issuer/Holder/Verifier) can do the sign up using link and login using link. On the issuer side, both the applicant and the issuer can log in, but they get to see different dashboards after they login.

The applicant logs in and fills the form and submits it to get VC(Verifiable Credential) for their Driving License.

When the issuer logs in, the list of all applied forms will be shown, and the issuer now checks the applications and issues a VC for the same. Issuing the VC involves two APIs: creating an unsigned VC and signing the credential.

STEP 4:

Once the VC is signed, it can be saved to the issuer’s wallet using this API and shared using the credential share API that creates a shareable link with expiry date and QR code(it contains the shareable link URL). This link can be sent to the applicant via email or as a QR code to the applicant.

STEP 5:

Finally, when the holders/applicants click the link, they are redirected to their respective wallet app and prompted to store or reject the VC. Once the user accepts the request, the VC is stored to their wallet.

STEP 6:

The holder is able to see the list of all VC stored in their wallet

STEP 7:

Let’s say, the holder wants to rent a car from a Car Rental Company(Verifier). To prove the authenticity of the holder’s VC and to avoid its misuse, the holder shares the VC with the verifier. There are a series of handshakes that happen to safely transfer the VC from the holder to the verifier as follows.

The car rental company has displayed a link with a JWT token generated from the SDK. This JWT token has the list of encoded credentials required for the share.

STEP 8:

The link in the above image redirects to the wallet application with Token Request in the query parameter. The holder accepts the request to share the VC to the car rental company.

STEP 9:

One or more VCs are combined to form a Verifiable Presentation. Also, the VCs present inside the Verifiable Presentation are not the actual VCs, and are intended to avoid misusing the VC. At the same time, it also has the necessary information to prove the authenticity of VC as it has a Response Token(JWT token with Verifiable Presentation encoded in it). This is generated by the holder like this code sample and shared to the verifier through the encrypted messaging service, similar to this code sample.

STEP 10:

When the verifier logs into the account, the list of submitted applications can be seen through the messaging service like the code sample and validates the Verifiable Presentation from the Response Token, similar to the code sample. On verification, the request is approved for a car rental.

Written by Sridharan Jayabal

How to implement driving license use case using Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


On-Demand Employment Endorsements — A Use Case for Verifiable Credentials

On-Demand Employment Endorsements — A Use Case for Verifiable Credentials For most of us, our first job is always special as it gives us a glimpse of how the next few decades could be. But tracking the responsibilities and roles in your first and subsequent jobs are not easy and more importantly, getting someone to verify them can be a challenge, especially if you’ve worked in an industry with hi
On-Demand Employment Endorsements — A Use Case for Verifiable Credentials

For most of us, our first job is always special as it gives us a glimpse of how the next few decades could be. But tracking the responsibilities and roles in your first and subsequent jobs are not easy and more importantly, getting someone to verify them can be a challenge, especially if you’ve worked in an industry with high attrition rates. Platforms like LinkedIn may state your past background with some degree of social vetting but are certainly not fool-proof!

Fast forward a few years, and you may end up in a situation where the proof of experience from your first job is needed! Verifiable credentials can be your savior here.

To get an idea of what these are, feel free to read this 101 post on VCs.

How Can Verifiable Credentials Get an Employment Endorsement?

Let’s say, John has applied for a Post-grad program in digital marketing at the ABC university, but it requires work experience in a related field. John had been a salesman ten years ago and wanted to share this experience with University ABC in the form of a verifiable credential without the need for any additional verification.

So, how can he do that?

As with any VC, there are three parties involved here too.

Issuer — Employers, job search sites, payroll providers, and startups like GoodWorker that empower seasonal workers and employers, one verified credential at a time. It matches employers with a large network of trustworthy and verified profiles of skilled workers, at a fraction of the cost, and allows workers to take control of their job search, with direct access to trustworthy employers. Holder — John Verifier — University ABC

As a first step, John logs into the issuer’s portal, and submits his personal details like his name and the month and year of employment. After verifying John’s credentials, the issuer looks through his work records, and sends a verifiable credential in the form of a QR code detailing John’s period of employment and his role as a salesman. In turn, John scans this QR code and stores these details in his digital wallet.

Next, he creates a verifiable presentation with the information that University ABC wants, and shares it with them.

Here is a sample VC that checks the past employment of John as a salesman.

Finally, the university examines John’s verifiable presentation and learns about his experience as a salesman. This makes him eligible for the post-grad program and coupled with his admission test results, he gets the admit he’d always wanted!

Sounds simple, right?

The obvious advantage of VCs is that they are readily available and accessible. Imagine if you have to go to your first company and search through the physical records in a dusty storage room (No, we aren’t talking about Flintstones here!!).

Since these VCs are immutable and tamper-proof, the verifier could access authentic information that helped its decision-making process. In all, a win-win situation for everyone!

Overall, VCs are a huge leap forward in the world of ever-changing technology as it is a hassle-free way of sharing personal information securely and quickly with explicit consent of the owner of that information.

At Affinidi, we’re working to build an ecosystem that would help developers and companies create such interesting VC-based applications.

To learn more about what we do, drop a line at contact@affinidi.com.

On-Demand Employment Endorsements — A Use Case for Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


My Life Digital

Rebuilding consumer trust after Cambridge Analytica

#ThrowbackThursday Remember the scandal of Cambridge Analytica and Facebook harvesting people’s data without their knowledge and then misusing it? This blog discusses how both companies could rebuild relationships in the aftermath. This practice is so far away from being ethical that the world was outraged and started to vote with their clicks, creating the #DeleteFac

#ThrowbackThursday

Remember the scandal of Cambridge Analytica and Facebook harvesting people’s data without their knowledge and then misusing it? This blog discusses how both companies could rebuild relationships in the aftermath. This practice is so far away from being ethical that the world was outraged and started to vote with their clicks, creating the #DeleteFacebook campaign. People care about how you handle their data, how would your data ethics strategy hold up to your consumers? 

Rebuilding consumer trust after Cambridge Analytica 

  
Having invested heavily in online services, the last thing brands and public bodies want is to see customers and service users abandon them. J Cromack, co-founder of MyLife Digital, advises organisations how to navigate the current crisis of confidence:

‘It’s concerning to see companies still treating people’s personal data as a tradeable commodity – something they can use however they like and sell on for a handsome profit. But all that is about to end.’

Many consumers have until now turned a blind eye to routine data collection – largely accepting it as part of the deal for having convenient and often free online services. But Cambridge Analytica’s activities and Facebook’s responsibility and subsequent response[1] have brought matters to a tipping point[2]. Online service providers are going to have to work hard to win back trust. 


Data custodianship is a privilege

 
Meanwhile, there are just weeks to go until the EU General Data Protection Regulation (GDPR) becomes enforceable, bringing data-privacy controls and organisational obligations up to date with the digital era. It still comes as a surprise to many companies to learn that the data they have collected is not their property. But now they will have to learn fast that they are merely custodians of that information – with a duty to use it circumspectly[3]
 
GDPR should be used by companies as an excuse to kick-start something new: being honest and transparent with people about data intentions, and having a holistic strategy and approach to customer data that transcends individual systems – and which is easily auditable. 
 
Where organisations are seen to use people’s data respectfully, with permission, in ways that provide tangible benefits to an individual – such as more direct access to what interests them and personalised promotions – consumers are much more likely to opt-in. 


What is changing? 


Under GDPR, every organisation that captures customer data – from internet giants to banks and retailers, to local authorities, health services, and charities – has an obligation to be transparent about the personal data it is collecting. It must seek permission for each specific use case, take appropriate steps to safeguard that data and ensure it is not shared beyond the parameters the customer has knowingly agreed to. 
 
This is nothing new; these principles were enshrined in the existing Data Protection Act[4]. But until now there has been too much emphasis on consumers reading the small print and taking proactive steps to protect themselves. Under GDPR, and certainly, in the aftermath of the CA/Facebook breach of public trust, accountability will shift back to the organisations collecting, storing, and using people’s data. They already face an uphill struggle; the latest figures from the Information Commissioner’s Office suggest that across the UK only a fifth of people have trust and confidence in companies and organisations storing their personal information[5]


Reframing GDPR compliance as a priority for consumer trust 


All this means, once and for all, that use and safeguarding of people’s personal data has ceased to be something that directors can palm off to compliance/governance and IT departments. It is central to organisations’ relationship with their customers, to how they are perceived in the market, to whether they will succeed or fail. At best, the way that companies and service providers manage and communicate about their handling of customer data will become a brand differentiator; as a minimum it will be a condition of engagement. 
 
In this light, GDPR compliance takes on a different meaning. Suddenly it is less about how permissions are worded at each customer touchpoint or the individual security requirements applied to each IT system and cloud application. Instead, it is about being open and honest and true to the principles of data protection. 
 
In time, this should also be something that customers themselves can have some control over (by being able to more easily review and edit their data permissions, for example). A new survey by Boston Consulting Group found that people were much more likely to willingly share their data if they felt confident that harmful use would be prevented[6]
 


Avoiding ever decreasing circles 


Above all, what the Cambridge Analytica/Facebook story demonstrates most is that it is not the regulatory authorities that companies need to fear, it is the customers who will vote with their clicks and swipes about where they feel comfortable leaving their digital footprint.
 
If customers fear they are living out Dave Egger’s book The Circle – and are driven off the grid where they can’t be tracked – no one wins. Organisations need to be seen to be using data with respect, not in ways that create suspicion and mistrust, causing customers to untick all the boxes, refuse cookies and shroud themselves in anonymity. 
 

[1] Facebook’s floundering response to scandal is part of the problem, FT.com, March 19, 2018 
[2] Tech world experiencing a major ‘trust crisis,’ futurist warns, CNBC, March 20, 2018 
[3] Cambridge Analytica and Facebook – did they put the individual at the heart of their data strategy?, Consentric blog, March 19th 2018 
[4] The rights of individuals, Data Protection Act, Principle 6 
[5] ICO survey shows most UK citizens don’t trust organisations with their data, ICO/ComRes research, November 2017 
[6] Generating Trust Increases Access to Data by at Least Five Times (exhibit 2), Bridging the Trust Gap in Personal Data, Boston Consulting Group, March 2018 

The post Rebuilding consumer trust after Cambridge Analytica appeared first on MyLife Digital.


Holo

Monthly Roundup for Holo & Holochain

Monthly Roundup for Holo & Holochain — March Edition March Edition Welcome to our monthly roundup — March edition, as always I’m bringing you the scoop of the month. If you have been reading the regular updates we share in Telegram or Twitter, you probably already know that we’ve now got a working Elemental Chat application and that we are hard at work preparing the Holochain and Holo inf
Monthly Roundup for Holo & Holochain — March Edition March Edition

Welcome to our monthly roundup — March edition, as always I’m bringing you the scoop of the month. If you have been reading the regular updates we share in Telegram or Twitter, you probably already know that we’ve now got a working Elemental Chat application and that we are hard at work preparing the Holochain and Holo infrastructure so that we can release it to all hosts. We are almost there! The big news this month was Holo being granted a patent for Holochain Distributed App Framework.

If you’re wondering what exactly we’ve been working on at Holo and Holochain this past month — the wait is over! Let’s jump right in.

Press Release: U.S. Patent granted for Holochain Distributed App Framework

In a big step forward for the P2P web, Holo Limited was granted a US patent for the rrDHT networking innovations within Holochain. rrDHT is a peer-to-peer networking design implemented in Holochain that describes a system of nodes communicating according to a relaxed, agent-centric distributed hash table.

Read the full Press Release: U.S. Patent granted for Holochain Distributed App Framework

How rrDHT Works: A Tech Deep Dive

Unpacking the technology behind Holochain’s secure P2P networking. This week we got word that a patent we submitted to the USPTO was approved. That means that Holochain’s special DHT design — rrDHT — is covered under intellectual property laws.

Read: How rrDHT Works: A Tech Deep Dive

Dev Tools in the Works

Mostly we’re working on performance, bugfixes, and features (proxy server, storage code, sharding, and validation receipts), but we’re also working on a few developer tools: the hc command (yes, it’s coming back), a mock HDK, and a nearly finalised hApp bundle spec.

Read Holochain Dev Pulse 92 — Dev Tools in the Works

Elemental Chat Gets Freshened Up

This week we rolled out an update to all HoloPorts. The most visible changes are in the UI of Elemental Chat, our demo app. Users now have ‘identicons’ beside their name; these auto-generated avatars are based on their actual agent ID in the Holochain network and can’t be forged. This is important because usernames are just labels; you can set yours to anything you like, even someone else’s name. If you’re talking to a friend and suddenly their identicon looks slightly different, there’s a good chance someone is trying to impersonate your friend.

Read Holochain Dev Pulse 93— Elemental Chat Gets Freshened Up

Decentralized Next-level Collaboration Apps with Syn

My passion for Holochain has always been sourced in upgrading our collective intelligence, which means making it easier and more joyful to collaborate together. In early December, Art and I sketched out, in an afternoon, a generalized pattern on Holochain for real-time collaborative apps like Google docs where you can type in the same document as others simultaneously, and see their cursor moving around.

Read: Decentralized Next-level Collaboration Apps with Syn

Moving HoloFuel up on the Roadmap

Yes, you read that right! We are actually ahead of where we expected to be in some ways. While it is taking a while to complete infrastructure testing because that is one of the most complex aspects of our work, we have been able to move many other activities forward. Because of that we decided to shift the HoloFuel application up in the roadmap. Here is what it looks like with the new changes. Check it out on our website where you will find more details.

More Dev Documents are out To install the new RSM go here. The new core concepts are out in the wild! Regular Holo Dev Updates on Twitter

If you are curious about the — hot from the press dev updates — those can be found on Holo Twitter.

Low Code Zone

A series focusing on simple hApp dev solutions for real humans. Low Code Zone explores the possibilities when agent centric solutions are combined with simple, modular tools that make building an app as simple as configuring a website.

Low Code Zone [Episode 5] Why Holochain needs a ‘Membrane Maker’ Low Code Zone [Episode 6] Holochain can use generic Open Source Low Code tools Build it! [Powered by Holochain]

Build It! Is a short, live, guided tour of how to quickly develop working apps on Holochain using scaffolding and rapid application development tools.

Watch Build it! EP 8 Watch Build it! EP 9

You can submit tech questions here for the next Live AMA with Phil.

Holochain Ecosystem Sessions

Ecosystem Sessions are long-form conversations with long-standing Holochain app builders on progress, beliefs and the ins and outs of building distributed solutions and organisations around them.

Ecosystem Session with Sacred Capital Rust In Blockchain Newsletter

RiB Newsletter #22 — A few tweaks

Next AMA 46

We will be hosting the next Holo AMA on the 13th of April at 3:30 PM UTC

We want to hear your questions for our next AMA so go ahead & ask away!

Watch the replay of AMA 45

Just two weeks ago we’ve hosted an AMA 45, you can watch the replay here. Grab a sneak peak of Elemental Chat Demo.

AMA Quick Cuts — Playlist on YouTube

Watch short clips on YouTube straight out of the AMA! Check the ever growing playlist here

Our community and events team is working tirelessly to bring people together and create amazing things on Holochain. As always, we would like to give our appreciation to all the community organisers without whom it would be not possible to create this magic! Many events are conducted virtually to respect the #SocialDistancing, so nothing is stopping you to participate!

Interested in building a hApp? Join Holochain Forum and start your journey!

Check out our upcoming events section and come along. Everyone’s welcome!

As always, if you have general questions about Holo, Holochain, or HoloPorts — or want to join our enthusiastic community — the best place to start is on our Twitter and Holochain Forum.

If you have specific questions or any technical issues regarding Holo or HoloPorts, you can read more, or contact us directly at https://help.holo.host.

We hope you’ve enjoyed this month’s short news collection and we will catch you next month — Paulii @ Holo

Monthly Roundup for Holo & Holochain was originally published in HOLO on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Managing Access to Your Business Applications: Breadth vs. Depth

by Martin Kuppinger I’ve been in IT long enough to remember when business applications were home-grown and written in COBOL. In the early stages of my career, I even gave computer classes on the standard algorithms for good COBOL programming, such as sorting and grouping. In the more than three decades since, I’ve seen SAP R/3 being discussed as the revolutionary approach for business applicati

by Martin Kuppinger

I’ve been in IT long enough to remember when business applications were home-grown and written in COBOL. In the early stages of my career, I even gave computer classes on the standard algorithms for good COBOL programming, such as sorting and grouping.

In the more than three decades since, I’ve seen SAP R/3 being discussed as the revolutionary approach for business applications, moving these from mainframes into client/server models. I’ve seen the emergence of the first SaaS applications, with some such as Salesforce or Ariba still being around. I’ve seen the broader adoption of the cloud for critical business workloads, such as SAP S/4HANA Cloud. And, I have also seen GRC (Governance, Risk & Compliance) become mandatory for organizations.

The hybrid reality of today’s business applications

For many, if not most organizations, the landscape of business applications has changed over the past year. With the advent of SaaS services, there are typically more vendors for the various business applications than there were a few years ago, and with the shift to the cloud, most organizations have ended up with a hybrid infrastructure. This makes managing such applications and providing GRC solutions for these environments more complex, starting with enforcing consistent identities, access controls, and SoD controls (Segregation of Duties) across these applications.

To further complicate things, other IT services are evolving towards becoming business applications.  ITSM (IT Service Management) solutions, for example, are morphing into ESM (Enterprise Service Management) solutions with central platforms for workflows and services that support several lines of business, well beyond IT.

A challenge of breadth versus depth

GRC and, within that, access control, are about finding the right balance between breadth and depth. In rather monolithic landscapes of business applications, there is logic in using a highly specialized solution. In hybrid, heterogeneous landscapes, however, there often isn’t a single GRC solution that covers all the business applications and services anymore. Some deliver depth, such as for the traditional SAP ERP solutions, while others provide breadth, starting with standard IGA (Identity Governance & Administration) solutions. There is no simple answer, and for many organizations, there won’t be a single solution.

This challenge is further complicated by organizational and ownership issues. Traditional SAP Access Control runs on SAP and commonly is owned by the SAP team. But how should the organization look in a world of hybrid and heterogeneous business applications? Is there still room for an SAP silo? Who should own what? CIOs, CISOs, and CROs (the Chief Risk Officers) must rethink the way GRC and access control are implemented for today’s ecosystems of business applications. This starts with revising the organizational structure and responsibilities.

For a more in-depth analysis of access control tools of SAP environments read this Leadership Compass.

 


Ontology

Ontology Weekly Report (April 1st–6th, 2021)

Highlights As we come to the end of the first week of April, we wrap up another successful week for Ontology. As our community continues to grow, Ontology’s decentralized identity solution continues to gauge the interest of respected voices in the industry. Latest Developments Development Progress We have completed 70% of the Ontology EVM-integrated design, which will make Ontology fully comp

Highlights

As we come to the end of the first week of April, we wrap up another successful week for Ontology. As our community continues to grow, Ontology’s decentralized identity solution continues to gauge the interest of respected voices in the industry.

Latest Developments

Development Progress

We have completed 70% of the Ontology EVM-integrated design, which will make Ontology fully compatible with the Ethereum smart contract ecology after completion. We have completed 50% of the latest Layer 2 technology, exploring the integration of Ethereum Layer 2 on Ontology MainNet.

Product Development

We have released v3.8.0, including optimization of the front-end interface The ONTO x OpenOcean event was successful and rewards will be issued soon.

dApps

113 dApps were launched in total on MainNet. 6,509,768 dApp transactions were completed on MainNet. 38,616 dApp-related transactions took place in the past week.

Community Growth

The Ontology community is growing at a rapid rate. This week we onboarded over 2,429 new members across Ontology’s global communities.

To keep up with our latest developments and community updates, please feel free to reach out to us on Telegram or follow us on Twitter. As always, we encourage anyone who is interested in Ontology to join us.

Global News

In some global news this week, the Chinese and English version of the “Digital Securities Industry Development Report 2020” co-authored with Cabin VC, Binance China Blockchain Research Institute, Gibraltar Stock Exchange Group (GSX Group), Hong Kong Digital Asset Exchange (HKBitEX) and UPRETS was released. Ontology’s Head of Business & Strategy was invited to attend the “New Form of Encrypted Assets in Grayscale Mode” online seminar.

In the Media

The Block — Ontology, A Blockchain For Decentralized Identity And Data

Leading crypto publication The Block released a research report on decentralized identity, mentioning Ontology as a key part of the infrastructure. Ontology, Bitcoin, Ethereum, NEAR, Sovrin and more are listed in the blockchain infrastructure category, and the ONTO cross-chain wallet is in the wallet & verification service category. Read the full report here.

Block Tribune — Expert Takes: PayPal Launches Crypto Checkout Payments

Li Jun, Founder of Ontology:

“PayPal’s decision to allow U.S. consumers to use their cryptocurrency holdings to pay at millions of its online merchants is likely to encourage many to make their first crypto transaction — even if it’s just out of curiosity. While some may be apprehensive, once tried, it’s unlikely they will notice much difference between paying with crypto or legal tender. So, why not?

With PayPal making big moves into cryptocurrency, it is more important than ever to get experts to weigh in on issues around security, and how decentralized identities like ONT ID can help.”

Decentralized Credit Rating Is Integral To DeFi Adoption

For years now, lending and borrowing offers individuals a vast array of new opportunities in which credit scoring plays an integral role. This makes OScore a vital missing piece to the DeFi’s puzzle because, in order to offer these products, this vital data is required so that they can be assured the purchaser or borrower will be reliable and in a position to repay them.

By utilising decentralized credit scoring that ensures the safety and security of user data, we are in the unique space of getting the best of both worlds.

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (April 1st–6th, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. April 2021

Shyft Network

The Shyft Network Roadmap

After a massively successful IDO on Polkastarter, the roadmap is laid out and the team is ready and excited to work towards it. We started Shyft Network with the view that one day our ecosystem would require critical infrastructure to ensure its stability. Now, more than three years later, that day has come; as international regulatory guidance begins to take shape, we begin to introduce inf
After a massively successful IDO on Polkastarter, the roadmap is laid out and the team is ready and excited to work towards it.

We started Shyft Network with the view that one day our ecosystem would require critical infrastructure to ensure its stability. Now, more than three years later, that day has come; as international regulatory guidance begins to take shape, we begin to introduce infrastructure that aims to solve global challenges faced by ecosystem participants.

The Shyft Network roadmap spans multiple years, but is not limited to our roadmap alone; rather, the future development of Shyft will also be dependent on where our industry goes, and the challenges that present themselves across institutional adoption and regulatory guidance.

The goal of introducing Shyft Network into the space is to provide a response framework that we, as an ecosystem, can build on and use to provide more options and create new systems that help us better comply with evolving regulations.

Shyft Network is meant to act as a layer 1 federated side chain, a smart contract tooling infrastructure, and a general purpose protocol that provides adaptability and optionality into systems.

Shyft Network also has general-purpose use cases that were built alongside its architecture in order to provide solutions to global problems. These use cases define the intent of why we built the Shyft Network. Although these are mission driven use cases, they do not limit the extent to which this infrastructure can be used to build other systems, and as we now deliver this infrastructure to the global ecosystem, we hope to work alongside the community as we all carry it forward into the future.

We’ve had so much interest from third-party developers, protocols, companies, enterprises, governments and more looking to implement Shyft Network and its aggregation infrastructure. Once the Node Federation takes control of the network, the network company, Shyft Network Inc., will help to provide resources and support to aid new parties entering the project.

Stage 1 — Launch

The initial deployment of Shyft Network smart-contracts on Ethereum is where this project begins to enter the open world and goes into the hands of users; with it, we initialize our core KYC contracts, and enable proof-of-concept of our flagship use cases. Shortly after, the node federation takes over and Shyft Network’s mainnet is deployed. Stage 1 is the culmination of more than three years of development, planning and execution. We survived crypto-winter and proved to our supporters and community that our project is stronger than ever.

What can the Shyft Network community expect? The SHFT Token ready for use on digital identity, verification and KYC concepts. VASPs who we partnered with enter a second round and are now integrating into Veriscope, the global exchange and institution-governed solution to the FATF Travel Rule. The deployment of community incentivization structures, i.e., yield wrapping, liquidity bootstrapping and DAO enablement. The creation of the Node Federation as an essential component of Shyft Network’s push towards decentralization March 24th, 2021: Ethereum <> Shyft Network version 1.0 launch The deployment of the first of the Shyft Network Core KYC contracts; this enables trust anchoring, compliant liquidity pool development, coalition and identity creation, and cross chain bridged asset and identity routing architecture to be possible. This deployment will focus infrastructure towards the integration of decentralized exchanges, liquidity pools, and existing Ethereum dapps for compliant, opt-in primitives. The importance of this infrastructure is that it creates a fundamental development apparatus and initial base structure so that the Defi ecosystem can begin to build FATF compliant endpoints into existing ecosystem utilities and platforms. April 2021 — Shyft Network launches the Federated Mainnet (Layer 1 consensus nodes will now carry the protocol forward) The Shyft Network mainnet is live and the Node Federation takes over the protocol and carries it forward. This is a major milestone in the decentralization of the Shyft Network. Once the Node Federation takes over the protocol, the Shyft Network team becomes another member of the community, and will work with partners and new applications to aid in providing developer tools and resources to new entrants. It will also launch educational marketing campaigns to increase awareness of the network and its capabilities. VASPS that have been leading governance on Shyft Network through Veriscope, as well as national identity systems and other institutional use cases, will be able to use the attestation infrastructure on the Shyft Network Virtual Machine (SVM). The globally distributed federation will now be active, and support for Travel Rule complying entities globally will now be live for utilization and deployment. Identity use cases and entity to entity data sharing requirements will initially take place on the Shyft Network Layer 1 until data portability and synthetic asset transmission into corresponding Shyft Network contracts on other chains is complete. ChainBridge technology, developed in collaboration with Chainsafe, will be deployed between the Shyft Network and Ethereum to support cross asset transmission and minimal cross-smart-contracts operations. This will not link coalitions or data use cases, attestations will not be routable at this stage, only asset transmission. Stage 2 — Expansion

Additional use cases, e.g., our PerseID Government ID project, will be deployed. The Shyft Network team will also start a community and industry wide education and promotion campaign to bring in new partnerships and developers into the ecosystem.

What can the Shyft Network community expect? Staking on the Shyft Network goes live. More incentives in the form of community DAOs and grants. Technological improvements on the Shyft Network protocol. Ecosystem expansion: external use cases and third-party developer use cases begin to deploy and scale the network.

Disclaimer: The plans, dates and other statements contained in this roadmap are an estimation only, are subject to change based on various factors and circumstances, and there are no assurances that they can be fulfilled

Q2 2021 Staking on Shyft Network goes live. Additional community token-based metagames & AMM incentivization is released. The goal is to improve token distribution and prepare the network for scalability and resilience. As an enhancement of Veriscope, the federation initiates discussions and prototypes of opt-in compliance products designed for decentralized finance; compliant DeFi will begin to take shape and aim to enter a new era of mass adoption and user empowerment. LP Reward Program is released on Ethereum, allowing a whitelisted subset of participants to get SHFT token rewards for applying their liquidity to specific SHFT:asset pairs on Ethereum’s AMMs. We will be expanding this program and notifying the community on the construction of additional purely-compliant LP pools. Another major milestone for Shyft Network is the launch of “Perseid” Government ID Project; the PerseID team has been working with the Government of Bermuda for years and are very proud to deliver technology that aims at improving the life of their citizens. Once launched, the PerseID and Shyft Network teams will look to deploy public grassroots & government-wide marketing campaigns aimed at expanding this particular use case. Federation improvements & Shyft Runtime Inflation DAO’s: One of Shyft Network’s core pillars is effective decentralization, and will further promote federation-wide technology improvements to integrate with Shyft Community DAOs. Runtime inflation DAO’s are aimed at creating a solid developer base and the incentive structures that will help it flourish. First Alpha iteration on the Shyft Byfrost bridging technology, allowing further economic & reputational finality across blockchains. Iterations on the “Compliant DEFI” project, integrations on VASP-scale to allow “passporting” of identities onto the blockchain DEFI applications. Stage 3 — Continuity Q4 2021 Further iterations, LP stability, composability standards implemented & used across multiple platforms simultaneously. Shyft Node Federation integrates Proof-of-Work checkpointing. 2022 Further instantiation of “Veriscope”, “Perseid”, “Compliant DEFI” products & cross-product integration, oracle support, management of full cross-chain opt-in identity transactions & composability support. Node Federation transitions to ToR-only communication. Conclusion

The team is ready to push forward and achieve this milestones; this is what we’ve been working towards and are very excited to share it with our community as we enter a new era of decentralization for the project. Join us as we unlock trust, identity, and creditability networks for humanity.

About Shyft Network

Shyft Network is a public protocol designed to aggregate and embed trust, validation and discoverability into data stored on public and private ecosystems, and facilitate information transfer between permissioned and permissionless networks. By incentivizing individuals and enterprises to work together, Shyft Network allows for the layering of context on top of data, turning raw data into meaningful information.

More INFO
Website: https://www.shyft.network/
Telegram Group: https://t.me/shyftnetwork
Telegram ANN: https://t.me/ShyftAnnouncements
Twitter: https://twitter.com/shyftnetwork
Medium: https://shyftnetwork.medium.com/
Discord: https://discord.gg/NJ7bZTCnm6

Northern Block

The Importance of Data Inputs and Semantics for SSI with Paul Knowles [Podcast]

Listen to SSI Orbit Podcast Episode #6 with special guest, Paul Knowles from the Human Colossus Foundation, as he discusses the importance of data inputs and semantics for Self-Sovereign Identity (SSI) with Northern Block's CEO, Mathieu Glaude (full podcast transcription available too). Listen, consume & share. The post <strong>The Importance of Data Inputs and Semantics for SSI </s


Click to Listen to this Episode on Spotify

Introduction: Music and SSI

Mathieu: Many different people have spoken to me about some of the work that you and the Human Colossus Foundation are doing: both within the foundation and also with different community initiatives over the past little while. I have been quite interested in the work you’re doing, and I’m looking forward to just going a little deeper into this with you today. Thanks for doing this.

When I started looking into your background and how you got into what you’re doing now, it seems that early on in your career, you had a passion with a lot of interest and activity in the music industry.

Paul: Yes, I’ve always had a bit of a dual career in music and the pharmaceutical industry. Like a lot of people in the music industry, sometimes you need a second job to keep your passion going. My crux was the pharmaceutical industry, but yes, the music industry. We’ll take it back a step: my degree at university was in I.T. Right after university, I did an audio engineering degree in London and learned how to work with the big mixing desks in recording studios. From there, I started working for a couple of recording studios, getting to understand the trade as an audio engineer. I fell into private events through my music industry career. I ended up doing showcases at the Gibson showroom for international artists travelling into the U.K. Gibson Guitars had some showrooms in London, so whenever anyone decent was coming through, I’d put on a showcase for the U.K. music industry, which would usually kick off their U.K. tour.

One of the artists that came through was an Indian artist called Raghu Dixit, who was actually the highest selling independent artist in India (outside of Bollywood). I put on one of these showcases for him, and through that, I ended up co-managing him for a while. That kept me busy. I got to fly around the world a little bit with him, but it was that journey that started getting me into the data space at the same time. I created a platform called the International Music Community, where the idea was to introduce up-and-coming artists to music industry professionals, but using data to build those algorithms. For an artist, we’d get all their social media traffic, put that into an algorithm, and then hook them up with industry professionals who had worked with artists of a similar level. It was an interesting space. We ended up white-labelling that platform and a lot of the music industry conferences really liked it. As you can imagine, you’ve got the president of Warner Brothers or something on a panel at a conference, and they don’t want to be bombarded with contacts from baby artists. Through this tiering thing that we’d built within the platform, I was able to protect the identity of the music professionals at the high end. But at the same time, they could look down into the filter to find the up-and-coming artists that they were looking for. So that got me into the data space in music. At the same time, I’d always had this parallel career in the pharmaceutical industry in data management and statistical analysis. That was paying the bills as I was doing all of the music stuff.

I suppose that’s how I became so interested in semantics. That’s how it all started; my journey into SSI was through that algorithm development within the music platform. I wanted a way of authenticating when an industry professional said that they worked with Rod Stewart or something like that. I wanted some way of authenticating that information, and that’s how I got into SSI, decentralized authentication, and key management, etcetera.

Mathieu: That’s super interesting! As you were explaining your marketplace solution of connecting up-and-coming artists or musicians with labels and so forth, right away, my head went to credentials, and showing proofs, and just doing matchmaking. With the way the platform had been built, but before self-sovereign identity, you could see the next evolution of the platform growing into its own ecosystem solution, potentially using verifiable credentials.

Paul: Exactly. The platform was an incredibly federated platform when I built it because I didn’t know that SSI existed. So as soon as I found that ecosystem, I tore up the rulebook and said, “This isn’t going to work; I have to rebuild it.”

SSI and Pharma

Mathieu: At the same time, you were definitely solving significant problems with the way it was built, and I’m sure it still does. Now that this type of thing is possible, we could explore all these new business opportunities with this. At least, that’s what gets me excited every day, when I get up and look at the self-sovereign identity stuff.

Working in pharma, it seems like big data is just crucial to that field. Big data and identity management are at the core. Is that what got you thinking more about the data structures? In pharma, I need to have very solid data inputs. But, if I want to be able to share results and across multiple parties, especially for clinical testing, that got you thinking about the structure of data a lot more, am I right?

Paul: Yes, absolutely. One of the first jobs I had in the pharmaceutical industry was actually data entry. They had these clinical report forms which were all handwritten, and I’d have to enter that into the system. In those days it was very dodgy: you’d have all sorts of doctor’s notes and comments, and you’re entering all of that, which is obviously full of P.I. information and dangerous data. It wasn’t nearly as strict as it is now, but yes, it did get me thinking about semantics very early. Obviously, for all those free form text fields; bringing in predefined entries and putting some structure around data is necessary, so that it becomes useful for machine-reading and analytics. One of the pain points with clinical data capture is that when you work on two sister trials and you’re trying to aggregate that data together at the end of the process, it was always little things that would break that pooling process. For example, maybe the same attribute had different formatting in each study, or the length of the attribute was different in each study — when you aggregated it, the data would truncate. These are pain points that are still going on today in the pharmaceutical industry. That’s where I started delving hard into semantics, and trying to find new ways of capturing data: different architectures, all that sort of stuff, to try and get some of those pain points ironed out.

Mathieu: A long time ago, I spent some time working in a clinic. We conducted clinical trials and later, staged trials on patients who would come in for the trials. I did all sorts of different stuff in this clinic, but I was more on the input side. I was managing the operation, the intake of the drug that was being administered, and the blood sampling and the testing. Because I was more on the input side of things, it wasn’t on the data and the analysis side of things. It seems that’s how you’re looking at the inputs and semantics: it’s the two sides of this. How do you describe inputs and semantics to someone?

Semantics and Inputs

Paul: The easiest way to think about it is: inputs are what is being inputted into a digital system or ecosystem. Semantics are what is providing context and meaning to the information that’s being entered into a system. Those are the two main differences. Another way to look at it, would be data entry versus data capture. When you just say ‘pure data entry,’ that’s the keys that you enter on your keyboard into a system. But really, to bring anything in any context or meaning, it relies on data capture; when you enter something and you need to capture it into a schema structure or something like that. That’s where you have all your formatting and your predefined entries, your human-readable labels, the context, the metadata of the schema, and all that sort of stuff. All of that rich information that brings contextual meaning to the data, that’s been inputted. The data inputs are on one side, the input side. You want to know that the data has come from an authenticable source, whereas on the input side, it’s more about making sure that the context and the semantics around the data — those constructs are immutable. So that way, any actor who is interacting with any of those objects knows that the data capture structures are the same for everybody.

Mathieu: You had a nice illustration diagram of semantics and inputs; you started splitting everything up into both of them. Does this go all the way down?

If you’re looking at the different inputs through credentials, does that push all the way through to attributes? Does that push all the way through to deeper objects?

It definitely seems that in the self-sovereign identity space, and entering with that mindset if these things are needed — you must have realized pretty quickly that for this to actually take off, there’s a lot of this infrastructure that needs to be in place.

Paul: Yes. As soon as you’re talking about a decentralized ecosystem, you are looking at a lot of innovation to enable that whole space to work. One suggestion that I would say to people is that you’ve almost got to treat the entire data economy as one company. Within a company, you would have a data analyst group or something similar, plus perhaps other groups that are more geared towards marketing or whatever. These sorts of differences within an organization work quite well, but as soon as you go into a decentralized space, it’s more challenging. Where do you categorize things in a totally decentralized space, so that you have communication throughout the entire data ecosystem? For that model that you’re talking about, I think the official term for it is “the model of identifier states.” At the Human Colossus Foundation, we call it the rugby ball model because of the shape of it.

I built that model because I thought the decentralized identity folks were overlooking semantics. Obviously, with my data management background, I wanted to get a model for them to understand the importance of semantics. When you talk about decentralized key management solutions, you’re really only looking at half of the puzzle. That puzzle is all about making sure that data comes from an authentic source. Regarding all the immutable semantics and stuff, that’s the other half of the model. I call that the semantics domain. That’s been fun; it took a little while to get down to the nitty-gritty of the dualism between every component within each half of that model. For instance, when you talk about an attribute in a schema on the input side, that’s a claim within a credential. Essentially, the input side is about real data, and the semantic side is about the constructs that were used to put context into those data inputs.

Mathieu: You’re involved in many different communities; we’ve seen each other in the different working groups, and you run your own working group in the Trust over I.P. movement. How does this model fit into the Trust over I.P. stack? There are different types of organizations or people that are playing at different levels in the Trust server I.P. stack. For example, you could have people just playing in Layer One: they’re building public identity utilities. You could have other companies, like the company I represent, that are playing more at the top layers: we’re trying to enable ecosystem solutions to be built. However, just as there is a governance layer across the stack, what you’re bringing here, through inputs and semantics, needs to be taken into consideration across the stack. How does that fit into the whole picture?

Paul: It’s just a different vantage point; it’s still a full ecosystem. With the dual-stack at Trust over I.P., you’re looking at governance versus technology across the four layers that you mentioned, from utility at Layer One, right up to the ecosystems at Layer Four. That gives you a fairly big picture of a data economy, but only from a perspective of governance versus technology. The inputs and semantics group is just a different vantage point — thinking more about what is entered into a system. and the meaning of the data that you’re capturing and stuff like that. It’s still a totally full model, but it’s complementary to the dual-stack. I’d suggest that data entry and data capture is valid for all four layers, and at each layer, it’s really about how you’re capturing that data. It’s not so much about what you’re capturing at the time; it’s thinking a little bit beyond that. For example, with all the vaccine stuff; it’s great if your apps can work for contact tracing and all that sort of stuff. But the way that I think about data is, “Okay, in the pandemic, you have the World Health Organization; how can we make this data useful for their processes and the analytics at their end?” That would enable them to get some real-time analytics to see how the pandemic is responding. I think there’s always a broader context than some developers sometimes think.

Human Colossus Foundation & KERI

Mathieu: You approached this through the Human Colossus Foundation; what were your steps to forming this? What was your journey of starting this organization, and why have you structured it the way you’ve structured it?

Paul: The foundation started because of Overlays Capture Architecture at the time. The foundation was founded by three of us: myself, Robert Mitwicki, and Philippe Page. The idea of it really kicked off when I met Robert. Robert’s a deep stack developer, but he’s an interesting developer — he’s always got a very broad perspective on the things that he’s building. We hit it off immediately when I showed him the Overlays Capture Architecture (OCA). Initially, we had built an innovation hub at a proprietary company called Dativa, but we soon realized while we were building our OCA and that environment that it had to be open source. By that time, we’d also started developing a model called the “Master Mouse Model,” which is a conceptual model of what a dynamic data economy would look like. That was all spearheaded through the characteristics of Overlays Capture Architecture. Really, the reason we built the foundation was we knew that OCA was going to be an important part. We knew that what they were doing with SSI, and in Hyperledger Aries and Hyperledger Indy, was an important space for authentic provenance chains. It was about taking these different parts, and trying to have a foundation where we could knit some of these components together without having to disrupt too much of what was going on in those communities. We could do it within our own foundation. Everything we build is totally open source. As soon as we’ve done a proof of concept or we’ve knitted some components together for the benefit of the economy, then we can showcase that, and people can simply pick up those components and integrate them into their own solutions.

Mathieu: For the work that’s being produced by the foundation (like the overlays capture architecture, which was the starting point): is it all very use-case-oriented, or is it more generic frameworks that only need to be customized?

Paul: No, they’re totally generic everything; one of the components we’re developing at the moment is a trusted digital assistant. That would be a deeper component into the stack than a traditional digital wallet. You’re probably aware of KERI; have you come across KERI, the Key Event Receipt Infrastructure?

Mathieu: In the seminar you threw, you had Sam Smith, who was involved in this. We were personally quite excited. We’re excited about KERI, but we’re also excited about the network of networks concept, so those fit in nicely together.

Paul: KERI is a very interesting architecture, because, with a data ecosystem, you can really go two different ways. You can have your Root of Trust being a node on the network, whether that’s Sovrin, or Ethereum, or Bitcoin, or whatever that provides a Root of Trust. What KERI promises to enable — and I say ‘promises’, because it’s still being developed as we speak — you could change your Root of Trust because it’s a ledger-less solution. Your Root of Trust could be what we call a trusted digital assistant, which is a component on your mobile phone. The importance of doing it that way, is because you avoid all this potential network lock-in. That is, when some people are building on Hyperledger fabric and others on Ethereum and the like, you could potentially get into these interoperability issues with the way DIDs are, currently. With the decentralized identifiers, you have this method space that usually gives you the location of where the identifier is housed. So, if it’s a DID:SOV, that suggests it’s on the Sovrin network. There’s something like 87 different methods, so it’s not great for interoperability.

What KERI allows, is if you have the identifier as a component on your mobile phone, then you can suddenly interact with any network: you’re not locked into those networks. That’s a really interesting technology that we’re going to be delving hard into at the Human Colossus Foundation, for sure.

Mathieu: My perspective on the evolution of the self-sovereign identity stack, is that we’re at a point where there needs to be a lot more intelligence built into agents to create more value.

With your digital assistant utilizing KERI, is that a way to say you’re basically enabling a smarter agent, that fits the real-world business processes?

Paul: Yes, exactly. The advantage of it is that we’re not stopping people from building on any blockchain networks, but at the same time, we’re not reliant on that, as well. If I could describe the ‘trusted digital assistant’ in a short sentence, I’d call it an API plug-in gateway, into a dynamic data economy. Today, a browser is really your way into the internet. The way for individual citizens to get into the decentralized dynamic data economy, is through this component on your mobile device. So, you can think of it as a personal browser, if you like.

Mathieu: Yes. That truly is being open source. Whether I’m the ‘good health pass’ or if I’m some ecosystem solution, I could choose to incorporate this into my stack. That gives you all of the context and all of the other benefits that you need to have. If I need to worry about privacy, or user experience, this could fuel a lot of these benefits that we talk about every day in SSI.

Paul: Absolutely. I think that there are three main domains that you really need to decentralize: one is obviously the decentralized key management, the second is decentralized semantics, and then the last one is decentralized governance. I think the governance piece is maybe where KERI can help, because you’re not necessarily locked into those networks. I believe that can help a lot to decentralize governance. For instance, if I’m at border control in the U.K., or China, or any of those different countries, you can’t expect them to all be using the same network. By introducing KERI, it gets you out of those potential network lock-in issues. When we say ‘decentralized governance’, I think it sits in that sort of space.

Working Groups within Trust over IP

Mathieu: You’ve chosen the Trust over I.P. as a good home to conduct certain work with the community. Would you mind describing what’s going on right now within the Trust over I.P., inside the inputs and semantics working group?

Paul: You can think of the inputs and semantics working group as the innovation hub of Trust over I.P. You don’t have to be an identity network, that is, an SS-Identity network. If you’re coming into the space and you have an identity solution of whatever kind, we can look at that within the inputs and semantics group. We would try to knit it together into a proof-of-concept or pilot, without disrupting some of the businesses that are already developing strong SSI solutions within the dual-stack, as it stands. I think there’s a little bit more freedom in the inputs and semantics working group for people to work on brand new solutions.

Mathieu: Is it more new solutions, or do you see a mix? There are so many solutions that exist in the market today. I’m sure you could talk about so many things that go on in pharma, for example. You’re working on helping entrepreneurs or helping new ecosystems, but is it a mix of both that come through there?

Paul: It’s absolutely a mix of both. You’ll get people who want to set up new ecosystems, but maybe they’re sometimes working with some legacy solutions. They want to figure out how to deconstruct what they’ve done to make it more decentralized. All of those issues can come into the inputs and semantics group. At the same time, we’re also looking at new solutions such as semantic containers. Some of the transient portability solutions can also fit into inputs and semantics, and we see how we can work them into moving data around in a safe and secure manner.

Mathieu: It’s such a cool space. I was looking into it: as you said, there are task forces on storage and portability; there’s privacy, and risk, and notice, and consent.

There are so many things that need to be thought of from the user experience standpoint, or from the business case standpoint: to figure out what to show, what not to show, and how certain functions should work. You’re able to absorb that, and build it into the models.

Paul: Probably one area that needs strengthening up a little bit, is the user experience side of things. There is a human experience working group that is set up at Trust over I.P. It’s not housed in the inputs and semantics group, but it’s a very new group, so I don’t think they’re totally up and running yet. It’ll be interesting to see what comes out of that task force.

Mathieu: For just a simple example of what you mentioned earlier in the pharma segment: the dates can get screwed up in different formats. Being in Canada, that happens very easily if dates are written in English and in French — you write it in one format if it’s in English, and another format if it’s in French, so there’s a mismatch, to begin with. That’s a simple user experience example where, if you want to internationalize a solution, or if you want to make it applicable, it needs to be taken into consideration.

I would love to hear more on how you thought about general user experience overall, but it seems like the structure that you’re building and bringing up enables contextualization to happen inside of the actual wallets, and apps and systems that are being used by end-users.

Paul: The semantics part is very interesting. I think we’ve stumbled across a really nice architecture, in that it doesn’t matter what language you want your credential to resolve in, or your predefined entries; they can all be resolving in your mother tongue. Everything in OCA is a layer, so depending on the country that you’re in, we can easily swap in a layer here and there, rather than rebuilding the entire schema structure. With you being a Canadian, you’ll be interested in this: I’ve been building a COVID-19 vaccination specification for the COVID-19 credentials initiative group. It’s an OCA-formatted spreadsheet, if you like. I showed it to the Canadian guys, and I had both Canadian French and English as some overlays. They asked, “Could you build some overlays for the Indigenous communities in Canada, like the Inuit?” I said, “Yeah, I think so.” Then, I had to google Inuit, because I have no idea — I’ve never seen the language before. It looks almost like hieroglyphics or something, and I thought, “I don’t think this is UTF-8 compliant.” Interestingly, with Overlays Capture Architecture, the character set encoding is a separate layer altogether. So, for an interesting character set such as what the Inuit use, you can just change the character setting encoding overlay, and then rebuild all the human-readable labels, and all the predefined entries, and everything in their language. The idea is that when you resolve that on your mobile phone, it comes through as a form. The whole form can be structured in their language, so it’s great. If you’re travelling and you don’t speak a language, you can enter everything in your own language. If the guy at border control only knows Russian, for example, all you have to do is hit a little scroll-down at the top from English to Russian, and the whole credential form changes into their language. It’s a really cool architecture, and we’re super proud of it.

Managing Privacy

Mathieu: I won’t lie, it was heavy the first time I started looking into it; there are so many layers. I think the fact that you’ve been able to componentize it like that, makes it very easy to swap-in/ swap-out what you’re trying to do, based on your use case. I don’t know if it would be the masking layer, but if we’re talking about selective disclosure or privacy, there’s another layer in there that addresses that too, right?

Paul: Yes. If you look at data management as a three-step process: from data collection, data capture, data exchange… that second step of data capture is really important, because that’s where you rebuild the semantics for the data exchange. In that way, when it’s gone out the other end, if you like, it’s all beautiful for machine learning, etcetera. Some of the overlays that you use in that second stage, the capture stage, are really things like your human-readable labels, your predefined entries, or your formats. Some of the overlays that you might use on the exchange side might differ, so that’s what we’re looking at at the moment. When you talk about a masking overlay, that requires additional processing on perhaps certain attributes that you flagged in the schema base. Some of that masking might be more suited for the data exchange side, rather than the data capture side as a secondary process. We’re looking into all those nuances at the moment. I think that OCA has enough flexibility to be very granular on which overlays are used at, what stage of the data life cycle.

Mathieu: That makes sense. You touched on this earlier when we were talking about KERI being interesting for governance, because you’re not locked into a specific utility. Thinking about privacy, and governance around whatever needs to go into an ecosystem solution; this is where ecosystem builders would be able to integrate this architecture, and to be able to create the rule-set based on this, right?

Paul: Yes, absolutely. The privacy folks are almost a totally different community in themselves. When I talk about the ‘Master Mouse Model,’ which is our vision of a dynamic data economy, what we talk about is really the data layer. However, there’s actually an entirely new layer that goes on top of that, which is the jurisdictional layer. For example, is a company compliant with jurisdictional law to be able to operate within that sort of economy?

When you talk about services: are those services authenticated by a jurisdictional authority? Are they allowed to perform those services? Those sorts of legal things are almost like a different ecosystem altogether. When I was talking about the ‘trusted digital assistant’, that’s really where you’d have APIs from that jurisdictional layer, plugging into your trusted digital assistant. That way, you know that when you’re looking for certain services, they’re already legal and they’ve been authorized properly. Obviously, it shouldn’t be up to the human being to decide whether something’s authorized or not; that’s a legal component.

Mathieu: Yes. So, in a COVID credentials or vaccination credentials use case, based on the jurisdiction you’re in, you don’t want to allow anyone to breach privacy or share PII. It’s all built into the system based on the jurisdiction: if I’m in Canada, or in a specific province in Canada, versus if I’m in a U.S. state, or if I’m in France or wherever, you could have this built right in.

What other use cases do you find interesting in what you see now? You came from the media/music space, you have a lot of experience in pharma, you’ve been spending a lot of time with the COVID projects: what do you find super interesting, or where are you spending most of your time right now?

Paul: There are a couple of interesting things that we’re looking into. One of the projects that we built at the Human Colossus Foundation was a digital immunization passport. We haven’t pushed it that hard, because it’s actually working on brand new technologies that are not standards yet. I think there’s going to be a second wave of digital immunization passports; the first wave that goes out might come across a few issues with governance and all that sort of stuff. We’re sitting in on discussions about the next generation of immunization passports. One of the interesting things in that solution was trying to cryptographically link verifiable credentials to semantic containers.

Our view of a credential is that it should allow you to do something. When you have a credential on your mobile phone, you can show it to border control. They can see, “Oh, you’ve got a green tick mark; you’re good to go.” All of the sensitive health information data that sits behind that, we’ve been storing that in what we call a semantic container. That’s where the holder of the data can give access to that container using a token, and there is obviously a cryptographic link between the container and the credential. What’s interesting is that the credential’s great for saying,” Yes, this data has come from an authentic source.” Essentially, by cryptographically tying that credential to the container, you can say that anything in that container has been authenticated by me, and that has some interesting properties.

When you’re talking about the music industry, I think that for music publishers it could be really interesting. They’ve always had trouble with people copying MP3s and giving it to their friends; the artist doesn’t make any money. However, if you put an MP3 in a container and you’ve got a verifiable credential saying, “Yes, this container has been authenticated by AC/DC or whoever the band is”, then you can track it very well, using these decentralized technologies. So, if anybody is accessing that MP3; firstly, they need a token from, let’s say, the publisher acting on behalf of the band. That would probably be a good way to do it, because there’s always a provenance chain, back to where it came from. There’s an interesting technology that we’ve been looking at, called Digital Water-marking. Basically, it just changes the composition a tiny bit of the digital asset within the container. So, if somebody leaks that for any reason, without proper authorization to do it, you can track who leaked it. That’s super cool: it’s more of a defensive approach to data sharing. I think with all these things: the more protection you can have, the better. That’s just another method that we’re looking at, for how to stop illegal data sharing.

Mathieu: Is a good analogy that a credential is like a recipe item, but without a semantic container? That’s the meal; you’re able to put everything in together, you could take it from different places, and just have it as its own package.

Paul: Yes, you can think of it exactly like that. It’s a bundle of information; so it could be a usage policy, plus a huge audio file, plus a couple of attachments — that can all be put into a container. The credential is saying that everything in that container is authentic; you can take a cryptographic hash of the contents of the container, and also a cryptographic hash of the data capture structure that was used to capture that data. If either of those two items changes, the hash basically breaks, and then the credential is automatically nulled.

When you have a UPS package or something like that, the package is the container and the thing that you’ve signed to get the package; that’s your credential. That’s your way of authenticating that it’s yours, all under your control.

Community Activities

Mathieu: All of this work, and thinking, and pilots, and effort: you’re bringing all this to the Trust over I.P., and you seem to be working with Sovrin. Is it more specifically on the UX or guardianship side of things? Are you also involved with Kantara?

Paul: Yes, I’m involved in all of those communities in varying capacities. I work with the MyData community, all about personal data. I set up the ‘MyData Health’ thematic group over there, so it’s really about making sure that that health data is treated properly throughout the data life cycle; that’s one thing I do. For the COVID-19 credentials initiative, I run the schema task force there. We’re basically creating schema specifications for COVID-19 tests: that includes test reports, test certificates, vaccine reports, vaccine certificates, those sorts of things. At Sovrin, I lead the communications team. So, whenever you see blog posts going out from Sovrin, it’s usually pulled into our group. We’ll vet it, and make sure that it’s accurate and supportive of the community. And then, with Kantara, I work very closely with the ‘notice and consent’ group over there. Notice and consent is totally not my expertise, but I know how important that piece is to the whole ecosystem, so I’m trying my best to learn as much as I can. There are some amazing people in that space, who have as much knowledge in their space as I do in semantics. I’m totally in awe of those guys, as well.

Mathieu: For the different communities advocating for trusted data, or self-sovereign identity, or decentralized identity, however you call it; I think everyone has the same ethos of what we’re trying to do, and the community is amazing. Everyone is trying to help each other out; everyone’s trying to give. Honestly, we’re very fortunate to work with these types of people every day. It’s just amazing; it’s very motivating.

Paul: Absolutely. You’re dealing with people who have 25 years of identity experience, and 25 years of consent, and getting these people to the same table to agree on what a dynamic data economy should look like. I’m totally blown away by the level of knowledge.

Mathieu: To close, where could you use help? How could people help you to keep pushing the dynamic data economy forward, and all these things forward?

Paul: It’s very much a cross-community effort, really. The Human Colossus Foundation is pretty much as agnostic as you like; we don’t really have an agenda. We talk to all the communities, and we try hard to make sure that everybody has a place at the dinner table to put their objectives forward, for the good of the economy.

Moving forward, if anybody’s really interested in this space, I would suggest that the inputs and semantics working group within Trust over I.P. is a great place to join. You’ll get a very broad perspective on the economy from a lot of specialists all working together. For a quick fix on a lot of information, that’s a great place to go. At Trust over I.P., looking at the dual stack, there’s also space for governance across the governance stack, and also new technologies in the technology stack. I think Trust over I.P. is a great space to join. When it was first set up, there were a few murmurings that this is just Sovrin, rebranded. In reality, it’s much broader than that. Sovrin is a utility for self-sovereign identity and will probably always be seen as the genesis point of that whole ecosystem, but Trust over I.P. goes much broader than that. It’s self-sovereign identity, but it’s also decentralized consent, semantics, ecosystems — it’s a much broader space. The reason I say that is because for anybody that is uneasy about SSI or has any doubts about it: just go into Trust over I.P., and there will be a space for your expertise, for sure.

Mathieu: Agreed. Paul, thank you very much for doing this with me. It was a great conversation. I’m sure people are going to find this quite interesting and want to go a little deeper into the stack that you guys are bringing forward. I know this is an area that I’m quite interested in going into a lot deeper, so thank you for doing this!

Paul: My pleasure, thanks for having me; I really enjoyed it.

The post <strong>The Importance of Data Inputs and Semantics for SSI </strong>with Paul Knowles [Podcast] appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Anonym

Removing Anonymity Online Would Risk The Most Vulnerable Users

It’s a sad reality that Web 2.0 and social media have made it easier for online abuse and hate speech to flourish.   In 2017 Pew Research revealed 41 percent of Americans had been harassed online and 66 percent hadwitnessed abuse of others. Updated 2021 data show roughly the same rates of abuse but more

It’s a sad reality that Web 2.0 and social media have made it easier for online abuse and hate speech to flourish.  

In 2017 Pew Research revealed 41 percent of Americans had been harassed online and 66 percent hadwitnessed abuse of others. Updated 2021 data show roughly the same rates of abuse but more severe abusive encounters, such as physical threats, stalking and sexual harassment. Significantly, 75 percent of the most recent encounters for those affected happened on social media platforms.  

We all know online abuse can be incredibly damaging and Pew puts it like this: “In its milder forms, [online abuse] creates a layer of negativity that people must sift through as they navigate their daily routines online. At its most severe, it can compromise users’ privacy, force them to choose when and where to participate online, or even pose a threat to their physical safety.”   

For years solutions to this growing problem have largely centred on ridding the internet of anonymity—forcing people to reveal their real names and identities when they engage online. But as Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, recently pointed out on the Lock and Code Malware Bytes podcast, removing anonymity unacceptably risks exposing the world’s most vulnerable populations. And that’s where we start to see that anonymity isn’t all evil for its shielding of malicious actors (a common charge); it’s also an essential protection for those who need it most. 

Ms Galperin says: “Being able to speak anonymously is extremely important for people in vulnerable populations, for people with unpopular opinions, for minorities, for women, for anybody whose identity is non-standard.” Included in that, she says, are survivors of domestic and sexual abuse, whistleblowers, and human rights activists working against oppressive governments. For these people, anonymity can be essential because, she says, “the systems are not there to protect vulnerable communities.” 

Ms Galperin points out anonymous speech and pseudonyms are “an extremely important tool for people who want to speak truth to power.” Journalists can do more dangerous reporting; whistleblowers can lift the lid on corporate malfeasance and so on. For domestic abuse survivors, anonymity allows vital sharing of experiences. “Often speech is the only outlet survivors have because often abuse is difficult to prove … and accusations often come a long time after the abuse. Getting legal action or even action by police on sexual abuse is incredibly difficult.” She says this action often doesn’t result in consequences for the abuser that are proportional to what they’ve done, “so all survivors can do is tell their stories.” 

If we remove anonymity online, we remove that vital outlet. “Not only will they not be able to tell their stories, they also won’t be able to continue hiding from their abusers. One major concern is ‘how am I going to continue my life online without continuing to be harassed by [my] abuser’? Starting new social media, getting a new phone … having a pseudonym, hiding … are all important to these communities,” Ms Galperin says. 

Ms Galperin says using real identities online won’t make the internet a safer place for anyone. She argues many people happily post harassing content under their own names all the time so revealing their identitywouldn’t deter their behavior. Further, users need to be able to compartmentalize their personal and professional lives so they can express themselves safely in different forums. Ms Galperin says: “We don’t have one single incorporated identity. We’re often different people to co-workers, school friends, church etc. That’s fine. But jumbling all those identities together [to use a single, real online identity] can result in really negative consequences.” 

Another long mooted solution to online abuse is to abandon S. 230 of the Communications Decency Act, which protects companies that publish or republish third party content and, as Ms Galperin points out, “has enabled social media platforms and our current media environment to thrive.”  

CDA 230 says “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230). This means, any platform that allows users to post content online (e.g. Facebook, Instagram, Tik Tok), is not legally responsible for that content. They don’t have to take down defamatory/abusive/inflammatory content (except where content is copyrighted) and they can’t be sued for what others say on their platforms.  

“People see [CDA 230] as problematic, because they think [removing ]it would rein in Big Tech and social media companies … but it is not a special law just for SM companies. These protections are essential to everyone and to competitors to Facebook, Twitter etc. Getting rid of it is not going to help.” 

Instead, Ms Galperin points out that “issues with CDA 230 are really issues with the First Amendment.” When you engage on a platform, it’s not your free speech rights that apply, it’s the platform’s. This means the platform owners can say what they want and remove whatever content they want. Abolishing CDA 230 won’t matter because platform owners can still apply their rights under the First Amendment. 

“Most of our speech online now takes place on platforms controlled by someone else. Companies are entirely within their free speech rights to edit, ban, leave out whatever they want. Users’ rights are largely limited.”  

In a nutshell, if we change the laws and make companies, particularly Big Tech, liable for their users’ content, they could become “extremely cautious and hyper moderated”, and again this removes one of the few safe outlets vulnerable people have to express their experiences. 

“Survivors need to publish their story on someone else’s platform. It’s often the only option survivors have for anything even vaguely resembling justice … Then if there’s no CDA 230 … now the survivor is completely silenced,” Ms Galperin said. 

As Lock and Code podcaster David Ruiz asked: So if using real names and changing the law won’t work, what will? What would make the internet a safer space for survivors? 

Ms Galperin says it’s about makers of tools and platforms taking responsibility for their products’ effects on vulnerable users.  

“Primarily the onus for making safe platforms is on the makers of platforms and the onus for safer tools and IoT objects is on their makers … I encourage them to think about how their tool will be used for harassment, how this tool will be used by a domestic abuser, to think about empowering the user to get away from someone they used to trust who also used to use this tool with them.”  

Importantly, she points to messaging: “One of the most important things is if you have a way of sending messages through your platform please make it possible for people not to use their real names. Ideally make it possible so they don’t have to hand over their phone number … and to block other users or mute certain keywords. If you give the power to the user, they can decide what is harassment and what is abuse and it really takes the onus off the makers of the platform to be judge, jury and executioner for every communication that somebody has online.” 

We’re already seeing a lot of positive movement in making it better for users online. Ms Galperin points to organizations like EFF and Malware Bytes, and we’d add Anonyome Labs to the list. 

At Anonyome Labs, our solutions help mitigate tech-facilitated violence and abuse. We empower people to be able to determine what information they share, and how, when and with whom they share it. We provide survivors and those at risk of harassment, violence and abuse, real and effective tools for protecting their personal information.  

In May 2020, we proudly became a partner of the Coalition Against Stalkerware, a global working group uniting advocacy groups, software developers, security firms, victims and survivors, in the fight to protect consumers against stalkerware and to eliminate abusive technology and software. 

We are also growing our own safety initiative, known as Sudo Safe. Through Sudo Safe, we partner with organizations that promote safe use of the Internet as well as those organizations that support at-risk people who may have a specific need for privacy and security in their life. Our goal is to support at-risk users, and to be a privacy resource for organizations supporting their members. Sudo Safe encourages the use of our privacy and cyber safety app, MySudo, to help people keep their personally identifiable information private. 

We’re meeting the real need for users not to have to expose their real name or phone number online. MySudo is the world’s only all-in-one privacy solution with strong privacy and security built in. Find out more. You can listen to the Lock and Code podcast with Eva Galperin here

Photo by Keagan Henman on Unsplash

The post Removing Anonymity Online Would Risk The Most Vulnerable Users appeared first on Anonyome Labs.


Tradle

LMA Developing Markets Virtual Conference

The Loan Market Association are the voice of the syndicated loan market in EMEA. They are hosting a conference 28 April 2021, 8.30–18.30 BST (UK). We will be joining a panel of experts to discuss FinTech in developing markets which you can watch at 16:05. The event is for members only. To register for the event go to https://www.lma.eu.com/events/lma-developing-markets-virt

The Loan Market Association are the voice of the syndicated loan market in EMEA.

They are hosting a conference 28 April 2021, 8.30–18.30 BST (UK).

We will be joining a panel of experts to discuss FinTech in developing markets which you can watch at 16:05.

The event is for members only.

To register for the event go to https://www.lma.eu.com/events/lma-developing-markets-virtual-conference

LMA Developing Markets Virtual Conference was originally published in Tradle on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Introducing Auth0 Organizations

Easily manage the business customers that access your applications
Easily manage the business customers that access your applications

Authenteq

Identity Verification and… Funeral Homes?

The post Identity Verification and… Funeral Homes? appeared first on Authenteq.

KuppingerCole

OneTrust Acquisition of Convercent

by Anne Bailey OneTrust, provider of data privacy, security, and governance solutions has announced that it will acquire Convercent, an enterprise GRC solution with an ethics and compliance portal. Slotting itself as a strategic acquisition, the two companies will be aligning and merging their products to actualize trust for the enterprise. The deal is intended to be finalized in mid-April. Exp

by Anne Bailey

OneTrust, provider of data privacy, security, and governance solutions has announced that it will acquire Convercent, an enterprise GRC solution with an ethics and compliance portal. Slotting itself as a strategic acquisition, the two companies will be aligning and merging their products to actualize trust for the enterprise. The deal is intended to be finalized in mid-April.

Expanding the ways to build trust

Building trust is a multifaceted task that encompasses the relationships an organization has with all stakeholders. OneTrust works with a trust framework where establishing trust and assessing risk of third-party vendors is at the foundation. There are also compulsory elements of trust like maintaining adequate security and compliance. The relational aspect of trust is powerfully influenced by transparency, ensuring privacy, and taking measurable action on environmental and social issues. The acquisition bolsters OneTrust’s activities in the third-party due diligence and GRC domains, and helps to make progress for new products: OneTrust Ethics and OneTrust ESG.

Third-party risk solutions are becoming ever more urgent, especially those that support supply chain management and accounting for supplier risk. COVID-19 showed the massive impacts on supply chains, which is only the most recent and most visible example of how unforeseen disruptions to a supply chain can wreak havoc. Pressures to manage vendor relationships come both from regional initiatives, such as the German Lieferkettengesetz, and from large enterprises that can put stringent requirements on their suppliers. There is a clear and growing market for such vendor risk management solutions.

To communicate and operationalize the trust reputation of a company, OneTrust Ethics is intended to function like a privacy portal. It will handle the communications, ethics policies, requests, and even gamification of taking ethical action. This could be employee facing as a centralized, standardized, and private way of handling ethics cases, as well as being vendor or customer facing as well.

OneTrust ESG is a groundbreaking product aiming to standardize reporting on environmental, social, and governance initiatives. This product will begin by assisting in reporting specific environmental initiatives, like when an enterprise makes a net zero carbon emission pledge. The product will assist in normalizing data from utility bills, collaborate with remediation providers, and work to provide transparent and standardized reporting the initiative’s progress. This product will eventually expand to cover other environmental and social initiatives like empowering diversity in the workplace. While it is a conservative approach towards measuring corporate ESG initiatives, simply providing support in standardizing reporting is a huge step forward.

Both of these are timely additions because corporations are under regulatory pressure and intense public scrutiny to protect whistleblowers, demonstrate meaningful action in diversity or sexual harassment cases, and show commitment to environmental causes. Contributions to establish metrics and define reasonable expectations of enterprises to create public value come from entities like the World Economic Forum and nonprofits such as Your Public Value. OneTrust joins the effort by striving to provide resources for transparent reporting and internal structures to mitigate risk.

Focus on growth

OneTrust has seen rapid growth overall of its original portfolio consisting of privacy, preference, GRC, and third-party risk management solutions. There will be continued development to expand the privacy platform to stay current with the increasing global privacy regulations. Convercent’s ethics and compliance solutions will be a recipient of this growth, with particular focus on R&D and customer support.


Forgerock Blog

Five Ways Identity and Access Management (IAM) Cuts the Cost of Unemployment Insurance Fraud

Recent web searches show the massive volume of unemployment insurance fraud occurring across the United States. The increase in unemployment claims due to job losses stemming from the COVID-19 pandemic, coupled with aging IT systems, have led to an increase in identity theft, fraud, and loss of federal tax dollars. The State of Rhode Island estimates 43% of its unemployment claims in the past year

Recent web searches show the massive volume of unemployment insurance fraud occurring across the United States. The increase in unemployment claims due to job losses stemming from the COVID-19 pandemic, coupled with aging IT systems, have led to an increase in identity theft, fraud, and loss of federal tax dollars. The State of Rhode Island estimates 43% of its unemployment claims in the past year may be fraudulent. California lost over $11 billion dollars to fraud in the past year. Thousands of accounts that use Social Security numbers (SSN) as the primary account identifier have been stolen in Minnesota

What’s Behind the Fraud?

There is no single answer, but the surge in COVID-19 related unemployment claims, stolen personally identifiable information (PII) from massive data breaches, and antiquated state unemployment systems that rely on out-of-date technology have generated the perfect storm for unemployment fraud. 

The United States government recognizes that state unemployment insurance systems are in dire need of upgrades or replacement. To address this crisis, the U.S. Department of Labor recently announced it will distribute $100 million dollars in funding to state agencies to help them combat fraud and recover improper payments. This is a good start, but a much more comprehensive plan is needed. 

Legacy Applications 

Some state unemployment insurance applications rely on technology that is at least 20 years old. This older technology does not provide an agile and efficient way for average citizens to quickly, securely, and safely access their unemployment claims. The systems are not user friendly, while many users want to log in to manage their claims using a smart device. Some states may not be in a position to upgrade their applications and systems until the pandemic surge in unemployment claims recedes.  

Legacy Identity and Access Management (IAM) 

Deploying a modern identity and access management system on the front end of these legacy applications can help reduce fraud. Keeping the applications on premises and deploying an IAM solution in the cloud is the best approach. Leading analysts state that hybrid cloud and hybrid IAM will continue to be the best approach in the coming years to support legacy systems. Legacy applications will continue to prevail until they are migrated to either SaaS applications or applications developed in DevSecOps.

The ForgeRock Approach

ForgeRock protects several state unemployment systems with millions of constituents authenticating daily. Identity Gateway allows state governments to continue using their legacy unemployment applications with a more modern approach by providing an authentication framework for legacy applications. By coupling Identity Gateway with Identity Management, Access Management and Intelligent Access, state agencies now have a modern, more secure, and agile IAM platform that will meet their needs. 

“Identity and Access Management is a key foundation in modernizing unemployment insurance.” — Mike Wyatt, Cyber Risk Leader focusing on Identity Management and State Government at Deloitte What Can States Do? 

The U.S. Department of Labor has published guidelines that reference the National Institute of Standards and Technology’s (NIST).

While these guidelines attempt to address commonly encountered flaws, they can be addressed more effectively by taking advantage of modern security tools such as those provided by ForgeRock. Here are the Labor Department’s recommendations and how ForgeRock can help:

1) Stop Using Social Security Numbers as a Constituent Login Method 

Social Security numbers (SSN) can be compromised in a variety of ways, including stolen mail and breached databases. The best approach is for constituents to log in using a unique identifier instead of their SSN. Unique identifiers offer additional benefits, including having a user-friendly syntax with data that only the owners would possess. They can also be used in other programs within the state without the fear of disclosure to inappropriate agents.

If an email address is used as the login key, ForgeRock’s user validation and orchestration tools within Identity Management can provide the necessary assurance of the email’s accuracy. In combination with the fraud detection analysis available in ForgeRock Intelligent Access, agencies can ensure that the email login key is unique and approved for use. 

2) Implement Robust Password Policies 

ForgeRock suggests having different password policies based on different types of users. Although complex password policies are not considered user-friendly, many backend unemployment systems running on mainframes require passwords for access. ForgeRock’s user interface (UI) ensures that passwords meet your desired security parameters and support rigorous password refresh/reset policies when a user is going through registration, thereby maintaining a high level of confidence throughout the experience. 

3) Implement Geo-Fencing

If a constituent is logging into the unemployment site from another state, that user should be flagged. Claimants should be in the state they are filing a claim in, ready and available for work. For example, if a login occurs from Montana for a New Jersey unemployment claim, this should be flagged as suspicious. 

ForgeRock Intelligent Access can use geolocation to verify that the user is logging in from a real location. For example, ForgeRock Access Management can allow a login event to occur only if it is physically possible. This prevents a person from logging in from two widely separated locations within a timeframe when movement at that speed would be impossible. Intelligent Access user journeys can be designed to transparently detect fraud through behavioral biometrics. If a login attempt does not fit the established use patterns, it can require the person logging in to provide stronger assurance of their identity. 

4) Enable Contextual and Adaptive Multi-Factor Authentication

ForgeRock includes the cost of multi-factor authentication (MFA) with Intelligent Access, while other vendors charge for it. Most users now understand that MFA makes their login experience more secure, so they are generally willing to sign up for it. Depending on the perceived fraud risk and the type of transaction, ForgeRock Intelligent Access can also adjust the MFA method and require more validations if needed. Often, an MFA challenge requires minimal contribution from the user, for example, by using facial recognition through an iOS device along with a PIN.

5) Incorporate Identity Proofing Solutions

ForgeRock offers pre-built identity proofing solutions through its Trust Network partners. These vendors extend the digital experience by leveraging fast, accurate, and contextually driven flows. These journeys surpass the user validating experience achieved in face-to-face transactions and provide a global review of a person’s assertions to further decrease any chance of fraud. The dynamic nature of the ForgeRock identity proofing flow enables organizations to fine-tune proof of identity. This ensures the level of assurance always corresponds to the level of access or the type of assets a user may need to reach. The experience encountered is tailored to the type of user and the risk associated with the flow, without the need for manual interventions from state agents.

There’s no question that the applications and IAM systems behind state unemployment agencies desperately need to be modernized. ForgeRock has the technology to make unemployment insurance better, stronger, and faster.

To learn more about how ForgeRock can help retrofit aging unemployment insurance infrastructure, check out the following assets:

Product Brief: Identity Gateway White Paper: 10 IAM Capabilities Key for Governments to Support Citizen Access

SelfKey

All Data Breaches in 2019 – 2021 – An Alarming Timeline

Your data is valuable and should belong to you. Nevertheless our online records are exposed on an almost daily basis, with potentially devastating consequences. This blog post aims to provide an up-to-date list of data breaches and hacks. The post All Data Breaches in 2019 – 2021 – An Alarming Timeline appeared first on SelfKey.

Your data is valuable and should belong to you. Nevertheless our online records are exposed on an almost daily basis, with potentially devastating consequences. This blog post aims to provide an up-to-date list of data breaches and hacks.

The post All Data Breaches in 2019 – 2021 – An Alarming Timeline appeared first on SelfKey.


Affinidi

Verifiable Credentials in Ben’s Serendipity

Benjamin Cox (Ben) was a lawyer and a working partner at an up-and-coming law firm in New York City. He was born in the inner city of NYC and had worked his way to this position. At the same time, he had been sensing fatigue and a lack of interest in his work over the last few months. Today, in particular, he was not ready for work at all and decided to call in sick. Bored, he simply darted out o

Benjamin Cox (Ben) was a lawyer and a working partner at an up-and-coming law firm in New York City. He was born in the inner city of NYC and had worked his way to this position.

At the same time, he had been sensing fatigue and a lack of interest in his work over the last few months. Today, in particular, he was not ready for work at all and decided to call in sick. Bored, he simply darted out of his apartment, took his car, and decided to go on a drive.

12 hours later…..

Ben was still driving. He was somewhere in Georgia and he booked into a hotel and spent the night contemplating his future. The next day, instead of driving back to NYC, he kept going in the other direction.

He had finally taken the plunge! Though there were many uncertainties, he knew he could handle it all as he had the emotional strength and more importantly, he had his verifiable credentials that could help him anywhere.

He got behind the wheels and started driving again. By the end of the day, he reached Houston! That night, the Houston Rockets were taking on the New York Knicks, and for the first time, he cheered for Rockets as he wanted to break away from the shackles of his life in NYC and anything that even remotely connected him to this city.

The next day dawned and Ben decided to go further south and drove right past the Mexican border between El Paso and Juarez. When he was asked to show his ID, he quickly turned to the driver’s license verifiable credentials. All that he had to do was share the verifiable presentation from his digital wallet to the border authorities, who in turn, checked for its validity and authenticity. Since these verifiable credentials were tamper-proof, the authorities had no qualms about his identity or the authenticity of his VCs, and approved his entry. So, just like that, Ben breezed into Mexico.

The next morning, the same story, and more driving! A couple of days later, he reached Cancun and decided to stay there for a few days.

While relaxing on the seaside one afternoon, he started feeling a bit nauseous, so he headed back to the hotel. As he was mulling over the possibilities, he realized he hadn’t been taking his regular meds for high blood pressure. He went to the local clinic to get a prescription for his restricted blood pressure drug. But the authorities there weren’t forthcoming to give the prescription, so Ben shared his medical condition verifiable credentials that proved his medical condition and his use of the drug. These VCs proved to be a life-saver as he was able to get a prescription for the restricted drug even in a different country! .

A week later, he decided it was time to decide what he really wanted to do.

He had no clue what, but it was definitely not back to NYC and the firm. With such thoughts, he picked up the local paper, and his eyes fell on an advertisement for an English-speaking guide in the small town of Barra de Potosi.

With a smile, Ben drove down to Barra and met the hotel manager at the Laguna de Potosi, the only hotel in this town of about 400 residents. Since it was a tourist town where hundreds of Americans landed each year to enjoy the watersports, the hotel was looking for a guide to talk to them.

Though Ben was over-qualified and his weekly pay was a pittance compared to what he was earning at NYC, he accepted it with all his heart! But there was a problem! To employ him, the hotel had to submit his ID and some proof that made him eligible for this job.

Verifiable credentials came to Ben’s rescue again as he dug through his digital wallet to find an English teaching employment verifiable credential he had done many years ago when he was a sophomore in college. He shared the verifiable credentials of his first job with the local labor department, and the officer was able to verify it based on the schema. This was deemed to be sufficient for the job and Ben started his new career as an English guide.

A week later, it was time for his pay and the hotel wanted to pay him through a bank account to avoid tax penalties. So, Ben had to open a bank account. But the bank manager wanted a ton of documents and all that Ben had to do was share his verifiable credentials with the bank manager. Ben created a verifiable presentation that contained all the documents that the bank wanted and shared this presentation with the bank manager. In turn, the manager checked the identity of the issuer and the validity of the documents, and opened the bank account.

Ben settled into a life of peace, solitude, and freedom. His law firm had given up searching for him and had moved on. Ben lived the rest of his life at Barra de Potosi, the magical place where he truly found himself.

So, what do you think is the reason for Ben’s serendipity? A stroke of luck? Destiny?

The real reason is verifiable credentials as it came in handy every step of the way. He had complete control over his credentials and could share it with just the entities he wanted to. Also, he had no plan and it would’ve been impossible to move seamlessly across international borders without his VCs.

Think about it for a moment. No physical documents at all, but a simple and secure self-sovereign identity that Ben had complete control over. More importantly, look at the interoperability and flexibility as Ben could use them in different situations and across multiple platforms.

That’s the true power of VCs!

Do you think he could have changed his life so drastically without his VCs? What about your life? How can VCs help you?

Imagine the endless possibilities it can create for people and the society at large.

This is undoubtedly the future — a world of secure and portable credentials that individuals can control completely.

Are you ready to be a part of this future?

We have the building blocks on which you can build any VC-based application that you think will benefit everyone. Reach out to info@affinidi.com to know more.

Verifiable Credentials in Ben’s Serendipity was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

Coinfirm Partners with nagel + associates

7 April, LONDON, UK – nagel + associates, forensic and investigative accountants, and the leading RegTech and blockchain analytics provider, Coinfirm, have announced their partnership to enhance the security of the blockchain economy. By deploying advanced analytics to combat market manipulation and fraud, this partnership will complement nagel + associates’ crypto advisory practice and will...
7 April, LONDON, UK – nagel + associates, forensic and investigative accountants, and the leading RegTech and blockchain analytics provider, Coinfirm, have announced their partnership to enhance the security of the blockchain economy. By deploying advanced analytics to combat market manipulation and fraud, this partnership will complement nagel + associates’ crypto advisory practice and will...

Tuesday, 06. April 2021

KuppingerCole

Frontier Talk #1: Dr. Harry Behrens - Decentralizing Mobility, Coopetition and Platforms

Raj Hegde sits down with Dr. Harry Behrens, Head of Blockchain Factory at Daimler Mobility, to discuss how decentralization is transforming the fragmented mobility industry. Tune in to this exciting episode for a deep dive on decentralized identity, explore the rise of the platform economy and access the playbook required to kick start decentralization initiatives at your organization.

Raj Hegde sits down with Dr. Harry Behrens, Head of Blockchain Factory at Daimler Mobility, to discuss how decentralization is transforming the fragmented mobility industry. Tune in to this exciting episode for a deep dive on decentralized identity, explore the rise of the platform economy and access the playbook required to kick start decentralization initiatives at your organization.




Spruce Systems

Spruce Developer Update #8

At Spruce, we’re building the most secure and convenient way for developers to share authentic data. Here’s the latest from our open source development efforts: Work-in-Progress: Creator Authenticity We are currently working on a project that will enable creator authenticity for digital assets including NFTs. The initial smart contracts are written, as well as a CLI/library to interact with web

At Spruce, we’re building the most secure and convenient way for developers to share authentic data. Here’s the latest from our open source development efforts:

Work-in-Progress: Creator Authenticity

We are currently working on a project that will enable creator authenticity for digital assets including NFTs. The initial smart contracts are written, as well as a CLI/library to interact with web applications. We plan on alpha testing the application this week.

Formally Verifying the Tezos DID Method

The Tezos DID method is a DID method that optimizes for privacy, enables formal verification, and scales to billions of identifiers by using “off-chain updates,” which allow private networks to extend and update on-chain data. A lot of our current work is focused on advancing did-tezos as the first formally verified DID Method.

We’ve continued work on improving the DID method’s core smart contract for on-chain updates. A first version of the formal proof has also been written, and a CI pipeline has been established.

DIDKit Updates

DIDKit is a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).

Added a Python package. Added a Django example app. Added a Flask example app. Added a JavaServer Pages (JSP) example app. Added a Svelte example CHAPI wallet. We’ve enabled DID Methods to use HTTP(S) on WASM and Android. Conducted a test with the VC HTTP API v0.0.2 test suite. Test report. Worked on support for Relative DID URLs. Improved DID URL dereferencing to support more DID documents. Support publicKeyBase58 for Ed25519. Implement did:onion. (WIP) Implement did:pkh — a DID method for deterministic handling of public key hashes by curve. Released ssi v0.2.0. Published to crates.io: ssi, ssi-contexts, did-web, did-method-key, did-tz, did-sol, did-pkh, did-ethr, did-onion. General bug fixes. Credible Updates

Credible is a credential wallet for the verification, storage, and presentation of Verifiable Credentials using Decentralized Identifiers. In addition to our native mobile editions, we’ve since written a browser extension version of Credible along with an SDK to enhance any web application with decentralized identity.

The extension version of Credible is in progress to promote increased accessibility

If you would like to discuss how we would deploy the architecture described above for a specific use case, please take 30 seconds to leave us a message, and we will respond within 24 hours.

Follow us on Twitter

Follow us on LinkedIn


GIMLY

The EOSIO DID method specification

Gimly has built a full draft of the EOSIO Decentralised Identifier (DID) method specification. This specification guides the implementation of DIDs on EOSIO powered blockchains.

Gimly has built a full draft of the EOSIO Decentralised Identifier (DID) method specification. This specification guides the implementation of DIDs on EOSIO powered blockchains.

Excerpt from the EOSIO DID method specification

Decentralized Identifiers (DIDs) are an important part of the self-sovereign identity (SSI) technology stack. The DID is the lowest level on top of which other identity components such as Verifiable Credentials (personal information or information about organisations and other entities) can be built. Using the SSI tech stack allows applications to use an interoperable identity that is driven by a well curated set of privacy and security knowledge. Using an SSI architecture brings privacy and security to digital human identity, and significantly removes technical and legal friction for governments, enterprise and SMEs.

DID:EOSIO specification Read the EOSIO DID Specification

We have been working with the Decentralised Identity Foundation to shape this specification, and also want to thank the W3C Credentials Community Group for their support in the creation of the Verifiable Condition type, a necessary component to create the EOSIO DID document to represent EOSIO account permissions.

Eosio identity working group Join the EOSIO Identity WG

We have been working with the Decentralised Identity Foundation to shape this specification, and also want to thank the W3C Credentials Community Group for their support in the creation of the Verifiable Condition type, a necessary component to create the EOSIO DID document to represent EOSIO account permissions.

Proudly presenting our draft EOSIO Decentralized Identity (DID) Method Specification for finalization with the community, by @theblockstalk and @casparroelofs.#SSI meets #EOSIO 💣@w3c #eosio_id #ssi #DIDhttps://t.co/qpMPbQ8Z0f

— Gimly Blockchain Projects (@gimly_io) April 7, 2021

Gimly is hiring!

Gimly will be participating in the eSSIf-lab second infra call with our connector for using NFC smartcards in SSI solutions AND we will be conbtributing to the NGI-ontochain project with a self-sovereign datavault and identity management panel.

For these projects and ongoing work with clients, we are looking for a new mid-level full-stack developer as well as a senior product development manager. See the jobs below and if you know someone interested we will be happy to hear from him/her!

See jobs  

Global ID

The GiD Report#154 — Digital identity is cool now (Big Tech is not)

The GiD Report#154 — Digital identity is cool now (Big Tech is not) Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: Digital identity is cool now Athletes as Storytellers (via Greg) The New Yorker explains Clubhouse Microsoft wants t
The GiD Report#154 — Digital identity is cool now (Big Tech is not)

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

Digital identity is cool now Athletes as Storytellers (via Greg) The New Yorker explains Clubhouse Microsoft wants to buy Discord No one trusts Big Tech Facebook’s massive data breach This week in NFTs Tweet of the week (digital wallet edition w/CZ) Stuff happens 1. Digital identity is cool now.

Got a text from a friend this morning about the latest craze — a new platform called y.at — let me know if you’d like an invite to check it out.

In a nutshell, Yat wants to turn unique emoji combos into universal identifiers:

Yat lets you use emojis as your universal username and identity on the internet. Imagine being known as 🔥🐍 or 🤖👻👑 instead of coffeequeen98 or jake.smith2438@gmail.com.
By owning a Yat — let’s say 🌊🔱🌴 — it’s yours forever. You are the ~only~ one on earth who owns these emojis.
The possibilities are endless with Yat. You can receive payments, use it on your socials, and eventually much more. Your Yat can be used as a link like this: https://y.at/🌊🔱🌴 (click it to see it!) and automatically redirect visitors to any website you want.

Now, it’s unclear to me how natural or intuitive it is to associate people with emojis — but it’d be easy to dismiss me as a boomer. (The fact that the platform is selling off emoji combos based on their own internal score for rarity makes it feel like a bit of a short term cash grab. Although my friend did buy his own emoji real estate for $120, apparently.)

“I don’t know, man. It seems like a low risk/high reward thing. People are hype on it! I’m just trying to be an early adopter. It’s called an investment, my dude.”

At the very least, there’s a compelling narrative here about digital identity — how it’s becoming more playful/fun/compelling in a way that’s attracting mainstream attention. People are actively thinking about this stuff. They’re willing to engage in new ideas. Most of all, they understand why identity is so important.

Here’s /m’s take:

Maybe it’s just me as I don’t use emojis much (except for 10 of them). They are really hard to find. Some of them are also very similar (e.g. skin tones, small changes) so I see a lot of problems with this. I guess this might be much easier for anyone that knows e.g. Chinese and Japanese characters but learning an alphabet with more than 3000 characters won’t be easy. But this might be where the 🌎 is going 😉
2. We need an identity. We want a voice.

Here’s the New Yorker from /gregkidd on The Rise of the Athlete Podcaster:

In the past, if athletes wanted to speak candidly, they would write a tell-all book, do a sit-down interview, maybe phone in to a radio show. If they aspired to work in media, they would try to land a cushy network job, providing expert commentary or analysis. But the Internet, which allows any of us to air the slightest thought, has changed those rules. Players have grown infatuated with sharing their perspectives in real time, in direct, unfiltered ways. Retired greats have realized that they possess endless content — stories, memories, behind-the-scenes morsels — that fans crave. And athletes everywhere are seizing the means of production. Around the time that Jeter launched the Tribune, LeBron James got funding for a new company, Uninterrupted. Its aim was to produce content from players’ points of view, and to show that those players could be “more than an athlete.” People like Jeter and James no longer had to settle for being talking heads. Now they barely had to settle for sports at all.

Greg’s take:

“Athletes as Storytellers — one more indication that the world is getting flatter.”
3. This week at the Clubhouse (naturally)

Here’s the New Yorker — Clubhouse Feels Like a Party:

Here would be the place to speak of the history of mass communication; to produce historical analogues for Clubhouse, such as call-in radio shows, the teen party lines of the nineteen-eighties, and the Agora; to cite Habermas or Gramsci and nod to Marshall McLuhan. I kept thinking of a minor section in Don DeLillo’s “White Noise,” from 1985, in which the narrator completes an A.T.M. transaction and experiences a sparkling moment of integration with the global financial apparatus: “The system was invisible, which made it all the more impressive, all the more disquieting to deal with. But we were in accord, at least for now. The networks, the circuits, the streams, the harmonies.”
This was basically how it felt to be on Clubhouse. With each new experience — a lullaby room, in which most avatars wore the same cartoon, baby-blue nightcap; a beatboxing and freestyle session between musical-theatre performers; a “dopamine reset” silent meditation, which was actually silent — I felt increasingly as if the app, whose creators have not articulated any particular theory of technology or media, belonged to a great, surging lineage.
In 1999, John Durham Peters, who is now a professor of English and film and media studies at Yale, published “Speaking Into the Air: A History of the Idea of Communication.” “In richer societies, much of our interaction is enabled by interpersonal media such as email and telephone,” he wrote — forums “in which the broadcast and the interactive are hard to tell apart.” In such a society, where the boundary between mass communication and in-person conversation is more porous, everyday talk can assume the tones and tics of media. “In private life, many of us talk like Beckett characters,” Peters wrote, while “in public discourse, celebrities present themselves as if they are our friends.” When we spoke, over the phone, he had not yet used Clubhouse. Still, he considered parallels to the use of CB radio among truckers in the nineteen-seventies, to the Sandinistas’ experimental shows in the eighties, and to French salons. He contemplated feminist scholarship on the synchronization of soap operas and domestic labor, and described the inside-out dynamic of advice columnists — the “structural inversion of the single voice crying out, and the advice columnist crying out to everyone all at once.”
“In the nineties, you were stigmatized if you were a lurker, but this legitimates the listener,” Peters said, of Clubhouse. “Listening is a democratic thing. It’s not passive. It’s one of the hardest things we do.” The concept also reminded him of shortwave radio. “Very early on, you had amateurs talking back and forth to each other, first in Morse code, then in voice,” he said. “You could presume intimacy because very few people had the receiving equipment. That’s sort of the same idea as Clubhouse. Not everybody has the receiving equipment. They don’t have access.”

Like I said, it’s the New Yorker.

And as you all know by now, everyone is Clubhouse — Via MIT Tech Review:

+ LinkedIn is working on a Clubhouse clone. (TechCrunch)
+ And so is Spotify. (The Verge)
+ And Discord! (The Verge)
+ The people who will be left behind in the shift to spoken social media. (Slate)
4. Speaking of Discord, Microsoft wants to buy it, apparently.

Via /m

Which everyone hates. Like Linus (of Linus Tech Tips): Do you WANT Discord to be Owned by Microsoft??

Probably because everyone hates Big Tech at the moment. This is wild:

5. Chart of the week:

Axios — Exclusive: Trust in tech cratered all over the world last year:

Why it matters: High public esteem has helped protect the tech industry from critics and regulators, but that shield is weakening.
Edelman said the main reason for the trust fall is the increasingly “complicated” relationship between the public and technology — including the spread of misinformation, rising privacy alarm and bias in artificial intelligence.
In the U.S., tech fell from the “most trusted” sector in the 2020 study, to ninth in the latest survey (taken in October and November) — behind food and beverage, health, transportation, education, consumer packaged goods, professional services, manufacturing and retail.

Yikes.

Also:

Google, Facebook, Amazon, Apple and Microsoft. Call Them Tech Frenemies for Life. Big Tech dollars are becoming toxic in Washington 6. Speaking of Big Tech — the massive FB data breach

Not great: 533 million Facebook users’ phone numbers and personal data have been leaked online

Also, a little awkward: ACLU, a defender of digital privacy, reveals that it shares user data with Facebook

And yet… You May Not Like Facebook, but Its Stock Deserves Better — at Least 20% Better

7. This week in NFTs: Sports cards have gone virtual, and in a big way Mark Cuban on Why DeFi and NFTs are the Future NFT developers say cryptocurrencies must tackle their carbon emissions Via /coddsquad — NBA Top Shot maker Dapper Labs just nabbed another $305 million investment from the likes of Michael Jordan and Will Smith Circle Launches Comprehensive NFT Platform & Marketplace Payments Solution Via /toddjcollins‎The Prof G Show with Scott Galloway: Crypto, NFTs, and Blockchain ft. Raoul Pal on Apple Podcasts 8. Tweet of the week:

CZ (on the age of the digital wallet—referencing the below image):

Not financial advice.
9. Stuff happens: Stefan Thomas: Chipotle is giving away $100,000 in bitcoin How people are thinking about vax passports from social/political perspective Panel: Are Vaccine Passports ‘Inevitable” And Should They Be Resisted? Coinbase to direct list on April 14th, provide financial update on April 6th — TechCrunch Lagarde Says ECB Could Have Digital Currency Within Four Years MobiKwik investigating data breach after 100M user records found online — TechCrunch You shouldn’t post a picture of your vaccine card on social media Sift: Exposing the Multi-billion Dollar Fraud Economy Why Media Brands Aren’t Dead Briefing: Andreessen Horowitz Doubles Down on Substack Briefing: Epic Games Takes Fight Against Apple to U.K. Google launches Android Ready SE Alliance to drive adoption of digital credentials • NFCW Visa Trials USDC Payment Service With Crypto.com, Anchorage Miami Mayor Wants City to Become Bitcoin Mining Hub Ripple to acquire 40% stake in Asia&#39;s Trianglo Via /JVS — #506: Balaji Srinivasan on The Future of Bitcoin and Ethereum, How to Become Noncancelable, the Path to Personal Freedom and Wealth in a New World, the Changing Landscape of Warfare, and More Slack CEO: More companies will be “digital first” post-pandemic

The GiD Report#154 — Digital identity is cool now (Big Tech is not) was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Secrets Management in .NET Applications

Learn the best practices to store and manage secrets in your .NET applications.
Learn the best practices to store and manage secrets in your .NET applications.

Fission

Meet the Fission Team: An interview with Co-Founder Brooklyn Zelenka

We talked with our co-founder and CTO Brooklyn Zelenka about her unusual career path, amazing achievements and experiences as a woman in tech. Read on to get to know one of our favorite fearless leaders.

We talked with our co-founder and CTO Brooklyn Zelenka about her unusual career path, amazing achievements and experiences as a woman in tech. Read on to get to know one of our favorite fearless leaders.

Q. Can you tell us about your programming journey and your career in computer science/development?

A.  I have a weird winding path to programming, and kind of "fell into it".

I started out as a classical music composer (this is relevant for a few reasons, as you'll see). Back in music school, I had the best Photoshop/Illustrator skills in my cohort, so I ended up making a bunch of the concert posters and programmes (back when they were paper). I was also a very interested in the technical aspects of music: What made it tick? What are the underlying structures? How many lines of polyphony can I write while keeping it coherent? Turns out that this is a lot of related theory to CS.

Many years later, I ended up at a startup doing graphic design. You end up doing a little bit of everything at startups, and they asked me I'd be willing to implement parts of the designs I was making. So I read a book that weekend, and showed up on Monday writing HTML/CSS and a tiny bit of jQuery. It turns out that I was better at that than design, and so I switched to development.

This was back before it was clear that Node would win the JS-on-the-server space. We were using a JVM-based framework that let you write in a dozen-or-so JVM-based languages, so I had to learn a bunch of languages early on to contribute to this project: Java, JS (Rhino), JRuby, Groovy, Clojure, and a couple more. I got really into Clojure, and thus into functional programming, and started reading the classic CS literature from the 70s and 80s. I quickly became a programming language nerd, which means also picking up a bunch of related CS. The rest, as they say, is history 😉

Q. Could you share an example of a hurdle or obstacle you experienced during your developer journey and how you overcame it?

A. You know, I honestly don't feel like I've have a ton of the classic blockers in my career! I'm very passionate about what I do, and work feels like play. The hardest part has been self-inflicted burnout. Burnout is the flip side of being so enthusiastic about my work: I take on a ton and work myself into the ground. I've burned out several times, and it's deeply unpleasant.

I'm learning to take better care of myself, but it's an internal struggle of short-term vs long-term. When people talk about mindfulness, you usually think of meditating and clearing your mind to focus on the here and now, but the bigger part of that for me is recognizing when I'm red-lining too much and eustress is turning into distress.

Q. You are one of the industry's most sought after keynote speakers. How did that come to be and how are you received as woman speaker in a male-dominated industry?

A. I think that I ended up keynoting conferences for a couple reasons: I was pretty engaged in tech community organizing, so I was fairly visible. I have a fair bit of public speaking experience, so I generally can generally (but not always :P) put together a coherent presentation. Finally, I like to push the boundaries of my tools, so I've done some OSS work that folks have found interesting (e.g. Witchcraft, Exceptional).

Brooklyn Zelenka keynoting at a 2017 event

I've definitely experienced sexism in tech, though as my career has progressed and I've become more visible, there's less of that. There have been comments that the women speaking at conferences are just there to fill a quota. I've had comments on talk feedback forms that were just one word: "girl", or people assuming that I'm Boris' assistant during a workshop about a technical standard that I developed. I've at times overcompensated by trying to "look the part" with programmer joke t-shirts and hoodies. Now, I'm pretty nerdy by nature, and I do like these things, but there's also a level of defense mechanism there.

That said, on the whole, people have been incredibly nice and supportive! Part of this choice of communities -- I just don't have time for constantly fighting, so I go where the people are friendly!

Q. What advice would you give young women about a career in tech?

A. Honestly: just try it! I've spoken to a bunch of women over the years that just don't see themselves as the unfeeling, uncreative, evil-genius programmers that our culture likes to portray. I had the same bias! It turns out that programming is a very broad field. It's also extremely creative, and provides a ton of flexibility. Yes, bad programming jobs exist, but you have way more agency and power than really any other career I can think of: you can really choose your own adventure. It's not even that hard to get started, though like any skill, learning the basics is the hardest part. Just try it; it's much better than you think!

Q. What has been your biggest career and/or personal achievement?

A. Wow, that's a hard one! I feel really lucky to have done a lot of things that I find meaningful, from activism to building products to seeing the world as a digital nomad. I think my biggest success has been forging my own path: choosing interesting experiences over cash, curating where I spend my energy to support communities that I care about, and learning as much as possible.

Thanks to Brooklyn for taking the time to answer questions about her background and career. Over the next months Fission will be sharing profiles of each of our awesome team members with the community. Fission's team is one of its biggest strengths and we want to give them each a moment to shine.

Interested in learning more about the Fission community? Visit the Fission events page to sign up to get notified about new events, and join us most Thursdays for new tech talks.


auth0

Identity, Unlocked... Explained: Season 2, Ep 6

Explore how FastFed is looking to shorten the time it takes to join organizations into a federation.
Explore how FastFed is looking to shorten the time it takes to join organizations into a federation.

SWN Global

The Partnership between Digital Economy Incubation Center and Sovereign Wallet Network

SovereignWallet Network (SWN) is pleased to announce a partnership with Digital Economy Incubation Center, a corporation with IT PARK Uzbekistan. The DEIC is a corporation with IT PARK Uzbekistan that aims at fostering an all-round development in innovative and scientific development, with the goal of improving the intellectual and technological potential of Uzbekistan. The Sovereign Wallet Netwo

SovereignWallet Network (SWN) is pleased to announce a partnership with Digital Economy Incubation Center, a corporation with IT PARK Uzbekistan. The DEIC is a corporation with IT PARK Uzbekistan that aims at fostering an all-round development in innovative and scientific development, with the goal of improving the intellectual and technological potential of Uzbekistan.

The Sovereign Wallet Network (SWN), developer of the world’s first identity-based enterprise blockchain — MUI MetaBlockchain, joined hands with Digital Economy Incubation Center (DEIC) on March 23rd, 2021. SWN aims to aid DEIC in its endeavour to accelerate innovation-driven growth across all Uzbekistan’s economic and social sectors. The SWN-DEIC partnership brings together deep skills in business and technology mentoring and consulting, product ideation, technology development and deployment, and organizational change management strategy to help Uzbekistan through successful technological transformations across fields.

SWN and UMSF will work together to implement blockchain-based e-government related projects in Uzbekistan, such as blockchain-based digital identity certificate, legally effective digital asset issuance, a digital certificate issued by e-government, digital notarization system, etc.

SWN and DEIC will work together on the research and development of CBDC (central bank digital currency) and Asset Tokenization on top of the MUI MetaBlockchain Network which will be beneficial for the Uzbekistan Government.

As a result of this partnership, there is a belief that there is a huge opportunity for MUI MetaBlockchain networks to be utilized and scaled successfully when SWN and DEIC connect the new innovative technologies in the Republic of Uzbekistan to MUI MetaBlockchain. In particular, SWN will support and curate growth-stage projects with the best high-growth innovators worldwide to utilize the MUI MetaBlockchain. The Sovereign Wallet Co., Ltd (SWN) also will provide education and mentoring to DEIC’s projects. Blockchain is an important technology for the next generation of entrepreneurs. Blockchain technologies provide efficiency, security, traceability to financial system-related activities and fundraising for projects.

Prior to this, SWN also entered into a technical and mentoring partnership with the Young Entrepreneurs Generator (YEG) under the Uzbekistan Government Innovation Center in February. YEG has over 500 corporate projects and has established a partnership with SWN to build to expand successful blockchain projects and business networks.

About Sovereign Wallet

Founded in 2015, Sovereign Wallet is a company with a globally spread, multicultural team that is committed to revolutionizing the financial space by providing a highly secure, highly efficient 4th gen blockchain platform to global citizens. We empower self-sovereign finance.

Our Products: Digital Asset Wallet, Metablock Exchange, and Identity-Based next-gen blockchain: MUI MetaBlockchain.

More About The Project

Follow us on Twitter
Join our Telegram Community Chat
Get the latest from our Telegram announcement Channel
Signup to our Bi-weekly community Update
Read the MUI MetaBlockchain Whitepaper
Download the MUI MetaWallet on Android and IOS


IDunion

IDunion enters the second project phase initiated by the Federal Ministry of Economic Affairs and Energy of Germany

The Federal Ministry for Economic Affairs and Energy (BMWi) has selected IDunion for the second phase of the innovation competition “Showcase Secure Digital Identities”. The three-year implementation phase began on April 1, 2021.  The goals of this new project phase includes the establishment of a European cooperative, the launch of a production network and the […]
The Federal Ministry for Economic Affairs and Energy (BMWi) has selected IDunion for the second phase of the innovation competition “Showcase Secure Digital Identities”. The three-year implementation phase began on April 1, 2021.  The goals of this new project phase includes the establishment of a European cooperative, the launch of a production network and the implementation of 40+ different pilot applications from several areas.

Frankfurt am Main, April 6, 2021 – IDunion, an open ecosystem for trusted identities, entered its second project phase funded by the German Federal Ministry for Economic Affairs and Energy on April 1, 2021 as part of the innovation competition “Showcase Secure Digital Identities”. With these showcase projects, the German Federal Government aims to promote the development of digital solutions for identity management in Germany within the framework of an innovation competition. This competition is based on the framework “Development of Digital Technologies“. The aim of this competition is to develop and test outstanding approaches, which are designed for everyday use and are usable in a wide range of possible applications in the field of secure digital identities.

Based on the concept of self-sovereign identities (SSI), IDunion has set itself the goal of establishing an ecosystem for natural and legal persons as well as things. Use cases in a wide range of areas will therefore be piloted in the following areas: education, e-commerce, mobility, e-government, e-health, finance, identity & access management (IAM) and industry/IoT. Some further concrete examples of applications are the following: the derivation of the ID card into available wallets, campus management and educational credentials for students, legal identities for companies or product identities. 

Moreover, the network has several other ambitious goals, such as building a comprehensive framework to further develop trust in the underlying technology. In addition to this, the partners are working together on the conception and implementation of security-related aspects. The wallets and other software applications developed within the consortium will also be further enhanced to ensure an optimal user experience and a wider distribution. 

A European cooperative will be established and will serve as a legal entity for the management of the network and the operation of the ecosystem. The cooperative will act in the interest of all members and will be a non-for-profit organisation. A draft of the statutes, which constitutes the set of rules for the members, can be found here.

Within this innovation competition, the following partners are funded: 

Bank-Verlag GmbH, Bundesdruckerei GmbH,  DB Systel GmbH,  Deutsche Telekom AG esatus AG,  GS1 Germany GmbH,  ING-DiBa AG,  Main Incubator GmbH,  Robert Bosch GmbH,  Siemens AG,  Stadt Köln,  Spherity GmbH,  Technische Universität Berlin,  Institut für Internet-Sicherheit (Westfälische Hochschule) und YES Payment Services GmbH.

The following institutions are associated partners: 

51nodes GmbH,  Berlin Partner für Wirtschaft und Technologie GmbH, Berliner Senatsverwaltung für Wirtschaft, Energie und Betriebe, Bundesamt für Migration und Flüchtlinge, BWI GmbH, CodeCamp:N GmbH, Commerzbank AG,  CompuGroup Medical Software GmbH, D-Trust GmbH, Datarella GmbH,  DATEV eG, Deutsche Bank AG, Deutsche Post AG,  Deutscher Sparkassen- und Giroverband e. V., Energy Web Stiftung, Festo SE & Co. KG, Fraunhofer-Institut für Angewandte Informationstechnik FIT, gematik GmbH,  mgm technology partners GmbH, Ministerium für Wirtschaft, Innovation, Digitalisierung und Energie des Landes NRW,  msg systems AG,  R3 LLC, regio iT gesellschaft für informationstechnologie mbh, targens GmbH,  TrustCerts GmbH und Verband der Vereine Creditreform e.V..

With the foundation of the European cooperative, the IDunion ecosystem will be open to new partners. Interested parties can reach out to contact@idunion.org


Ocean Protocol

Meet the Fleet — Spring 2021

Meet the Fleet — Spring 2021 Learn about the latest additions to the Ocean team and how they fit into #ANewDataEconomy We’ve dubbed 2021 as #TheYearofScaling at Ocean Protocol. We’ve staffed up members from all over the world for our product, ecosystem, and communications teams to reach our traction-based goals. In a celebration of Ocean’s diversity and transparency, get to know the newest
Meet the Fleet — Spring 2021 Learn about the latest additions to the Ocean team and how they fit into #ANewDataEconomy

We’ve dubbed 2021 as #TheYearofScaling at Ocean Protocol. We’ve staffed up members from all over the world for our product, ecosystem, and communications teams to reach our traction-based goals. In a celebration of Ocean’s diversity and transparency, get to know the newest members of our crew.

From left to right: Alex Napheys, Andrea Burzi, Bogdan Fazakas, Călina Cenan.

Alex Napheys is the OceanDAO Lead. His experience in relationship building and growing products and communities makes him an invaluable asset to our team. When he’s not investing and researching in crypto, he’s likely enjoying a good coffee, playing basketball, and talking about crypto.

Favorite book, movie, or album: Zero to One.

Why he joined Ocean: “You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.”

Andrea Burzi previously worked as a Smart Contract Engineer over the past years. Lately, he’s built full dApps in the NFT space, helped open-source projects, and audited contracts. Andrea believes that data is a new asset class, and he wants to foster publishing and consuming data more fairly.

Favorite book, movie, or album: Snatch.

The perfect day: Seaside, crypto, and friends.

Bogdan Fazakas has been working as a software engineer since 2017 and most recently co-founded his own start-up! Bodgan joined Ocean because he was amazed by the cutting-edge technology behind the project and was excited by the challenge it presented.

Favorite book, movie, or album: Rich Dad Poor Dad, Open: An Autobiography, Sapiens: A Brief History of Humankind.

The perfect day: A day of learning and improvement.

Călina Cenan started her PHP career, working for big retail companies, building online campaigns, email management systems, and other web software tools. She switched to Python and started working with data-driven and finance-related web applications. She has worked as a freelancer for the past year.

Favorite book, movie, or album: “Impossible question: my favorite authors are Isaac Asimov, Irwin D. Yalom, Eric-Emmanuel Schmitt, Marquez, and Llosa.”

Why she joined Ocean: “The planets aligned perfectly, with people who knew people who knew me. I was impressed, intrigued, and also skeptical at first since it seemed like a huge challenge. But this amazing team quickly made me feel at home, and it turned out to be one of the best decisions of my career.”

From left to right: Claudia Holhos, Daniel Tóth, David Hunt-Mateo, Dimo Dzhurenov.

Claudia Holhos is a recent Computer Science graduate currently working as a Junior Front-End Developer. Impressed by Ocean’s commitment to delivering quality services to its users, she joined the team to apply her knowledge and learn from her teammates.

Favorite book, movie, or album: I loved reading “The Buddenbrooks” by Thomas Man, and I always enjoy watching Benigni’s “La vita e bella.”

The perfect day: Home, sun, music, movies, cooking, books, painting.

Daniel Tóth spent seven years at Google in Trust & Safety — communicating, implementing, and enforcing content policies on misinformation, elections, hate speech, scams, and other hot topics. He’s also heavily involved in LGBTQ+ activism on the civil society front, from fundraisers for school workshops to panel discussions about the stigmatization of HIV.

The perfect day: A well-deserved celebration of a milestone — ideally outdoors with champagne.

Why he joined Ocean: “During my Master’s in Internet Studies, I remember learning about the theoretical possibility of a more equitable data marketplace than the current one. When I discovered that Ocean is actually making it happen, I knew I had discovered something important and worthwhile.”

David Hunt-Mateo, a technical generalist and skilled communicator with an affinity for developer tooling, has worked in various roles spanning from digital hardware to embedded software to desktop applications. He believes that users are entitled to privacy & data ownership and that contributing to Ocean Protocol is the most important thing he could be doing right now.

Favorite book, movie, or album: The Discworld series by Terry Pratchett.

The perfect day: Hiking with my wife in the Rocky Mountains.

Dimo Dzhurenov worked with different technologies, and for the past 4–5 years, he dedicated his time and effort to working with devs, companies, and several organizations in understanding and utilizing blockchain technologies, focusing on understanding the business and market problems and working on different blockchain projects.

The perfect day: Coffee, good music, progress, challenge, adrenaline, spiritual and mental growth.

Why he joined Ocean: “Ocean is working on a problem that is yet to be recognized by society as we move towards more and more digitalization of businesses and personal data. We need to have a say in what’s happening with our data since it’s becoming our most precious and valuable asset in a new world economy. Working on that problem and contributing to its solution is what got me excited about Ocean.”

From left to right: Jamie Hewitt, Maria Cretu, Norbert Katuna, Robert Alcantara.

Jamie Hewitt started his career working in marketing and communications and then transitioned into being a software developer. His most recent role before Ocean was as a tech lead for an online fashion company. His passion for working with blockchain and Ethereum intensified last year, as he spent six months working on a decentralized version of GitHub, which used Ethereum and IPFS.

Favorite book, movie, or album: Book: Wild Swans. Film: Queen of Katwe. Album: Stadium Arcadium, Red Hot Chilli Peppers.

Why he joined Ocean: “I joined Ocean Protocol because I am incredibly excited about the technology and the great success it’s had in early adoption. I believe Ocean Protocol will completely transform the data industry in the near future.”

Maria Cretu is starting her software engineering career here at Ocean! She joined Ocean because it is a fast-paced environment for her to learn and gain experience while having a meaningful impact. She also has an interest in cryptocurrency and is working on a passion project in the space.

Favorite book, movie, or album: The Wolf of Wall Street.

The perfect day: A day at the beach, nice weather, programming.

Norbert Katuna is a frontend developer with experience in multiple domains, mainly in education and fitness. He has worked on creating web applications and web pages from design to the latest stages. He believes in the values of Ocean and is learning a lot while bringing value to the team.

Favorite book, movie, or album: The Shack.

The perfect day: Go outside and explore a new country.

Robert Alcantara has woven a thread between Data Engineer, Director, Founder, Game Designer, and Analyst. He believes that Blockchain is the needle. He joined the team for one simple reason: to contribute to a sustainable and equalized data economy.

Favorite book, movie, or album: Small Giants: Companies that choose to be great instead of big.

The perfect day: Outdoors, physically exhausted, knowing I have tried my best.

For more information about our team, visit our website at oceanprotcol.com

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

Meet the Fleet — Spring 2021 was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Affinidi

Opening a Bank Account — A Use Case for Verifiable Credentials

Opening a Bank Account — A Use Case for Verifiable Credentials Opening a bank account is something that we have all done at some point in our lives. Though over the years, the process has become a lot simpler, still it entails a ton of documents to prove your identity. At the bare minimum, you’d need, A piece of government-issued identification to prove your identity Your Social Sec
Opening a Bank Account — A Use Case for Verifiable Credentials

Opening a bank account is something that we have all done at some point in our lives. Though over the years, the process has become a lot simpler, still it entails a ton of documents to prove your identity.

At the bare minimum, you’d need,

A piece of government-issued identification to prove your identity Your Social Security Number or a Taxpayer Identification Number

Carrying these documents is a hassle because they can get lost or damaged in transit. Plus, it is highly risky in the event of theft because your Personally Identifiable Information can fall into the wrong hands.

A more secure and easier way to share your information for opening a bank account could be through verifiable credentials.

How to Use VCs to Open a Bank Account?

Here is a sample workflow on how verifiable credentials can be used to open a bank account.

There are three entities to these transactions, and they are:

Issuer — A startup company that collates all the government-issued documents and stores them in the form of a verifiable credential. A good example would be DigiLocker from the Government of India that provides access to authentic digital documents to citizen’s digital document wallets. Holder — The individual/company/entity looking to open a bank account. In this sample use case, let’s assume this entity is an 18-year old girl opening a bank account for the first time. Verifier — The bank where the holder wants to open an account

As a first step, the holder logs into the issuer’s portal and submits a request for verifiable credentials. After checking the holder’s identity and request, the issuer sends a QR code to the holder, and in turn, the holder scans the code and saves the information in her digital wallet.

The scanned information would look something like this:

Sample VC

When the verifier requests the documents, the holder creates a verifiable presentation of her SSN/TIN and any other government ID and sends it. Finally, the verifier examines the schema to determine the authenticity and validity of these documents. Specifically, the verifier checks the following.

Is the credential in a format that’s acceptable for the verifier? Does it contain the data that the verifier needs such as information that identifies the holder, the credibility of the issuer, and more? Is the data still valid? Are the credentials issued by a trusted issuer? Do the credentials or its signature provide the cryptographic proof that the holder is the subject of this verifiable presentation?

If the verifiable presentation sent by the holder meets these criteria, the verifier approves the application for account opening, and the holder can start transacting through it.

This sample workflow gives you a glimpse into the world of self-sovereign identity, where an entity has complete control over his or her credentials and can choose to share these secure and tamper-proof identities with the appropriate entities.

Does this enthuse you to create SSI applications that can change the face of our society?

Check out how Affinidi’s APIs can help you get started.

Opening a Bank Account — A Use Case for Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Join The Ontology Team!

We are looking for YOU to join our growing team! While we are thrilled with the tremendous amount of growth we have seen at Ontology this year, we are still eager to expand even more! As a result, we have compiled a thorough list of all the positions we are looking to add awesome new team members in! If you, or anyone you know is a perfect fit for any of the roles above, please feel fr
We are looking for YOU to join our growing team!

While we are thrilled with the tremendous amount of growth we have seen at Ontology this year, we are still eager to expand even more! As a result, we have compiled a thorough list of all the positions we are looking to add awesome new team members in!

If you, or anyone you know is a perfect fit for any of the roles above, please feel free to contact us at careers@ont.io for more details! Looking forward to meeting you and having you join us in our mission to bring true decentralization to identity & data practices!

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Join The Ontology Team! was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

What Are the Six Key Areas of the FATF Consultation?

On March 19th, Paris-based Financial Action Task Force (FATF), the global standard-setting body for anti-money laundering and counter-terrorism finance (AML/CFT), released its Draft Updated Guidance for a Risk-Based Approach to Virtual Assets and Virtual Asset Service Providers. Or, in compliance acronym speak the FATF's draft guidance for its RBA to VAs and VASPs. The newly released

On March 19th, Paris-based Financial Action Task Force (FATF), the global standard-setting body for anti-money laundering and counter-terrorism finance (AML/CFT), released its Draft Updated Guidance for a Risk-Based Approach to Virtual Assets and Virtual Asset Service Providers. Or, in compliance acronym speak the FATF's draft guidance for its RBA to VAs and VASPs. The newly released draft updates FATF's June 2019 standards for VAs and VASPs and further integrates guidance set out in its 12-month review report completed in July 2021.  

Monday, 05. April 2021

KuppingerCole

Analyst Chat #70: AdTech and Future Alternatives to 3rd Party Cookies

Annie Bailey and Matthias continue their conversation around privacy, targeted marketing and the end of the era of the 3rd party cookie, that they started two weeks ago. They discuss the characteristics and the pros and cons of upcoming approaches, while this technology area is still continuing to evolve.

Annie Bailey and Matthias continue their conversation around privacy, targeted marketing and the end of the era of the 3rd party cookie, that they started two weeks ago. They discuss the characteristics and the pros and cons of upcoming approaches, while this technology area is still continuing to evolve.




Holochain

Elemental Chat Gets Freshened Up

Holochain Dev Pulse 93

NOTE: Beginning this week, we’re trying something new with the Dev Pulse. We’re going to start publishing smaller pieces more regularly. We think this will make them both fresher and more digestible.

This week we rolled out an update to all HoloPorts. The most visible changes are in the UI of Elemental Chat, our demo app. Users now have ‘identicons’ beside their name; these auto-generated avatars are based on their actual agent ID in the Holochain network and can’t be forged. This is important because usernames are just labels; you can set yours to anything you like, even someone else’s name. If you’re talking to a friend and suddenly their identicon looks slightly different, there’s a good chance someone is trying to impersonate your friend.

This is an example of how cryptographically-secured data is a good first step to ensuring authenticity and accountability without the need for a central authority.

There are some under-the-hood changes to Elemental Chat as well. The network stats are calculated more accurately (you can see them by clicking that graph icon in the upper-right corner). And the real-time performance of chatting (using signals) has been tweaked to be more efficient; this should result in a small performance improvement.

Under the hood, the HoloPort’s operating system is using a new version of Holochain RSM. The reason this matters is that it prepares us for the next infrastructure milestone: hosted hApps! The speedups so far are modest, but we expect that other near-term updates will make Holochain substantially faster, fast enough to support thousands of users per app.

These future updates will include:

A switch to a new storage engine based on SQLite, the world’s most popular database engine. I know, I know, last year we said we were switching to LMDB because it was the best thing ever. And in comparable tests (hash table lookups) it does outperform SQLite. But we need more than just hash table lookups — we need fast querying with arbitrary filter parameters. This is especially important if we want to do neighbourhood sharding well. And sharding will lift the lid on scaling, allowing hApp networks to grow far beyond what traditional blockchains can reasonably handle. The proxy server, which helps HoloPorts and Holochain users punch through home firewalls, is in the midst of a large rewrite. So far, benchmarks predict that it’ll be 5× faster for raw throughput, at the very least. Once this is deployed, it will enable us to see places where DHT gossip and validation performance need to be improved.

For developers, all of the above means that Holo hosting is on its way. The Holo Web SDK is being readied for public use as well, and we’re exploring the possibility of offering automated hApp testing services in a real hosting network on real HoloPorts.

Special note for hApp devs

The develop branch of Holochain has advanced quite a bit past main in the past month or two, bringing features like the hc dev tool and hApp bundles. For that reason, I’m now recommending that you use https://nightly.holochain.love to install/enter the development shell. Also, take note that we might be changing installation URLs in the future, but we’ll always keep the installation guide up to date for you.

Cover photo by Paweł Czerwiński on Unsplash


Shyft Network

The Regulatory Shift

I’d like to take a dive into the deep end and talk about the regulatory shift — no pun intended — that is happening today because Bitcoin has performed exactly as we expected. Institutions are now coming aboard. The momentum we see today is more justified than that of 2017’s, during the days of the ICOs. Today’s momentum is here to stay, and that’s exciting. That’s why there is a regulatory shift

I’d like to take a dive into the deep end and talk about the regulatory shift — no pun intended — that is happening today because Bitcoin has performed exactly as we expected. Institutions are now coming aboard. The momentum we see today is more justified than that of 2017’s, during the days of the ICOs. Today’s momentum is here to stay, and that’s exciting. That’s why there is a regulatory shift happening.

The ecosystem has started to understand the implications of the ongoing regulatory shift. That fact that crypto doesn’t fit into the regulatory box is the problem. Bitcoin doesn’t operate in the same context as traditional finance. The space must figure out how to merge these two, disparate worlds together and create a bridge.

There will be an intersection at which blockchain and traditional finance meet, and we’re at that Goldilocks moment now. In the children’s story, she tastes three different porridges and finds that she prefers the porridge which is neither too hot nor too cold, but just the right temperature. The question is, What is “just the right amount” of regulation when it comes to blockchain?

From 2010–2016, there was an impression given that the crypto ecosystem was untouchable, invincible, censorship resistance, and much more. The whole Honey Badger sentiment, which states that Bitcoin will progress no matter what had a few flaws. For one, Bitcoin doesn’t care what anyone thinks. Bitcoin itself is censorship resistant, but its users are not honey badgers, we are bound and influenced by the policies and regulations that shape the societies we live in, and if we don’t get that right, we as a society lose in our ability to utilize at scale what bitcoin and our industry fixes and enables.

I had a very deep conviction that the traditional world could make impactful and negative changes to our ecosystem, and influence the future of blockchain. In 2017, there were no early crypto OGs working on the regulatory challenge, ensuring it progresses the right way.

Over the next several years, the ecosystem continued to grow, causing the regulatory landscape too swiftly become the main battleground for blockchain. We must work with international organizations to educate them and drive the ecosystem into the light.

While lobbying on behalf of the industry, I began in 2016 to look at the problem from a technical perspective. I saw the demand for a blockchain identity infrastructure which preserves openness, decentralization, and the privacy of individuals, and began work thereupon.

This work comes with tradeoffs. For instance, full anonymization and privacy of money is a big ask in a world that combats terrorist financing. Whether you like Know Your Customer (KYC) or not, it is a function of how institutions operate, just like they must adhere to securities law, international taxation, and so on. Since KYC is a critical requirement when it comes to open payment systems, you can imagine how the regulations would evolve.

However, our ecosystem doesn’t work the way to which they are accustomed. Their traditional approach to regulation will transform the ecosystem into something barely recognizable. We can avoid that, however, it’s a battle with lots of complexity. There’s plenty to overcome. The challenge requires collaboration amongst the blockchain industry.

Tackling the problem isn’t as attractive as DeFi or NFTs, but this battle has the potential to empower the space to move forward. We need open infrastructure and tools that support the ecosystem and the cointegration.

Written By Joseph Weinberg, Co-Founder, Shyft Network.

About Shyft Network

Shyft Network is a public protocol designed to aggregate and embed trust, validation and discoverability into data stored on public and private ecosystems, and facilitate information transfer between permissioned and permissionless networks. By incentivizing individuals and enterprises to work together, Shyft Network allows for the layering of context on top of data, turning raw data into meaningful information.

More INFO

Website: https://www.shyft.network/
Telegram Group: https://t.me/shyftnetwork
Telegram ANN: https://t.me/ShyftAnnouncements
Twitter: https://twitter.com/shyftnetwork
Medium: https://shyftnetwork.medium.com/

IBM Blockchain

Creating a more interoperable blockchain future

Calls for interoperability began almost as soon as the second blockchain framework was built. Today there are permissioned blockchains (like The Linux Foundation’s Hyperledger Fabric) which are preferred for many enterprises solutions, public blockchains for cryptocurrencies, and emerging blockchain identity frameworks that hold great promise for streamlining transactions in nearly every industry.

Calls for interoperability began almost as soon as the second blockchain framework was built. Today there are permissioned blockchains (like The Linux Foundation’s Hyperledger Fabric) which are preferred for many enterprises solutions, public blockchains for cryptocurrencies, and emerging blockchain identity frameworks that hold great promise for streamlining transactions in nearly every industry. In the world […]

The post Creating a more interoperable blockchain future appeared first on Blockchain Pulse: IBM Blockchain Blog.


HYPR

HYPR

The HYPR Cloud Platform 6.10 helps organizations keep up with internal and external compliance regulations.

The HYPR Spring Release (6.10) brings exciting new features and enhancements for a stable, secure, and simple passwordless experience. From improving the user experience to adding security features that’ll make compliance auditors smile, the HYPR Spring Release creates a clear and safe path to passwordless security.

Addressing Security & Compliance

HYPR strives to ensure the most secure passwordless experience without disrupting the user experience. HYPR is a favorite of end-users because they’re able to employ their own smartphones to authenticate. This may give administrators pause, however, as admins have little to no control over the devices themselves.

To keep up with internal and external compliance regulations that our customers face, with our spring release HYPR has introduced admin-defined mobile version control to establish a minimum OS and HYPR mobile app version requirement that extends to the entire user base.

Enforce minimum OS requirements to maintain the latest security updates.

With these rules in place, users cannot pair devices or authenticate into protected resources without updating their device to the required version. To make the rollout process easy, administrators can define an authentication warning period during which users are still able to authenticate successfully, but are prompted to update their device or mobile application to avoid being blocked in the future.

HYPR has always had rules in place to prevent jailbroken devices being used for authentication. Jailbroken devices are prevented from running the HYPR mobile app as well as the HYPR mobile SDK. To further enhance security, our spring release brings additional enhancements to keep the solution up to date and secure.

Updated User Login Flows

When logging into web resources with HYPR, only devices that the user paired appear as an authentication option, making it easier to know which method to authenticate with. If the user has only one type of device paired, then HYPR automatically sends the request, reducing the required clicks during the flow.

In addition to a smoother web login experience, RDP users can now authenticate in a single step. By shifting more of the login control from Microsoft to HYPR, users are not burdened with unnecessary process points and are no longer wasting time in the login operation.

Finally, Mac users can rejoice as HYPR enables the use of Touch ID to unlock their Mac workstation. Touch ID is a FIDO-certified authentication method that employs the security of the platform biometrics to validate a user’s identity in lieu of a password. Combining this with HYPR unlock, all users can log into their computers with FIDO authentication with the methods that works best for them.

The Takeaway

HYPR continues to improve the user experience for end users as well as administrators who rely on ease of deployment and upgrade. Every second counts in performance and it is our goal to deliver lightning-fast experience at every HYPR touch point.

To learn more about what 6.10 has to offer, read our latest release notes for Workforce Access 6.10 and Customer Authentication 6.10. Or, contact us and our team will be in touch.

Sunday, 04. April 2021

Identosphere Identity Highlights

Identosphere #26 • ezcap library • ESSIF-Lab's second tranche winners • DID:DID

Another week's thoughts clarifications updates and developments towards verifiable credentials, decentralized identities, and the end of (FB) data silo honey pots
Thanks for joining us!

Hopefully our Patrons appreciate the Quarterly issue we just released, because we really appreciate you! We think it’s an impressive testament to the industry that’s grown around decentralized identity.

Coming Up

PoCATHON by Affiniti • Mar 26 – May 9, 2021

We invite developers across the world to come and build applications that generate secure, portable and privacy-preserving credentials enabling trust across entities using Affinidi’s APIs

Oktane21 • April 5-9

Inspiring keynotes. Hands-on training with experts. Oktane21 is the place to learn new skills, gain an Okta certification, engage with new ideas, and emerge ready to create the next transformative experiences.

The EOSIO Identity Working Group - Kickoff • April 12th

Gimly is excited to start the EOSIO identity working group WG (Twitter #eosio_id)! This open working group (WG) will create and foster identity solutions using EOSIO technology, by creating open W3C compliant self-sovereign identity standards, interoperability, and ecosystem development for eosio based identities.

Covid-19 Technology Innovations • April 14

“explore the technology innovations being pioneered in response to the Covid-19 pandemic, and what potential for Scottish ventures this presents.” by Peter Ferry, Gary McKay and Julian Ranger (Siccar, APPII and digi.me)

Internet Identity Workshop XXXII (#32) • April 20-22

OpenID Foundation Virtual Workshop • April 29, 2021

Identiverse 2021 • June 21-23 (Denver)

Jobs Gimly is Hiring

POSITION 1: FULL-STACK BLOCKCHAIN DEVELOPER

POSITION 2: PRODUCT DEVELOPMENT MANAGER

Explainer

@GeraldSantucci

Getting Started with Self Sovereign Identity SSI

The blog is my getting started with Self Sovereign identity. I plan to explore developing solutions using Self Sovereign Identities, the different services and evaluate some of the user cases in the next couple of blogs. Some of the definitions are explained, but mainly it is a list of resources, links for getting started. I’m developing this blog series together with Matteo and will create several repos, blogs together.

Introduction to Self-Sovereign Identity Jakubkoci 

In this article, I will do my best to explain self-sovereign identity from the end-user perspective, without any technicalities.

The way towards self-sovereign identity Ines Duits

This series of blogs focus on self-sovereign identity, SSI. This post explains where SSI originated from by giving a timeline of how digital identity has changed over the years. In the second blog, we focus on what SSI is exactly. In the third blog, give two examples of use cases (IRMA and Sovrin) where SSI plays an important role.

Verifiable Credentials Use Cases – Affinidi Protecting Your Driver’s License

the biggest advantage of such an SSI-based driver’s license is that there’s absolutely no possibility of loss. Furthermore, there is no question of your PII on the license to fall into the wrong hands because the holder has complete control over how it is used and with whom it is shared.

Accessing Medical Records Anywhere

this workflow doesn’t involve any third-party to store your medical data and this also means no worry about medical data storage policies and the laws associated with it. The holder completely owns his or her medical data and stores it exclusively in his or her digital wallet, thereby making it secure and hassle-free.

MyData  Me2BA Claims Victory in Contest Over California Privacy Regulations 

On March 15th, the AG’s Office of Administrative Law (OAL) approved additional CCPA regulations promulgated by the Department of Justice. Notably, the Department withdrew its original language mandating the “Privacy Options” icon. In its place is new language making commercial use of the icons optional only.  In other words, our stated concerns about the icons were well received, and ultimately adopted. 

Not Just Personal Data Stores Alan Mitchell

This is the fifth in a series of blogs which provide edited extracts of key points made by Mydex CIC in its response to the UK Government consultation around a new National Data Strategy.

This blog focuses on the main ingredients needed to unleash the full potential of personal data — in addition to personal data stores.

Thought Leadership  Why framing “data” as an asset or liability is dangerous MyDigital Footprint

If there is one thing that can change finance’s power and dominance as a decision-making tool, it is the rest of the data. According to Google (2020), 3% of company data is finance data when considered part of an entire company’s data lake. McKinsey reports that 90% of company decisions are based on finance data alone, the same 3% of data.  

If you are in accounting, audit or finance shoes, how would you play the game to retain control when something more powerful comes on the scene?

SSI Updates Our Language Reflects Our Values Auth0

At Auth0, we believe that consistent, iterative improvement leads to incredible results. We recognize that addressing our use of biased language is an ongoing process rather than a one-and-done effort. With our guidelines and principles in place, all employees are empowered to address biased language as they find it. We all have an obligation to be intentional with our language and consider how all words we use have the potential to reflect our values and beliefs.

Digital identities – steps on the path to an ID ecosystem Bankenverband

This article is very very good in articulating the big picture of how SSI systems relative to older system and emerging fragmented systems. 

An answer to these challenges is an ecosystem in which digital identity data can be exchanged in a way that is secure, reliable, scalable and convenient. This will have a positive impact on the economic future of Germany and Europe while at the same time enhancing the private sphere of the individual.

The EU Digital Green Certificate Program Evernym

Although the EU’s approach to COVID-19 health certificates (the Digital Green Certificate) implements existing technology and supports both paper and digital credentials, offline usage, and speedy verification, it makes a number of security and privacy compromises. Our analysis found it to be inherently centralised and better suited for low assurance use cases.

Elastos DID: What’s Ahead for 2021

DID 2.0’s primary objectives are to provide a superior developer and user experience, and to support more complex business models and use case scenarios enabling the expansion of DID’s implementation and adoption potential. 

Drilling down: Co-development DIF

What “standardization” means to DIF and what DIF means to standardization.

A newbie-friendly survey of how DIF relates to nearby organizations with overlapping or related foci.

What “co-development” and “coöpetition” really mean, concretely

Spherity launches New Product to Support Pharmaceutical Supply Chain Compliance

The product establishes trust in digital interactions between trading partners in pharmaceutical supply chains and ensures compliance with the U.S. Drug Supply Chain Security Act (DSCSA).

One woman’s open-source journey to decentralized identity Indicio

Noha Abuaesh, a Bahrain-based computer scientist, has been exploring decentralized identity for the last year, often with assistance from Indicio.tech’s open-source tools and free communications channels. 

Community Credentials Resonate

Verifiable Credentials are a new web standard for proving things digitally, thanks to some clever cryptography.  We are building Community Credentials to be the ‘Know Your Co-operator’ equivalent of KYC (Know Your Customer for business) for co-op social trust, all without reliance on centralised providers (or blockchains).

Meet the eSSIF-Lab’s ecosystem: The Infrastructure Development Instrument second tranche winners NGI Community

2nd tranche winners are the following:

Verifier Universal Interface by Gataca España S.L. – Building Standard APIs for Verifier components to enable SSI interoperability

Automated data agreements to simplify SSI work flows by LCubed AB (operated under the brand iGrant.io) – Adopt SSI and make it consumable for both organisations and end-users

Presentation Exchange - Credential Query Infra by Sphereon B.V. – Presentation Exchange Interop and Integration

Letstrust.org by SSI Fabric GmbH – Self-Sovereign Identity for everyone: Enterprise & Consumer Cloud Wallet (OIDC-based), Credentials & SDKs as a basis for applications - free

SSI Java Libraries by Danube Tech GmbH – Improving and completing a set of generic, open-source Java libraries for working with DIDs and VCs

WordPreSSI Login by Associazione Blockchain Italia – SSI Login for every WordPress site

NFC DID VC Bridge by Gimly – Enabling the use of NFC secure elements as DID and VC transport for off-line and online identity, authorizations and access management

DIF SDS/CS WG: CS Refactoring Proposal 0.2 Hyperonomy

Latest Version of the Proposal (0.2 – March 24, 2021)

Agent-Hub-EDV Architecture Reference Model (AHE-ARM) 0.1

Transcription of Selected Parts of the DIF SDS/CS March 11, 2021 Zoom Call

OSI Stack Proposal for Confidential Storage Specification

Based on the March 11 Zoom discussion where we worked hard to discern the differences between Agents, Hubs, and EDVs (and I believe were largely successful IMO), I’ve like to propose to the SDS/CS WG that we refactor the current Confidential Storage specification into 3 separable parts/specifications.

EU Grant to Help Building Blockchain Infrastructure.  Sphereon

We’ll be providing a Presentation Exchange that creates interoperability between W3C DIF-compliant Verifiable Credentials and Hyperledger Aries-based Verifiable Credentials for the European Blockchain Services Infrastructure (EBSI).

Podcast Self-Sovereign Identity for Social Impact & Importance of UX Jimmy J.P. Snoek, Tykn

when you go to somewhere in Sub-Saharan Africa, that’s going to be pretty difficult, when there’s maybe one phone in a village and it’s not even necessarily a smartphone. It’s very easy to say, “Oh yeah, but within SSI, everything has to be stored on the edge wallet.” What we saw was that if you make that this hard requirement, and keep working from that, then all these population groups are just going to be left behind more and more.

The Future of Authenticating Your Data with Doc Searls, Katherine Druckman and Dave Huseby

Across time and space immediately being tracked and, and falling victim to what I call sort of casual surveillance or corporate surveillance, right? Where is your data flows through systems; businesses are able to observe that movement of your data, your information aggregated, develop some kind of psychological model and then able to sell that to people who wish to manipulate you

PSA Today with Julian Ranger, founder of Digi.me 

Personal data governance (in a world of surveillance capitalism‪)‬

COVID-19 Digi.me creates first working UK vaccine passport capability

verified fully private, secure and tamper-proof due to multiple robust security measures including encryption.

This health pass has been designed to be fully interoperable with other international standards, such as the UN Good Health Pass Collaborative, of which digi.me is a member.

Video SSI eIDAS Legal Report – Ignacio Alamillo – Webinar 55

The European Commission developed the SSI (Self-Sovereign Identity) eIDAS bridge, an ISA2 funded initiative, to promote eIDAS as a trust framework for the SSI ecosystem. It assists a VC (Verifiable Credential) issuer in the signing process, and helps the verifier to automate the identification of the organization behind the issuer’s DID (Decentralized Identifier). Simply by “crossing” the eIDAS Bridge, a Verifiable Credential can be proven trustworthy in the EU. 

What BBS+ Means For Verifiable Credentials Evernym

In a recent Evernym blog post, we discussed why BBS+ LD-Proofs are the privacy-preserving VC format that everyone should implement. In this webinar….

- A brief history of verifiable credential formats, and how a lack of convergence makes scale and interoperability an ongoing challenge

- How BBS+ Signatures are the breakthrough that combine the best of the JSON-LD and ZKP formats, while still allowing for selective disclosure and non-trackability

- The path forward: What remains to be done to fully converge on the BBS+ format

NFT and music, NFT:DID for turning NFT's into identities, and critical updates to mainnet. Ceramic Community Call

you can go to ceramicnetwork/nft-did-resolver on github to see the prototype

so this is the minimal implementation that allows you to verify signatures of the most recent owner of the nft did as like being valid 

public-credentials@w3.org Technical Report on the Universal RDF Dataset Normalization Algorithm - Bill Bradley

The goal of this technical report is to review the Universal RDF Dataset Normalization Algorithm (URDNA2015) for correctness and to provide satisfactory evidence that possible issues with URDNA2015 have been considered and dismissed.

did:did - DID Identity DID (DID) DID method

We hope the community will find this useful to help increase adoption and interoperability of Decentralized Identity technology.

Specification: https://did-did.spruceid.com/

Source: https://github.com/spruceid/did-did/

Registration request: https://github.com/w3c/did-spec-registries/pull/280

The ezcap library - Manu Sporny

Now might be a good time to announce some open source tooling a few of us have been working on related to zcaps that is being created to simplify the developer experience when developing with zcaps.

ezcap (pronounced "Easy Cap") - An easy to use, opinionated Authorization Capabilities (zcap) client library for the browser and Node.js.

Literature Blockchain, Self-Sovereign Identity and Digital Credentials: Promise Versus Praxis in Education

This article is primarily interested in the affordances of the technology as a public good for the education sector. It levers on the lead author’s perspective as a mediator between the blockchain and education sectors in Europe on high-profile blockchain in education projects to provide a snapshot of the challenges and workable solutions in the blockchain-enabled, European digital credentials sector.

Identity not SSI V2 of FIDO2 CTAP advanced to Public Review Draft

The FIDO Alliance has published this Public Review Draft for the FIDO2 Client to Authenticator Protocol (CTAP) specification, bringing the second version of FIDO2 one step closer to becoming a completed standard.

FIDO Recognition for European Digital Identity Systems and eIDAS Grows

Recognition of the value of FIDO in European digital identity systems and eIDAS continues to grow.  This month has featured two new updates in Europe on the FIDO front: the release of a landmark ENISA report that discusses the role FIDO2 plays in eIDAS, and the accreditation by the Czech government of a new eID solution using FIDO2.

Not ID Blockchain Ecosystem’s Response to MiCA Regulation Proposal INTABA

INATBA believes that the interaction between policymakers and industry representatives should continue throughout the whole regulatory process. The period between the publication of the proposal and the enactment of the regulation is estimated to be three years. 3 Within this time, many aspects of the blockchain (and DLT) ecosystem may change, the rather nascent technology may further evolve in unexpected ways and novel business models may emerge.

Thanks Again, See you next week!

Saturday, 03. April 2021

Europechain

Top Blockchain Service Providers In The EU In 2021

BaaS provides easy access to blockchain services for enterprises. In this article, we discuss some of the top BaaS providers in the EU in 2021.

2021 is still relatively young, but the year is shaping up to be as eventful as the last, if not more. In early January, US democracy faced a deadly challenge, sparked by a hitherto unthinkable incitation by the very individual who’s supposed to uphold that democracy against the nation’s enemies, foreign, and domestic. The Covid-19 pandemic rages on, though the scientific community’s ingenuity and...

Source

Friday, 02. April 2021

IDnow

IDnow welcomes Bundesnetzagentur decision and predicts turning point for digital identity verification

Munich, 2 April 2021 – The decision of Bundesnetzagentur to allow AI (artificial intelligence) based identification methods for new use cases in Germany is a confirmation of IDnow’s platform strategy and at the same time a turning point in international competition for German technology companies. With the publication “Video identification with automated procedure” for automated […]

Munich, 2 April 2021 – The decision of Bundesnetzagentur to allow AI (artificial intelligence) based identification methods for new use cases in Germany is a confirmation of IDnow’s platform strategy and at the same time a turning point in international competition for German technology companies.

With the publication “Video identification with automated procedure” for automated optical identity verification by Bundesnetzagentur in cooperation with the Federal Office for Information Security (BSI), AI-based solutions for the verification of customers are now approved for certain use cases in Germany in addition to the proven methods. (Order here.)

The published order is based on the German Trust Services Act, which implements the eIDAS Regulation in Germany. The catalogue of criteria published by Bundesnetzagentur and BSI prescribes high technology standards for automated identity verification. IDnow is one of the few European providers that has already been officially confirmed with other methods by Bundesnetzagentur within the scope of an accredited conformity assessment and is now also already working on the confirmation for AutoIdent.

IDnow has been using automated identification technology for many years. With its product IDnow AutoIdent, IDnow is one of the leading providers for automated identification in Germany. As early as 2012, IDnow filed a patent application for fully automated identification procedures, which has been granted by the European Patent Office. “We are pleased that the regulatory authorities have now finally opened up to technological development in Germany and we see our efforts and years of research into the development of our own AI technology confirmed,” says Armin Bauer, Co-Founder and Chief Technology Officer of IDnow. In addition to this patent, IDnow holds other patents and patent applications at the European level.

IDnow’s technology uses so-called Deep Learning to perform an identification at the same security level as a personal identification – for example in a branch. For this purpose, the self-developed artificial intelligence, automatically recognises the security features of the ID documents in a video stream and performs a biometric face comparison of a video selfie with the ID document. This is complemented by new security mechanisms such as a “liveness detection” that prevents attacks through recorded videos and photos.

IDnow sees the regulator’s decision as an important step towards a digital society in Germany and as a fundamental turning point for German companies in international market comparison. The established and very successful services such as the electronic ID card, BankIdent or the VideoIdent procedure can now be supplemented with AI-based AutoIdent methods for new use cases. IDnow expects a significant change for German consumers: The new decree will make innovative technologies such as digital identities accessible to the broad population in the future – regardless of the individual citizen’s affinity for technology. This is because the new regulation enables platform companies such as IDnow to offer the appropriate digital methods for identity verification in each case.

“The decision of Bundesnetzagentur is an important milestone for platform companies like IDnow. In the future, IDnow, as one of the leading European platform providers, will be able to offer additional, automated identification methods to numerous other industries in Europe. This is an essential step towards a digital future in Germany and for Europe and shows that the time for secure digital identities has come,” says Andreas Bodczek, CEO of IDnow. “Consumer security is always at the forefront of the development of our procedures. For many years, we have therefore been actively working with various organisations in politics and regulation to jointly shape this secure, European future,” he adds.

IDnow is one of the leading providers of digital identities with a broad portfolio of products and solutions and identifies several million citizens per year with its platform. The portfolio includes identification methods from offline to online, from automated methods to procedures carried out by identity specialists, available flexibly online, offline in branches and even by a courier at the user’s doorstep. IDnow has expanded its role in recent years far beyond simply offering individual ident methods and has become the overarching platform for digital identities with several million transactions per year.

Just last month, IDnow announced the acquisition of identity Trust Management AG, one of the leading international providers for on- and offline verification. This is the second acquisition in the last six months for IDnow and represents an important milestone on the way to becoming the leading identity platform in Europe. The acquisition of identity Trust Management AG enables IDnow to expand into new industries and offer its services to a broader customer base in Germany and beyond.


One World Identity

State of Identity Rewind: March

Rewind and replay! Each week, State of Identity host Cameron D’Ambrosi is joined with experts across the identity landscape and enterprises alike to unpack the latest trends, hottest topics, and things you need to know – and this past month was no exception.    In March, we hosted identity experts at the helm of Parallel … State of Identity Rewind: March Read More » The post State

Rewind and replay! Each week, State of Identity host Cameron D’Ambrosi is joined with experts across the identity landscape and enterprises alike to unpack the latest trends, hottest topics, and things you need to know – and this past month was no exception. 

 

In March, we hosted identity experts at the helm of Parallel Markets, Teramind, Mastercard, and DueDil for discussions on: 

The creation and rollout of a truly portable digital identity Leveraging employee data while balancing privacy at home The impacts of the pandemic-induced digital shift one year later  The power KYB solutions can have on retaining customers 

 

Parallel Markets: Financial Onboarding for Individuals and Businesses

 

Tony Peccatiello, Co-Founder & CEO at Parallel Markets, joins the State of Identity podcast to unpack the difficulties in doing KYC and AML for businesses. He shares how his organization is taking a “come for the tools, stay for the network” approach to building and bringing a truly portable digital identity to the market – simultaneously attacking the complicated issue of accreditation and KYB.

 

Listen Now

 

Teramind: Balancing Compliance & Culture

 

As the world welcomes remote work as the new standard for many jobs, how can companies manage their workforce while providing workers with privacy at home?

 

In this episode, Eli Sutton, VP of Global Operations at Teramind Inc., joins OWI to discuss the balance of compliance and culture in a “new normal” and how employers can and should leverage the data they receive from employee monitoring and project management tools.

 

Listen Now

 

 

Mastercard: Revolutionizing Digital Identity for the Modern Times

 

Mastercard’s SVP, Digital Identity, Sarah Clark, joins the State of Identity Podcast to dive further into her insights shared at our Digital Forum, Identity Verification in Patient Health Credentials.

 

She and host Cameron D’Ambrosi go beyond healthcare to discuss how verticals, like fintech and payments, have been transformed by the pandemic-induced digital shift, providing an expert view into the macro trends for 2021 and how digital identity continues to impact the average consumer one year later. 

 

Listen Now

 

 

DueDil: Understanding True Business Identity

 

What can a deeper understanding of customers and UBOs do for risk, compliance, and growth potential?

 

To unpack this, Justin Fitzpatrick, Co-founder and CEO of DueDil, joins us to explore how the past year accelerated the need for digitization in financial services, why it’s now more challenging than ever to attract and retain customers, and how KYB and identity solutions can keep them.

 

Listen Now

 

 

Looking Ahead

 

Next month we’re excited to host conversations with SailPoint, Playground Global, GBG, and Persona. Stay up to date on each conversation and subscribe on Spotify now!

The post State of Identity Rewind: March appeared first on One World Identity.


Infocert (IT)

Forrester conferma InfoCert come “Large Provider” di soluzioni di Firma Digitale fra 19 player mondiali

The post Forrester conferma InfoCert come “Large Provider” di soluzioni di Firma Digitale fra 19 player mondiali appeared first on InfoCert.digital.
Nel suo ultimo report “Now Tech: Digital Signature And Trust Services, Q1 2021”, InfoCert è stata individuata da Forrester come Large Provider nella categoria Digital Trust Platform, riconoscendone il posizionamento tra i leader nel settore della Firma Digitale.

Nella giornata del 25 marzo, la società indipendente di analisi e ricerca Forrester Research, ha pubblicato il report “Now Tech: Digital Signature And Trust Services, Q1 2021”, classificando InfoCert “Large Provider” per le soluzioni di Firma Digitale; lo scopo del report è quello di agevolare professionisti e imprese nella valutazione delle soluzioni proposte dai diversi provider in ambito Firma Digitale e Servizi Fiduciari.

Il report di Forrester suddivide il mercato della Firma Digitale e dei Servizi Fiduciari in quattro segmenti diversi, InfoCert è stata inserita fra i “Digital Trust Service Platform”, ossia coloro che soddisfano i requisiti della normativa europea in tema di identificazione elettronica e servizi fiduciari offrendo una gamma di soluzioni qualificate per i clienti che necessitano il rispetto di standard di garanzia molto elevati.

Scarica gratuitamente il report pubblicato da Forrester Research per conoscere le evoluzioni nel mercato delle firme digitali e i principali attori globali come InfoCert:

Scarica il Report

Il report è molto utile per differenziare i fornitori a seconda delle specifiche capacità e livelli di servizio offerti. Il segmento “Digital Trust Service Platform”, all’interno del quale InfoCert è stata riconosciuta, è il massimo livello di Trust raggiungibile per una piattaforma di servizi per la trasformazione digitale. Questa importante menzione conferma che InfoCert è in grado di fornire ai propri clienti degli standard di compliance e di governance di alto livello, coprendo un ventaglio di esigenze che va dalle più semplici alle più complesse.

Il report prosegue analizzando altri tre segmenti che sono: Digital Transformation Platform, e-Signature Specialist e PKI (Public Key Infrastructure) – Specialist.

Il team che ha curato la stesura del report, guidato dalla Senior Analist Enza Iannopollo, pone l’accento sul fatto che il cliente debba poter accedere a servizi che si adattino alle proprie esigenze, per questo deve essere messo in grado di poter scegliere il fornitore che soddisfi i criteri di funzionalità e compliance in tutte le fasi del processo in questione.

GoSign, la soluzione di firma di InfoCert, è ormai un punto di riferimento sul mercato grazie alla sua versatilità. La possibilità di firmare su qualsiasi dispositivo, di utilizzare diverse tipologie di firma e di personalizzare la soluzione con processi verticali, consente alle aziende di velocizzare i processi approvativi e quelli di firma verso clienti e fornitori.

Danilo Cattaneo, CEO InfoCert, ha commentato:

“Essere stati menzionati da Forrester tra i “Large player” mondiali di Digital Trust Platform conferma il lavoro straordinario che InfoCert ha fatto in questi anni nel campo del Digital Trust. La nostra missione da sempre è supportare i clienti con soluzioni end-to-end che li abilitino ad una trasformazione digitale sicura e con un livello di Trust e di Governance scalabile secondo le esigenze del business. Credendo nel miglioramento continuo dei nostri servizi, nell’ultimo anno il nostro gruppo ha investito in modo significativo acquisendo tre società specializzate nella Cyber Security, e grazie a questi investimenti i clienti hanno a disposizione una piattaforma di digital trust che si differenzia dal resto del mercato per le avanzate caratteristiche di cyber security, rispondendo ad un’esigenza sempre più attuale per le aziende di ogni dimensione. Stiamo portando le caratteristiche delle soluzioni di digital trust ad un livello superiore.”

Scarica il Report

The post Forrester conferma InfoCert come “Large Provider” di soluzioni di Firma Digitale fra 19 player mondiali appeared first on InfoCert.digital.


Affinidi

Accessing Medical Records Anywhere — A Use Case for Verifiable Credentials

Accessing Medical Records Anywhere — A Use Case for Verifiable Credentials If there’s one thing we’ve learned from this COVID pandemic, it’s the importance of good health and the need to stay on top of our medical records at all times. Whether it’s our vaccinations, prescriptions, medical tests, conditions, or even a history of doctor visits, keeping this information readily accessible has
Accessing Medical Records Anywhere — A Use Case for Verifiable Credentials

If there’s one thing we’ve learned from this COVID pandemic, it’s the importance of good health and the need to stay on top of our medical records at all times.

Whether it’s our vaccinations, prescriptions, medical tests, conditions, or even a history of doctor visits, keeping this information readily accessible has become imperative now than ever before.

At the same time, this is easier said than done as our medical records are all over the place (literally!). You’ll know this pain if you’ve moved from one state/country to another in the last few years because the medical records are stored only in the institution or physician’s office where it is generated. Though these are done partly to preserve privacy, transferring or giving access to these records is a whole different story altogether!

But all this can change with Verifiable Credentials and Self-Sovereign Identities (SSI).

For starters, verifiable credentials are a secure and accessible way to share your medical records with whomever you want. You can read more about verifiable credentials and how they work in this post.

So, how do VCs work? How can you keep all your records in one place and share them within seconds to just those you want?

Let’s understand with a possible real-life scenario.

Lisa is on a business trip to London, and she suddenly feels dizzy and nauseous. The same night, she develops a high fever and decides to visit a nearby clinic. Now, the doctor suspects it to be a case of food poisoning and wants to know her medical history and pre-existing conditions before prescribing a medicine.

To give this information, Lisa simply logs into her wallet, creates a verifiable presentation of the medical records that the doctor wants, and shares it with the doctor. After looking at the records, the doctor finds that she is allergic to sulfide and prescribes a mild dose of antibiotics.

Soon, Lisa gets well and is back to work!

Sounds simple and quick, right? What went on in the background that made it so easy for Lisa to share her medical records within seconds?

That’s the magic of VCs, and here’s how it works.

As with all VCs, three parties are involved and they are:

Issuer — Lisa’s primary care physician’s office

Holder — Lisa

Verifier — The doctor’s clinic in London

As a first step, Lisa logs into her doctor’s clinic and submits a request for VC. The primary care physician’s office checks Lisa’s immunization status and her history of allergies, and accordingly, issues the verifiable credentials in the form of a QR code with the results of those checks. Lisa scans this code and saves it in her wallet.

Next, she receives a Credential Share Request from the doctor’s clinic in London and approves it to share a Verifiable Presentation containing her medical history to the doctor’s office in London. The receptionist at the office logs into the admin account, verifies the credentials, and notifies the doctor of allergies and pre-existing conditions. That’s it, really!

If you notice, this workflow doesn’t involve any third-party to store your medical data and this also means no worry about medical data storage policies and the laws associated with it. The holder completely owns his or her medical data and stores it exclusively in his or her digital wallet, thereby making it secure and hassle-free.

Also, these VCs are secure, tamper-proof, and authentic information that can be quickly shared with anyone you want. Such streamlined access is sure to transform the way we create and handle our Personally Identifiable Information. And undoubtedly, it’s our future!

Are you ready for it?

We are!

We’ve built a bunch of APIs you can leverage to create such applications.

In the meantime, you can also check out our PoCathon to build Proof of Concepts using Affinidi’s resources and get a feel of the endless possibilities that our APIs offer.

Accessing Medical Records Anywhere — A Use Case for Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 01. April 2021

Dark Matter Labs

Scaling the Right to Retrofit

Example sketch of street transition in Malmö — as part of the Climate KIC Deep Demo project This article proposes key principles which we believe could be the foundation for replicating democratic models of retrofitting. It is part of the ‘A Right to Retrofit’ series, and you can read all other related blogs by following the links at the bottom of the page. The way we retrofit needs to b
Example sketch of street transition in Malmö — as part of the Climate KIC Deep Demo project

This article proposes key principles which we believe could be the foundation for replicating democratic models of retrofitting. It is part of the ‘A Right to Retrofit’ series, and you can read all other related blogs by following the links at the bottom of the page.

The way we retrofit needs to be redesigned to meet the urgent need for climate transition in the built environment. To overcome the systemic constraints on the potential impact of retrofit, a fundamental shift in the system is required. [For more on this, see The System Challenges to Retrofit]

We believe this systemic shift can be achieved by testing strategic interventions in the ‘dark matter’ of the retrofit ecosystem: through piloting and proving out new infrastructures, new standards, new legal patterns, or new institutions. Our ‘dark matter’ experiments and propositions will be oriented around these key principles which we believe will begin to deliver a sustainable, socially-just future.

01 — Collective and democratic

The individualisation of responsibility over retrofit continues to frame retrofit solutions with individualised approaches. [For more, see this section of The System Challenges of Retrofit] Instead, when a whole street is considered as the subject of retrofit, the ambition of retrofit is a scale of magnitude beyond the sum of individual households, by considering not just these private spaces themselves, but in combination with the public spaces in which they coexist. The ambition of collective retrofit could include public realm, green mobility, and community amenities; these are necessary in a climate transition that includes all of the built environment. As a minimum unit of organisation, a street-wide community provides a critical mass to test systemic changes that reshape the retrofit market and the wider ecosystem, such as exploring new models of collective financing that remove cost burdens from the individual. The street as a unit of organisation is also a familiar, natural scale for deliberative decision-making as a community.

02 — Performance-oriented

The justification for retrofit is still viewed primarily through its ability to deliver decarbonisation, yet retrofit’s ecosystem is rarely structured around its actual performance in doing so, relying on theoretical and indirect measures. In restructuring incentives around measured performance, the dynamics of the system will not only shift in favour of better performing retrofit, but have wide-ranging systemic impacts — from providing the upstream impetus for improving skills, tools, and services to deliver on this demand for performance, to broadening the definition of what makes a high-performing retrofit project by analysing and accounting for its full range of co-benefits. We think that with this new knowledge infrastructure, built on data from IoT, sensor technology, and automation using smart contracts, the definition of ‘performance’ can evolve from energy use into a more holistic and precise resident-oriented metric such as ‘comfort’ instead, and be built right into the incentives of contracting for retrofit work. [For more on how performance-based incentives can shift the retrofit system, see this section of The System Challenges to Retrofit]

03 — Empower the local

Moving from being “done to” their communities, to being “done with” them, a future retrofit ecosystem must empower residents to be in the driving seat of their community’s climate transition. Building their trust, agency and capability is fundamental to have enough community buy-in for its long-term success, and to ensure a democratised transition. This also means challenging the false economies of delocalised, value-extracting approaches. Instead, leveraging local supply chains, local skills, and local networks of knowledge and trust ensure that retrofit is not just about short-term investment in emissions reductions, but also long-term investment in community wealth and resilience. This emphasis on the local also means moving away from assumptions of one-size-fits-all retrofit service models, instead moving to modular, adaptive models that may be built on wider systemic learnings, but tailored to the specificities of the local context. [For more on our research on local retrofit models, see this section of The System Challenges to Retrofit]

04 — Built on open knowledge

Knowledge and data around retrofit drives better decision-making, yet data gaps exist in the ecosystem for many stakeholders, from residents, to suppliers and municipal governments. Building this open knowledge infrastructure serves as an evidence-base for new infrastructures, new standards, new legal patterns, or new institutions, and each future retrofit project has the potential to become a longitudinal study, offering a wealth of data into the comprehensive impact of deep community retrofit. This knowledge infrastructure must be in place to harness the future utility of these insights. Building this open knowledge infrastructure for the long term ensures that innovation capability in the retrofit ecosystem is not exclusive to proprietary services and governments, but too is democratised. [For more on our research on why this knowledge infrastructure is critical, see this section of The System Challenges to Retrofit]

Future retrofit starts now: testing and demonstrating

These principles need to be translated from ‘thinking’ into ‘doing’. The work of Dark Matter Labs is currently scoping and investing in strategic experiments to make this future retrofit ecosystem a reality. These ‘system demonstrators’ consist of what we believe to be the minimum viable propositions to prove out a systems change, based on the dynamics of the system we’ve mapped out so far, and in line with the principles we believe are necessary for a democratised climate transition.

It is important to note that these propositions and demonstrators are part of an ongoing iterative process: as our learning evolves from sketch to prototype, concept to context, so too will the nature of these propositions. Based on our work so far, we believe there are some interesting lines of enquiry that include;

Service journey for community retrofit
Can we redesign the citizen experience of retrofit for a community at the scale of a street or block to maximise the potential of its ‘beyond the individual’ approach? By building and ensuring processes to empower the community’s democratic, and deliberative, decision-making capability, can this collective approach form the critical mass needed to reshape the wider ecosystem, such as broadening the range of financing and reinvigorating local supply chains? Data platform
Can we lay the foundations for retrofit’s knowledge infrastructure, and design an open platform that presents relevant insights to stakeholders across the retrofit ecosystem for holistic, data-informed decision-making, at all stages of the service journey? Can this foundation include a protocol to capture the potential insights from future retrofit projects to help grow this knowledge infrastructure? Smart comfort contracts
Can we structure contracts so that their incentives are not based solely on completing agreed retrofit work, or relative energy savings, but instead contract around broader definitions of performance, such as comfort?. Can potential beneficiaries of retrofit outcomes extend beyond the residents? Might these standards be co-created with the communities who will use them, and might they be structured with financing to overcome high upfront costs and tenure barriers?

These interventions aim to target fundamental and interconnected leverage points in the retrofit ecosystem, with propositions that we think could shift the underlying dynamics, rebalance incentives, and empower actors. In the next phase of this work, our ambition will be to build and test these interventions in-situ with the residents of these retrofitted homes, to ensure these interventions create cascading, multiplied impacts and prioritise deep, comprehensive retrofit through systemic changes that are sustainable in the long-term.

Over the next few months, Dark Matter Labs will be pursuing these and other lines of enquiry, applying future retrofit principles in real places to test propositions with system-shifting potential. We believe that the system not only needs to change to deliver wider adoption of deeper retrofit, but these changes must ensure that the climate transition of the built environment is democratised, equitable, and just. We will begin building this collective, democratic vision of retrofitting, and also rebuilding trust in the systems of how our built environment is made and remade. As our testing progresses, we will be updating blogs in this series with a deeper look into some of the prototypes/propositions that are emerging.

To outline our work on reimagining retrofit, we have curated our thinking from the last 9 months into a series of blogs which form the basis for ongoing work.

These blogs examine

Retrofit & ‘Building Back Better’ The System Challenges to Retrofit Scaling the Right to Retrofit (this blog)

You can find the full series here.

This blog and its visuals has been co-authored by Calvin Po, Ariane Porter and Jack Minchella.

Scaling the Right to Retrofit was originally published in A Right to Retrofit on Medium, where people are continuing the conversation by highlighting and responding to this story.


The System Challenges to Retrofit

Pawel Czerwinski on Unsplash This article explores the system challenges that underpin the current retrofit ‘ecosystem’. It is part of the ‘A Right to Retrofit’ series, and you can read all other related blogs by following the links at the bottom of the page. Over the past nine months Dark Matter Labs has been working with EIT Climate-KIC and partners in cities across Europe to understan
Pawel Czerwinski on Unsplash

This article explores the system challenges that underpin the current retrofit ‘ecosystem’. It is part of the ‘A Right to Retrofit’ series, and you can read all other related blogs by following the links at the bottom of the page.

Over the past nine months Dark Matter Labs has been working with EIT Climate-KIC and partners in cities across Europe to understand the retrofit ‘ecosystem’, and how it can be reformed to meet the urgent need for climate transition in the built environment.

In each of the local contexts, alongside our partners Polmi and Poliedra in Milan, REGEA in Zagreb, Jibe in Sofia and the City of Edinburgh Council, we identified factors across different parts of the retrofit system leading to barriers to retrofitting.

Through this work we’ve come to understand that retrofit, its uptake and impact, must go deeper than its market dynamics alone, or we risk continuing to overly rely on supply and demand-side levers (such as one-off training schemes and homeowner grants). Retrofit and its market exists in a wider context where factors from across the stack, from institutional infrastructure to community dynamics, have complex interrelationships. To develop a system-oriented understanding of retrofit and its problems, retrofit must be considered as part of an entire ‘ecosystem’.

In the C40 Cities’ framework and energy municipal learning exchanges, we see emerging consensus in the retrofit field that home retrofit has intrinsic, multiple benefits, beyond just improvements of energy efficiency and emissions reductions. This work recognises that retrofitting has potential positive impacts across health and well-being, fuel poverty and inequality, local economic recovery, job creation, and community resilience, because retrofit exists not in isolation, but sits across strategic intersections of multiple systems.

The diagram below is a summary of common factors that were found over the course of working across many contexts, where problems that on the surface appear simple (such as lack of cash, lack of awareness, and lack of suppliers) in fact have complex, interlinked causes. These are represented as ‘causal loops’ or vicious cycles, where the identified factors mutually reinforce one another, and all eventually lead to the symptoms of low retrofit uptake rates, lack of deep, comprehensive retrofit approaches, and ultimately, contribute to the exacerbation of the climate crisis. The factors that are connected to many others begin to indicate strategic areas and leverage points for interventions in the system, with potential, multipliable impacts.

A system map of retrofit ecosystem — original file here. 01 — Individualised Responsibility and Risk

Pre-21st century institutions are designed around individual interests and responsibilities, but in the face of an unprecedented collective climate crisis, these institutions are being stretched to their limits.

Retrofit, in attempting to deliver a collective climate transition of our built environment, still operates within the primarily individualised institution of property. Common across almost all the contexts we worked in, the onus to pay for retrofit work remains by default with the individual property owner, with the risks of high energy costs, maintenance costs, and substandard housing conditions borne by the individual. An owner’s inability to pay the high upfront costs required, and the lack of available financing, becomes a barrier to retrofit with cascading consequences. As a result, credit/cash-poor households, who may benefit most urgently from energy and cost savings, are often excluded, exacerbating a vicious cycle of fuel poverty and inequalities, despite the solid long-term investment case of retrofit.

Efforts to mitigate this so far still operate within the framework of individual responsibility: governments and local authorities continue to subsidise individual homeowners with one-off grants for individual interventions. This funding is prescriptive and bureaucratic for the homeowner to access, and often require onerous supplier accreditation. The UK’s Green Homes Grants are a perfect example of this: not only were there delays in payments and a distinct lack of suppliers to do the work (leaving 95% of the funding set aside by the Treasury unspent), that which was spent found its way to construction firms currently under criminal investigation for the Grenfell tragedy. Grant schemes like these tend to concentrate on the ‘easy wins’ by limiting eligibility to certain types of intervention, and lend the weight of government authority to mainly conservative, piecemeal approaches, rather than deep, comprehensive retrofit.

In the long term, this has been shown to be insufficient in reducing energy usage and emissions, and within 30 years time, will require further retrofit. The knock-on consequences of this funding barrier is that with retrofit’s limited demand, contractors have little incentive to invest in training or developing dedicated retrofit services, so work is completed with conventional construction methods with little guarantee of performance, and in turn, worsening the perception of retrofit, becoming a cascading risk.

When retrofit approaches operate under the assumption that responsibility lies with the individual property owner, it ignores the more complex reality that property interests are relational and hierarchical. The only situation where such an individualised approach is even remotely appropriate is the owner-occupied detached home. For anything else, these hierarchies can be the relationship between landlord and tenant, between owners in a block, or between freeholder and leaseholders. Each of these stakeholders may have different incentives and different agencies to act on them, and when these coexist within a single building and are misaligned, the barriers to retrofit are multiplied.

This may be as obvious as tenants who pay their own energy bills having the incentives to retrofit but being dependent on their landlord to commission retrofit work. In the urban contexts we worked in last year, such as Milan, Zagreb, and Edinburgh, homes were predominantly in blocks of flats. Multiple parties sharing the same building fabric and maintenance costs means commissioning retrofit work comes with an added dimension of complexity. There is a need to enable trust and consensus-building across the community to proceed, subject to property laws and governance of shared buildings that vary by context. More often than not, these governance structures are still built around dividing collective responsibilities (and costs) to individual owners (in the words of a partner in Zagreb, “deciding who can sue whom”). The inability of some residents in a shared building to afford retrofit work may lead to those residents pressured to sell and move out in an ecological gentrifying phenomenon of ‘renoviction’, or limit the comprehensiveness of retrofit, or even hinder a retrofit project altogether.

For example, from our work with City of Edinburgh Council, while the council has set ambitious decarbonisation targets with council-funded retrofit used as a means to achieve this, the privatisation of individual council flats has created mixed-tenure blocks with private owners who may not be able or willing to invest in retrofit. The buildings considered viable for the council’s retrofit programme end up being limited to those where the council itself owns all units. The barrier of a resident’s ability to fund upfront, along with limits on how public investment can be used in mixed-tenure buildings, calls for an alternative tenure-independent model with long-term collective financing that can overcome the affordability and accessibility barriers to retrofitting and to high-standard housing.

This individualisation of responsibility does not just concern money, but also time and knowledge — the complex and inconsistent retrofit landscape of contractors and products, requires the individual homeowner to invest time in research, with little experience or expertise to guide them. While the retrofit ecosystem could benefit from a more established knowledge infrastructure, approaching retrofit collectively rather than individually could allow this burden to be shared. A well designed collective approach could create enough aggregated demand for new models to be viable, such as dedicated, resident-oriented retrofit services providing beginning-to-end guidance throughout the retrofit journey.

In our work in Milan, where the condominiums, apartment blocks or collection of homes are often managed and maintained by a Building Manager. When redesigning the end-user journey of retrofit around a collective approach, we could leverage the Building Manager’s unique existing community relationship to engage and guide residents through the retrofit process, instead of relying on the efforts of motivated individual owners. By working within governance structures that communities already know, trust and rely upon, the perceived complexity and misaligned incentives that discourage residents from embarking on retrofit projects can be overcome as a community.

02 — Knowledge, performance and trust

Retrofitting homes is understandably perceived by residents to be a risky activity. These risks are often not intrinsic to the retrofitting process, but are perceptual. As experienced by one of our municipal partners, these perceptual risks can act as barriers to retrofitting for households who can afford retrofit costs and, at least in theory, should be the “low-hanging fruit” and obvious targets for boosting retrofit uptake. The crucial missing piece is the knowledge infrastructure around retrofit, where the accessibility of accurate information, in particular data on retrofit performance, can mitigate these concerns. For a resident, the perceived risks of retrofit, namely the large investment and the temporary disruption to their home, are balanced out when they can be sure that their retrofit project and its use of new energy technologies is built on a performance track record, and will deliver promised results. This knowledge infrastructure around performance, both as an evidence base and a metric for successful retrofit completion, is critical in transforming the reliance on ‘faith’ in retrofit into genuine trust.

EU-wide schemes such as Energy Performance Certificates are attempts to implement a knowledge infrastructure around building performance and retrofit, under the economic rationale of “de-risking” and creating a “market signal for efficient buildings”. However, the implementation of EPC’s have been inconsistent across member states, and the fact that EPCs rely on modelled or indirectly measured energy performance and lack open access to its data in some countries limits its utility in progressing the knowledge infrastructure around built environment transition. Worse, when EPCs are used as the only performance metric, they could end up incentivising inappropriate or ineffective retrofit interventions due to their pro forma approach. While long-term financing based on achieved performance outcomes are used in energy performance contracting provided by ESCOs (Energy Service Companies), where energy savings are used to finance the retrofit cost, these models are inevitably based on the few available sources of performance data, primarily metered energy use, and leave the other benefits of retrofit unaccounted for, reinforcing only the energy and emissions-related understandings of retrofitting performance instead of incentivising a comprehensive, deep approach with multiple benefits.

Knowledge infrastructure is also crucial for the planning of retrofit works: for a local authority or retrofit service provider, data on the neighbourhood can scope out potential retrofit sites that can deliver maximum impact, and support a targeted resident engagement strategy or leverage a local skilled labour force. This data picture not only can survey the condition of neighbourhoods’ built environment and its energy performance, but can be broadened to include its demography, its social, economic and environmental conditions, and embed in the planning process a fuller range of impacts in assessing a retrofit project’s cost-benefit balance, and identify stakeholders and potential beneficiaries beyond the residents themselves.

From our work in Zagreb, state funding allocated after the March 2020 earthquake primarily for reconstructing homes to a safe standard also provided a unique opportunity to initiate a neighbourhood-scale climate transition, by simultaneously retrofitting those homes and minimising overall disruption. As part of developing the knowledge infrastructure around retrofit, we conducted a data study using a combination of earthquake damage data, census and open datasets in a prototype of a mapping tool that could identify building typology, indicators of fuel poverty, potential for nature-based solutions and public realm interventions, and even public attitudes based on social media data.

When holistic knowledge of the city is built into the retrofit process, such as in a open mapping platform, neighbourhoods can be targeted for engagement that fit a comprehensive balance of factors, and begin building a portfolio of retrofit sites where returns are not just considered in terms of energy savings per capital expenditure, but maximising impacts across a broad range of environmental, socioeconomic factors

03 — Supply-side cycles

One of the key causal loops is that the low demand for retrofit, caused by a myriad barriers including those above, leads to a relatively underdeveloped supply side of the retrofit market, with few dedicated retrofit service providers, relying on traditional contractors or installers of individual products. This lack of competition to deliver on performance (which impacts quality of construction, timeliness of completion, the appropriate specification of products, in addition to actual achieved energy use and emissions) in turn reinforces lower uptake. As a result of these supply-side barriers, many existing retrofitting services, such as Energiesprong, have adopted centralised industrial supply models based on standardised design, procurement and funding, to target neighbourhoods as ‘critical masses’ for viable community retrofit.

Standardised approaches are often justified by an assumption of the economies of scale, that they afford greater speed in reaching the scale of renovation and energy efficiency measures needed. This model is used by many local authorities or housing associations to retrofit their own stock, for example Edward Wood Estate, Fulham and New Barracks Estate, Salford. However, these large-scale initiatives mean that the resulting innovation, knowledge and economic growth is centralised and delocalised, reducing the opportunity for alternative services to develop and compete. These centralised models can deliver fast results; companies like Energiesprong have developed off site manufacturing methods that allow for quick turn over by designing and completing prefabricated panels that are brought to site and constructed within 14 days; this speed makes centralised approaches an attractive route with looming legal decarbonisation targets.

However, the concern is that the speed of a centralised industrial supply model comes at the cost of agency of the local community over the deliberation and consensus-building processes of retrofit, with residents encouraged to accept proposals adapted from their standard approaches. Emerging alternatives, such as Carbon Co-op, are developing models that rely on networks of practice which bias existing networks of knowledge, skill and delivery within a neighbourhood to develop local and collective responses to the challenge of retrofit. In the case of Carbon Co-op, a local team is employed to coordinate households requiring renovation, and understand the needs of the demand through detailed sessions with residents. The same team manages various specialist contractors, organising actors to collaborate and build a ‘critical mass’ both supply and demand-side for local retrofit. For a climate transition that is just and democratised, as much agency as possible should be devolved to the level of the community. This is not simply just a desirable principle, but it also builds a solid foundation for ambitious retrofit with a broad and deep scope through a process of co-design and trust-building.

One of the effects of the centralising tendencies of some service models is that by leveraging centralised, large suppliers, it removes the impetus to retrain and up-skill the local workforce. The benefits of retrofit are economic as much as environmental, with spending on retrofit projects having the potential to create demand for 515,157 new jobs between 2020–2024 just within the UK. In contexts such as our work in Zagreb, there have been mostly publicly-funded training and accreditation schemes to up-skill trades with construction skills for low-energy buildings and energy efficiency renovation, such as CroSkills and other programmes. However, these training schemes tend to only be funded on a one-off basis, and the accreditation is not widely adopted by the industry and the wider market, without mandatory licensing or public recognition of their value. These training programmes are therefore often not sustained as they operate in isolation without creating the demand for these improved skills. When training is implemented alongside a service model that draws on local suppliers, and contractual incentives for retrofit are based on delivered performance, these system reforms can sustain up-skilling in the long-term as a critical part of climate transition.

To outline our work on reimagining retrofit, we have curated our thinking from the last 9 months into a series of blogs which form the basis for ongoing work.

These blogs examine

Retrofit & ‘Building Back Better’ The System Challenges to Retrofit (this blog) Scaling the Right to Retrofit

You can find the full series here.

This blog and its visuals has been co-authored by Calvin Po, Ariane Porter and Jack Minchella.

The System Challenges to Retrofit was originally published in A Right to Retrofit on Medium, where people are continuing the conversation by highlighting and responding to this story.


Retrofit & ‘Building back better’

Photo credit - Nick Fewings on Unsplash Retrofit & ‘Building Back Better’ The implicit promise of any government policy with the ambition of a ‘green recovery’, ‘building back better’, or fulfilling net zero targets, is that every citizen has the right to a sustainable, socially-just future. Yet as we emerge from the current stage of the pandemic there remains a nagging sense of unc
Photo credit - Nick Fewings on Unsplash Retrofit & ‘Building Back Better’

The implicit promise of any government policy with the ambition of a ‘green recovery’, ‘building back better’, or fulfilling net zero targets, is that every citizen has the right to a sustainable, socially-just future.

Yet as we emerge from the current stage of the pandemic there remains a nagging sense of uncertainty over how we get from a pre-pandemic world, beset with complex problems and contested ideas, and move towards a future that is radically more just and sustainable. If ‘getting back to normal’ is an illusion, and any ‘new normal’ is ladened with the same vested interests and structural injustices as before, then the promise of ‘building back better’ should also be finding ways of rebuilding the trust that such a just and sustainable future is possible.

One area of convergence that holds promise is retrofitting.

Taken at face value, retrofitting is the process of upgrading our homes to be more energy efficient, but viewed through a much wider lens, retrofitting is the conscious adaptation our streets and neighbourhoods to become a more sustainable and equitable platform for our everyday lives. It is this broadened idea of retrofitting that can serve as a transition strategy for society to recover and rebuild.

Over the course of the last nine months Dark Matter Labs have been working with partners across Europe to understand how this wider remit of retrofitting, and the systems that support it, can form a vehicle for a post-COVID recovery, and act as a transition model to a 21st century economy.

Retrofitting. Where to begin?

Building on the work of other organisations, we now understand retrofit’s need and scale, why we should fund it, and some great ideas about localising how it can be done. All of them agree on one thing…retrofitting is difficult. This is partly because it’s riddled with complexity (needing to draw on many expertise, skills, and networks) and partly because it’s unpredictable and costly (no two homes are the same). But beyond these technical and logistical hurdles, retrofitting seeks to address a much more complex crisis — the crisis of trust.

Like so many of the crises that define our current economy, a big part of this crisis of trust has its roots in our broken housing system. This is a sector defined by and designed for a carbon age: cramped, expensive, unhealthy homes; chronic air pollution; streets dominated by cars and roads; expensive and unaccountable energy costs; disappearing local services and spaces; failing high streets; rising isolation, anxiety, and domestic abuse. These are challenges that have been exacerbated by the economy into which they were born, and unequally amplified during the pandemic. It is no wonder that proposals to tinker with our homes is met with mistrust when faced with the scale of these challenges.

Lockdown policies have exacerbated inequality
For many, lockdown policies have forced our homes to host our entire lives. Four walls now define not only our domestic space, but also the office, the school, the playground, and everything in between. That means more heating, more energy, more internet, more cooking, cleaning, and stress. All which sits against a backdrop of anxiety over our health and wellbeing, job security and rising fuel costs. These added burdens remain largely unaccounted and invisible, and inevitably end up falling on the individual.

The home is the fundamental unit of our housing system, and the pandemic has made it increasingly clear that the home is also the determining factor in our access to education, to support networks, and to public space, fundamentally shaping our health and wellbeing. When all of this is filtered through an unequal housing system, it shows in plain sight the illusion of equal access to these basic standards of living, especially for those at the intersections of socioeconomic class, race, gender and ability.

Any political remedy that proposes a regenerative recovery strategy with housing at its focus needs to recognise these cascading impacts of the home. It must move away from the idea of retrofit as a superficial investment in fixing technical problems in individual buildings, to one which addresses the underlying systems that are tied to our domestic spaces. Retrofit should not only see the home as a strategic site for decarbonisation, but also a site for addressing the inequalities in wider, interlinked systems, of which housing sits at the centre.

A crisis of trust needs a democratic transition
To address the challenge of climate breakdown, retrofitting will mean large-scale, long-term plans that propose to take our homes and streets apart and build them back with us still inside. For this to be possible, the way we approach retrofitting needs to empathise with just how precarious many people’s lives have become, not only during the last 12 months, but as a result of the systemic injustices in how our built environment gets made and remade, and the suspicion that it has caused. A legitimate climate transition cannot address the technical challenges alone, but must tackle the social ones as well. For retrofitting to be a part of this kind of transition, it must be done with the consent of all those who hold a stake in it. This transition must be democratic, not just about voting, but in the widest sense: the agency to shape the future, with the right support to exercise this agency capably.

A green recovery at the scale of the street

Conventional approaches to retrofitting have attempted to reduce domestic carbon emissions on a piecemeal basis, relying on the ‘easy wins’ for individual homes such as one-off grants for wall or loft insulation, replacing boilers, or adding heat pumps. Although useful in the short term, these measures will only make a small dent in our 2050 goals.

From Home to Neighbourhood
Government recovery plans tend to centre on individual households as the site for both decarbonising the built environment, and also the site for attempting to address systemic inequalities. But approaching retrofitting this way, through the lens of an atomised single home, made up of atomised individuals, won’t work. A recovery strategy built on retrofitting should try to multiply the value of green investment and economic stimulus by recognising that the home is inextricable from the spatial, social, and economic relationships of its neighbourhood. Retrofitting is critically situated in a knot of wider networks of local services, spaces, skills, ownership, regulation, ambition, and trust.

Fig. 1 — Understanding retrofit as a whole system of recovery

In practical terms, this means designing a retrofit strategy that sees the home in the context of its block, its street, and its neighbourhood. It is only at these scales can retrofit happen across a continuum of private to public spaces, and work with economic and social networks together as a complete unit of change.

The challenge we face now is how we can deconstruct and redesign the ‘dark matter’ around retrofit to deliver on this vision of a democratised climate transition.

This publication series aims to do just that, outlining the challenges we see in the current retrofit ecosystem, and how we think this approach to retrofitting could be scaled democratically.

These writings are a curated version of our thinking from the last 9 months. Over the next few months, Dark Matter Labs will be pursuing these and other lines of enquiry, applying future retrofit principles in real places to test propositions with system-shifting potential. We hope to begin rebuilding the ecosystem on a collective, democratic vision of a retrofitting that delivers deep and broad impacts, and also rebuilding trust in the systems of how our built environment is made and remade in our shared climate transition. As our testing progresses, we will be updating this series with a deeper look into some of the prototypes and propositions that are emerging.

Stay tuned.

This blog and its visuals has been co-authored by Calvin Po, Ariane Porter and Jack Minchella.

Retrofit & ‘Building back better’ was originally published in A Right to Retrofit on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Introducing Auth0 CRAPTCHAs!

Easily maximize user friction to protect your applications with our improvement on the CAPTCHA™: The CRAPTCHA!
Easily maximize user friction to protect your applications with our improvement on the CAPTCHA™: The CRAPTCHA!

Finicity

Finicity named Best Fintech to Work For for fourth consecutive year

Finicity was named one of the Best Fintechs to Work For in 2021 by Arizent, publisher of American Banker and National Mortgage News. Finicity is one of nine companies that have been included on every year of the list’s existence dating back to 2018. Even though many fintechs saw growth accelerate over the last 12 […] The post Finicity named Best Fintech to Work For for fourth consecutive year ap

Finicity was named one of the Best Fintechs to Work For in 2021 by Arizent, publisher of American Banker and National Mortgage News. Finicity is one of nine companies that have been included on every year of the list’s existence dating back to 2018.

Even though many fintechs saw growth accelerate over the last 12 months, they were not immune to the political, social and financial turmoil wrought by the COVID-19 pandemic. Companies  had to respond nimbly, as the crisis demanded shifts in business strategy and leadership style. You can read what leaders learned over the past year as well as view the full list here.

Finicity was also profiled for its timely business strategy and acquisition by Mastercard. You can read it here ($).

You can also explore open positions at Finicity on our Careers page.

 

The post Finicity named Best Fintech to Work For for fourth consecutive year appeared first on Finicity.


Authenteq

Life at Authenteq: Roman Kierzkowski on biometric data and building tools for everyone

The post Life at Authenteq: Roman Kierzkowski on biometric data and building tools for everyone appeared first on Authenteq.

Ontology

Ontology Monthly Report — March 2021

Ontology Monthly Report — March 2021 An Update from Ontology March 2021 “I don’t want to define us as a revolution. We want to enhance the link to the traditional world to make things better and better. Not just ‘okay, break everything and rebuild everything’.” Li Jun, Founder of Ontology You can find more detailed updates below. 中文 繁體中文 한국어 日本語 Española Slovák Tiến
Ontology Monthly Report — March 2021

An Update from Ontology

March 2021

“I don’t want to define us as a revolution. We want to enhance the link to the traditional world to make things better and better. Not just ‘okay, break everything and rebuild everything’.”

Li Jun, Founder of Ontology

You can find more detailed updates below.

中文

繁體中文

한국어

日本語

Española

Slovák

Tiếng Việt

русский

Tagalog

සිංහල

हिंदी

বাংলা

Developments / Corporate Updates

Development Progress

This month Ontology released v2.2.0. We have completed 65% of the Ontology EVM-integrated design, which will be fully compatible with Ethereum contract ecology after completion. We have completed 45% of the latest Layer 2 technology, which explores the integration of Ethereum Layer 2 on the Ontology MainNet.

Product Development

This month we released ONTO v3.7.8, enhancing the OKExChain TestNet features. We have launched the OKT TestNet token campaign on ONTO, which has attracted over 50,000 users. We have also launched the ONTO engagement fund joint campaigns with DeFiner, AntFarm, WePiggy, CherrySwap and GAP Cash. This will allow users to receive bonus rewards once they finish specific tasks.

dApps

As of March 24th 2021, there have been 113 dApps launched in total on MainNet. In February, there were a total of 6,474,213 dApp transactions completed on MainNet. We recorded 22,443 dApp-related transactions in the past month.

Community Growth & Bounty

Another astounding month of growth for the Ontology community! This month we onboarded 5,687 new members across Ontology’s global communities. As always, we welcome anyone who is interested in Ontology to join us!

New Team Members

Our team is growing! This month we welcomed a new Business Development Manager, Head of Community, and Operations Intern. Welcome onboard!

Recruitment

At Ontology, we are always looking to welcome new members to our team. We currently have a list of open roles and are looking to hire ambitious and hardworking individuals. See a list of open roles below:

Global Marketing Manager Solution Architect Europe Ecosystem Growth Manager Content Manager Product Manager Golang Developer Senior Front-end Developer Senior JAVA Developer Senior Quality Assurance Engineer Community Operations Associate Out & About — Event Spotlight Ontology assisted in the successful release of IEEE’s first blockchain-based Internet of Things (IoT) data management framework. This month, MicroWorkers integrated Ontology’s ONTO wallet to facilitate additional payment options for workers. This news was covered by top tier outlets such as AP NEWS, Yahoo Finance, BENZINGA, Finanzen.net, Business Today, Europa Press, and Asahi Shimbun Digital, covering Americas, Europe, Asia, and Africa. It was also announced that Wing Finance sponsored Chainlink’s new price feeds for Ontology-based assets. The Chainlink-powered oracle networks will maintain on-chain reference contracts that supply the most up-to-date USD price of Wing’s cross-chain assets, starting with WING and ONT. Another exciting development for Wing Finance. Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Ontology Monthly Report — March 2021 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: The Taxman Continues to Collect, this Time in Canada

🇨🇦 Canada's tax authority, the Canada Revenue Agency (CRA) has won a federal judge's court order against Coinsquare, a cryptoasset exchange, who must now provide information on its high-value Canadian customers and their crypto trading activities. Coinsquare estimates between 5 and 10 percent of its 400,000 customer records could be caught in the sweep. Coinsquare has 15 days to

🇨🇦 Canada's tax authority, the Canada Revenue Agency (CRA) has won a federal judge's court order against Coinsquare, a cryptoasset exchange, who must now provide information on its high-value Canadian customers and their crypto trading activities. Coinsquare estimates between 5 and 10 percent of its 400,000 customer records could be caught in the sweep. Coinsquare has 15 days to hand over the information to the CRA. 


My Life Digital

Regtech to the Rescue

#ThrowbackThursday This blog from 2018 talks about what has been penned as RegTech or Regulatory Technology. The surge of new regulations across multiple industries also gave rise to new technological solutions. These were intended to enable organisations to manage the responsibilities required by each regulation.  Regtech to the rescue  MiFID II, PSD2 and GDPR a

#ThrowbackThursday

This blog from 2018 talks about what has been penned as RegTech or Regulatory Technology. The surge of new regulations across multiple industries also gave rise to new technological solutions. These were intended to enable organisations to manage the responsibilities required by each regulation. 

Regtech to the rescue 

MiFID II, PSD2 and GDPR all focus on data – either opening up access or strengthening its protection. Without the right data solution, financial service providers risk coming unstuck, writes Ren Watson of MyLife Digital. 


Trust is at the heart of the financial services industry. Without trust, it would go bust. We trust it with our assets and we trust it with our data. And when that trust takes a knock, so do the share prices of the companies involved. 

 
Recently, millions of South Africans were told by Liberty Life that its IT systems had been compromised. The market reaction was to immediately wipe 4.7 per cent off the share price – or R1.68bn from its market value of R34bn. 

 
But it’s not just hacking that gets people worried. The misuse of personal data has become front-page news. Cambridge Analytica’s role in the 2016 US presidential elections comes immediately to mind. 

 
The financial services industry is awash with personal data that must be kept safe, and there is plenty of regulation to protect it. MiFID aims to protect customers through increased transparency; PSD2 opens up data to permitted third parties in a bid to increase fair competition (open banking); GDPR protects personal data from misuse. Some initiatives, like MiFID II and PSD2, are specific to the financial services industry, while others such as GDPR are applicable to all. 

 
“Data protection is a common theme for the regulator,” says Simon Morris, partner in the financial services and products practice at London law firm CMS. “And there are three key principles. Confidentiality – personal data is given willingly but over highly risky media such as emails, over the phone or by text, so it must be kept safe over these channels. Secondly, it is important to require firms to only use and process that data fairly. Thirdly, we require very robust and resilient IT protection against phishing and hacking. Covering all three is a real challenge.” 

 
With each new piece of regulation comes what Morris calls “repapering” – the collection of more data and its storage. This is often done in silos to stop data from being used by others. Ben Robinson, head of strategy at core banking IT specialist Temenos, believes that when it comes to consent, data silos are a particular problem. 

 
“Each time a bank collects data it has to record consent for its use. It’s difficult to get the correct consent and often the data is replicated, perhaps with different consents in different silos. They need a dynamic consent database,” says Robinson. “Open banking promises so much, but silos kill its fluidity and makes compliance harder.” 

 
Morris echoes this, saying: “The problem is repapering each time for a new piece of regulation. And getting the right consents. A company that can offer a credible solution to all or even part of all this will be offering a general good.” 

 
He cautions that lots of financial services companies are still getting consent wrong. “They are not getting consents or relying on opt-outs. That’s simply not good enough very often. Or they do not get the correct consents,” he says. 

 
This should worry customers and shareholders alike. “We know regulators do regular reviews. If a company is hacked or doesn’t have the right consents, the regulator will turn on you,” he warns. 
 

The fines for non-compliance are potentially huge – breaches of GDPR are up to €20m or 4 per cent of global annual turnover. And reputational damage, as we have seen, can cost billions in lost market value. 
 

“We trust banks – it’s to their advantage and it’s a bigger issue for them than for anyone else. They need to maintain and not squander our trust. It’s the biggest single thing they must safeguard after our assets,” says Robinson. 
 

With patchy or faulty compliance and the associated risks in plain sight, the financial services industry needs to get its house in order. Technology has the answer – a dynamic solution directly connecting customers with their personal data to provide permissions for certain uses, together with robust security. Regulators will broach no excuses if banks are found wanting. Customers might be even less understanding.

The post Regtech to the Rescue appeared first on MyLife Digital.


Coinfirm

Introducing Non-Fungible Tokens to the AML Platform

Non-fungible tokens (NFTs) are rapidly becoming more and more popular. What is the importance of NFTs? Each non-fungible token is composed of unique data (e.g. visual, audio, metadata), as opposed to fungible tokens that are all uniform such as 1 BTC, USDT, etc. To incorporate this element of uniqueness, as well as a financial value,...
Non-fungible tokens (NFTs) are rapidly becoming more and more popular. What is the importance of NFTs? Each non-fungible token is composed of unique data (e.g. visual, audio, metadata), as opposed to fungible tokens that are all uniform such as 1 BTC, USDT, etc. To incorporate this element of uniqueness, as well as a financial value,...

Dark Matter Labs

Accelerating City Transitions

How redesigning the dark matter of city systems can unlock sustainable, democratic urban environments Madrid is one of 10 European cities DML are working with on EIT-Climate KIC’s Healthy Clean Cities Deep Demonstration [Photo by Alex Vasey on Unsplash] Looking back and forward on our work in cities After a year like no other, and faced by what may well be permanent disruptions to their natu
How redesigning the dark matter of city systems can unlock sustainable, democratic urban environments Madrid is one of 10 European cities DML are working with on EIT-Climate KIC’s Healthy Clean Cities Deep Demonstration [Photo by Alex Vasey on Unsplash] Looking back and forward on our work in cities

After a year like no other, and faced by what may well be permanent disruptions to their natural rhythms and rituals, cities around the world are grappling with how to navigate their uncertain, post-covid, climate-defined futures. Such transitions have, of course, been long overdue. The entrenched inequality and unsustainable metabolism that have come to define our cities is what drives many to look for ways to transition them towards the sustainable and democratic urban environments we need them to be. On current trends, temperature changes of 3 degrees celsius are more likely than 1.5 or even 2 degrees, exacerbating a series of calamitous, cascading risks in our cities, whether as tangible places or as urban systems. The urgency and the depth of the transitions required to enable more hopeful scenarios means we have to act across a range of systemic levers in order to achieve a democratic, equitable pathway to a liveable future.

Our journey into cities

Over the past 5 years, Dark Matter Labs has found itself involved in several collaborations with such focus — where efforts to transition society in response to technological revolution and climate breakdown look to the city as a critical problem-space and tangible unit of change. From our early work to develop impact movements in Camden, to our experiences with Civic Square and Alternative.Camden, then the development in 2019 of our Micro-Massive urban transition thesis with UNDP and EIT Climate-KIC and our involvement in the latter’s Healthy Clean Cities Deep Demonstration, we’ve cultivated a deeper understanding of the critical role ‘dark matter’ — the invisible structures and infrastructures that shape our systems, from regulation and procurement to contracting and financing mechanisms — must play in driving transitions in cities.

Cities as systems of systems: regulation, tech, participation, and finance

Cities, of course, are anything but one-dimensional. Even a first glance at their impact on climate change reveals that their footprint is much larger than the emissions taking place ‘within the gates.’ Scope 3 emissions in particular are driven by complex global supply chains in which urban politics is nested and entangled. But working with cities is still worthwhile — as long as we recognise that cities form a ‘system of systems’, where several levers of change come together, and can be acted upon — whether at the scale of a city-regional portfolio of interventions, or district-focussed ‘living labs’.

How cities update and in some cases revolutionise their regulatory landscape will determine the speed and depth at which urban environments can transition. Since co-founding the Network of Regulatory Experimentation in partnership with BLOXHUB, Community of Federal Regulators, Dot Everyone, MaRS Solutions Lab, The McConnell Foundation and Waag Society, Dark Matter Labs has continued to explore the promise of a new generation of regulatory sandboxes based in cities, with Montreal and Madrid playing early hosts.

How cities leverage the promise of technology — digital tools & platforms, open data systems and artificial collective intelligence — will in many cases be pivotal in both accelerating change, but also in either by-passing or actively ensuring deeper legitimacy and ownership of transitions amongst those most affected. Seeing deeper civic participation as inherent in regulatory or technological innovation is integral to the future of cities: the promise of new technology and agile regulation moving us to a more sustainable future rings false unless people and communities can truly be part of collaboratively understanding why it is necessary, what opportunities it could hold for them, and how different design principles can drive towards very different outcomes. In this context, Dark Matter Labs and Lucidminds, with support from Nesta, have been developing three use cases for near future ideas where Civic AI can help equip communities with the tools to collectively respond to the climate crisis and achieve the goal of a carbon-neutral society. While such technology scenarios may feel distant, controversies such as around the Sidewalks Toronto proposition shows the urgency of building civic legitimacy around tech-enabled sustainability pathways. In response, we are working with city governments like Amsterdam to explore how to bring collaborative dynamic regulation and tech-enabled renewable energy communities closer to reality, as well as with the Korea Agency for Infrastructure Technology Advancement, tech-enterprises and local citizens in Daegu, South Korea, to develop new frameworks and strategies for combining technology and civic innovation.

Visual from our Daegu Creative City project with Daegu Technopark

How cities finance their transitions will be crucial in ensuring capital best serves the future, both in terms of what they invest in and how they invest. Dark Matter Labs has been working with partners across North America and Europe to explore how the transition capital embedded in the coming Green Deals can be best deployed. As cities will be significant recipients of transition capital over the coming decade, they must develop new financial instruments that enable long term investment; new mechanisms to capture and share the multiple spillover effects and value flows of a climate investment strategy, like better health and jobs; and new public governance models capable of limiting rent-seeking incentives traditionally structured into investment propositions. In short, we need to re-code capital so it privileges the shared public value of transition investment. Cities are starting to see the significance of the challenge, and our work with the cities of Madrid and Vienna to explore such ‘re-coding’ mechanisms will accelerate in 2021.

What we’ve learned along the way

Needless to say, we’ve learned a few lessons from these encounters with cities and their dark matter. We have a growing appreciation of the need to combine discovery work and creative imagining with the hard graft of enabling change to happen on the ground. We have a deeper sense of the undeniable role cultural competence alongside local leadership and partnerships must play to drive and sustain urban transformation in what are always unique places with powerful and complex identities. In response, our team is even more geographically distributed than it was a year ago; aspiring to be more deeply embedded in the local contexts in which we work. We are explicitly working on what it means in practice to deal with urban systems by both reaching down into neighbourhoods and up into sprawling economic and ecological geographies and supply chains. We recognise the need to mobilise national and regional authorities as much as municipal governments if we are to truly rewrite the DNA of cities and their dynamic flows beyond the traditional city boundaries. Each of these points probably warrants a post in themselves, and over the next months we’ll be sharing in more detail some of our insights from the work in different places.

Why cities — revisited

These experiences and reflections have pushed us to refocus on the why, what and how of city transitions. Of all the problem and opportunity spaces our world has to offer, why do we choose to ‘lean into’ cities?

Aggregate and compound value

Cities are where many of the most complex challenges we face combine into webs of cascading risks and legacy lock-ins. It’s widely acknowledged that cities are a net-contributor to climate breakdown, covering less than two percent of the earth’s surface, but consuming 78% of the world’s energy, producing more than 60% of all carbon emissions and reinforcing the habits, lifestyles and extractive economies most associated with the causes of the climate crisis. And the past year has been a particularly stark reminder that this takes place in the context of chronic and deepening ‘multiple horizon’ emergencies, of growing disparities in wealth and power, and the double edged sword of runaway technological capabilities.

While the scale and entanglement of the problems are enormous, so is the possibility for impact. If we succeed in transitioning cities to a more sustainable and democratic future, the aggregated value of improving the lives of the vast urban populations cities serve is clear. But cities are also particularly powerful hubs of knowledge and collaborative innovation capacity, driven by their innate diversity, their assertive plurality of institutional and non-institutional actors, and their capacity to attract capital. This, together with the soon to be made available ‘once-in-a-generation’ investments, if leveraged well, can shift them from engines of unsustainable growth to catalysts of healthy, green, caring transitions.

The pathways for deploying this capital need to be explored in the full acknowledgement of complexity, uncertainty and entanglement. This also means that we cannot just engage deeply with neighbourhoods and communities, but also have to think widely across (bio)regional and even national economies and ecologies. They provide fertile ground for strategic innovation and new lead markets (e.g. bio-based zero embodied carbon building, nature-based solutions financing, industrial hydrogen) that represent significant compound value and multiplier effects beyond the city’s municipal limits. This, in turn, is critical to overcome the growing political divides and spectre of culture wars between cities and hinterlands where it comes to the winners and losers of transition.

An opportunity to build institutional capacity for deep transitions

More fundamentally, we’re seeing more and more clearly the urgent need to enable the deep institutional shift from incrementalism to transformative change. Cities are expressing that they see the need to accelerate this shift — but it is evident they currently struggle to provide the conditions for this to happen. They want systemic change — but can they change their systems? Siloed organisations; analogue-era regulatory approaches; resistance to policy-making that actively shapes markets; limited capabilities in blended finance and multiple outcome accounting; underdeveloped local innovation ecosystems; and rudimentary infrastructures for collaborating with citizens are just some of the critical institutional capacity gaps present in cities. At the same time, the Deep Demonstration in Healthy Clean Cities has seen emergent progress in many of these fields: from Vienna’s success in bringing together a remarkably large number of departments around its decarbonisation agenda employing ‘ T-shaped’ team members to connect opportunities, to Amsterdam’s dynamic outcome-based regulation aspirations for last-mile logistics, cities are starting to embrace pathways to action that can match the ever-more ambitious target-setting. If we succeed in redesigning the dark matter of city systems, and unlock such fertile institutional ground for creating sustainable, democratic urban environments, we go a long way to accelerating transitions towards rapid decarbonisation and a healthier, more sustainable human thriving.

What’s at stake?

Of course, institutional capacity is just one side of the equation. The other is where it takes us. Directionality matters. Intentionality matters. Values matter. So what are we trying to achieve with our work in cities?

At Dark Matter Labs, the aspirations of city transitions is to create a thriving urban everyday life with radically improved standards of civic agency and the opportunity for human flourishing afforded to all, within planetary boundaries, and alongside a regenerated natural realm. We see this manifesting across a range of domains and lived experiences in cities and peri-urban areas.

Visual from our Malmö Example Streets experiment proposition

In the near and now, we aspire for ‘cities-as-commons’ — places where participatory governance of civic assets shape our communities, public spaces and local economy. New institutional tools, such as multiple-level climate contracts and civic endowments for shared infrastructure and innovation investment, are building blocks for this. We see cities as carbon positive urban environments — not just because of decarbonised homes and transport, but also with vast urban nature capable of turning concrete jungles into their very own carbon sinks. We aspire for ‘caring cities’ — places that privilege and properly invest in wellbeing and empathy, but also shared learning and active participation in deep democracy. We see cities as circular and collaborative, where food and (re)construction are part of digitally integrated local value chains capable of building community wealth and local resilience, underpinned by next generation logistics systems and agile regulation that include real-time environmental performance as key metric.

Urban futures like these would mean people having more power over their places, a deeper relationship with nature despite their urban setting, a sharpened consciousness of the materiality of their city, greater security and capacity to focus on the pursuit of equitable human flourishing, and the ability to benefit from the value generated by collective endeavours, all the while preserving and nourishing our shared ecosystems. It is a vision in which cities finally deliver on their promise to each and every one of us.

How might we get there?

As we’re often reminded, it’s not enough to have a clear case for change and a compelling vision for how things can be different. This is particularly true in the uncertain, complex and emergent times in which we live. So if there is no clear linear pathway to unlock such a city transition, how do we proceed?

Our thesis at Dark Matter Labs is that there are a set of key ingredients we must start with.

Strategic risk and future liability

Even though 2020 showed in brutal tangibility what ‘interconnected and cascading risk’ can mean for cities, urban governments and the national equivalents that support them systematically fail to properly account for risk and liability. In too many cities we see ambitious urban strategies for decarbonisation sat in juxtaposition with stable consumption of high embodied carbon materials such as concrete. In too many cities we see aspirations for eliminating poverty alongside extractive development models reliant on housing and land asset bubbles. In too many cities we see ambitions for once-in-a-lifetime deep energy retrofitting investments that are unconnected to other needs and opportunities in neighbourhoods, like reimaging streets, creating digitally enabled local supply chains or building community wealth. In too many cities we see a growing awareness of mental health risk being undercut by an ongoing inability to make long investments into social infrastructure that overcome loneliness and strengthen individual and collective resilience.

To address these disconnects, we’re focussing on building capabilities and mechanisms for understanding and acting on strategic risk. That means assessing risk and liabilities not at a project level, but rather at a whole-place transition level: identifying the extent to which a proposed course of action or set of interventions impacts on the ability of a city to deliver on its vision and mission. In our age of the long emergencies, cities must take steps to address multiple levers of change, connect several domains of vulnerability and opportunity, and unlock new lead markets.

This new relationship with foundational risk and liability at the urban level can have a ripple effect on some crucial practices that define the trajectory of cities, perhaps most profoundly in how we move towards strategic investment portfolios inextricably linked to the transitions we pursue. We are exploring a range of instruments and financial models such as smart and tradable perpetual bonds that can amplify the capacity of public sector and other public interest actors to raise capital for long term multiple outcome interventions; urban-scale carbon sequestration certificates as a pathway to broader ecosystem services investment certification; and a Settlement Risk-Innovation Facility to enable systemic innovations to resolve key liabilities (e.g. urban air pollution) with solutions that can only be proven over time. Alongside similar efforts such as those of the Transformation Capital Initiative, we are bringing together problem owners, solution providers and financiers in new relationships that again are geared towards aligning interests for the long term.

Portfolios of interconnected interventions

As cities make this shift towards more co-beneficiary investment strategies, the necessity of a portfolio of interconnected interventions becomes clear. EIT Climate-KIC’s 2019 strategy document Transition, in Time lays out how, in the context of uncertainty, a portfolio approach is essential to discover options and pathways. What we have seen over the last year and a half working with EIT Climate-KIC and a range of other partners in fifteen cities across Europe is how the portfolio of interventions need to focus both on domain-based and on transversal approaches. The two intertwine to recognise the complex and adaptive nature of systems, but also to generate systems learning and mark out possible transition pathways, rather than seeking to simply validate whether a single point intervention has impact or not.

Overview of one strategic experiment in Madrid’s Health Clean Cities portfolio of interventions

To start with the transversal: the next generation transition finance capabilities we mention above are just one of the cross-cutting innovation capacities that cities need. Regulatory innovation capacity will be equally essential: updating cities’ regulatory frameworks can not be limited to a one-off fix to include e.g. embodied carbon emissions into their planning systems and public realm specification sheets. Rather, it needs to be a permanent capacity for change that can work with civic and private stakeholders as well as with other public authorities in an on-going, agile and transparent manner to integrate new progress metrics and strategic learnings stemming from accelerated decarbonisation and pollution targets into policy frameworks. As our Legitimacities publication set out, this requires us to build a new set of collaboration settings (‘sandboxes’) and capabilities so that urban regulatory experiments can be created with and co-governed by people from across sectors, rebuilding trust and revealing transition pathways.

And regulatory innovation is just one aspect of a broader transition governance capacity, which is multifaceted and needs active investment at a time where large amounts of recovery capital and transition needs will at local level often meet deeply ingrained scepticism and mistrust of ‘change’ and its winners and losers. It is extremely unlikely that a truly just transition can be forged through the decision-making patterns of the past. Hence we are working to develop a new range of ‘transition’ governance tools and mechanisms, ranging from new contracting architecture for collective action with Leuven2030, to new collaborative governance arrangements with the Orleans Metropole. Our collaboration with Viable Cities and nine Swedish cities to establish Climate City Contracts — an emergent and agile contracting architecture for complex and entangled challenges like the climate emergency — is another example of an alliance of partners working together to reimagine governance.

In a city portfolio, such ‘horizontal’ innovation capabilities meet with a range of ‘vertical’ missions or integrated interventions such as fostering circular economies, at-scale nature-based solutions, and whole-district climate retrofitting (or actually, we’d prefer to call it future-fitting — as it also needs to take stock of shifting working patterns after Covid-19, counter the loneliness epidemic, and build resilience in adapting to a 3+ Celsius scenario of climate extremes). In each of these domains there are interconnected financial, regulatory and broader transition governance questions, alongside technology and data questions. Responding to these entanglements through a portfolio approach can enable cities to become early movers in lead markets that can build new inclusive value and supply chains, attract investment and drive sustainable recovery. While confronting to the managerial orthodoxies of city administrations, this more dynamic approach inevitably gives rise to new and valuable institutional and sector-specific tools such as comfort contracts that can bind together households, energy solutions providers and financiers in a district retrofit; or material registries that can enable the construction and real estate finance sector to embrace circularity, and make decisions based on long-term considerations.

Towards more hopeful urban futures

As we continue down the multiple transition pathways we have embarked on with cities, we are making some strategic bets where we will invest our time and energy over the coming years. We’re working with cities and a range of partners on developing strategic transition portfolios, on place-based proofs of possibility exploring how the full stack of institutional innovations can be brought together, and on particular tools and instruments for next generation governance and finance. Across these three types of work, we increasingly pay attention to how we learn from strategic experiments and build capabilities in the process. We believe that by better structuring the demand for such change, building new functions of governance, and showing tangible demonstrations of new and legitimate transition pathways, we can reorientate our cities’ current trajectories towards more hopeful futures.

This is our emerging thesis on redesigning the dark matter of city systems. It has been built through the support from, and conversations with many of our partners, and we are encouraged by (and immensely grateful for) their trust in together navigating the opportunities for positive change in an uncertain world. Equally, we are keenly aware that our approaches and practices are inevitably incomplete — and we’d like to invite feedback and discussion in order to spot gaps, next opportunities, and blind spots as well as complementary perspectives. It’s not quite true that this shared endeavour is ‘just getting started’ — but given the terrain ahead, that’s often what it feels like. We look forward to exploring the way forward with you.

Written by Joost Beunderman, Tom Beresford, Linnea Rönnquist, Dan Wainwright, Jonathan Lapalme and Eunji Kang in deep acknowledgement of contributions to our work with cities by other DM colleagues, our kin (the 00 ecosystem) and our partners — particularly EIT-Climate KIC, Bankers Without Boundaries, Democratic Society and Material Economics.

Funded by EIT Climate-KIC

Accelerating City Transitions was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokeny Solutions Delivers T-REX Billboard, the Secondary Market Solution for Security Tokens

The post Tokeny Solutions Delivers T-REX Billboard, the Secondary Market Solution for Security Tokens appeared first on Tokeny Solutions.

Luxembourg, April 2021 – Tokeny Solutions has announced a white-label secondary market solution for securities issuers and digital asset marketplaces to improve private market access for investors. The Billboard module of the T-REX Platform allows an issuer’s investors to connect with one another, express trading intentions and ultimately execute peer-to-peer security token transfers on a decentralized infrastructure. Asset issuers can utilise the solution today, and improve the experience for their investors.

The private market industry is one that has grown impressively over the past decade, offering lucrative returns for investors. Yet, due to this industry operating on a plethora of centralized, manual and inefficient networks, there is barely any transferability in the secondary market. Investors are resigned to long lock-up periods and ultimately face costly liquidity problems. Today, this is a barrier to entry for many as liquidity is often needed, especially in times of economic uncertainty.

Billboard leverages decentralized technologies to digitally identify stakeholders and their assets to facilitate transactions. Thanks to onchain identities, the ownership and the eligibility claims of the counterparties are automatically verified. Once verified, the transaction is rapidly executed peer-to-peer by investors. This easy-to-use type of transfer reduces many fees seen in traditional private markets such as those to do with settlement and custody.

Since 2017, security tokens have promised to bring liquidity to an industry that has long been starved of it. So far, this hasn’t happened due to the lack of a complete, end-to-end solution, from the identification of investors down to the on-chain transfer. With T-REX Billboard, we are delighted to be the first company to finally deliver on this promise and open a broad spectrum of opportunities to a wider group of investors, and improve the liquidity as a result. Luc FalempinCEO Tokeny Solutions As the leading pan-European market infrastructure, we invested in Tokeny with the shared vision that blockchain technologies and the use of security tokens could further support the development of private markets. Billboard, the new module of the T-REX security token platform, is a significant step in this journey and completes Tokeny’s offerings: financial institutions now have all the technological pieces to efficiently interact with security tokens, and thus grasp the benefits of blockchain technology in the world of private markets. Pierre DavoustHead of CSDs in Euronext and Non-Executive Director at Tokeny

Blockchain technology is now recognized as an infrastructure to issue and transfer securities across capital markets. These securities, or security tokens, fall under the same rules and regulations as traditional financial securities in many European countries, including France, Germany, Italy, Luxembourg, the Netherlands, Romania, Spain and the UK. Asset owners utilising Billboard also benefit from the built-in compliance of the T-REX security token standard, which has been used to tokenize over €8.5 billion of assets to date.

Learn More About T-REX Billboard About Tokeny Solutions

Tokeny Solutions allows financial actors operating in private markets to compliantly issue, transfer and manage securities using distributed ledger technology, enabling them to improve asset liquidity.

Due to disconnected and siloed services that are currently used to enforce trust, private markets experience poor asset transferability with little to no liquidity. By applying trust, compliance and control on a hyper-efficient infrastructure, Tokeny Solutions enables market participants to unlock significant advancements in the transferability and liquidity of financial instruments.

Tokeny Solutions is the leader in its field and in 2020 were named one of the top 50 companies in the blockchain space by CB Insights. They are backed by Euronext.

tokeny.com | Press | LinkedIn | Twitter

The post Tokeny Solutions Delivers T-REX Billboard, the Secondary Market Solution for Security Tokens appeared first on Tokeny Solutions.


Evan Network

Blockchains Strengthens Decentralized Identity Capabilities with Acquisition of evan GmbH

Blockchains, Inc., a Nevada-based blockchain technology company committed to protecting and empowering individuals, today announced its acquisition of evan GmbH. The post Blockchains Strengthens Decentralized Identity Capabilities with Acquisition of evan GmbH first appeared on .

Dresden, Germany and Sparks, Nevada, April 1, 2021 – Blockchains, Inc., a Nevada-based blockchain technology company committed to protecting and empowering individuals, today announced its acquisition of evan GmbH. Headquartered in Dresden, Germany, evan focuses on building for the new economy through its blockchain-based decentralized identifier solutions.

“Developing a trust infrastructure based on digital identity is key for cooperation on the blockchain. Enabling parties to transfer trust from the real world to the digital world is what led us to Blockchains,” said Thomas Müller, evan’s co-founder and CEO. “Digital identity is in Blockchains’ DNA. We are very excited to be joining the Blockchains’ family and working with like-minded innovators and visionary developers to empower individuals and businesses.”

In 2018, the founders of evan GmbH set out to jump-start the movement to Web 3.0 by developing decentralized identifiers for businesses to interact directly with each other without an intermediary, by leveraging automation and a trust and trace-based methodology. Evan’s core technologies have significant implications for Blockchains’ suite of solutions currently in development. The companies plan to integrate evan’s solution stack with Blockchains’ digital platform to empower individuals by allowing them to control their personal information and assets.

Recognized as one of Gartner’s Cool Vendors in Blockchain Business in 2020, evan is a contributor of the W3C, an international consortium where member organizations and the public work together to develop Web standards. Additionally, the company is a member of the Decentralized Identity Foundation and partners with Global Legal Entity Identifier Foundation network organizations.

Evan’s expertise extends to employing identity through IoT devices for use-cases in a variety of industries, including energy, government services, transportation, and healthcare. “How IoT connects to decentralized identifiers provides significant privacy benefits when these devices are communicating directly with the outside world, and when the devices operate with multiple parties in different contexts, such as when part of a supply chain or when integrated into smart city infrastructure,” said Thomas Herbst, co-founder and CTO, evan GmbH.

“The value that evan brings to Blockchains is not only in the tremendously talented people who built a technology portfolio based on a much-needed trust infrastructure but also a deep understanding for developing decentralized solutions with global interoperability standards in mind,” said Lee Weiss, executive vice president, Blockchains. “Whether for companies to be able to work together and with us in our blockchain-based smart city in Nevada, or for us to be able to partner with a wide scale of enablers for our digital ecosystem, interoperability is key.”

Together, evan and Blockchains are writing a new chapter in the evolution of the web by breaking dependencies on central platform providers and giving individuals back their autonomy in the digital space.

The post Blockchains Strengthens Decentralized Identity Capabilities with Acquisition of evan GmbH first appeared on .

SWN Global

Details of The Final Round of the MetaMUI Pre-IEO Sale Now Available!

2 months ago, after the successful mainnet launch of the MUI MetaBlockchain, we announced the Pre-IEO sale of our MetaMUI mainnet coins. After the first two rounds, we are truly humbled by the response from our ever-supportive community. The first Pre-IEO round saw all 14,000,000 (14 Million) MetaMUI coins sold for a record time of 35 minutes, we were stunned by this level of response from the com

2 months ago, after the successful mainnet launch of the MUI MetaBlockchain, we announced the Pre-IEO sale of our MetaMUI mainnet coins. After the first two rounds, we are truly humbled by the response from our ever-supportive community. The first Pre-IEO round saw all 14,000,000 (14 Million) MetaMUI coins sold for a record time of 35 minutes, we were stunned by this level of response from the community. For the second Pre-IEO round, nothing prepared us for what the community had installed for us, as all 10,000,000 (10 Million) MetaMUI coins were sold in 4 minutes, yes you read that correctly, 1,2,3,4 minutes!

And what better way to reward the community than by announcing the third and final round of the MetaMUI Pre-IEO sale. The SWN Global team is thrilled to announce the details of the final Pre-IEO sale of the MetaMUI coins.

Details of the 3rd Pre-IEO Round

A total of 6,000,000 (6 Million) MetaMUI coins will be available for the 3rd round of our Pre-IEO sale The 3rd round will take place on the 20th April 2021, at 7:00 am Estonian time (GMT +2) The price for each MetaMUI coin will be available at $0.18 Reserve feature will be activated on the 16th April 2021 The Pre-IEO round will take place on Metablock Exchange

Further Details

Acquisition: 100% unlocked on the list
Min. Contribution: $100
Max. contribution: unbounded

About The Reservation Feature and Details

Due to the high demand in the First Round of Sale, we experimented with the Reservation feature, which turned out to be successful, as it was instrumental in the record-breaking, successful Second Round of sale. We believe this will allow users to ensure they participate in the final round.

Accepted assets: BTC/USDT/ETH
Activation date: 16th April 2021
End Date: 19th April 2021

ETH, BTC, USDT are the only accepted cryptocurrencies. Also, only Metablock Exchange addresses will be accepted.

Please note that citizens of China and the United States are not allowed to participate in this Pre-IEO sale. MetaMUI is a utility coin that powers the entire MetaMUI ecosystem — which includes our products MUI MetaBlockchain, MUI MetaWallet, Metablock Exchange, and other upcoming products.

Note: To participate in the Pre-IEO sale, make sure to register on Metablock Exchange and verify your identity(KYC). We prepared a step-by-step guide that walks you through the entire process from how to create an account on Metablock Exchange, to how to pass the KYC on Metablock Exchange, and finally, how to buy MetaMUI in 6 simple steps.

For institutional investors or users with any questions about participating in the sale, feel free to contact us through our email support@metablock.exchange.

Thank you for being a part of the MetaMUI Community!

Sovereign Wallet team

— — — —

More About The Project

Follow us on Twitter
Join our Telegram Community Chat
Get the latest from our Telegram announcement Channel
Read the MUI MetaBlockchain Whitepaper
Download the MUI MetaWallet on Android and IOS


HYPR

HYPR

Say goodbye to login. Forever.

Our April 1st platform update brings increased performance, noticeably faster speed, and groundbreaking security enhancements. 

The highly anticipated No-Factor Authentication (NFA) feature is the pinnacle of cybersecurity innovation. It removes the hassle of logging into workstations and applications by eliminating the need to log in…altogether.

That’s right — we’ve gotten rid of the login screen completely, so there is no barrier to entry.*  

Instant Access for Anyone, Anywhere

No-factor login means you don’t need to prove who you are. By eliminating the authentication flow, your teams will enjoy an unrivaled sense of freedom when logging in. Just sit down at your computer, and begin using it. Nothing else is required — no password, token, biometric, nothing.

Arguably, this approach is more secure than a password.

Easy Logout

Logging out has never been easier. You don’t need to log out, because you were never really logged in. Feel free to get up from the computer, walk away from it (leaving it in any state), and when returning it is simply open and available to you. 

A seamless logout experience is only possible with No-factor authentication.

How It Works: Bleeding-Edge Technology

From machine learning to Zero Trust architecture to immutable blockchain-based identity stores, NFA combines the latest buzzwords to achieve bleeding-edge innovation.

All the user needs to do is show up. HYPR’s NFA architecture handles the rest. Our R&D team has been working tirelessly on eliminating login and what you see today is the culmination of decades of hard work.

Super Low Pricing

Looking for great ROI when sourcing a security product? How about one that requires no investment?

If you’ve come this far, then by now you’ve probably realized that this feature doesn’t exist, so you’ll never have to pay for it. Happy April Fool’s Day 2021.

* Electricity and internet connection are still required. 

Wednesday, 31. March 2021

Finicity

Finicity Strengthens Data Access Agreements with Partnerships from Leading, National Financial Institutions

Data has driven incredible improvements in the way people have experienced financial services over the last decade. Services that once could only be done at bank branches can now be easily accessed online. And people now have powerful financial information, products, and management tools at their fingertips via their mobile devices.  The expansion of open […] The post Finicity Strengthens D

Data has driven incredible improvements in the way people have experienced financial services over the last decade. Services that once could only be done at bank branches can now be easily accessed online. And people now have powerful financial information, products, and management tools at their fingertips via their mobile devices. 

The expansion of open banking is creating a new turning point in the money experience. Open banking allows consumers to share their financial data with third-party developers and fintechs in exchange for new services—unlocking the potential of financial data to catalyze another generation of innovation across the industry. Services such as providing financial data to build credit and gain immediate access to capital, initiating a direct payment to a business or individual, and managing household expenses and budgets fundamentally change the boundaries of banking relationships.

Responsible data practices are required for participants to best take advantage of this technology. The key differentiator that sets the world of open banking apart from previous innovation environments is the requirement for consumer consent in data sharing and adherence to core data principles: control, access, transparency, traceability, and security (CATTS).

In order to enable open banking and empower customers, data access agreements (DAA) must be in place to ensure access to financial data that financial service providers need to innovate and provide new services and benefits in a digital world. These agreements define common rules for how two parties—usually an open banking platform or financial data aggregator and a financial institution—will communicate and exchange financial data. More secure data access agreements mean more connections that financial service providers can use to empower consumers with greater control over their financial health. 

Finicity’s Market Leadership in Financial Data

Finicity’s connections cover 95% of direct deposit accounts in North America. And thanks to 20 signed direct access agreements with some of the nation’s largest financial institutions, we support 60% of our traffic with direct API access. We expect that direct-API traffic share to grow to over 80% by the end of 2021, greatly reducing the use of user credentials and screen scraping in the financial services market.  

Since 2017, we have led in signing data access agreements with the top FIs including Wells Fargo, Bank of America, Chase, Fidelity and many others.  And, in just the past year, Finicity has worked to strengthen the open banking ecosystem by creating stronger, more secure data access agreements and partnerships with key financial institutions and fintechs, including Charles Schwab, TD Bank, Citi, Brex, Chime, US Bank, and BMO Harris.  

And where Finicity has been leading, the market is following. Along with an increasing number of financial institutions and other financial service providers, more payroll providers are recognizing that consumers can benefit from the data they hold. As a result, these providers are adopting API connections to open banking platforms that expand the use of financial data in lending, tenant screening, background checks, government verification and personal financial management.  Finicity announced a Direct Access Agreement with the leading payroll provider in September 2020, representing 16% of the payroll provider market.

DAAs provide the broader fintech and financial services community with access, through Finicity, to consumer-permissioned financial data that enables a variety of apps and services across financial management, payments, lending, and beyond. And a key result for all parties is enhanced data stability, accuracy and improved security through reduction of user credentials.

A direct access agreement with Finicity ensures the most reliable data security for financial institutions while still enabling fintechs and financial service providers to deliver solutions that empower consumers and foster financial inclusion. To learn more about becoming a supported financial institution and how that benefits you and your customers, be sure to check out our financial institutions page.

And request a demo today to see how the power of data from Finicity’s open banking platform can accelerate your fintech innovation to get to market sooner.

The post Finicity Strengthens Data Access Agreements with Partnerships from Leading, National Financial Institutions appeared first on Finicity.


auth0

Build a Flutter Wishlist App, Part 1: Introducing Flutter and Building a Basic Wishlist App

Learn the basics of the Flutter development kit and the Dart programming language and build an app to manage your shopping wishlist.
Learn the basics of the Flutter development kit and the Dart programming language and build an app to manage your shopping wishlist.

Anonym

Cisco: The Age of Privacy Has Arrived, Helped by COVID-19

Cisco has released its Data Privacy Benchmark Study 2021 and it’s got some revealing insights for enterprise.  Unsurprisingly, this year’s study is framed around COVID-19 and reveals the pandemic has significantly elevated the importance of privacy as a business priority.   Cisco says: “The reaffirmation of privacy’s v

Cisco has released its Data Privacy Benchmark Study 2021 and it’s got some revealing insights for enterprise. 

Unsurprisingly, this year’s study is framed around COVID-19 and reveals the pandemic has significantly elevated the importance of privacy as a business priority.  

Cisco says: “The reaffirmation of privacy’s value even during the pandemic positions it as a priority for years to come. Privacy is no longer an afterthought; it is core to how we work and interact with each other. The Age of Privacy has arrived.” We wholeheartedly agree with these sentiments.  

Cisco’s Data Privacy Benchmark 2021 is a significant study. It’s an annual survey which, for this 2021 installment, anonymously surveyed more than 4,700 security professionals from 25 geographies in mid-2020. Respondents represented all major industries and organization size. It also drew on data from Cisco’s 2020 Consumer Privacy Survey, which was completed in mid-2020 by 2,600 adults in 12 countries.*  

The four key messages that flow from Cisco’s Data Privacy Benchmark 2021 are important for enterprise:  

Privacy is helping organizations to overcome the challenges of the pandemic. 

Cisco reports 93 percent of organizations used their privacy team to help navigate and manage their pandemic response in 2020, particularly the rapid shift to remote work and balancing individual rights and public safety. Private and secure remote work access and systems and policies for when and how to share employees’ personal information and to limit access to and use of that data are urgent priorities in the pandemic.  

Investment in privacy has increased and ROI is positive.   

As you’d expect in a pandemic, the study shows privacy budgets across all organization sizes roughly doubled in 2020. The average privacy spend is now USD 2.4 million. While ROI on this spend was slightly down compared with2019, it tracked similarly to 2020 and remained solid. Thirty-five percent of respondents reported benefits at least two times their investments in areas such as mitigating losses from data breaches, spurring innovation, achieving operational efficiency, and building trust with customers.  

Tighter privacy laws are welcomed and positively regarded. 

We recently told you about the new “privacy actives”, a term Cisco coined following its 2019 Consumer Privacy Survey. ‘Privacy actives’ are highly motivated privacy-aware individuals who care about privacy, are willing to act to protect theirs, and are proactive about it. The latest Cisco data shows organizations are recognizing and responding to this new influential demographic and accept that consumers won’t buy their products and services if they perceive a risk to their personal information from the company’s data practices and policies. 

As such, the steady tightening of privacy laws globally, which promote greater transparency, fairness and accountability, are being welcomed and positively regarded by both enterprise and consumers. Cisco reports that 79 percent of organizations indicated privacy regulations are having a positive impact (and only a 5 percent negative impact).   

Read more about how privacy regulations are picking up pace at the state level in the US and around the world.  

Privacy is becoming even more critical to the bottom line. 

Cisco has several key findings in this area:  

90 percent of respondents agreed that external privacy certifications, such as ISO 27701, APEC Cross-Border Privacy Rules, and EU Binding Corporate Rules, are an important buying factor when choosing a product or vendor.  Organizations with more mature privacy practices are enjoying greater business benefits than average and are in a stronger position to handle new and evolving privacy regulations.   More security professionals say data privacy is now a core requirement of their role, and 93 percent of organizations are reporting privacy metrics such as privacy program audit findings, privacy impact assessments, and data breaches to their board. 

All these findings point to the fact the pandemic has strengthened the commitment to privacy globally. As the report summarizes: “Privacy budgets have increased over the last year, organizations have more resources focused on privacy, and privacy investments going above and beyond the law are translating into real business value. Privacy legislation and external certifications are providing assurance in a business environment where it’s hard to know whom to trust. Consumers are exercising their privacy rights and demanding enforcement of existing privacy protections.” 

Indeed, Cisco could not be plainer in its assessment of privacy’s importance to business as we move further into 2021: “Organizations that get privacy right improve trust with their customers, operational efficiency, and both top-line and bottom-line results.” 

If you’d like to explore how Anonyome Labs can help your company to rapidly produce branded data privacy and cybersecurity solutions, check out Sudo Platform, our complete business privacy toolkit, and MySudo, our exemplar consumer privacy app. 

* The 2020 Consumer Privacy Survey, which we’ll also report on soon, surveyed participants from Australia, Brazil, Canada, China, France, Germany, Hong Kong, India, Indonesia, Italy, Japan, Malaysia, Mexico, Philippines, Russia, Saudi Arabia, Singapore, South Korea, Spain, Taiwan, Thailand, The Netherlands, UK, US, and Vietnam. 

Photo By theshots.co

The post Cisco: The Age of Privacy Has Arrived, Helped by COVID-19 appeared first on Anonyome Labs.


Spherity

Spherity launches New Product to Support Pharmaceutical Supply Chain Compliance

Already integrated by SAP and rfxcel, the Spherity Credentialing Service is now ready to be shipped to the market Spherity announces the launch of its new product: The Spherity Credentialing Service, which sets the benchmark for compliance solutions in the field of trading partner verification and is available from today. The product establishes trust in digital interactions between trading
Already integrated by SAP and rfxcel, the Spherity Credentialing Service is now ready to be shipped to the market

Spherity announces the launch of its new product: The Spherity Credentialing Service, which sets the benchmark for compliance solutions in the field of trading partner verification and is available from today. The product establishes trust in digital interactions between trading partners in pharmaceutical supply chains and ensures compliance with the U.S. Drug Supply Chain Security Act (DSCSA).

Spherity Credentialing Service — Photo by Aron Visuals

We are proud that Novartis, as an innovation leader, is looking to adopt the Spherity Credentialing Service. David Mason, Regional Serialization Lead at Novartis says that “Using credentialing is the first proven digital solution for our industry that addresses the ATP compliance gap of knowing if the counterparty is an Authorized Trading Partner. This is a foundation to meet DSCSA requirements by 2023.”

SAP and rfxcel have integrated the Spherity Credentialing Service within their verification routing service solutions to be able to share and verify the Authorized Trading Partner (ATP) status in product verifications. Herb Wong, Vice President of Marketing & Strategic Initiatives at rfxcel says “The Credentialing Service is the most comprehensive effort to address the upcoming Authorized Trading Partner requirement for DSCSA. rfxcel was impressed to see how seamlessly it integrated with our solution.” Dr. Oliver Nuernberg, Chief Product Owner at SAP says “For SAP, one of the key requirements was to ensure that the existing returns verification process is not impacted by adding credentialing. By making the credentialing optional, we further ensure that our customers can add this capability over time without disrupting existing processes.”

The Spherity Credentialing Service enables supply chain actors to verify in real time that they are only exchanging information with Authorized Trading Partners (ATP), as per DSCSA requirements, even when they do not have a direct business relationship yet. The Spherity Credentialing Service integrated Legisym as credential issuer and is based on the ATP architecture that was tested by an industry wide pilot. Beyond DSCSA compliance, Spherity leverages process efficiencies of exchanging data with indirect business partners by avoiding manual and time consuming due diligence processes. Saving significant time and money for all participants in the ecosystem.

To drive the utilization of decentralized digital identity technologies across the industry, Spherity participates in the newly founded Open Credentialing Initiative (OCI). As an industry consortium, this initiative incubates the ATP architecture and governs further standardization efforts.

“Using ATP credentials for product verification interactions is just the tip of the iceberg. The established enterprise identities and associated verifiable credentials will leverage efficiency to exchange data in regulated environments”, says Georg Jürgens, Manager Industry Solutions at Spherity.

About Spherity

Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to enterprises, machines, products, data and even algorithms. We provide the enabling technology to digitize and automate compliance processes primarily in highly-regulated technical sectors like pharmaceuticals, automotive and logistics. Spherity’s decentralized cloud identity wallet empowers cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.

Stay sphered by signing up for our newsletter, follow us on LinkedIn or Twitter.

Press Inquiries
For press relations contact:
Marius Goebel
communication@spherity.com

Spherity launches New Product to Support Pharmaceutical Supply Chain Compliance was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


Shyft Network

Shyft Network Partners with Polygon (Matic) to Build Opt-In Compliance Infrastructure

Bridgetown, Barbados — We are delighted to announce our partnership with Polygon (formerly known as Matic Network) to collaborate and build an enhanced regulatory-compliant infrastructure for Decentralized Finance (DeFi). Shyft Network’s compatibility with the Ethereum Virtual Machine will enable Polygon to deploy an opt-in compliance infrastructure, such as whitelisted addresses and Anti-Mo

Bridgetown, Barbados — We are delighted to announce our partnership with Polygon (formerly known as Matic Network) to collaborate and build an enhanced regulatory-compliant infrastructure for Decentralized Finance (DeFi).

Shyft Network’s compatibility with the Ethereum Virtual Machine will enable Polygon to deploy an opt-in compliance infrastructure, such as whitelisted addresses and Anti-Money-Laundering (AML) and GDPR compliant systems, that help comply with AML regulation while protecting user’s personal identifiable information.

Developers will be able to integrate Shyft Network core contracts into their Dapps and enable Know-Your-Customer (KYC) & identity primitives, compliant asset pools, routed reputation and verification across Dapps, and comply with incoming Decentralized Finance, and custodial and non-custodial FATF global guidance requirements.

“With the advent of mainstream interest into crypto and DeFi, we will see a new wave of adoption of web3 technologies. Protocols like Shyft will be instrumental in making this a reality with their ability to trustlessly verify and validate data between permissioned and permissionless networks.” — Arjun Kalsy, VP-Growth, Polygon.

Furthermore, through this partnership, Shyft Network and Polygon, are working together to build the first opt-in compliant zkRollup protocol, enabling mass scaling for regulated networks and, more importantly, for large volume users that are under AML regulation.

“We are very excited about this partnership as it enables Polygon’s users to build KYC and identity primitives into their networks. Upcoming regulation will deeply affect the way this industry moves forward; Shyft Network’s identity and reputation frameworks will help open up a path with opt-in compliance primitives to projects using Polygon’s infrastructure.” —Fredrico Nassire, Co-Founder, Shyft Network
About Polygon (Matic)

Polygon is the first well-structured, easy-to-use platform for Ethereum scaling and infrastructure development. Its core component is Polygon SDK, a modular, flexible framework that supports building and connecting Secured Chains like Plasma, Optimistic Rollups, zkRollups, Validium etc and Standalone Chains like Matic POS, designed for flexibility and independence. Polygon’s Layer 2 Chains have seen widespread adoption with 90+ Dapps, ~7M txns and ~200K unique users.

If you’re an Ethereum Developer, you’re already a Polygon developer!

Website | Twitter | RedditTelegram

About Shyft Network

Shyft Network is a public protocol designed to aggregate and embed trust, validation and discoverability into data stored on public and private ecosystems, and facilitate information transfer between permissioned and permissionless networks. By incentivizing individuals and enterprises to work together, Shyft Network allows for the layering of context on top of data, turning raw data into meaningful information.

Website | Twitter | GithubTelegram


KuppingerCole

Time CISOs Stopped Trying to Speak to the Board?

by Paul Fisher I have been covering cybersecurity issues, first as a journalist then as an analyst, since 2006. In that 15 years I have heard the mantra that security is a boardroom issue hundreds of times. The subject has filled countless conference talks and media articles. It appears that the message is still not getting through if a speech by the new CEO of the UK National Cyber Security Ce

by Paul Fisher

I have been covering cybersecurity issues, first as a journalist then as an analyst, since 2006. In that 15 years I have heard the mantra that security is a boardroom issue hundreds of times. The subject has filled countless conference talks and media articles.

It appears that the message is still not getting through if a speech by the new CEO of the UK National Cyber Security Centre (NCSC) anything is to go by. In her first public speaking engagement in March this year, Lindy Cameron, said, you guessed it folks, security must be given more attention in the boardroom.

“Cybersecurity is still not taken as seriously as it should be, and simply is not embedded in UK boardrooms. The pace of change is no excuse — in boardrooms, digital literacy is as non-negotiable as financial or legal literacy,” she said.

“Our CEOs should be as close to their CISO as their finance director and general counsel, and we want to help them to develop this knowledge, as we’re all too aware that cyber-skills are not yet fundamental to our education — even though these are life skills like wiring a plug or changing a tyre as well as skills for the future digital economy.”

Fine words and reported diligently by a few security media outlets in between the latest sensational cyber-attacks. But why has it not changed – and more importantly, does it even matter?

Have a cigar?

I take issue that cybersecurity is “not taken seriously” by CEOs etc. I believe they are more than aware of the risk of cyber-attacks. But there is a crucial difference between taking an issue seriously and making it a regular “embedded” boardroom issue - one discussed and approved by directors at all time. This is what many aspire to and to get there, security people are told they must speak the “language of business” for budget approval and to get things done.

I am not sure those who want a seat at the table will ever find one. Perhaps security is not a regular boardroom issue because the board simply expects the CISO and everyone in the hierarchy beneath them to get on with the job. It also betrays a lack of awareness by the “cyber is a boardroom issue” lobby as to what board meetings consist of.

Mind your language

Very often this debate is framed around language – the Board does not care because CISOs only talk in technical terms they do not understand. If security people can frame security needs and projects with “normal” business language, then more budget may be forthcoming or new policies could be approved. I’m not so sure this makes a difference.

Other than reducing people-based incidents such as phishing through awareness training (not usually successful in the long term), reducing cyber risk is almost 100% based on the successful management of policies, IT and software. Therefore, it is not unreasonable for a CEO and the board to consider it primarily an IT issue.

There are articles, books even, which explain how to get boardroom buy-in – yet the amount of activity and social interaction that these entails would preclude the CISO from doing much else. Other organizations are simply too small to have an expensive and dedicated CISO function – so they rely on managed security services. The board then simply expects that company to do what it says on the tin.

Unlike finance, HR, marketing, R&D and other LOBs, IT security is a function of risk and event management and not one of creative development - it is a negative asset- one that organizations would gladly pay less for if they could. Security does not lead to innovation, create new markets, design new products and services or consider the impact of M&A, ROI, or P&L. It is however a cost entry in the auditor’s report.

Focus on technology not theory

For her first speech, Linda Cameron would have done to better to focus on how advanced technology, automation and better deigned security tools are doing more keep organizations safe than trotting out a tired line about security’s battle to be taken seriously by the board.

Outside of this diversion the best CISOs, CIOs and IT managers are putting in place security technologies that focus on the end user by working with vendors and analysts to find the right solution. They are making employees lives easier by deploying solutions that provide privileged access, single sign on, secure remote access etc without security controls getting in the way. This is surely making security a business issue -without needing boardroom approval.

Given the workload of most CISOs right now, not having to worry about speaking to the board would probably come as a relief. Of course, the one time a CISO may be summoned to the board is when there is a serious breach, but no-one ever said life was fair.


Forgerock Blog

Six Essential “Must Haves” for Your IoT Deployment

Getting these elements right can be the dividing line between IoT chaos or control Too many vendor-based IoT blogs and articles go like this: AMAZING STATISTIC on the exponential growth of IoT! (Hint: It’s huge!) SCARY STATISTIC on how out of control the situation is. (Note: It’s very, very out of control!) (ANOTHER) SCARY STATISTIC on how a single compromised device can imperil your
Getting these elements right can be the dividing line between IoT chaos or control

Too many vendor-based IoT blogs and articles go like this:

AMAZING STATISTIC on the exponential growth of IoT! (Hint: It’s huge!) SCARY STATISTIC on how out of control the situation is. (Note: It’s very, very out of control!) (ANOTHER) SCARY STATISTIC on how a single compromised device can imperil your entire organization. CONCLUSION: Buy our stuff.

These basic tenets are obvious and generally well understood by anybody in the orbit of IT. Missing here, however, is what are the essential elements that are needed to get your Internet of Things (IoT) project off the ground. What should you be looking for or careful not to miss? Based on my conversations with many customers in all industries and in all phases of their IoT projects, I’ve put together six must-haves that need to be in any IoT plan. 

Must-Have #1: Support for Smart Devices

IoT devices come in several different flavors. A common one is the “smart device.” Smart devices are small computers: they are IP addressable, have a microprocessor, communicate using a protocol (Wi-Fi, Z-wave, or other), and have memory storage for an X.509 certificate, which is a public key cryptographic document used to secure the device. A smart IoT device can perform some complex tasks, and some even have a user interface (UI) that can be accessed. Your IoT plan needs to make provisions for how to manage these smart devices, including how to update software, if allowed. Your plan should also take into account how to install and refresh a security certificate. And, it should address how to manage these devices like any other computer on your network. But aren’t all IoT devices “smart” you may ask? Read on to find out.

Must-Have #2: Support for Constrained Devices

Not all IoT devices are “smart”. In fact, quite a few are just inexpensive, single-purpose devices that sit at the edge of the network and communicate through a gateway. These are known as “constrained” devices because they are limited – or constrained – in how much you can interface with the device or customize it for your environment. An example is a simple temperature sensor. But that doesn’t mean you can ignore them. The truth is you’ll have both smart and constrained devices on your network, and you need a solution that can manage both.

Must-Have #3: Support for Offline Capabilities

Think of where IoT devices are installed. While most are operating within the comfortable confines of the network WiFi, many others are mobile or far afield in places like factories, oil platforms, utility buildings, cars, and the like. Communications may be intermittent. A critical, often overlooked element in IoT plans is how do you support the offline nature of IoT devices? In particular, how do you authenticate devices to ensure they can continue to do their jobs while momentarily unable to connect to the network? A well-executed IoT plan should account for how offline devices can be secured and authenticated while off the network. Also needing to be accounted for is what happens when they rejoin the network in terms of synchronization with policy changes and event logs.

Must-Have #4: Manage IoT and Humans With a Common Platform

Who wants one more platform to manage? My guess is nobody. In that case, you need to manage your IoT devices like you do your users – with the same identity platform. This can be done because the users and devices share similar management and lifecycle needs. Just as you have users join your organization, gain various levels of access, and eventually leave, you also need to verify your IoT devices, authenticate them, give them various levels of access, and then eventually terminate their access and retire them. After all, the principles of least privileges and entitlement creep apply to non-human identities as well. Doing this on a common platform will allow you to manage both human and IoT assets through a common interface.

Must-Have #5: Integration with Leading IoT Platforms

The leading cloud platform providers – Google, Amazon, and Microsoft – all have their IoT management features. Best of all, their basic capabilities are often low-cost or even free to use. This is great, but it’s just a starting point. While those vendors offer deep analytics capabilities, they stop where real identity management capabilities are needed. The cloud providers hand these capabilities off to the IoT application that can manage and secure the IoT devices. That’s why selecting a vendor that has tight integration into your chosen cloud vendor’s platform is essential. 

Must-Have #6: Automatic Authentication, Authorization, and Registration

Just as it is increasingly difficult – or almost impossible to manually manage the sprawl of identities in your environment – it is nearly impossible to manage all the IoT devices connecting to your network on a daily basis. Once devices are known on the network, a good IoT platform can automatically register devices when they turn on and automatically authenticate them. Automation is the only way to handle IoT projects. Make sure you include vendors who have an automated approach.

ForgeRock IoT can provide all the must-haves on our list. Getting your IoT projects off the ground and doing it right the first time with these critical capabilities can mean the difference between having a smooth, automated, secure IoT infrastructure that drives your business forward with confidence, or one that takes you into chaos. Contact ForgeRock to learn more about how our experts can help you with your IoT projects.


Affinidi

Protecting Your Driver’s License — A Use Case for Verifiable Credentials

Protecting Your Driver’s License — A Use Case for Verifiable Credentials A driver’s license may seem like a simple card that you carry around with you every day, but it can have serious consequences if it falls into the wrong hands. Well, let’s imagine for a moment that your wallet containing a few credit cards, your driver’s license, and some cash, is stolen. You would immediately call the
Protecting Your Driver’s License — A Use Case for Verifiable Credentials

A driver’s license may seem like a simple card that you carry around with you every day, but it can have serious consequences if it falls into the wrong hands.

Well, let’s imagine for a moment that your wallet containing a few credit cards, your driver’s license, and some cash, is stolen. You would immediately call the credit card companies and let them know about the theft. But what about the driver’s license?

It can be a time-consuming process to report the theft by which time the thief would have impersonated you with your Personally Identifiable Information!

What’s a better way to avoid this situation?

Go for self-sovereign identities (SSI) and verifiable credentials because they are private, immutable, secure, and available only in your digital wallet, so you can access it at any time and share it with anyone without ever worrying about losing it.

Driver’s License and SSI

Let’s take a peek into how you can apply, share, and safeguard your driver’s license through SSI.

In this example, let’s assume that the applicant wants to use the driver’s license to rent a car from a car rental company.

There are three parties involved, and they are:

Issuer — A startup company that issues standard and interoperable driver license Verifiable Credential (VC) after validating the government-issued driver’s license Holder — An individual applying for the interoperable driver license VC Verifier — The car rental company Creating and Managing a Driver’s License VC

Here’s the step-by-step process.

The holder goes to the issuer’s portal to submit the details to get the verifiable credential for the driver’s license. He/she fills up the form and submits it. The issuer checks the application and issues the verifiable credential to the holder. The issuer sends the verifiable credential in the form of a QR code, by mail, to the holder. The holder gets the QR code, scans it using the Wallet application, and saves it to his/her wallet. The holder goes to the verifier’s website or office and scans the QR code. The holder checks the Credential Share Request and approves it to share his/her Verifiable Presentation to the verifier, and waits for the confirmation. The verifier logs into the admin account, checks the list of applications, and verifies the credentials. On confirmation, the verifier rents a car to the holder.

As you can see, the biggest advantage of such an SSI-based driver’s license is that there’s absolutely no possibility of loss. Furthermore, there is no question of your PII on the license to fall into the wrong hands because the holder has complete control over how it is used and with whom it is shared. So, no more worries about losing your driver’s license and facing its repercussions.

At the same time, it’s simple to implement, foolproof, and saves time and effort for everyone.

So, are you ready to leverage this power of SSI? Reach out to contact@affinidi.com to explore how we can help you in this exciting journey.

You can also join our PoCathon to build a similar Proof of Concept using Affinidi’s resources.

Hop on the bandwagon and be a part of a new revolution in privacy and trust generation.

Protecting Your Driver’s License — A Use Case for Verifiable Credentials was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Metadium

Metadium based NFT platform, ‘METApie’ will launch in June

Dear community, We want to share some exciting news with you today! : Metadium based Non-Fungible-Token(NFT) platform ‘METApie’ will come in June. METApie will support NFT issuance and transactions through a marketplace based on Metadium blockchain. The new platform will work with two other Metadium-based apps for DID identification and authentication: MYKEEPiN and THEPOL. Thanks to D

Dear community,

We want to share some exciting news with you today! : Metadium based Non-Fungible-Token(NFT) platform ‘METApie’ will come in June.

METApie will support NFT issuance and transactions through a marketplace based on Metadium blockchain.

The new platform will work with two other Metadium-based apps for DID identification and authentication: MYKEEPiN and THEPOL. Thanks to DID, users can create NFTs while choosing to disclose their identity or not.

Coinplug, leader of this project and Metadium’s main technical partner, announced its intention to expand its business areas into entertainment, gaming, luxury brands, e-sports, etc., by developing METApie. It will facilitate the creation of NFTs linked to digital content owned by members of MYKEEPiN Alliance.

METApie will launch with open beta service in June.

We thank you for your continuous support.

Best,

- Metadium Team

Metadium based NFT platform, ‘METApie’ will launch in June was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 30. March 2021

Northern Block

Self-Sovereign Identity for Social Impact & Importance of UX Design with Jimmy J.P. Snoek from Tykn

Listen to SSI Orbit Podcast Episode #5 with special guest, Jimmy J.P. Snoek from Tykn, as he discusses Self-Sovereign Identity for Social Impact and the Importance of UX Design with Northern Block's CEO, Mathieu Glaude (full podcast transcription available too). Listen, consume & share. The post <strong>Self-Sovereign Identity for Social Impact & Importance of UX Design</strong&


Click to Listen to this Episode on Spotify

Mathieu: Jimmy, thank you for doing this with me.

Jimmy: Thank you for having me on; I’m looking forward to it.

Mathieu: We’re doing some cool stuff in the self-sovereign identity space, and we’ll get to that in a few minutes. I was curious — I know you come from the crypto space. It seems that a lot of people in the self-sovereign identity community come from crypto backgrounds. Conversely, there are a lot of people in the self-sovereign identity community who try to distance themselves from crypto, but there is some tie there. Did you come from the crypto space?

Jimmy: I’d say, technically, yes. Although, at this moment, all of that is very nuanced. It’s logical that a lot of people come from a crypto background, because the whole idea of decentralization and everything associated with it, is the same mentality that’s carried over. I think it’s good that crypto and SSI have been parting ways a bit.

I entered the crypto space in 2015-2016 through Bitcoin and then Ethereum; mostly being super-interested in the technology, and just geeking out, diving into it. At some point, I met my co-founders from Tykn, and they came from the same space. We saw this big identity problem, and I thought, “Okay, can we marry this idea of decentralization with the problem of identity and privacy and everything?” That was at a very naive stage in early 2017, when everything to do with self-sovereign identity and decentralized identity began to emerge. All those ideas are still very rudimentary; you just hash a certain identifier, and you put that on the chain. No one had said, “Maybe that’s a privacy concern and a correlation concern.”

Everyone was very excited about the potential of these ideas. We built some early proofs-of-concept on Ethereum; at a time when you didn’t have ERC 725, it was just ESC 20s. We saw that this is a lot more complicated than it should be when you’re talking about people needing to have tokens in their wallets to interact with this. The UX was truly a nightmare, even conceptually.

At the time, people also had this idea of verifying credentials when paying for things like gas, which didn’t seem like a very good idea. At a certain point, we saw a group of like-minded people within Sovrin, which had only a couple of other stewards at the time (I think we were the ninth when we onboarded in June or July 2017). They’d been working on this concept of self-sovereign identity for a while. We hooked in on that; we saw, “Okay, these guys have done a lot of work that we thought we still had to do.” So, we dove into that and started flying the sovereign/SSI flag. Shortly after that, we started working with The Netherlands’ Red Cross through the 510 data team. For us, that was a very valuable experience in needing to make these things work for people on the ground floor and for high-stakes situations. That was very valuable from the perspective of privacy and the implications. We didn’t want to have any perverse incentives, which you also see in certain models. For example, this is one that’s still around: it’s the idea of data marketplaces. In the humanitarian sector, at least, people were pitching this idea of data marketplaces, where refugees and people who’d been struck by disaster could sell their data. They would get money for it, and then they could use that to sustain themselves. That sounds good, in the sense that, “Oh, you’re empowering people with the power of data; we are giving all this data away anyway, and now — power to the people who can use this to get financial freedom.” But, when you’re talking about people who have been put into such a situation, they will, of course, always sell their data. That’s a perverse incentive, because, at that point, they don’t really have a choice — which doesn’t scale very well. Coming face to face with those realizations was very good for us early on.

Mathieu: We came from the crypto space as well; it was similar to Tykn, in that we had done some early POCs on Ethereum and on some other chains. It was the same problem back in the early days: “Yeah, let’s just throw everything on here, let’s disregard anything else. Yay for immutability, yay for not being able to rewrite data and history, let’s go!”

Then, you realize quite quickly that this doesn’t work. It doesn’t even work for a use case of buying gas, or going to buy a bottle of alcohol. It is interesting; people who come from that space, they get decentralized identity, or they get self-sovereign identity. It’s more the concept, and not necessarily the implementation, the technology, or the governance, or everything that you could go into. Conceptually, it makes sense. I believe that many people who are advocates for crypto get that right. You gave a presentation earlier this year at the North American Bitcoin Conference.  I attended that conference in December 2017, which was when the Initial Coin Offerings (ICO) were peaking, and it was crazy.

Jimmy: We were there at the same time, then. It was probably January 2018, the one in Miami.

Mathieu: It was crazy! There were close to 100 presenters at that conference, and half of them were ICO pitches. It was unbelievable.

Jimmy: That was nuts. We were actually one of the companies presenting, but we weren’t presenting an ICO, and so we stood out in that way. We actually had a booth there as well, and that was the conference where we unveiled our collaboration with the 510 global initiative of the Red Cross, which was done the month before. There were so many ICOs. I was there with Khalid, and I remember the guys next to us. They didn’t really feel right to me; there was something about how they interacted with people that was very defensive. Two months later, they were indicted for securities fraud for 40 million or something.

Mathieu: You can still see some of those classic memes today in crypto: there’s a guy at a party with a Bitcoin Christmas sweater and a guy with an Ethereum sweater, that comes from around that timeframe. It was just pure craziness.

Jimmy: That was peak madness, because that’s when the market had topped the week before. It was just all-round “Lambo Moon” hype madness.

At the same time, barring those projects, a lot of good things have come out of this as well. What was exciting about that time to me, is that I looked around and saw, “Okay, most of these projects; it’s ridiculous, and they shouldn’t exist.” At the same time, I saw that the underlying technology was still so early, and in a way, it was really shitty; there wasn’t much you could do with it. But to me, this was something that’s super early that’s still going to evolve a lot. You could start to ideate what things might look like in five to ten years. So, in that sense, I’ve tried to remain optimistic but also neutral. I can tell you that a lot of the information is pure hype, and in the short term, that’s going to go very wrong. But then, in the long term, there’s an element of this that’s going to survive, and that element is going to be super interesting.

It’s the same with Non-Fungible Tokens (NFTs) — a huge hype. A lot of it doesn’t make sense, in that it’s non-persistent. Much of this art is hosted on centralized domains, and the tokens themselves just contain pointers to the centralized domain. If that domain goes down, or if some of these marketplace startups shut down in five to ten years, then your art is gone, and your token’s worthless. That doesn’t make any sense to me. There’s an underlying idea that some of this art that’s created is generative and some not, but it’s actually stored on chain, which is a lot more persistent. I find it interesting that some art will be fully persistent. If we think about how much we are interacting on Zoom and other platforms nowadays, it’s not weird to think that in five years, we’ll get into this “metaverse” idea.

Jamie Burke from Outlier Ventures wrote about the “Open Metaverse” and being involved in these virtual worlds. Of course, having scarcity there is a good idea, just like in the real world. Then that idea becomes appealing, but at the same time, a lot of it is now is hype and it shouldn’t really exist. To clarify, I don’t mean it shouldn’t exist at all; of course, it should, but it doesn’t have persistent, lasting value, and will go through the same cycle.

Mathieu: If you’re thinking about the NFTs today, it’s the same thing as the ICOs were, three years ago. The properties behind it are amazing; the ability to be able to tokenize and create liquidity around pretty much any type of unique asset is quite interesting. Forget “just art”: imagine being able to borrow against it, and to lend against it. There are so many cool things you could do with that.  When you went back to the virtual conference this year, you pitched self-sovereign identity to the crypto crowd: How did you approach that?

Jimmy: I try to approach it from a neutral point of view. Oftentimes at these conferences, people pitch their own project or at least shill it, and there’s some financial motivation behind it. The organizers have said, “Don’t do that; try to keep it informational and educational.” That’s what I tried to do. In a couple of slides, I talked about some of the stuff Tykn’s been doing, just from a use case perspective. Really trying to keep it straightforward, as an explanation of how this really works from a high level: how credentials flow, and data flows, and what hits the underlying ledger, and what doesn’t. But then, you already saw in the comments, people asking, “Oh, what’s this project? What’s the token?” No token, no ICO. Just a tech company, building tech stuff.

Mathieu: But, by the way, it could bridge a significant gap between the institutions and crypto, so you want to pay attention.

Jimmy: I wonder when that’s really going to happen. I thought maybe that in this cycle, there would be more of this linkage between the crypto space and SSI space because they’ve separated. I thought maybe some of these SSI use cases would start to drip into the crypto space, even just for KYC (Know Your Customer). But now, because of more decentralization in the crypto space, with decentralized exchanges and such, you see that, in fact, demand for KYC is becoming less and less — they’re actually moving away from it. For instance, ShapeShift moved to working with decentralized exchanges. Is there really a demand for that now in the crypto space?

I don’t know. Maybe by the time where you have to essentially dox all your wallets, then it might become more interesting. Maybe once it’s necessary to prove that you own certain wallets that becomes interesting, but so far I haven’t really seen that — which is slightly surprising to me. On the other hand, throughout the bear market, blockchain crypto had a bad rep; so, perhaps it’s good that SSI moved itself away from that. In my opinion, that was rational and a good move, because blockchain and that idea of SSI were so strongly connected. There were many misconceptions. There were journalists writing about us, who vaguely understood blockchain and they vaguely understood the premise of SSI. They made assumptions that you could take someone’s identity and put it on a blockchain, and then it can’t be lost anymore, and it’s there forever.

Of course, that isn’t how it works, and that’s not a good message to perpetuate. We got flack from people, saying, “Ah, Tykn’s putting personal information on channels.” No. This is the opposite of what we do. I would be calling these journalists asking, “Oh, could you please amend this in the article? Thank you for writing about us, but could you please correct this?”

Mathieu: But, people get stuck here. I see this often: as soon as the word blockchain enters someone’s mind, they really get stuck on blockchain. It’s not unimportant, but it’s not what you should be focusing on when we’re talking about decentralized identity or self-sovereign identity.

Digital Identity for At-Risk Populations

Mathieu: Let’s go back in time to the founding of Tykn: You, Khalid, and there was a third person, I believe. A lot of companies come out of a real -problem. It seemed like this one was a perfect example of a real-world problem; an early contributor to the project, who lost his identity. Would you mind talking through that, and how Tykn came about?

Jimmy: Yes, that was one of the motivations. He struggled through this problem himself; being a refugee and coming to The Netherlands, without having a birth certificate or anything to prove his history. As a refugee in The Netherlands, there are certain processes with many steps that are required. You go through ten, eleven different kinds of interviews, with different institutions. You have this whole sheet of interviews that you go through to establish things such as your age and everything else. He did have a passport, so he could still have something to show, but for most of the requirements, he didn’t. That lack of track record did impede his ability to integrate better, and to get access to certain opportunities along the way. Of course, while he was within these refugee camps, he saw that there were many people who were worse off, who just had nothing to prove about themselves. Perhaps from a place of good-natured naiveté on our side, we thought — okay, we can solve this.

I feel that a lot of companies are born out of the idea of underestimating how big the problem is: “Oh, we can do this!” Of course, the problem is way bigger than one company could strive to solve, but it did put us on that right track. What’s important for me, and what’s always been important, is to at least make some sort of impact on the status quo to make it better than it is. Maybe we can’t retroactively give billions of people an identifying document — you can’t retroactively give someone a birth certificate. But, you can try to help those people to have a higher degree of access to the current systems without needing that document. To me, that has become an important part of the motivation within Tykn. Of course, there are big regulatory, legal, policy implications. When you’re talking about legal documentation, it’s very hard to change. It will happen eventually; at least, I think that it will happen over time, but that’s not something you can change within the timeframe of five years. Over the past four years, we’ve spoken frequently with policy-makers within different governments. We have worked with different governments through giving awareness sessions, and with large institutions within the ICRC (International Committee of the Red Cross) and IFRC (International Federation of Red Cross), with the United Nations, the Dutch government, the Turkish government. We’ve tried to at least create that awareness and create a discussion. On that front already, it’s done quite a bit of good.

What we were talking about, at the time, was really the far end of the scale for them: the idea of self-sovereignty and coming from a very puristic point of privacy and data control. Some of the things that stuck practically, were the ideas of things such as data minimization. There is a looming idea of what they call ‘surveillance humanitarianism.’ It’s growing to be a more important term and idea, as the opposite of ‘surveillance capitalism.’ It’s the idea that by digitizing a lot of these humanitarian aid processes, you are also gathering a lot of data about different people who are at risk of being exposed. You can get a lot of insights into these populations, which can be dangerous. You’ve seen it with the Rohingya people; through the best humanitarian efforts, they became a lot easier to identify. I don’t think that was even a tech problem, but just because of the humanitarian intervention, suddenly, these people became very centralized in certain areas, and they had identification cards. Suddenly, they were an easy target for genocide. That’s of course on the worst end of the spectrum, but it’s also not strange to think that a lot of this tech is being tested in the field. I feel that we’re putting privacy concerns on the back burner, coming from an idea of, “Oh, these people don’t have much anyway”; they are almost entirely digitally excluded. So it doesn’t really matter that much if we don’t safeguard their privacy optimally, or if we store all their information in a central database, or collect their biometric information. That scales very poorly, especially when you’re talking about Sub-Saharan Africa which is digitizing at lightning speed. Over the course of twenty years, it becomes quite dangerous. For us, that’s why that sense of control, data minimization, privacy has to be there as a first building block. Seeing that the conversation has started flowing over the past couple of years, that makes me feel good at least, because it’s a conversation that we didn’t see four years ago. We are definitely not the only ones that have been pushing this.

There are a lot of activists in that space that have been contributing to making this impact. I have personally seen that ‘click’ once we’ve explained it, but at least those core principles got stuck somewhere and are identified as something we have to take seriously. So at least the idea of self-sovereign identity can already produce quite a bit of good.

Mathieu:  I guess that naive optimism when jumping into anything is probably not a bad thing, because if you knew the complexity underneath, you might have stayed away from it, right?

Jimmy: Yes, I think about that a lot. If I had known what was coming and what was ahead in 2017, I’d probably have said, “I’m going to do something else, and just sit on my crypto.” Honestly, I would have been better off if I did that. But I have no regrets at all. We actually talked about that this morning during our stand-up. Of course, in the beginning we had to bootstrap a lot before we started raising funds, and even then, we barely paid ourselves out in terms of salary or anything. I was compensating for that within crypto, and even through the bear market, I sold a lot. I’m thinking now that I could have probably just retired at 25, which I don’t think is a good thing, necessarily.

Mathieu: I think you guys are clearly trying to make a social impact here, and it’s quite important. Anything that falls outside of that, it’s nice to think about; maybe, it hurts sometimes to think about the missed opportunities.

Jimmy: Yes. I don’t think about it too much. What we’ve been doing with Tykn is much more fulfilling to me, and I wouldn’t trade it for a second. So, no regrets.

Mathieu: When I think about privacy, I like to break it down into just two pieces: one being control, or having the ability to control what happens. The second piece is all about transparency or traceability. That could happen with centralized systems but it could also happen with decentralized systems, as well.

Going back to the early days of Tykn; you’re looking to empower people to have control, and transparency and to own their identity. Is that the project that you guys did with the Turkish government? Was that along the same lines?

Jimmy: Exactly. We put that press release out a couple of weeks ago. This is a project that we started in 2019, at least in the research and conceptual phase. We took part in this UN/Turkish government-led accelerator for identity companies to tackle some of the aspects of the refugee crisis in Turkey. Turkey has taken in a lot of Syrian refugees, and they want them to integrate into society. Of course, there are several different facets to different problems within that whole sphere. We went to Turkey multiple times; we went from Istanbul to Gaziantep, which is very close to the border. In Gaziantep, out of the 2.5 million people that live there, 500,000 are refugees. Surprisingly few are still in refugee camps; they’ve put in a lot of effort to be able to integrate them into society.

We went there with a bunch of assumptions, and again, with a certain degree of naiveté. They have this temporary protection card. We know it’s hard for them to move across different geographical areas and to do certain things because they need to get this card updated, and sometimes, they don’t have the mobility. We knew that smartphone penetration was very high amongst the refugee population; a lot higher than people assume. It’s about 94-95 percent, and a lot of families have multiple devices: iPhone, iPad, they love Apple products. We thought, “Okay, we’ll just digitize this temporary protection card and that’s a job done. They can receive the money on their phone, and they can update it remotely, etc. etc.”

However, we got to talk to the refugees on the ground floor, and the people around that; with people from government to local NGOs to United Nations (we try to talk with everyone). Then, we saw that this didn’t even come close to being the actual most pressing issue. They thought the card was annoying, because it’s a piece of plastic that’s just a bit too big to put in your pocket. Further, they said, “that’s annoying, but I don’t really care about that; I have bigger things to worry about. I can’t get a job on paper; my eight-year-old daughter is working full-time in a textile factory for 30 euros a month,” and so on. Those were very heavy and shocking conversations, as they always are. I had a conversation with a colleague about this: the difference between knowing, and understanding. You can know a lot of these facts, and read them from reports and articles, and see them on the news. Understanding: we probably never will, because we’ve never had to go through those experiences. At the least, we can approximate understanding once you have those conversations face-to-face, and when you sit at the table with people and they tell you their eight-year-old daughter is working almost full-time in a textile factory.

We started to look at, and think about this. We saw that one of the pain points was that there’s a work permit application process that employers need to go through in order to employ a refugee. This process is a bit cumbersome; quite long, usually. What often happens, is that the employer passes this process onto an accountant. You have to pay an initial fee for this work permit, and then you have to pay a recurring fee, and then the accountant also charges markup. The way that the accountant does this application process, is that the employer basically gives their information from their portal. Essentially, they give government identity portal login information to the accountant, who also has a lot of other data. The accountant charges a surplus on top of it.

This is not very legal, but what we saw happening often, is that the costs of this work permit were being passed on to the refugee. We started working it out, and we’ve found that for a refugee, oftentimes they end up with less money if they actually work on paper. Of course, that’s a huge disincentive to work, when you can still get your emergency social credit every month, and work under the table and get paid more, because you need to sustain your family. So, we thought maybe we shouldn’t work with the refugees directly, but maybe we should focus on this part with the employers. If we make this work permit application process easier for them to do, then it would give more incentive to hire refugees, and they would be the primary social benefactor. The UN really liked that idea, and the Turkish government really liked that idea.

We started working that out with our product Ana, and we ran a pilot on that in Istanbul, with the Istanbul chamber of commerce, INGEV (a local NGO), and the Ministry of Labour. Essentially, we made it super easy for an employer to receive a certificate of business ownership, and then use that with a statement from an accountant, to be able to apply for a work permit within the application. The pilot was received very well, and it’s now escalated to the top of the presidency office, where we’re now talking about how we can move this forward within their digital transformation framework.

Turkey is very advanced in terms of its digital infrastructure. For some reason, that is not on a lot of people’s radar. For example, Tubitak, which is the research arm of the Turkish government: they’ve been working on indie wallets and sovereign desk networks since 2018.

Many other governments are not thinking about it even now, and they were already looking at that three years ago. So, in that sense, they are very advanced. That’s why it’s been such a pleasure to work with them, because they already know everything. I remember meeting their chief researcher at a UN function somewhere in Ankara, and I was talking to someone else. I was talking about self-sovereign identity, and trying to obscure the terminology a bit to make it easier to understand, like “explain it like I’m five.” This guy came over and said, “Hey, are you talking about self-sovereign identity? Are you talking about Sovrin? Because then we need to talk.” I said, “Yes, absolutely,” which was a really nice moment.

Mathieu: It’s nice when you can skip that education part; when people are already sold on the concepts. Now, we could focus on what the economics are, and what the business case is.

It seems that going on the ground there really helped you guys understand the dynamics within that ecosystem; to figure out how the economics work. That’s the crucial piece to figure out, if you’re going to make anything work. There’s another key piece too. I know you guys put a lot of focus on UX and design as a company. There are many conversations happening around design in the SSI space. Even just around wallets: more specifically, how do you internationalize wallets? It’s more than just having a switch that you flip the language; people use things differently in different countries. I’m sure that was noticeable as well in Turkey, or with Syrian refugees, around how they use technology. Even looking at intricacies within the apps; how can you make it a better experience, get more engagement, and get more usage?

Would you mind speaking about the importance that you guys place on that, and then how it’s resulted in Ana? I know we’ve talked before, about cloud products and browser products. How does that all tie together for you guys?

Jimmy: It’s a shame Khalid isn’t here, because he’s the chief UX-er. There are some huge things that we’ve seen there, which were very valuable in terms of learning. Like a lot of other SSI companies, we went being a bunch of geeks working on the whiteboard, to actually having to speak with people and understand, “If we built this, could you use it?” Oftentimes, in the beginning, the answer was “no.” Then, you find out that across population groups, there are a lot of user experience assumptions that we make nowadays. Many things are very intuitive and natural for us now, because the digital products that we’ve been using for the past ten years have certain interactions that are just completely logical to us: filling out a form, scrolling, having it go back using a button. All those things seem very natural to us, but it isn’t inherently natural for a lot of other population groups. When you actually go to test those designs, you see people get stuck. One thing we saw within the Syrian refugee population was that they are a lot more tech-savvy. As I said, a lot of them have iPads; they have iPhones. Syria was generally a technologically advanced society, so they understand those interactions a lot more and there isn’t as much of a gap.

However, with some of the design work with the Red Cross, we did see people for whom it wasn’t ‘intuitive’ when having to scroll and fill out certain forms. You have to think hard about, “Okay, how can I make this as intuitive as possible,” and that also inspired some of the designs for Ana. This probably won’t come as a big surprise, but a service or product that is used almost ubiquitously across population groups is WhatsApp and Facebook. Even people who didn’t understand any other apps; they understood WhatsApp, they understood texting people, and they understood Facebook and Facebook groups. Many other areas are sometimes quite abstract, and didn’t quite ring globally. Some of the interface within Ana, we try to make it seem like it’s a WhatsApp flow: when you go through certain parts of the onboarding, for instance. It was the same with one-to-one, which the Red Cross project ultimately ended up being called: one-to-one, or cash-based aid.

From an SSI perspective, we took a lot of learnings from that, because we came face-to-face with some of the realities of having to go into the field and making it useful for people. We saw pretty early that the puristic view of SSI, in terms of having everything stored on edge wallets — when you go to somewhere in Sub-Saharan Africa, that’s going to be pretty difficult, when there’s maybe one phone in a village and it’s not even necessarily a smartphone. It’s very easy to say, “Oh yeah, but within SSI, everything has to be stored on the edge wallet.” What we saw was that if you make that this hard requirement, and keep working from that, then all these population groups are just going to be left behind more and more.

SSI for Privacy and Security Concerns

For those cases, perhaps it makes more sense to take some of those qualities of SSI in terms of having a higher degree of control and being able to guard that, and have these guardianship models where the information isn’t per se stored locally. For instance, that would allow them to use a feature phone, which is what we’ve been working on: being able to accept credentials and verify credentials using a feature phone, which inherently needs some sort of SSI cloud infrastructure. Obviously, the credential isn’t stored on the feature phone, on the edge in that case. That does open up a lot of these interactions and use cases to these population groups. It’s also better than the status quo, which is something we feel strongly about. Perhaps it’s not in the puristic sense of ‘pure SSI,’ but it is much better than the status quo.

As long as you take certain aspects into account, such as: not having that degree of correlation, still having that degree of privacy, operating from a starting point of user control, and data minimization — they still have the portability. From that point of view, it is a lot better than the status quo, and it’s important to be able to see that nuance across different sectors and within SSI. We’ve also seen it from the other side, within enterprises where they didn’t especially want to have something pure SSI, because they wanted to get certain insights. That’s not a popular idea within SSI for good reason, I think, because that’s something we ultimately want to move away from.

Mathieu: That doesn’t apply for business reasons or for regulatory reasons; there’s no reason why data could not be received and stored and used. Going back to what you’re saying, about removing certain correlations, minimizing the data, and ensuring the user control and transparency, and the different privacy properties, and all the other principles we live by. As long as that’s built into the architecture of what you’re doing; if you want to do business with a bank or with an employer and they need certain information — it’s up to you, to opt into it.

Jimmy: Yes. Opt-in, and be able to opt-out, as well. That’s important to me, within SSI as well as complying with GDPR. It’s good that we can request our data back, and request it to be forgotten. I don’t know about you, but I don’t have a list anywhere of companies who have my data, or that I’ve interacted with. So, even being able to have that, and having a ‘nuke’ button that says, “Okay, opt out of all of these,” that would already be pretty great.

At a minimum, becau se of the principles of SSI, I think that things like that will become a lot easier within the next five years or so. Ultimately, perhaps it’ll even become a requirement.

Mathieu: On the consumer side, I like what you said about taking messaging into consideration, inside of the user experience. Messaging is the killer application of mobile technology; there’s nothing more used than that. So, I love that this makes sense to literally everyone in the world. If you could try to incorporate a bit of these digital credentials within a messaging experience, it’s very intuitive. It makes sense, and doesn’t force someone to learn something new or get used to using something new, it’s just easy.

Jimmy: That’s something we need to build on. We’ve all become very excited about SSI because of its potentially disruptive properties. You start ideating about what society would look like, in a perfect utopia where this is the norm. You have the people who go a step further, and they’re suggesting, “Oh, maybe we don’t even need governments anymore, because we could just attest to our own existence.” But, at the end of the day, if you build something that’s so completely different, the regular person will just not understand how to even use it, or why I would want that.

A lot of people say, “Oh, but I trust my government,” or “Oh, but I trust my bank, why would I want to step away?” We still see it in crypto: it’s “Oh, I don’t feel comfortable with that. I like having everything in my bank,” and that’s super logical. However, I think that working within SSI, it becomes very easy to have your nose to the whiteboard, and forget that there are other people out there who aren’t as intimately familiar with the technology, its propositions, and the ideology around it. Most people simply want things to work more smoothly, and perhaps as a secondary goal, they want to be less at risk.

This is something that we’ve also seen. In a lot of SSI companies, including us in the beginning, much of what we were saying was based around privacy and security. To many people, that isn’t an appealing value proposition. That assumption is baked-in to a lot of products when you use your bank. They assume with all these regulations it’s all good; it’s a bank, it’s secure and private until proven otherwise. For a lot of people, it’s never proven otherwise. Many people don’t get their identity stolen, so they’re sure it will all be fine. That’s the paradox in The Netherlands; they did some research on it, and found that there’s this privacy paradox of people. Everyone’s concerned about their privacy, but no one necessarily takes the proper actions to ensure that degree of privacy and online security, and even things like password management. I still meet people who have no idea that they shouldn’t reuse passwords. These are people my age, and they just have no idea why that would be risky.

That’s something we have to consider as well. When you put this out there, a lot of people simply want things to go more smoothly, and they don’t necessarily have a strong opinion on how that happens. Many people don’t understand why biometrics may be bad, especially if we consider what’s happening in China. It has taken quite a lot of education of the tail risks of implementing something like that; of having all your biometrics in a central database that can be accessed by governments and who knows else. They only see, “Oh, but wouldn’t it be great to walk through the airport, and they scan your face, and you get on the plane.” From a user experience point of view that could be pretty great, but then from a point of view of privacy and security, maybe not.

Mathieu: At our company, we laugh every time we ask a company we’re talking to about their security. “I don’t know, we hash our data, and we do this and it’s quite funny, To your point here, it comes down to the lowest friction and the most utility. I saw numbers the other day: today there are 2.8 billion people who use Facebook products, and they’re awesome to a lot of people. Why would they not want to use them, when there’s so much utility to it?

Jimmy: That’s also a point. It’s really hard to get away from that grip now. Even if I think about it myself; I still have a Facebook profile, just because my grandparents have a Facebook profile. Especially now, because my grandfather’s pretty sick and it’s magic; it’s the only way for him to see me, and my sister in the UK, and my parents in Spain. We can all see him at the same time while he’s in bed. To him, it’s magical. For me, that’s huge, so it does bring a lot of good things to people’s lives. But then, most people don’t understand the potential risks of having that power within Facebook. We spoke briefly about things like the Open Metaverse in the future. Facebook is already making a big bet on that with Oculus and all these other devices that you’ll need a Facebook account to use. They’re already extending that control which in the long term is very scary. A lot of people don’t inherently care about that, because they just want to receive that utility, which is logical.

Mathieu: For sure. We’re having these utopian conversations about self-sovereign identity and how everything’s going to be on the edge; everyone’s going to own their keys, everyone’s going to own their credentials. There are similar conversations if we jump back on the crypto side; we want to have the right principles baked in.

Over the past decade, the rise of cloud, mobility, and connectivity have created so much usability and so much utility. It’s lowered friction for so many things that you have to use these products. Again, to your point earlier about still interacting with banks: banks are custodians of your money, your fiat money. There are several issues with banks, but they do provide very valuable services. The same thing should be true when we talk about identity and credentials. It’s crazy to think that people are going to store their credentials on an edge wallet, similar to the way the maximalists store their crypto on an edge wallet. That’s such a small percentage of the population, and it doesn’t work well. So, how do you bake the correct principles that we’re advocating, into existing models and augment them? We’re not jumping from 20 percent to 100 percent, but we’re jumping from 20 to 25; we have some goals to get to 30, and to keep improving that. I love what you guys are doing and how you guys are thinking about that, with your product suite and your vision for the future.

Jimmy: We’re totally aligned on that. I think ultimately, as you said, it’s a very puristic view that’s come from the crypto space, where it’s natural. You have to manage your own keys, unless you want to keep it on a centralized exchange— which historically hasn’t gone very well. Then, you have these hardware wallets and where you store it, which is essentially on the edge, or at least your keys. If that’s for a regular user, that’s not going to fly, it’s not going to happen.

I have many friends and family who are now diving head-first in crypto — of course, because things are at all-time high. I even have to persuade them to spend 80 euros on a hardware wallet, so that it’s not on an exchange and they’re not at risk of huge loss. When you have something like that, but storing it in something like MetaMask, they already find that more comfortable. Ideally, you would use MetaMask with a hardware wallet. That way, you could have it in your browser, but you can still sign it with your hardware wallet, that is, your physical wallet.

I think for identity, something will need to happen where I can at least sign it with my phone, perhaps. Most of the applications are just browser-based, even for different use cases. We talked about this last time, in terms of different stakes and levels of assurance needed. For certain use cases you wouldn’t even need to sign it off. I would love it if a “one-password solution” were able to integrate verifiable credentials. I would simply use my one password, and be able to access all these services. In a way, you could say that approach defeats the point of decentralization, but for a lot of use cases that makes a lot more sense. There’s so much nuance in this, because it’s such a new space.

I think we’ve only seen the tip of the iceberg of the change that this technology will provoke. Take the example of your diploma: we use our diploma to get access to certain job opportunities. You use the presentation of your diploma to provide that you’ve put in the work. But, what we mostly use it for, is signalling; we put that information on our LinkedIn profile. It’s the same with our job history. I might use an employment statement to get a mortgage or something, but mostly, it’s on my LinkedIn profile for signalling. That is data that I want to be public information, and what most of us want to be public information. So, we should have an option to make certain credentials public. I should have the power for myself, to turn that off and make it private again. I should have the option to not disclose that to anyone at all, but then I also should have the option to make that public. A lot of those things, it’s still relatively new and we’re only just thinking about this.

It’s the same with peer-to-peer credentials. I started thinking about that, when my girlfriend was from Canada. She said that in Canada (and you probably know more about this than I do) if you want to get a new passport, you need to get a bunch of people from your circles to attest that you exist. You need to get five people to say, “Oh yes, I know Mathieu. He’s my brother, or he’s my colleague, or he’s my friend, or he’s my neighbour.” Of course, for some applications, p2p credentials would be quite interesting. There are so many other uses that we haven’t even thought of yet. So, yes, I think within that nuance, we’re going to see a lot of billion-dollar companies spinning up within these little corners. This is what excites me about SSI, because it’s still so new. It’s so young and so early, relative to what it will look like ten years from now.

Mathieu: Yes, I totally agree with you. I would love to have a further conversation with you; there’s a bunch of other stuff we could talk about. I think the whole concept of using verifiable credentials for signalling is quite a funny one, if you look ahead. I love that idea. Thinking about the peer-to-peer credential: you start talking about the value of relationships, and how that could start building up. I think we could go on for a long time on these subjects, but Jimmy, I want to thank you so much for doing this with me today.

Jimmy: It was a pleasure, it was really nice. Maybe we do need to have a part two, someday. Let’s plan for that.

The post <strong>Self-Sovereign Identity for Social Impact & Importance of UX Design</strong> with Jimmy J.P. Snoek from Tykn appeared first on Northern Block | Self Sovereign Identity Solution Provider.


KuppingerCole

Introducing Frontier Talk - The World’s First Podcast on Decentralized Identity

Frontier Talk goes beyond technical jargon to stimulate conversations that matter. In this series, we take you inside the minds of influential leaders, innovators, and practitioners from eclectic areas (enterprise, startups, academia, venture capital, etc.) to extract their experience working with emerging technologies such as Blockchain and AI. Join Raj Hegde on this journey to redefine the ‘

Frontier Talk goes beyond technical jargon to stimulate conversations that matter. In this series, we take you inside the minds of influential leaders, innovators, and practitioners from eclectic areas (enterprise, startups, academia, venture capital, etc.) to extract their experience working with emerging technologies such as Blockchain and AI.

Join Raj Hegde on this journey to redefine the ‘I’ in Identity!




Leadership Compass Identity Fabrics

by Martin Kuppinger This report provides an overview of the market for Identity Fabrics, comprehensive IAM solutions built on a modern, modular architecture, and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing solutions that serve cust

by Martin Kuppinger

This report provides an overview of the market for Identity Fabrics, comprehensive IAM solutions built on a modern, modular architecture, and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing solutions that serve customers best in building their Identity Fabrics.


auth0

How Parallel Computing Will Affect The Security Industry

Why GPUs crack passwords much faster than CPUs
Why GPUs crack passwords much faster than CPUs

Global ID

The GiD Report#153 — The age of the digital wallet has arrived

The GiD Report#153 — The age of the digital wallet has arrived Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. ICYMI: GlobaliD is HIPAA compliant This week: Digital wallets will gut traditional banks The U.S. isn’t behind in digital payments anymore
The GiD Report#153 — The age of the digital wallet has arrived

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

ICYMI: GlobaliD is HIPAA compliant

This week:

Digital wallets will gut traditional banks The U.S. isn’t behind in digital payments anymore Visa lets you settle in crypto This week in tech policy The rise of the Sovereign Writer Thinky longreads on Likes and music Stuff happens What’s in your digital wallet? Photo: NeONBRAND 1. Digital wallets will ‘gut’ traditional banks

(According to Cathie Wood, who manages the largest actively managed ETF in the world. H/T Grit Capital)

Here’s a crazy stat /gregkidd pointed out last night:

Market cap of top 4 U.S. banks => $1.1 trillion
Market cap of top 4 cryptocurrencies => $1.2 trillion

Pretty wild.

Remember when Marc Andreessen claimed that software would eat the world? Well, apparently fintech is eating the world now (according to Genevieve Roch-Decter in her latest newsletter — also via /gregkidd).

Some key charts:

Because the future won’t be about banks.

It’ll be about digital wallets:

A DIGITAL WALLET is a service that allows you to pay for things, usually through a mobile phone app connected to a credit card, debit card or even a bank account. It stores a number of other items, such as a driver’s license, gift/loyalty cards, tickets for entertainment events, and transportation passes. It can also contain private keys for cryptocurrencies and some even let you hold stocks.

Would be cool to keep my COVID creds and NFTs there, too.

Welcome to GlobaliD (powered by Uphold).

2. Remember when the U.S. used to be way behind in digital payments?

That’s no longer the case, according to The Economist:

It has been a while coming. In 2018 Ant Financial, China’s payments giant, raised private funds at a valuation of $150bn. It was common then to hear Chinese executives say that America, land of the posted cheque and the hand-signed credit-card receipt, was years behind, held back by a cosy club of banks and credit-card firms.
Now investors have decided the moment has come. Take PayPal, a digital-payments firm set up in 1999 to allow users of Palm Pilots, a forebear of smartphones, to “beam” each other money. It was later bought by eBay, an online marketplace, which spun it out in 2015 for $45bn. Today it is worth $275bn, more than Citigroup or Wells Fargo. It is also more valuable than Ant, which has fallen out of favour with regulators in China and has been forced to cancel its initial public offering.
Enthusiasm for digital-payments companies has been whetted by the pandemic. The share price of PayPal has jumped 186% in the past 12 months while shares in Square, an American rival, have more than quintupled and those of Adyen, based in Amsterdam, have nearly tripled. The digital boom is luring credit-card colossi and tech titans, such as Visa and Google, to online payments.
3. Speaking of Visa, they let you settle in crypto now

“This is huge.” — /pstav

The news:

Visa has processed a cryptocurrency payment directly on the Ethereum blockchain as part of a new service the payment giant plans to introduce to its partners later this year.
The move, the latest sign of increased adoption of digital currencies by the old-guard financial industry, bumped the price of bitcoin (BTC, +2.57%) and ether (ETH, +6.05%) roughly 5% each.
Per a press release shared with CoinDesk, Crypto.com sent a USDC (-0.06%) stablecoin transaction on Ethereum to an account at Anchorage custody under Visa’s name. Crypto.com issues “crypto-backed” Visa cards that allow its users to spend the coins in their Crypto.com wallet.
United States dollar coin, or USDC, is a stablecoin pegged 1-to-1 with the dollar. It is the second-largest stablecoin with an $11 billion capitalization.

“Yep.” — /shadi

All the other fintech stuff:

Germany’s Central Bank Tests Blockchain Solution to Counter CBDCs — CoinDesk Covid-related fraud has cost Americans $382 million Fintech: How COVID Affected Payments and the Underbanked | Hacker Noon Federal Reserve’s Digital Dollar Push Worries Wall Street America used to be behind on digital payments. Not any more 4. This week in tech policy:

On new tech laws:

Why it matters: Democrats, empowered in Congress and enraged by misinformation over vaccines and the election, agree it’s time to legislate on tech policy, including updating the key law that shields them from liability from user-generated content. The path to passing a bill is a little clearer, and there have been signs that the largest tech platforms are ready to embrace some changes.
Driving the news: Pallone said he wants to take aim at online platforms’ financial incentives to amplify misinformation and extreme content.
“The more outrageous and extremist the content is, the more engagement and clicks they get, and therefore the more advertising dollars they get,” Pallone told Axios. “If nothing else, I’d like to create a disincentive for these companies to amplify this content that leads to violence.”
In an op-ed ahead of the hearing, Facebook disputed the notion the company has a financial interest “in turning a blind eye to misinformation,” saying, “We have every motivation to keep misinformation off of our apps and we’ve taken many steps to do so at the expense of user growth and engagement.”

On last week’s hearing:

Lawmakers at Thursday’s hearing on misinformation were less interested in getting answers from the CEOs of Facebook, Twitter and Google than in warning the social media giants that a legislative hammer is about to land on them, Kim, Ashley and Margaret report.
Driving the news: In a gruelingly long session conducted entirely by video conference, members of the House Energy and Commerce Committee told the CEOs their businesses prioritize ad revenue and engagement over rooting out misinformation and content that harms users, especially children.
Why it matters: The relatively consistent lines of questioning, sometimes crossing party lines, displayed a new unity among members of Congress in their concern about the companies — and a stronger likelihood that they might pass punitive laws.
5. The rise of the Sovereign Writer

Here’s a cool Stratechery piece, highlighted by /anej, who noted:

Ben Thompson made a good piece on so-called sovereign writers that thrive off of Substack and he makes a good point with “the problem with the social networks is that they want to own the reader, but the entire point of the sovereign writer is that they own their audience. “
With the feed and publishing capabilities and addition of wallet we can easily compete in that space as well. Email per se is still just a channel.

And here’s Ben on The Sovereign Writer:

I am by no means an impartial observer here; obviously I believe in the viability of the sovereign writer. I would also like to believe that Stratechery is an example of how this model can make for a better world: I went the independent publishing route because I had no other choice (believe me, I tried).
At the same time, I suspect we have only begun to appreciate how destructive this new reality will be for many media organizations. Sovereign writers, particularly those focused on analysis and opinion, depend on journalists actually reporting the news. This second unbundling, though, will divert more and more revenue to the former at the expense of the latter. Maybe one day Substack, if it succeeds, might be the steward of a Substack Journalism program that offers a way for opinion writers and analysts to support those that undergird their work.
What is important to understand, though, is that Substack is not in control of this process. The sovereign writer is another product of the Internet, and Substack will succeed to the extent it serves their interests, and be discarded if it does not.
6. And If you want more thinky stuff:

Here’s a longread about Likes:

And yet, as new communication technologies expand and come to structure more and more of our interactions, it is hard to avoid feeling there is something lost in living by the Logic of the Like. We reach for terms like “homogeneous,” “simple” or “flat” in order to express some truth about life in a world reduced to vectors of attraction and revulsion. Homogeneous, because of the tendency to steer ourselves toward others who already view the world as we do. Simple, in that complex emotional life is compressed into a single scale with only two sides: like and not like. Flat, because it seems to reduce the depth and texture of life.
But these terms fail to capture what is perhaps the most potent feature of the Logic of the Like, which is its seeming inescapability. It is similar to Thomas Hobbes’s theory of religion. We create a God whose needs and desires in turn come to control us. And the Logic of the Like is a jealous God. It admits nothing outside of itself, even its own negation. For example, suppose you decided to reject homogeny and seek out difference, contingency and surprise. A noble pursuit, perhaps, but easily assimilable: “I think we should expose ourselves to difference” is exactly what someone like you would say. It is a self-organizing system but also a self-enclosing one.

And another one about the plight of musicians (in a world full of monopolies).

7. Stuff happens: Via /m — Microsoft in Talks to Buy Discord for More Than $10 Billion Slack rolls back parts of its new DM feature over harassment concerns Via /carolyn — Startup ID.me Reaches $1.5 Billion Valuation For Its Identity Software Used By 22 U.S. States
The ambitious goal, according to Hall, is to build “the Visa in the identity layer of the internet.” Hall’s startup thinks it can slot in alongside the tech giants as a neutral vendor that isn’t interested in its users’ data. At the same time, ID.me’s digital calling card can be reused wherever the company’s got a contract, meaning the same DMV visitor in California could drive to Las Vegas and use it to check in at the MGM Grand casino, skipping the front desk. “Portable logins aren’t trusted, and trusted logins aren’t portable,” Hall says.
Via /pstav — ID.me brings in $100M at unicorn valuation | PitchBook Via /antoine — IDEMIA Brings Mobile ID Technology to Arizona Via /anej — WhatsApp for work: Slack is turning into a full-on messaging app

The GiD Report#153 — The age of the digital wallet has arrived was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Oracle Cloud Guard

by Mike Small Poorly managed security controls within a cloud services tenant's resources are increasingly the cause of security incidents and compliance failures. CSPM (Cloud Security Posture Management) tools provide functionality to address this challenge. This report provides a review of Oracle Cloud Guard which strongly matches KuppingerCole's recommended functionality for CSPM within the Or

by Mike Small

Poorly managed security controls within a cloud services tenant's resources are increasingly the cause of security incidents and compliance failures. CSPM (Cloud Security Posture Management) tools provide functionality to address this challenge. This report provides a review of Oracle Cloud Guard which strongly matches KuppingerCole's recommended functionality for CSPM within the Oracle Cloud Infrastructure


GIMLY

The EOSIO Identity Working Group - Kickoff April 12th!

Gimly is excited to start the EOSIO identity working group WG (Twitter #eosio_id)! This open working group (WG) will create and foster identity solutions using EOSIO technology, by creating open W3C compliant self-sovereign identity standards, interoperability, and ecosystem development for eosio based identities. Gimly has been working in the SSI ecosystem for over two years, closely with the D

Gimly is excited to start the EOSIO identity working group WG (Twitter #eosio_id)! This open working group (WG) will create and foster identity solutions using EOSIO technology, by creating open W3C compliant self-sovereign identity standards, interoperability, and ecosystem development for eosio based identities.

Gimly has been working in the SSI ecosystem for over two years, closely with the Decentralised Identity Foundation and Sphereon, and has been the core development partner of Europechain for the creation and implementation of the MyD sovereign identity portal. We are working with the European self-sovereign identity Framework lab and Ontochain as part of the European Next Generation Internet initiative, and building an SSI pilot with KLM/DHL to improve cargo supply chain.

From our experience with Europechain and knowledge of the EOSIO ecosystem, we are excited to bring our experiences and allow EOSIO identity systems to participate in the larger SSI ecosystem. We see this as a pivotal technology to allow enterprises and governments to safely and ethically manage identities and data exchange, and an enabling tool for blockchain and non-blockchain applications.

Please join us for the kickoff event Monday 12th April 2pm GMT / 10 am EST open for all interested in identity on EOSIO!

Please fill out your details in the sign up form to get the WG calendar and event invite. Agenda: WG and Gimly intro Intro of members What joining the WG means Please Retweet to show your attendance and interest for this working group!

We are proud to announce the EOSIO identity working group #eosio_id - open Kickoff event 12th April!

This WG between the EOSIO community and @block_one_ will bring self-sovereign identity #SSI to #eosio 🔑🔒💪.@europechain_ @WAX_io @HelloTelos #EOS https://t.co/NtEiuXJ9YY

— Gimly Blockchain Projects (@gimly_io) March 30, 2021

The first piece of work we will be proposing for the working group will be the finalisation of the EOSIO DID method (the first piece of the SSI tech-stack). This can be used by #EOS, #Telos, #WAX, #Europechain, voice.com and more to create self-sovereign identities. The 2nd meeting on of the WG will start planning the DID method. To progress this work item, we have already created a full draft of the EOSIO DID method which we will release next week. As part of this work, we have submitted to W3C a new cryptographic material type called Verification Condition needed for EOSIO DIDs.

Stay tuned to the Gimly Twitter and Linkedin where we will announce the EOSIO DID method and the following WG meetings!

Gimly is hiring!

Gimly will be participating in the eSSIf-lab second infra call with our connector for using NFC smartcards in SSI solutions AND we will be conbtributing to the NGI-ontochain project with a self-sovereign datavault and identity management panel.

For these projects and ongoing work with clients, we are looking for a new mid-level full-stack developer as well as a senior product development manager. See the jobs below and if you know someone interested we will be happy to hear from him/her!

See jobs sign up form:

Infocert (IT)

“La sovranità tecnologica e la sicurezza del Digital Trust sono fattori sempre più strategici per l’Europa”

È la visione del Gruppo Tinexta illustrata da Danilo Cattaneo, Amministratore Delegato di InfoCert, durante l’evento del Centro Economia Digitale sulla Sovranità Tecnologica Roma, 30 marzo 2021 – Tinexta Group, tra gli operatori leader in Italia nelle aree di business Digital Trust, Credit Information & Management e Innovation & Marketing Services, sta partecipando all’evento organizzato [

È la visione del Gruppo Tinexta illustrata da Danilo Cattaneo, Amministratore Delegato di InfoCert, durante l’evento del Centro Economia Digitale sulla Sovranità Tecnologica

Roma, 30 marzo 2021 – Tinexta Group, tra gli operatori leader in Italia nelle aree di business Digital Trust, Credit Information & Management e Innovation & Marketing Services, sta partecipando all’evento organizzato dal Centro Economia Digitale per la presentazione di un nuovo position paper sulla Sovranità Tecnologica, alla cui realizzazione ha contribuito con le competenze della propria controllata InfoCert, la più grande Autorità di Certificazione a livello europeo

Di fronte a un parterre istituzionale ed imprenditoriale d’altissimo profilo, ad esporre la visione di Tinexta su un tema così strategico è Danilo Cattaneo, Amministratore Delegato di InfoCert: “L’emergenza pandemica ha accelerato il processo di digitalizzazione per cittadini ed aziende e il loro ricorso a soluzioni di Identità digitale, firma elettronica, PEC, conservazione dei documenti. D’altronde, il Digital Trust, cioè la fiducia digitale, sta diventando uno dei fondamenti del funzionamento di tutte le società moderne. Per l’Europa sarà strategico, oserei dire vitale, assicurarsi e mantenere una solida sovranità tecnologica in un ambito così determinante per l’intero sistema economico, politico e sociale. È indispensabile tenere presente che il Digital Trust implica il controllo delle identità digitali e delle firme digitali e, quindi, la gestione di dati sensibili. Per di più, abilita processi molto critici: sottoscrizione di contratti, apertura di conti correnti, accesso a posizioni previdenziali, prescrizioni mediche e molto di più. Di conseguenza, il mantenimento della sovranità tecnologica è irrinunciabile per garantire la protezione dei dati dei cittadini da possibili intrusioni o manipolazioni e assicurare la continuità di servizi fondamentali.”.

Miliardi di transazioni digitali al giorno determinano la trasmissione via web o sul cloud di dati personali e commerciali: la perdita totale o significativa di sovranità si tradurrebbe in minacce per aree delicatissime quali la Difesa, la segretezza industriale, la privacy e la sicurezza dei cittadini. Si pensi, ad esempio, che l’Europa rappresenta solo il 4% delle piattaforme business digitali nel mondo, con il resto dei dati che transita su infrastrutture nelle mani di operatori extra UE.

“È importante mantenere il controllo sulla gestione e la residency dei dati digitali. Così come prevenire i rischi che scelte commerciali di aziende straniere – o addirittura decisioni politiche di Stati esteri – provochino impatti devastanti sulle attività e sulla stabilità di un Paese, ad esempio sospendendo l’erogazione di servizi quali i processi di firma digitale ai propri utenti europei. In questo senso, sarebbe opportuno avere strumenti di governance per quelle aziende ad alta rilevanza strategica per dati trattati o ruolo nella catena del valore. E, non ultimo, è indispensabile che le soluzioni di Digital Trust siano caratterizzate by design da livelli di sicurezza informatica adeguata” ha sottolineato Cattaneo.

Già da anni Tinexta effettua investimenti mirati in Ricerca e Innovazione che le hanno consentito di acquisire lo status di Digital Champion europeo, offrendo al mercato la possibilità di affidarsi a un player continentale allineato alle normative UE in tema di data privacy e digital trust. In questo solco, ha scelto di puntare anche sulla cybersecurity, acquisendo recentemente tre società qualificate del settore con circa 700 persone in organico: l’obiettivo è integrarne esperienze e competenze nel Gruppo e così rafforzare ulteriormente la propria offerta di Digital Trust, peraltro già pienamente conforme ai rigorosi parametri normativi del regolamento europeo eIDAS.

“La volontà di integrare gli aspetti di certificazione fiduciaria e di  e sicurezza dei dati  rappresentano un  passo decisivo in un rapido percorso verso un posizionamento esclusivo e di leadership nella creazione di infrastrutture digitali sicure e con piena validità legale” ha commentato Danilo Cattaneo a margine del suo intervento.

InfoCert SpA

InfoCert, Tinexta Group, è la più grande Certification Authority europea, attiva in oltre venti Paesi. La società eroga servizi di digitalizzazione, eDelivery, Firma Digitale e conservazione digitale dei documenti ed è gestore accreditato AgID dell’identità digitale nell’ambito di SPID (Sistema Pubblico per la gestione dell’Identità Digitale). InfoCert investe in modo significativo nella ricerca e sviluppo e nella qualità: detiene un significativo numero di brevetti mentre le certificazioni di qualità ISO 9001, 27001 e 20000 testimoniano l’impegno ai massimi livelli nell’erogazione dei servizi e nella gestione della sicurezza. Il Sistema di Gestione della Sicurezza delle Informazioni InfoCert è certificato ISO/IEC 27001:2013 per le attività EA:33-35. InfoCert è leader europeo nell’offerta di servizi di Digital Trust pienamente conformi ai requisiti del Regolamento eIDAS (regolamento UE 910/2014) e agli standard ETSI EN 319 401, e punta a crescere sempre di più a livello internazionale anche mediante acquisizioni: detiene il 51% di Camerfirma, una delle principali autorità di certificazione spagnole e il 16,7% di Authada, Identity Provider tedesco all’avanguardia. InfoCert, infine, è proprietaria dell’80% delle azioni di Sixtema SpA, il partner tecnologico del mondo CNA, che fornisce soluzioni tecnologiche e servizi di consulenza a PMI, associazioni di categoria, intermediari finanziari, studi professionali ed enti.

Tinexta Group

Tinexta, quotata al segmento STAR della Borsa di Milano, ha riportato i seguenti Risultati consolidati al 31 dicembre 2020: Ricavi pari a Euro 269,1 milioni, EBITDA pari a Euro 77,9 milioni e Utile netto pari a Euro 37,9 milioni. Tinexta Group è tra gli operatori leader in Italia nelle quattro aree di business: Digital Trust, Cybersecurity, Credit Information & Management e Innovation & Marketing Services. La Business Unit Digital Trust eroga, attraverso le società InfoCert S.p.A., Visura S.p.A., Sixtema S.p.A. e la società spagnola Camerfirma S.A., prodotti e soluzioni per la digitalizzazione: firma digitale, identità digitale, onboarding di clientela, fatturazione elettronica e posta elettronica certificata (PEC) per grandi aziende, banche, società di assicurazione e finanziarie, PMI, associazioni e professionisti. La Business Unit Cybersecurity opera attraverso le società Yoroi, Swascan e Corvallis e costituisce uno dei poli nazionali nella ricerca ed erogazione delle soluzioni più avanzate per la protezione e la sicurezza dei dati. Nella Business Unit Credit Information & Management, Innolva S.p.A. e le sue controllate offrono servizi a supporto dei processi decisionali (informazioni camerali e immobiliari, report aggregati, rating sintetici, modelli decisionali, valutazione e recupero del credito) e RE Valuta S.p.A. offre servizi immobiliari (perizie e valutazioni). Nella Business Unit Innovation & Marketing Services, Warrant Hub S.p.A. è leader nella consulenza in finanza agevolata e innovazione industriale, mentre Co.Mark S.p.A. fornisce consulenze di Temporary Export Management alle PMI per supportarle nell’espansione commerciale.  Al 31 dicembre 2020 il personale del Gruppo ammontava a 1.403 dipendenti.

* * *

Per maggiori informazioni:

InfoCertPress Relations Advisor BMP Comunicazione per InfoCert team.infocert@bmpcomunicazione.it Pietro Barrile +393207008732 – Michela Mantegazza +393281225838 – Francesco Petrella +393452731667 www.infocert.itTinexta S.p.A.Corporate & Financial Communications Carla Piro Mander Tel. +39 06 42 01 26 31 carla.piro@tinexta.comMedia Advisor Barabino & Partners S.p.A. Foro Buonaparte, 22 – 20121 Milano Tel.: +39 02 7202 3535 Stefania Bassi: +39 335 6282 667 s.bassi@barabino.itSpecialist Intermonte SIM S.p.A. Corso V. Emanuele II, 9 – 20122 Milano Tel.: +39 02 771151

The post “La sovranità tecnologica e la sicurezza del Digital Trust sono fattori sempre più strategici per l’Europa” appeared first on InfoCert.digital.


KuppingerCole

Jul 21, 2021: The Access Management Playbook: Securing Today's Organizations

Managing the granting and revoking of access based on user workflows is paramount to enabling effective risk management. Enforcing distinct access control requires an interconnected access management system that aligns with company policies and regulations.
Managing the granting and revoking of access based on user workflows is paramount to enabling effective risk management. Enforcing distinct access control requires an interconnected access management system that aligns with company policies and regulations.

Shyft Network

The Shyft Network Token Distribution and Economics, Part 2

Providing further clarity on circulating supply during the first month, and supply schedules. From the Shyft Network Team: Hi everyone, as we begin to distribute and hand over the core protocol to the ecosystem and community, we wanted to quickly put out a high-level breakdown to provide more context, as well as a better high-level view of how we see economic circulation and productive supp
Providing further clarity on circulating supply during the first month, and supply schedules. From the Shyft Network Team:

Hi everyone, as we begin to distribute and hand over the core protocol to the ecosystem and community, we wanted to quickly put out a high-level breakdown to provide more context, as well as a better high-level view of how we see economic circulation and productive supply across the Shyft Network.

We also want to notify anyone inspecting the contract’s values moving here and there, that we will be shortly moving assets to Cold Storage.

The way to look at Shyft Network’s economics is through streams of productivity and how those tranches are formulated, what they are intended to achieve, and what role they play across the macroeconomics as well as micro operational needs of any ecosystem or use case.

Circulating supply and total supply are often black and white, and the numbers are meant to be a macro snapshot of the entire universe. Within this universe, are a variety of asset tranches that are not meant to be a part of the active circulating supply, but are regulating pools of liquidity that were designed for certain use cases and network-effect-specific outcomes.

Below is a high-level breakdown that should help our community understand the distinction between Active Circulating Supply, Inactive Supply, and Total Supply, and how self-regulating asset pools play into this.

The sum of all liquidity in the universe

Let’s start by describing what self-regulating asset pools are. These pools are, primarily, use-case-controlled circulating reserves that become a part of the percentage of circulating supply, but are pre-programmed into smart contracts that have specific use cases and are autonomous applications (pre-built in-system use cases).

These tranches are designed to regulate and accelerate economic activity as cross-protocol total supply balancers. They are not meant for operational disbursement and are not allocated to participants of the ecosystem, but are liquidity pools that act to autonomously expand, decrease, accelerate or slow down the total usage and supply of the overall ecosystem. These asset pools were designed as mechanisms to help regulate and keep the ecosystem functional, plus incentivizing stability and longevity. Here are some examples of these self-regulating pools (for more information on these tranches, please see our Token Economics blog post):

1. Synthetic Asset Wrap Rewards
2. LP Pool & Liquidity Rewards
3. Economic Metagame
4. Next-Gen DEFI

What goes into the Active Circulating Supply? The following buckets and tranches are considered active or can be active when deployed (some will remain inactive for a long period of time), and can be counted towards the Active Circulating Supply when deployed on the network.

Operational Liquidity.

Over the next 7 years, these tranches and allocations are defined and allocated across the distribution schedules for the network’s operational activities. These are asset pools that can become active when they enter circulation. They are meant to provide liquidity to new projects, like compliant DeFi, strategic partnerships, marketing activities, and community DAOs that incentivize the growth of our developer community. Here are some specific tranches that work within this bucket:

Ecosystem Development (Support): To be used on operations, liquidity, partnerships, federation bootstrapping, marketing, foundation & DAO creation.
1. Public Distribution
2. Shyft Network Treasury
3. VASP Initializer

Purchaser, Team & Advisors

These are the incredible people who have made this global ecosystem possible by supporting it, advising, or directly working on it for the last 4 years, and who are committed to continue working alongside all of you as an open community over the coming years.

1. Ongoing Technical Partnerships
2. Purchasers (earlier purchaser private sale)
3. Strategic Partnerships (The latest strategic purchaser private sale)
4.