Last Update 5:51 AM June 24, 2021 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Thursday, 24. June 2021

Blockchain Commons

Gordian QR Tool Supports Vaccine Records, 2FAs, Cryptoseeds, and More

The state of California recently announced their Digital COVID-19 Vaccine Record system. It allows Californians to access a digital copy of their vaccine records, rather than having to depend entirely on their physical copy. The main entry point is a QR code containing a rudimentary Verifiable Claim (VC), which makes for great ease of use. However, Blockchain Commons has concerns over how the user

The state of California recently announced their Digital COVID-19 Vaccine Record system. It allows Californians to access a digital copy of their vaccine records, rather than having to depend entirely on their physical copy. The main entry point is a QR code containing a rudimentary Verifiable Claim (VC), which makes for great ease of use. However, Blockchain Commons has concerns over how the user experience (UX) design might negatively affect both privacy and security; we are working to address these with our new Gordian QR Tool for the iPhone, Blockchain Commons’ first Apple appstore release.

Read More

For a while now, increasingly personal information has been getting encoded as QRs. During the pandemic, COVID tests were often stored as QRs. There has also been extensive use of QR codes to transmit secure information across an airgap. Many people first saw this use of QRs when they read a QR code for a two-factor authentication seed into their phone’s Authenticator app. At Blockchain Commons, we similarly use QR codes as an encoding method to transmit seeds and keys.

Some possible architectural issues arise from using QR codes for confidential data, such as the fact that you’re actually transmitting the data (not a proof of the data), that the QRs tend to contain all of the data (not just a selection), and that there’s no way to rescind a QR or expire it. Those issues will have to be dealt with at a foundational level as we figure out what can safely be encoded as a QR — and more importantly how to offer restricted proofs rather than complete information.

However, there are also security & privacy UX issues related to the storage of QR codes that need to be resolved no matter what is encoded. The state of California itself demonstrated the bad habits that we’re already developing when they said in their official press release: “individuals are encouraged to screenshot the information and save it to their phone files or camera roll”.

Now, storing QRs as photos on a phone does meet some minimal security requirements. Most phones are closely held and protected by a PIN, a thumbprint, or a faceprint. They’re often backed up to a cloud service and that cloud data is mostly secured as well. Still, there are numerous opportunities for problems.

That’s in large part due to the fact that photos are built to be shared. It’s very easy to send a photo to a friend or a social network, which is great for photos, but less so for your QR codes. In addition, photos are built to be synced. Though your data might be relatively safe if it’s synced to the cloud, it’s not that secure when synced to your desktop computer, which is vulnerable to malware and other attacks.

Photos also aren’t built to be searched and their categorization systems are usually rudimentary. Even if your QR code was safe, you might not be able to easily find it.

That’s where Blockchain Commons’ Gordian QR Tool for the iPhone comes in. Though we can only make sure that our own uses of QR codes are architecturally secure, we can definitely provide better ways to store QR codes, and that’s what QR Tool does.

QR Tool lets you scan any QR code into an encrypted vault. It then protects your code with two-factor authentication: you initilally log in with your Apple login and password, and then anytime you want to view your codes, you additionally must use a biometric authentication, such as a fingerprint. Unlike your semi-porous camera roll, QR Tool was built to keep confidential information secure.

QR Tool also provides proof that your codes weren’t changed. Using Lifehash technology, QR Tool links each code to a unique, evocative, and unchanging graphical color representation. This will prove particularly useful when the Lifehash specification is adopted by other companies, so that you can transfer a QR code between applications and verify immediately that it’s the same — something that’s difficult with the QR code alone.

Finally, QR Tool resolves the problem of categorization by automatically recognizing a wide variety of QR codes and giving you the opportunity to change those categorizations as you see fit.

If the future we hope to see more fully featured Verifiable Credentials, but there’s no doubt that QRs are with us for the moment, and we need to make sure that they remain secure and easy to use. This is why we released QR Tool. It’s a reference implementation for the Gordian Principles, demonstrating how to store confidential data in a way that’s independent, private, resilient, and open: QR Tool ensures that you hold your own QRs, that they’re safe, and that you can easily transmit them to other services.

If you’ve got QRs of your own to secure, you can purchase QR Tool from the Apple Store or you can compile it directly from the source code found at the QR Tool repo on GitHub.

Wednesday, 23. June 2021

Indicio

Decentralized Identity opens the doors for safe travel and tourism

The post Decentralized Identity opens the doors for safe travel and tourism appeared first on Indicio Tech.

Decentralized Identity opens the doors for safe travel and tourism Learn how Indicio and SITA worked together using privacy-preserving technology to reshape contactless health information sharing.

Proof of testing or vaccination has become central to how we reopen travel to countries, admit visitors, and bring tourism economies back to life. Providing privacy and control for people is the key to establishing public confidence in a system for proving one’s health status.

A digital proof of a Covid negative test or vaccination must be designed to protect individual privacy. It should enable a medical or test center to directly provide that information to an individual—and involve no one else storing or managing their data. It should be tamper proof and incapable of being faked. And it should be easy to download and quick to use.

This is why Indicio.tech, a public benefit corporation that provides decentralized identity software solutions, and SITA, the leading global technology provider for the air transport industry, have used open source, privacy-by-design technology to build a solution that allows airports, airlines, and all elements of the tourist economy to use verifiable digital credentials to safely and securely return to life.

How to reopen travel and tourism and preserve privacy
Decentralized identity uses distributed ledger technology, cryptography, and a new way to provide an individual with control of their digital information. This means identity credentials that contain health information are issued directly to that person’s digital wallet, without any handoff to or management by third-parties. Trusted organizations can quickly issue millions of credentials without any of the information they contain being collected and stored in a third-party database.

Then, when the person decides they want  to share all or just part of the information, such as the specific details of their test status, the authenticity and original source of that information can be definitively proven. This makes the digital credential compliant with health and data privacy law (HIPAA, GDPR).

The advantages of this decentralized identity ecosystem are that it can:

Replace paper cards with fully digitized identity information Increase efficiency by automating many tasks involved in presenting personal  health status Ensure consent and control when sharing personal data Allow a user to select which information they want to disclose while obscuring the rest Enhance security through privacy-by-design, user-friendly digital records, and tamper-evident distributed ledger technology Avoid the problem of fraudulent health cards or paper forms from being presented Scale to include millions of participants, including employees, travelers, and residents, with just a few writes to a public ledger and an inexpensive mobile application Speed recovery of reopening venues and countries

Open and manage public spaces
Indicio’s identity ecosystem is built using Cardea, a complete ecosystem for the exchange of privacy-preserving digital credentials, open sourced as a project in Linux Foundation Public Health. Based onHyperledger Indy and Aries open source technology, its flexible design means it can be easily adapted and deployed by any company, government, or organization that needs a privacy preserving digital credential for managing access.

Indicio’s implementation of Cardea for SITA and the Government of Aruba features a mobile interface in the form of a mobile app for users and a second mobile app for use by venues to receive and verify credentials from users. Software called mediator agents and enterprise agents allow for scaling and automation of the credential issuing and verification processes. Distributed ledger technology provides cryptographic assurance that the data within any given credential has not been tampered with or altered.

Cardea’s architecture protects privacy and aids compliance by separating issuers, holders, and verifiers of credentials. Issuers cannot know where credentials were used by those holding them, and verifiers (receivers) of credentials are able to limit the amount of data they receive and retain.

Successful test deployment in Aruba
The island of Aruba and global air transport technology provider SITA came to Indicio to create a trusted traveler system that makes it easy for visitors to share their health status privately and securely using their mobile device.

Aruba is focused on finding innovative ways to strengthen its tourism industry while minimizing the risk of Covid-19 infection from visitors. Unlike immunity passports, the verifiable digital credential system from Indicio allows visitors to share a trusted proof of their health status. This trust is possible because the traveler has shared their health status and had it verified by a public health agency.

Once a test result is approved, the traveler is issued a second credential by the public health agency to confirm that they have tested negative. This credential contains no medical data whatsoever and is used only to prove the person’s test status. The Happy Traveler Card, as this credential is called in Aruba, is verified by hotels, restaurants, and entertainment venues that the traveler visits. It is an easy way for even their smallest businesses to ensure the safety and health of their guests.

The Happy Traveler Card in action

 

Machine readable governance enabled businesses and venues to trust that tourists had been tested on arrival by Aruba’s health department. Visitors using the digital Aruba Happy Traveler Card could be swiftly and reliably verified with a phone app. This freed both businesses and the government from the burden of mechanically collecting data with the attendant risk of error or fraud.

The Cardea ecosystem enables Aruba to move toward a privacy-first approach to supporting their tourism industry, which in 2019 accounted for 98.3% of Aruba’s GDP and supported 47,000 jobs—99% of all employment on the island.

Build on our experience for your solution
The tourism and hospitality identity solution for SITA is highly replicable for use cases in any industry and easy to integrate with existing systems. With a professionally staffed global network for verifiable digital credentials supported by many of the leading companies in this space, Indicio is building the future of identity. Open source and interoperable software means your solution is scalable and  sustainable. Our expert team of architects and engineers can create and customize a solution quickly for business, governments, and organizations who need a privacy-first identity solution and they can deploy it in weeks.

To learn more or schedule a free one-on-one consultation to find out how you can benefit from this solution, contact us.

 

 

The post Decentralized Identity opens the doors for safe travel and tourism appeared first on Indicio Tech.


Finicity

Finicity’s Mortgage Verification Service is now live on SimpleNexus

SimpleNexus announced an integration with Finicity’s Mortgage Verification Service (MVS) that allows lenders to streamline the verification of applicants’ assets, income and employment using a single embedded service. Finicity launched MVS in February. The service leverages consumer-permissioned bank and payroll data to provide accurate, real-time insight into a borrower’s current assets, income a

SimpleNexus announced an integration with Finicity’s Mortgage Verification Service (MVS) that allows lenders to streamline the verification of applicants’ assets, income and employment using a single embedded service.

Finicity launched MVS in February. The service leverages consumer-permissioned bank and payroll data to provide accurate, real-time insight into a borrower’s current assets, income and employment in minutes, without any paperwork. MVS has helped lenders shave up to 12 days off the origination process and is accepted by both Freddie Mac and Fannie Mae, making loans eligible for rep and warrant relief.

SimpleNexus is the first mortgage point-of-sale (POS) platform to offer Finicity’s MVS as an integrated solution. Without ever leaving the SimpleNexus mobile app, borrowers can use MVS to complete asset, income and employment verification in a few simple steps that take just minutes to complete. Lenders receive validated payroll, paystub and bank account data in real time and can refresh the data within 10 days of the loan closing as needed to fulfill investor requirements.

Read the full release here.

The post Finicity’s Mortgage Verification Service is now live on SimpleNexus appeared first on Finicity.


Meeco

Support Centre for Data Sharing interview with Meeco

The Support Centre for Data Sharing (SCDS) initiative focuses on researching, documenting, and reporting about the data sharing practices, EU legal frameworks, access and distribution technology that are relevant to organisations, and that imply novel models, and legal or technological challenges. “Whilst privacy is paramount, you can’t have a digital ... Read More The post Support Centre for
The Support Centre for Data Sharing (SCDS) initiative focuses on researching, documenting, and reporting about the data sharing practices, EU legal frameworks, access and distribution technology that are relevant to organisations, and that imply novel models, and legal or technological challenges.
“Whilst privacy is paramount, you can’t have a digital economy if everyone is locking their entire digital footprint away”

Katryna Dow, CEO & Founder Meeco
This is one of the many topics touched on the latest Support Centre for Data Sharing interview. Raymonde Weyzen and Esther Huyer interview Meecos’ CEO & Founder Katryna Dow and Chief Commercial Officer Jason Smith. Some of the challenges that come with data sharing are data privacy and data control. However the paradox of data sharing is that it generally means the recipient needs the data in order to fulfil an obligation, such as deliver a service, validate identity, deliver a package or customise an experience.

So the issues are often not about sharing, but about trust and transparency. Helping customers understand why the data is required and providing evidence that it is being used as intended is a great way to establish trust. Another way to boost trust is to focus on designing services that minimise the amount of data collected whilst maximising the value created. When designing data sharing services, consider how to minimise collection whilst delivering maximum value. It might be as simple as only holding the data for the maximum time needed to complete the service and then sharing evidence that it has been deleted. In this thought provoking interview Raymonde asks about Meeco’s inception, its work so far and recent growth. Some of the the exciting projects discussed include mIKs-it, the safe multimedia app for children, developing a decentralised identity and verifiable credentials wallet and how innovators like VELA Solutions are transforming workforce management and My Life Capsule are helping their customers be prepared for a family emergency. Other questions and topics covered include: + How the idea of Meeco was conceived + Why is there a need for data sharing? + What is the data sharing lifecycle at Meeco? + Examples of use cases at Meeco + What specific licensing or standards to share data are used at Meeco? + In order to share data proper;y going forward, do we need more or less regulation? + Where would you see Meeco and ultimately the digital world in 10 years from now? And finally, the most difficult question we weren’t prepared for “What would be the working title of a movie starring Meeco?” to find the answer to this and more click below to listen or watch the interview. Watch the interview Huge thanks to the Support Centre for Data Sharing team for all the great work they are doing to help people understand the value of data sharing We very much appreciated the opportunity to share Meeco’s perspective.

The post Support Centre for Data Sharing interview with Meeco appeared first on The Meeco Blog.


Anonym

Google’s Safety Section Doesn’t Really Keep Pace With Apple

Google has pre-announced plans to introduce a “safety section” in the Google Play store to help people understand how Android apps, including its own, use the personal data they collect about their users.  If this sounds like Google’s answer to Apple’s privacy labels, introduced in December 2020, it is — kind of. Critics say Google’s new feature isn’t a

Google has pre-announced plans to introduce a “safety section” in the Google Play store to help people understand how Android apps, including its own, use the personal data they collect about their users. 

If this sounds like Google’s answer to Apple’s privacy labels, introduced in December 2020, it is — kind of. Critics say Google’s new feature isn’t a privacy move so much as a justification for why a user can trust an app with their data. As Google puts it, the new section “will help people understand the data an app collects or shares, if that data is secured, and additional details that impact privacy and security”. 

Google’s not really saying apps shouldn’t collect the data in the first place — which is increasingly Apple’s position, particularly with its new App Tracking Transparency (ATT) feature.

So what can we expect from the Google Play store safety section? At a glance, app developers will be asked to declare:

What personal data they collect, including users’ names and email addresses, and any information from the device, such as location data, contacts and photos and videos  What they’re using that personal data for — is it for a better user experience or personalized ads, for example?  Any good security and privacy practices they already have in place, such as data encryption or protecting children under Google’s Families policy Whether the app needs the data it collects to function or whether users have a choice about their data being shared (as they do on iOS apps through Apple’s new ATT)  Whether the app’s safety section is verified by an independent third party Whether they will service the user’s right to data deletion requests if the user chooses to uninstall the app. 

As we said, Google’s safety section and Apple’s privacy labels are not really a case of comparing apples with apples (pardon the pun). Critics are saying Apple privacy labels focus on the data apps are collecting for tracking purposes and what’s linked to the end user, whereas Google’s safety section is more about an app’s best practices and trustworthiness in collecting and handling a user’s personal data. While the independent verification on the Google labelling doesn’t happen on Apple’s privacy labelling, it is optional, and it also remains to be seen how rigorous it will be. 

To us, this is all part of Google’s predictable misunderstanding of what safety means to many people. Yes, it’s about security of data, but Google’s repeated ploy to bait-and-switch a privacy concern into a security solution is missing a big aspect – what companies do themselves and how they share our personal data with other parties. No amount of security-only thinking results in a privacy solution in this case. What’s more, Google going on about the evils of data sharing are moot since they are the biggest data hoarder there is.

Like Apple did when it announced the ATT feature, Google is giving developers plenty of time to get used to the idea. The safety section in Google Play store won’t become mandatory for all apps until Q2 of 2022. In fact, this feels like Google is giving its app developer community longer than Apple did from announcement to required implementation. Perhaps that’s an indirect acknowledgement by Google that the data handling practices of its huge app dev community needs more time?

Once the requirements are mandatory, app developers who don’t comply will face sanctions.

Like all the privacy changes we’re seeing in the ad tech space lately, consumers will ultimately determine whether this new move from Google goes far enough in protecting their personal data. As iOS apps have seen with the low opt-in rates for cross-app tracking from Apple’s new ATT, Android apps might be in for a rude shock.

Photo By: Bloomicon

The post Google’s Safety Section Doesn’t Really Keep Pace With Apple appeared first on Anonyome Labs.


auth0

Protocol Types in Python 3.8

A quick introduction to the new Protocol class in Python 3.8 and how it enables structural typing
A quick introduction to the new Protocol class in Python 3.8 and how it enables structural typing

IBM Blockchain

A legal perspective on blockchain-anchored business networks

Business law, legal considerations and regulatory compliance are important to setting up and running an enterprise business. The blockchain business network enables enterprises to interact among their stakeholders across geography with trust and traceability. Hence, it is important that the blockchain business network comply with a required set of business laws, geography-specific standards, a glo

Business law, legal considerations and regulatory compliance are important to setting up and running an enterprise business. The blockchain business network enables enterprises to interact among their stakeholders across geography with trust and traceability. Hence, it is important that the blockchain business network comply with a required set of business laws, geography-specific standards, a global […]

The post A legal perspective on blockchain-anchored business networks appeared first on Blockchain Pulse: IBM Blockchain Blog.


Technology in blockchain-anchored business networks

Technology choices and considerations are foundational to construct, deploy and operate a blockchain-based business network. The overall technical architecture of a blockchain solution is driven by the functional and non-functional requirements of all stakeholders, and the core business. The solution consists of four layers — data, network, application services, and apps or interaction layer. But

Technology choices and considerations are foundational to construct, deploy and operate a blockchain-based business network. The overall technical architecture of a blockchain solution is driven by the functional and non-functional requirements of all stakeholders, and the core business. The solution consists of four layers — data, network, application services, and apps or interaction layer. But […]

The post Technology in blockchain-anchored business networks appeared first on Blockchain Pulse: IBM Blockchain Blog.


Ontology

Ontology Weekly Report (June 16–22, 2021)

Highlights It’s been another exciting week for Ontology, as we continue the development of Ontology’s integrated EVM, which is now 60% complete. In other news, Ontology’s Chief of Ecosystem Partnership, Gloria Wu, spoke with Neil Hughes on the Tech Talks Daily Podcast. Latest Developments Development Progress Ontology’s integrated EVM design is fully complete, and the development is 60%
Highlights

It’s been another exciting week for Ontology, as we continue the development of Ontology’s integrated EVM, which is now 60% complete. In other news, Ontology’s Chief of Ecosystem Partnership, Gloria Wu, spoke with Neil Hughes on the Tech Talks Daily Podcast.

Latest Developments Development Progress Ontology’s integrated EVM design is fully complete, and the development is 60% complete. It is designed to be fully compatible with the Ethereum contract ecosystem. ETH RPC support is 95% complete. Ontology’s new Ethereum account system is now 70% complete. Product Development Activities co-organized with BabySwap, ApeSwap, CafeSwap and HyperJump, are in progress. ONTO launched a user experience research report on Twitter, encouraging users to give feedback on their experience. In return, users receive ONG rewards, which will be issued within two weeks. On-Chain Activity 116 dApps have been launched on MainNet; the total dApp transaction volume is 6,616,413, an increase of 4,329 from last week. 15,808,311 transactions have been completed on MainNet, an increase of 30,926 from last week. Community Growth 263 new members joined our global community. We are very excited to see the Ontology community continue to grow and we encourage anyone who is curious about what we do to join us. We held our weekly DeID Summit on Clubhouse and Community Call on Discord, led by our Head of Community, Humpty Calderon. Humpty gave a brief recap of the burgeoning Ontology ecosystem, an update on development progress, and a discussion on decentralized identity topics. As always, we’re active on Twitter and Telegram where you can keep up with all our latest developments and community updates. Global News Ontology was invited to The Tech Talks Daily Podcast Gloria Wu, Chief of Ecosystem Partnerships, was invited to chat with Neil Huges on The Tech Talks Daily Podcast. They discussed digital identity and the importance of privacy and data, and Ontology’s mission to solve these issues and change how data is managed and transferred. Listen here. ONT listed on WazirX ONT is now listed on WazirX, one of India’s biggest and fastest growing cryptocurrency exchanges with over 900,000 users. ONT holders can trade ONT for $INR or $USDT easily with WazirX. Chainstack recommends Ontology for decentralized identity management Chainstack, a blockchain technology facility service provider, recommended Ontology’s decentralized identity management service thanks to its ability to help you take back control of your data, verify your digital identities, data and assets, whilst protecting your privacy. Ontology held special events for Father’s Day on Weibo and Twitter To spread the love to every dad, Ontology launched a Weibo activity, “Repost one thing that impresses you about your father”, which received great traction from the community. We also ran a Twitter campaign, “Dad needs DIDs”, calling on Dads everywhere to register an ONT ID and establish a decentralized digital identity. Both campaigns received some great feedback from our community. Ontology In The Media

City Telegraph — Complete Guide: Best Altcoin & How to mine Alternative cryptocurrencies

An altcoin is an “alternative to bitcoin”, that is any cryptocurrency other than bitcoin itself. Each of the alternative crypto-units is trying not only to copy, but also to improve upon the characteristics of bitcoin. Already, there are more than 800 altcoins, most of which differ from the original slightly. City Telegraph shares several things about altcoins users should know and lists the best proof of stake (PoS) Coin List 2021.

Ontology is recommended as one of the top 10 best proof of stake (PoS) coins in 2021. Ontology is a blockchain network built to support other blockchains, both public and private so that businesses can take advantage of their technological advancements. Currently, Guarda wallet and Ledger Live wallet are supporting ONT/ONG with an annual reward of 7.3%.

Bloomberg — Fox Creates $100 Million Fund for the Nonfungible Token Market

Fox Corp. is joining the list of investors and businesses pumping $100 million into nonfungible tokens (NFTs), a type of digital file (e.g. image, audio, video) that can be bought and sold. The fund is part of a broader effort announced in May that includes a new business unit, Blockchain Creative Labs, that will sell and mint NFTs and other digital goods.

Ontology is committed to blockchain technology and the NFT industry. Ontology is proud to collaborate with ROCKI, a next-generation music streaming service and music NFT platform built on Binance Smart Chain. Ontology will provide decentralized identity (DID) solutions to ROCKI’s music creators and users to ensure they are authentic and secure.

Want more Ontology?

You can find more information about our decentralized solutions across identity and data, on our website or simply follow us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (June 16–22, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

AML-compliant identification available around the clock for the first time at IDnow

As one of the first providers, the IDnow platform will enable a 24-hour service for identity verification   Munich, June 23rd 2021, IDnow, a leading platform-as-a-service provider for identity verification, announces the launch of AML-compliant video identifications around the clock. Users can now perform non-automated identity verification procedures at any time thanks to the overnight […]

As one of the first providers, the IDnow platform will enable a 24-hour service for identity verification

 

Munich, June 23rd 2021, IDnow, a leading platform-as-a-service provider for identity verification, announces the launch of AML-compliant video identifications around the clock. Users can now perform non-automated identity verification procedures at any time thanks to the overnight operation.

“With the extended service hours with 24-hour availability, we give our users even more freedom and enable our business customers with tailored support even better accessibility, higher customer satisfaction and of course the possibility to identify more customers per day” says Bettina Pauck, COO of IDnow. “No matter what industry you are in or what your use case is – opening a bank account, foreign exchange trading or player verification, we are here for you.”

The new service offering, which is available in English and German, completes IDnow’s strategy of focusing on the user needs and providing an easy and quick identification – whenever and wherever a customer wants. “For a convenient process that fits into the fast-paced, digital-centric everyday life of users, such a factor is essential,” adds Bettina Pauck.

Furthermore, IDnow offers a highly secure and reliable structure – with a diversification to a total of 20 geographically distributed ident center locations. The sites are located in various cities across Germany and Europe and also guarantee a high level of data protection. The highly qualified and multilingual fraud and identification specialists, who can also offer the service in French, Italian and Spanish, are trained on the comprehensive BaFin process, know the data protection regulations as well as the common forgery and fraud possibilities.

IDnow has expanded its role in recent years far beyond offering individual ident procedures and has become the overarching platform for digital identities with several million transactions per year. In early 2021, IDnow announced the acquisition of identity Trust Management AG, one of the leading international providers of online and offline verification. This was the second acquisition within six months for IDnow and represents an important milestone on the way to becoming the leading identity platform in Europe. The acquisition of identity Trust Management AG enables IDnow to expand into new industries and offer its services to a broader customer base in Germany and beyond.


Elliptic

Crypto Regulatory Affairs: Texas Hold 'Em: The Lone Star State Goes All-In on Crypto Custody

Regulators in Texas have given state-chartered banks the green light to custody crypto.  

Regulators in Texas have given state-chartered banks the green light to custody crypto.  


Okta

Use Okta and Oso to Secure a FastAPI + SQLAlchemy App

FastAPI is really fast and SQLAlchemy is really…SQL-y. But what good is a fast and SQL-y application if it isn’t secure? In this post, we’re going to show you how to secure a fast and SQL-y app! First we will need some authentication, which is how we identify who the user is. We’ll use Okta for this. Next, we’ll want to perform authorization, which controls what the user can do in our app

FastAPI is really fast and SQLAlchemy is really…SQL-y. But what good is a fast and SQL-y application if it isn’t secure?

In this post, we’re going to show you how to secure a fast and SQL-y app!

First we will need some authentication, which is how we identify who the user is. We’ll use Okta for this.

Next, we’ll want to perform authorization, which controls what the user can do in our application. We’ll be using Oso for that, which is a batteries-included library for authorization.

This post is intended for people who have some familiarity with both FastAPI and SQLAlchemy. By the end of the post, you will know how to make sure users have access to the things they need - and only the things they need.

The full example is available on GitHub. Clone the repo and follow along!

Table of Contents

Bear Management Service Architecture Set Up Okta Start the React Front End Start the FastAPI Back End ABAC, As Easy as 1-2-3 🕺 The Bear Necessities Add Your First allow() Rule Deal with List Endpoints Flesh Out Your Authorization Policy Final diff of securing the app with Oso Learn More About Oso, FastAPI, and Python Bear Management Service Architecture

What, you expected TodoMVC?

Our app allows authenticated users to register their own bears and view the bear population at large. It consists of two separate services: (1) a front end through which users authenticate with Okta, and (2) a back end that exposes an extensive API for creating and retrieving bears.

Set Up Okta

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Single-Page App and press Enter.

Use http://localhost:8080/login/callback for the Redirect URI and set the Logout Redirect URI to http://localhost:8080.

What does the Okta CLI do?

The Okta CLI will create an OIDC Single-Page App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. It will also add a trusted origin for http://localhost:8080. You will see output like the following when it’s finished:

Okta application configuration: Issuer: https://dev-133337.okta.com/oauth2/default Client ID: 0oab8eb55Kb9jdMIr5d6

NOTE: You can also use the Okta Admin Console to create your app. See Create a React App for more information.

Once your new Okta application is created, the Okta CLI will print out its Issuer and Client ID properties:

$ okta apps create ... Okta application configuration: Client Id: <YOUR CLIENT ID> Issuer: https://<YOUR OKTA DOMAIN>/oauth2/default

Create a file named .env in the project’s root directory with the following contents:

CLIENT_ID=<YOUR CLIENT ID> ISSUER=https://<YOUR OKTA DOMAIN>/oauth2/default AUDIENCE=api://default

After creating the .env file in the project root, symlink it into the okta-hosted-login subdirectory so that both the front end and back end projects have access to the same configuration parameters:

$ ln -s ../.env okta-hosted-login/.env Start the React Front End

In the okta-hosted-login directory, run npm install to install dependencies and then npm start to fire up the React front end.

Once the app is up and running, navigate to http://localhost:8080 in your browser. Click the Login button and enter your Okta credentials when prompted. Successfully signing in should redirect you back to your front end, where you’ll be greeted with your name (courtesy of Okta), a ‘Create a new bear’ form, and a list of bears:

The list will be empty because our back end service isn’t running. Let’s change that now.

Start the FastAPI Back End

While the React app runs happily in the background, open a new terminal and cd to the project’s root directory. In the root, create and activate a new virtual environment, and then install dependencies:

python3 -m venv venv && source venv/bin/activate pip3 install -r requirements.txt

Finally, start the FastAPI server:

uvicorn app.main:app --reload --reload-dir=app

If you reload http://localhost:8080, you should see the list of bears populated with a number of very good bears owned by various members of Example.com, Inc:

Go ahead and create a few new bears of your own.

Our app is now open for business. In fact, it’s too open. Every authenticated user can see everyone’s bears — even users who have been banned for trying to create koalas. It’s time to put a stop to this madness.

ABAC, As Easy as 1-2-3 🕺

Oso is an open source authorization system that we’ll use to secure our app. Oso is flexible enough to support any access control pattern your heart desires, but for this example we’ll focus on attribute-based access control (ABAC).

ABAC is all about representing fine-grained or dynamic permissions based on who the user is and their relation to the resource they want to access.

The Bear Necessities

First, after pressing Ctrl+C to exit FastAPI, we’re going to install the oso and sqlalchemy-oso packages:

pip3 install oso sqlalchemy-oso

Once pip finishes pipping, open up app/main.py using your favorite text editor and make the changes that you see below. The following is a Git diff of the change — if you’re following along, add the lines that start with a single + symbol to your local copy of app/main.py:

diff --git a/app/main.py b/app/main.py index 397037e..7dd57d0 100644 --- a/app/main.py +++ b/app/main.py @@ -6,4 +6,5 @@ from okta_jwt.jwt import validate_token from sqlalchemy.orm import Session, sessionmaker from starlette.config import Config +from oso import Oso from app.crud import create_bear, get_or_create_user_by_email, list_bears @@ -17,4 +18,7 @@ conf = Config(".env") issuer, audience, client_id = conf("ISSUER"), conf("AUDIENCE"), conf("CLIENT_ID") +# Initialize Oso. +oso = Oso() + def get_db():

Next, create an empty Oso policy file in the app directory:

touch app/policy.polar

This is where we are going to write all of our authorization logic using Oso’s declarative policy language called Polar.

Load the new policy file into our Oso instance in app/main.py:

diff --git a/app/main.py b/app/main.py index 7dd57d0..a1e166f 100644 --- a/app/main.py +++ b/app/main.py @@ -20,4 +20,5 @@ issuer, audience, client_id = conf("ISSUER"), conf("AUDIENCE"), conf("CLIENT_ID" # Initialize Oso. oso = Oso() +oso.load_file("app/policy.polar")

Before we start filling that policy file with authorization rules, we first need to register the application data types that we’re going to use in our authorization policy with Oso. Registering the application types allows us to reference them in our Polar policy as specializers.

We’re going to use the register_models() helper from the sqlalchemy-oso library to register our SQLAlchemy models with Oso in bulk. register_models() registers all descendants of a SQLAlchemy base class with Oso; otherwise, we would have to call oso.register_class() for each individual class that we wanted to register:

diff --git a/app/main.py b/app/main.py index a1e166f..796ab1e 100644 --- a/app/main.py +++ b/app/main.py @@ -7,7 +7,8 @@ from sqlalchemy.orm import Session, sessionmaker from starlette.config import Config from oso import Oso +from sqlalchemy_oso import register_models from app.crud import create_bear, get_or_create_user_by_email, list_bears -from app.db import engine, setup_db +from app.db import engine, setup_db, Base from app.models import User from app.schemas import Bear, BearBase @@ -20,4 +21,5 @@ issuer, audience, client_id = conf("ISSUER"), conf("AUDIENCE"), conf("CLIENT_ID" # Initialize Oso. oso = Oso() +register_models(oso, Base) oso.load_file("app/policy.polar")

In addition, let’s register the BearBase Pydantic model that’s used in the create() handler. Again, registering the class allows us to refer to it in our Polar policy as a specializer:

diff --git a/app/main.py b/app/main.py index adfc2c8..f2a17d0 100644 --- a/app/main.py +++ b/app/main.py @@ -23,2 +23,3 @@ oso = Oso() register_models(oso, Base) +oso.register_class(BearBase) oso.load_file("app/policy.polar")

Finally, let’s enforce authorization in the create() handler so that those fanatics from down under stop trying to sully our ursine haven with koalas:

diff --git a/app/main.py b/app/main.py index 796ab1e..adfc2c8 100644 --- a/app/main.py +++ b/app/main.py @@ -74,3 +74,5 @@ def index(db: Session = Depends(get_db)): @app.post("/bears", response_model=Bear) def create(request: Request, bear: BearBase, db: Session = Depends(get_db)): + if not oso.is_allowed(request.state.user, "create", bear): + raise HTTPException(403) return create_bear(db, bear, request.state.user)

If you save app/main.py and then try to create a new bear, the POST request will return a 403 Forbidden.

Oso is deny-by-default, and we currently have an empty policy file. In the next section, we’ll write our first authorization rule to allow real bear lovers to create real bears.

Add Your First allow() Rule

For our first foray into writing a policy, we’re going to use Polar to add a rule that prevents banned users from creating new bears.

Open up app/policy.polar and add the following rule:

allow(user: User, "create", _bear: BearBase) if not user.is_banned;

The rule works by matching the inputs provided by the application:

user - an instance of the User class. The action is the string literal "create". _bear - an instance of the BearBase class.

And then checking that the provided user’s is_banned field is false. We don’t yet need to check anything further about the bear resource, so we prefix it with an underscore to indicate that it won’t be referenced in the body of the rule.

NOTE: To learn more about Polar syntax, head on over to the Writing Policies guide in the Oso documentation.

Save the file, flip back to localhost:8080, and you should once again be able to create new bears (assuming you haven’t set your own user’s is_banned property to True). All law-abiding bear enthusiasts have had their access restored, and the koala lovers are left out in the cold. (Does it get cold in Australia? We’ll investigate in a future blog.)

Deal with List Endpoints

oso.is_allowed() worked perfectly for securing the create() endpoint, but it’s not the best tool for the job when it comes to securing index(). The difference is that create() operates over a single record, while index() operates over a potentially very large collection. If performance weren’t an issue, we could load the collection from the database and iterate over it, calling oso.is_allowed() on every record to filter out unauthorized entries. However, we all know that Zoomers lose interest and click away from your website if it takes longer than a few Planck time units to load, so we need a better solution.

And now, presenting… a better solution. Data filtering! The cure to all of life’s performance issues.

The sqlalchemy-oso library is built on top of Oso’s data filtering feature. In a nutshell, the logic engine at the core of the Oso library can turn your authorization policy into a set of constraints similar to the WHERE clauses used to filter records when querying a relational database. The sqlalchemy-oso library then applies those constraints directly to your app’s SQLAlchemy queries. In this way, only authorized records are loaded from the database in an efficient operation instead of loading all records and then iterating over the collection to determine the authorized subset.

Without further ado, let’s wire up our app so that we can efficiently filter some bears.

First, we need to modify app/main.py to import authorized_sessionmaker() from the sqlalchemy-oso library:

diff --git a/app/main.py b/app/main.py index f2a17d0..1fa9573 100644 --- a/app/main.py +++ b/app/main.py @@ -9 +9 @@ from oso import Oso -from sqlalchemy_oso import register_models +from sqlalchemy_oso import register_models, authorized_sessionmaker

Next, we’ll create a new FastAPI dependable that mirrors our existing get_db() function but yields an Oso AuthorizedSession instead of a regular SQLAlchemy Session. To do this, add the following code below the get_db() function in app/main.py:

diff --git a/app/main.py b/app/main.py index f2a17d0..1fa9573 100644 --- a/app/main.py +++ b/app/main.py @@ -54,0 +55,11 @@ def current_user( +def get_authorized_db(request: Request): + get_oso = lambda: oso + get_user = lambda: request.state.user + get_action = lambda: request.scope["endpoint"].__name__ + db = authorized_sessionmaker(get_oso, get_user, get_action, bind=engine)() + try: + yield db + finally: + db.close() + +

And finally, update the index() handler so that it depends on our new get_authorized_db() dependable:

diff --git a/app/main.py b/app/main.py index f2a17d0..1fa9573 100644 --- a/app/main.py +++ b/app/main.py @@ -71 +82 @@ app.add_middleware( -def index(db: Session = Depends(get_db)): +def index(db: Session = Depends(get_authorized_db)):

Save the file, reload localhost:8080, and… no bears. They’re still happily growling away in the database, but we haven’t added any Oso rules permitting access. Let’s change that.

Flesh Out Your Authorization Policy

To start off, let’s add a rule to app/policy.polar that permits all users to list all bears:

allow(_: User, "index", _: Bear);

Save the file, reload localhost:8080, and you should see every bear again. But something else is missing. The Owner column is empty! When we serialize bear records in the back end, we include the email address for each bear’s owner — a piece of data that comes from our SQLAlchemy-backed User model. Access to User data is now protected by Oso because User is a subclass of the Base SQLAlchemy class we registered via sqlalchemy-oso’s register_models() method. Let’s add one more blanket allow() rule, this time permitting users to view user data at the index() endpoint:

allow(_: User, "index", _: User);

Do the save-and-reload dance, and the Owner column should once again be populated.

Next, let’s add some constraints to our "index" rule for bears so we can see Data Filtering in Action (coming soon from Manning, probably).

Perhaps users should only be allowed to see their own bears:

diff --git a/app/policy.polar b/app/policy.polar index ff90780..0a1a77d 100644 --- a/app/policy.polar +++ b/app/policy.polar @@ -6 +6,2 @@ allow(_: User, "index", _: User); -allow(_: User, "index", _: Bear); +allow(user: User, "index", bear: Bear) if + bear.owner = user;

Save, reload, and confirm you can no longer see anyone else’s bears. If you want to see some bears, you’ll need to create them for yourself.

The index view is now private, but it feels wrong to prevent our fellow bear enthusiasts from viewing polar bears, the sweetest and most mild-mannered of all bears. To right that wrong, we can register the Species enum as a constant so that we can reference it in our policy:

diff --git a/app/main.py b/app/main.py index 1fa9573..6abb06d 100644 --- a/app/main.py +++ b/app/main.py @@ -13 +13 @@ from app.db import engine, setup_db, Base -from app.models import User +from app.models import User, Species @@ -24,0 +25 @@ oso.register_class(BearBase) +oso.register_constant(Species, "Species")

Then, update said policy:

diff --git a/app/policy.polar b/app/policy.polar index f5bc0a5..2883949 100644 --- a/app/policy.polar +++ b/app/policy.polar @@ -6,2 +6,3 @@ allow(_: User, "index", _: User); allow(user: User, "index", bear: Bear) if - bear.owner = user; + bear.owner = user or + bear.species = Species.polar;

While we’re at it, who doesn’t like pandas or bears named “Smokey”:

diff --git a/app/policy.polar b/app/policy.polar index 2883949..b458ddc 100644 --- a/app/policy.polar +++ b/app/policy.polar @@ -6,3 +6,4 @@ allow(_: User, "index", _: User); allow(user: User, "index", bear: Bear) if bear.owner = user or - bear.species = Species.polar; + bear.species in [Species.panda, Species.polar] or + bear.name = "Smokey";

Well, that’s about all we can bear (sorry) for one blog. Let’s take stock and wrap up.

Final diff of securing the app with Oso diff --git a/app/main.py b/app/main.py index 397037e..6abb06d 100644 --- a/app/main.py +++ b/app/main.py @@ -7,13 +7,22 @@ from sqlalchemy.orm import Session, sessionmaker from starlette.config import Config +from oso import Oso +from sqlalchemy_oso import register_models, authorized_sessionmaker from app.crud import create_bear, get_or_create_user_by_email, list_bears -from app.db import engine, setup_db -from app.models import User +from app.db import engine, setup_db, Base +from app.models import User, Species from app.schemas import Bear, BearBase from app.seed import seed_db # Load environment variables. conf = Config(".env") issuer, audience, client_id = conf("ISSUER"), conf("AUDIENCE"), conf("CLIENT_ID") +# Initialize Oso. +oso = Oso() +register_models(oso, Base) +oso.register_class(BearBase) +oso.register_constant(Species, "Species") +oso.load_file("app/policy.polar") + @@ -46,2 +55,13 @@ def current_user( +def get_authorized_db(request: Request): + get_oso = lambda: oso + get_user = lambda: request.state.user + get_action = lambda: request.scope["endpoint"].__name__ + db = authorized_sessionmaker(get_oso, get_user, get_action, bind=engine)() + try: + yield db + finally: + db.close() + + # Reset and seed database. @@ -62,8 +82,10 @@ app.add_middleware( @app.get("/bears", response_model=List[Bear]) -def index(db: Session = Depends(get_db)): +def index(db: Session = Depends(get_authorized_db)): return list_bears(db) @app.post("/bears", response_model=Bear) def create(request: Request, bear: BearBase, db: Session = Depends(get_db)): + if not oso.is_allowed(request.state.user, "create", bear): + raise HTTPException(403) return create_bear(db, bear, request.state.user) diff --git a/app/policy.polar b/app/policy.polar new file mode 100644 index 0000000..b458ddc --- /dev/null +++ b/app/policy.polar @@ -0,0 +1,9 @@ +allow(user: User, "create", _: BearBase) if + not user.is_banned; + +allow(_: User, "index", _: User); + +allow(user: User, "index", bear: Bear) if + bear.owner = user or + bear.species in [Species.panda, Species.polar] or + bear.name = "Smokey"; Learn More About Oso, FastAPI, and Python

In this post, we started out with a very fast and SQL-y application built on FastAPI and SQLAlchemy. We created and configured a new Okta application to handle identity management and authentication for our app. Then we used Oso to add efficient, fine-grained authorization to our back end API.

The full example is available on GitHub.

If you’re up for it, here are a couple exercises to try:

Add roles to the app using sqlalchemy-oso’s built-in roles feature. Perhaps every bear lives in a sanctuary, and a user can have a particular role in each sanctuary, e.g., “Visitor”, “Friend”, or “Shepherd”. Add a “Delete” button next to every bear in the list, and wire up each button to send a DELETE request to the back end. Secure your new delete() handler with Oso, and add a rule to the policy that only allows users to delete their own bears. Join our slack and let us know what your favorite bear is and why it’s the polar bear. Learn more about other use cases of Oso, other languages it supports, and more in the Oso documentation.

If you liked this post, chances are you’ll like these others:

Build and Secure an API in Python with FastAPI The Definitive Guide to WSGI Build a Simple CRUD App with Python, Flask, and React Build a CRUD App with Python, Flask, and Angular

Don’t forget to follow @oktadev on Twitter and subscribe to their YouTube channel for more excellent tutorials. You can follow Oso on Twitter too.

Tuesday, 22. June 2021

Aergo

Digitized Voting Democratized: The Aergo Blockchain Voting Paradigm

Immutable, Trustless And Immune To Fraud: The Blockchain Expansion Of Democratic Republicanism Introduction: The Great Fraud: Blockchain’s Rectification Can We Trust Traditional Paradigms Of Ballot Counting For Fair And Equitable Elections Without Blockchain? It has become evident that democratic-republicanism is under-threat by tyranny, despotism and elusive election tampering. Throughout the glob
Immutable, Trustless And Immune To Fraud: The Blockchain Expansion Of Democratic Republicanism

Introduction: The Great Fraud: Blockchain’s Rectification

Can We Trust Traditional Paradigms Of Ballot Counting For Fair And Equitable Elections Without Blockchain?

It has become evident that democratic-republicanism is under-threat by tyranny, despotism and elusive election tampering. Throughout the globe, voter fraud has become a prevalent topic of discussion from major political entities to the citizenry; ballots don’t make results, the counters do. The last three elections within the United States have succumbed to political parties accusing one another of voter fraud, through substantiated and unsubstantiated claims; how can we rectify such a broken paradigm that threatens republicanism, blockchain is the answer.

Blockchain technology is an immutable system that is immune to fraud and provides users with an innovation that is highly secured, decentralized and anonymized with audible chains of records. Blockchain technology has revolutionized and transformed numerous different industries which include supply-chains, escrow, invoices, healthcare, crowdfunding and cloud storages etc. and now it is time for blockchain to perpetuate the ethos of democracy as the catalyst of the demise of voter fraud. Blockchain in its purest of forms and most simplistic of terms is a digital ledger that is peer-to-peer that mitigates the avaristic intent of centralized entities. Encryption makes these records incorruptible and easy to verify. Voters can utilize blockchain and can be assured that their vote will not be tampered with and simultaneously, their vote stays anonymous; this is the true ethos and incorruptible paradigm of voting that should become the foundational bedrock of republicanism. This dynamic change in transparency surrounding voting is incumbent for democracy to expand and flourish into the 21st century, especially with ongoing investigations surrounding voter-fraud, not just in the United States, but throughout the world.

Aergo’s Understanding And Desire To Decimate Voter Fraud: The Pikkle Construction

Pikkle Is The First Blockchain Voting Mechanism On The Aergo Platform: Enhancing Global Participation Of Transparent Blockchain Voting

Pikkle is the first of its kind; built on the Aergo platform by Blocko, Aergo’s technology partner and utilized by the Korean Broadcasters Association, Pikkle aims to resolve a variety of dilemma’s that plague traditional voting paradigms and mechanisms. Pikkle resolves issues such as cost inefficiencies, resolving voter processes that creates substantial costs that coincide with targeting voters, identity verification, ballot counts and check lists. Pikkle aims to mend the public trust of voting paradigms which, throughout the world, many voters have lost confidence in voting mechanisms because of the continuous onslaught and investigations of voter fraud. Pikkle ensures that trust is consolidated back into the consciousness of the populace; unfortunately, since the advent of social media, it has become evident that votes can be easily manipulated and influenced for the platforms own self-interest. We saw this occur consistently among some of the most prominent social media platforms during the 2016 and 2020 presidential elections within the United States. With voter fraud corrupting democracy, one must ask, are we in a democracy, or a democracy masked by corruptible despotism; it is not power that corrupts, but it is magnetic to the corruptible and Pikkle aims to change this notion.

The Korean Broadcasters Association aims to utilize Pikkle as the standardized go-to platform for reliable voting services for all users involved. Pikkle collects users information and views and assists in decision making through an application service rather than traditional cold-calling and text voting spams. Pikkle resolves voter fraud by storing data-components of voters such as identification of voters and voting results utilizing blockchain technology; an immutable digital ledger that cannot be hacked, tampered with or corrupted, unlike traditional voting systems. Convenience has become a crucial component of almost every single part of our daily lives so Pikkle encourages the seamless usage of QR codes to participate in voting actions, insured by the public and private trust of blockchain technology. Centralized entities have attempted to garner similar approaches, but since the public trust in these entities has waned over the last fifteen years, it is evident that an app like Pikkle can replace traditional voting mechanisms. Through the utilization of blockchain technology, stake-holders, companies, governmental organizations, criminal enterprises and political parties can no-longer shake the foundational bedrock of pure democracy, and Pikkle emphasizes and embodies such a notion. The app’s identity verification service, which eliminates the need for a physical ticket, enables efficient and easy entry into cultural events and festivals, a hallmark of South Korean tradition. Fun fact, the word Pikkle derives itself from two words “pick” and “people”, the people pick the electoral incumbents, not the despots and politicians.

The Korean Broadcasters Association: Blocko And Aergo: Unified Under The Consciousness Of Mending Public Voter Trust

Mending Public Trust Through Blockchain Voting Platforms: The Pikkle Paradigm

On April 23rd, The Korean Broadcasters Association, Blocko and Aergo signed an MOU to ensure the practical application and construction of a blockchain based audience participation application and voting system. The Korean Broadcasters Association is an organization that is almost half a century old with an aim to unify South Korea through domestic terrestrial broadcasts. The Korean Broadcasters Association has worked with the largest broadcasting stations in all of South Korea which include joint ventures when counting exit polls alongside the KBS, MBC and SBS. Through offering various cooperation to provide the 40 members with high quality and diverse broadcasting services, the KBA is continuously trying its best. The KBA intends on utilizing Pikkle for a variety of services which include, but are not limited to, media content and voting systems to select winners for Korea Broadcasting Prizes and the Seoul International Drama Awards on the Aergo platform. Blocko also plays a crucial role in the development and expansion of Pikkle as well.

Blocko is Aergo’s main technology partner that was established in 2014 and has worked with some of the largest institutions in all of South Korea which include the Bank of Korea, Financial Security Agency, Shinhan Financial Group, Hyundai Motor Group and many more. It is incumbent to reference that Blocko is South Korea’s largest blockchain infrastructure provider that has built blockchain based services for financial, political and public institutions throughout the nation. Blocko has commercialized enterprise blockchain platforms based off of Aergo, an open-sourced hybrid blockchain protocol powered by the native cryptocurrency AERGO optimized for hybrid deployments for public and private institutions.

Conclusion: Mending Democracy: The Pikkle Mission Confirmed

Pikkle Aims To Integrate With Broadcasting Systems And Festivals By Q3 Of This Year And Intends To Launch NFT’s Next Year

It is evident that the public trust of traditional voting mechanisms has all but collapsed over the last decade. Aergo comprehended this notion and seeks to rectify public trust themselves through the utilization of Pikkle on a consolidated frontier between Aergo, Blocko and the Korean Broadcasters Association. Pikkle is the first of its kind surrounding blockchain voting, enabling immutable and incorruptible data to be stored on the blockchain which has the ability to rectify our globally broken electoral system. Blockchain technology can transform democracy, and Pikkle can perpetuate its transcendence.

Digitized Voting Democratized: The Aergo Blockchain Voting Paradigm was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

A Fresh Look at the Business Value of PAM in the Work-From-Anywhere World

As businesses continue their digital transformation journey, managing privileged users has taken on a new and greater importance. Privileged Access Management (PAM) has never been more important, with increased remote working and employees increasingly getting privileged access to data and services.

As businesses continue their digital transformation journey, managing privileged users has taken on a new and greater importance. Privileged Access Management (PAM) has never been more important, with increased remote working and employees increasingly getting privileged access to data and services.




auth0

What Is Low Code? How Low Code Can Speed Digital Transformation

Learn how a low-code approach can help organizations save time and money on software development.
Learn how a low-code approach can help organizations save time and money on software development.

Global ID

GiD Report#165 — Everything you need to know about new FTC chief Lina Khan

GiD Report#165 — Everything you need to know about new FTC chief Lina Khan Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: The 20-something Yale student that made antitrust cool What people are saying What it means for Big Tech Ever
GiD Report#165 — Everything you need to know about new FTC chief Lina Khan

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

The 20-something Yale student that made antitrust cool What people are saying What it means for Big Tech Everything Lina Khan + Big Tech What people think about Big Tech platforms Chart of the week BONUS: Every platform is Clubhouse now Stuff happens 1. The 20-something Yale student that made antitrust cool Photo: Lexey Swall / linamkhan.com

Lina Khan shot to unlikely national recognition after the publication of her seminal work while still at Yale, Amazon’s Antitrust Paradox. An overnight sensation, the paper helped reframe decades of consensus around antitrust frameworks — which at the time were formulated around pricing. Instead, we should focus on structural power, Lina argued.

Here’s the NYTimes back in 2018:

Over 93 heavily footnoted pages, [Lina Khan] presented the case that the company should not get a pass on anticompetitive behavior just because it makes customers happy. Once-robust monopoly laws have been marginalized, Ms. Khan wrote, and consequently Amazon is amassing structural power that lets it exert increasing control over many parts of the economy.
Amazon has so much data on so many customers, it is so willing to forgo profits, it is so aggressive and has so many advantages from its shipping and warehouse infrastructure that it exerts an influence much broader than its market share. It resembles the all-powerful railroads of the Progressive Era, Ms. Khan wrote: “The thousands of retailers and independent businesses that must ride Amazon’s rails to reach market are increasingly dependent on their biggest competitor.”

Lina, at the time:

“As consumers, as users, we love these tech companies,” she said. “But as citizens, as workers, and as entrepreneurs, we recognize that their power is troubling. We need a new framework, a new vocabulary for how to assess and address their dominance.”

Thanks to Lina’s paper and her ability to bring newfound attention to an old challenge, new frameworks and vocabularies have been bubbling below the surface over the last few years.

In that time, our stance on Big Tech has rapidly shifted. From being darlings of capitalism and innovation, we now view the corporate titans that control our digital world with a heightened sense of skepticism.

Our political leaders have taken the hint. The issue of monopoly power is, today, a bipartisan issue — though sometimes for different reasons.

2. What people are saying

Now, 32, Lina Khan is the new chair of the FTC.

Axios:

Why it matters: Khan, an antitrust expert well-known for her ideas for applying competition law to the tech industry, is sure to spook tech platforms.
Khan, 32 years old, is a hero to critics of tech who want to see the government act more aggressively against what they see as anti-competitive behavior from companies like Amazon, Google, Apple and Facebook.

BBC:

Worryingly for Big Tech too, a number of bipartisan bills have also been put forward in Congress to rein in the power of Big Tech. That there are Democrats and Republicans who support antitrust action makes her position even more powerful.
Ms Khan’s appointment is the clearest sign yet that President Biden means business when it comes to clipping the wings of companies like Amazon and Apple.

Wired in a piece titled, The US Government Is Finally Moving at the Speed of Tech:

The result is that we find ourselves living in a world that looks very different from the one we were living in just a few years ago. New antitrust cases against tech giants are popping up left and right, keeping the issue firmly in the public consciousness. The companies are devoting unprecedented sums toward lobbying, advocacy, and advertising to try to avert a crackdown. And in the sharpest break with the past, Congress and the White House are taking concrete steps to restructure markets that have been left to their own devices for two and a half decades.
It’s all so much, so fast, that it’s hard to keep track of the various subplots. The introduction of the five House antitrust bills and the elevation of Khan to FTC chair, for example, look like two separate stories. But they’re really two parts of the same story: Khan was herself the key investigator behind the House antitrust subcommittee’s investigation of Apple, Amazon, Facebook, and Google, begun in 2019. The bills introduced last week are the fruits of that investigation. (While the time between the start of the investigation and the release of legislative proposals has felt like an eternity to those of us who follow this closely, it wouldn’t be bad for a Silicon Valley product launch. It took Amazon three years to bring the Kindle to market.)

Matt Stoller, The Antitrust Revolution Has Found Its Leader:

Khan is something rare in progressive politics, someone with academic credentials and mastery over a dense technical subject, but also connected with a broad-based populist social movement that crosses partisan lines. I can’t tell you how many people I’ve spoken to in business, Republicans as well as Democrats, who talk in reverential tones about Khan. It’s not just that she is an important thinker, it’s that she *understands* what they are going through, the coercive power they are up against. And that’s because she got her start understanding the economy not in a classroom or at a law firm, but as a business journalist, listening to business people and workers facing monopolists.
To call this appointment remarkable understates the point. That Khan is on the commission, with Republican votes, is surprising enough, but for her to be Chair is downright shocking. It’s too soon to know what Khan is going to do in her new role, but her appointment is already sending shock waves in the enforcement community globally. Antitrust policy is run by a small yet international community of lawyers and economists who know each other. In every country, some of them are cheering this move, while others are horrified. But they all know it matters, because as goes the U.S. on antitrust, so goes Europe.
3. What it means for Big Tech

For one, they’re on high alert. Here’s The Information:

And on Tuesday, tech stocks barely flinched at the news. But Khan’s arrival is likely to put Google, Apple, Facebook, Amazon and Microsoft on the defensive in a way the past few years of heated anti-tech rhetoric have not.
That’s because Khan’s appointment to lead the agency, one of two antitrust enforcers in the U.S. along with the Justice Department, poses perhaps the most serious threat yet. Although she is just one person on a five-member commission, as chair she will set the enforcement and policy priorities for the agency. That could very well mean an antitrust case against Amazon, given her law school paper outlining the antitrust problems with the online retailer. A Khan-led FTC also is likely to bring enforcement actions to rein in not just the technology sector, but corporate America writ large.

We’ll be watching to see how Lina Khan intends to leverage her new position of power. But it’s worth noting that antitrust is about a lot more than just trust busting — say, forcing Facebook to break up Instagram and WhatsApp. As Greg has mentioned in the past, the cat’s out of the bag.

Instead, we could see new rules around key issues such as interoperability and portability — including this recently introduced bill (via /m):

This post assesses H.R. 3849, the “Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act of 2021” introduced by Representatives Scanlon (D-PA) and Owens (R-UT). The bill would impose sweeping requirements on specific covered platforms to build new technical interfaces to transfer data to competitors and potential competitors and to enable interoperability between their services.

Here’s Mitja on the subject, comparing it to when telco companies finally allowed phone number portability:

This bill might end Facebook’s monopoly, but one of the interesting and important things to note is interoperability which enables users to switch from one social network to another.
In this case, “switch” could be defined similarly to when the UsSs government forced wireless companies to let you take your phone number with you when you switch carriers — a similar move may be called on social networks to let you take your identity and social content and transfer it elsewhere.

With Lina, anything’s on the table. But these are early days and she still has her work cut out for her, as WIRED reminds us:

None of this is to say that the movement to break up Big Tech, or rein it in, or whichever bumper sticker you prefer, has achieved victory. The legislative agenda in particular remains a long shot until proven otherwise. While it’s noteworthy that the bills have bipartisan support within the House antitrust subcommittee, they might not have enough to clear the 60-vote threshold in the Senate, even in altered form. House minority leader Kevin McCarthy has already come out in opposition, and even those Senate Republicans who might be sympathetic have yet to publicly support the agenda. (Even if the bills become law, there’s no guarantee they will achieve the desired results.) Ted Cruz, who appears to revel in browbeating social media executives as much as anyone, voted against Khan’s confirmation. So did Mike Lee, the ranking Republican member of the Senate antitrust subcommittee.
But even if legislation stalls, Khan’s overwhelming confirmation — she received support from 21 Republican senators — means change is coming to the FTC. Several experts have argued that the agency has enormous, mostly untapped power to structure markets. Two of those experts now make up two-thirds of the commission’s Democratic majority: Khan and her former boss, Rohit Chopra, who last year published an article arguing for the agency to use its authority to issue rules about unfair competition, rather than relying exclusively on litigation. They now have an opportunity to put that theory into practice.
4. Everything Lina Khan and Big Tech Lina Khan will chair Federal Trade Commission Lina Khan: The 32-year-old taking on Big Tech How to Fight Amazon (Before You Turn 29) Amazon’s Antitrust Antagonist Has a Breakthrough Idea Trustbusting Kicks Into Higher Gear Big Tech’s Uneasy Future The Antitrust Revolution Has Found Its Leader Via /m — Tech Regulatory Overhaul Series: The Excess of ACCESS Big Tech moves in on the creator wars Exclusive poll: Broad public support for new tech regulations The Guy at the Center of Facebook’s Misinformation Mess Lobbyists for Silicon Valley Giants Like Facebook Find Glory Days Are Over 5. What people think about Big Tech platforms

Axios:

By the numbers: A survey of 1,203 likely voters taken May 14–17 finds that 82% of respondents are somewhat or very concerned about children being radicalized by online content and 76% somewhat or very concerned about becoming addicted to online platforms.
Nine in 10 voters are very or somewhat concerned about data breaches, and the poll results showed broad support for new rules for social media firms, with equal support from Republicans and Democrats.
Most notably, the poll shows that a majority of voters, evenly by party, support breaking up tech companies into smaller entities — 57% of polled Democrats (22% “strongly supporting” and 35% “somewhat supporting”) and 57% of Republicans (34% “strongly supporting” and 23% “somewhat supporting”).
6. Chart of the week

WSJ:

“In the last four or five years, the pendulum has swung in an overly dramatic fashion from ‘tech can do no wrong’ to ‘tech can do no right,’ ” said Adam Kovacevich, who spent 12 years as one of Google’s top lobbyists. He now leads a new tech group called Chamber of Progress aimed at wooing back Democrats.
As five House tech bills came together this month, lobbyists for the companies that would be most affected said they were frozen out of the process and that the lawmakers did everything they could to keep drafts of the proposals out of their hands.
“The industry was very much treated as the enemy, and one to be isolated,” one lobbyist said. As a result, the bills are tougher on the industry than if lawmakers had sought their input to craft something that they could support.
7. BONUS: Every social media platform is Clubhouse now 8. Stuff happens Capturing The Digital Identity Evolution Through A Layered Approach — One World Identity Mattel joins the NFT frenzy with Hot Wheels digital art. Via /vsCalifornia unveils system to provide digital COVID-19 vaccine records Via /carolyn — Stripe goes beyond payments with Stripe Identity to provide AI-based ID verification for transactions and much more — TechCrunch Money, Monetary Policy, and Bitcoin | Ray Dalio at Consensus 2021 The Bitcoin Industry Responds to ESG Concerns — CoinDesk “DeFi Protocol Risks: the Paradox of DeFi” by Nic Carter, Linda Jeng :: SSRN SEC Delays Approval of VanEck Bitcoin ETF, Again — Blockworks OnlyFans Seeks New Funding at Valuation Above $1 Billion PayPal Alums Launch Decentralized Payment Network that Connects Fiat and Digital Currencies — Blockworks Banks Edge Closer to Ethereum 2.0 Staking — CoinDesk NFT Newsmakers: Fox, Tiger King, NFT Genius and the World Wide Web — Blockworks The Week Social Audio Went Mainstream (Table); An Interview With YouTubers Colin & Samir Sports NFTs Are No Slam Dunk Murky Regulation Leaves Opening for Other Validators Solana’s Bid to Take On Ethereum Facebook Previews Its Clubhouse Competitor; Q&A with Gumroad’s Sahil Lavingia Mark Cuban ‘Hit’ by Titan Crypto Crash As Coin’s Price Falls to Near Zero

GiD Report#165 — Everything you need to know about new FTC chief Lina Khan was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Auth0 Recognized as a 2021 Gartner Peer Insights Customers’ Choice for Access Management

Gartner Peer Insights Customers’ Choice distinction based on feedback and ratings from enterprise users
Gartner Peer Insights Customers’ Choice distinction based on feedback and ratings from enterprise users

Transmute TechTalk

Encoding Trust that Travels with Data — A New Product Introduction Case Study Powered by Solutions…

Encoding Trust that Travels with Data — A New Product Introduction Case Study Powered by Solutions Design with Transmute This case study describes supply chain innovation work completed by GS1, GS1 US, and GS1 Canada and Transmute during the second half of 2021. About GS1: GS1 is a neutral, not-for-profit organisation that develops and maintains the most widely used global standards for eff
Encoding Trust that Travels with Data — A New Product Introduction Case Study Powered by Solutions Design with Transmute This case study describes supply chain innovation work completed by GS1, GS1 US, and GS1 Canada and Transmute during the second half of 2021.

About GS1: GS1 is a neutral, not-for-profit organisation that develops and maintains the most widely used global standards for efficient business communication. GS1 is best known for the barcode, named by the BBC as one of “the 50 things that made the world economy”. GS1 standards improve the efficiency, safety and visibility of supply chains across physical and digital channels in 25 sectors. GS1’s scale and reach — local Member Organisations in 115 countries, 2 million user companies and 6 billion transactions every day — help ensure that GS1 standards create a common language that supports systems and processes across the globe. Find out more at www.gs1.org

About Transmute: Transmute is a US-based technology company that uses next generation identity and credentialing technologies to provide comprehensive solution design and software to define, digitize, and share verifiable data at scale. Transmute partners with innovative enterprises and governments to build digital trust ecosystems where this data can be securely exchanged across diverse stakeholders and technology vendors. Transmute specializes in trusted information about global supply chain products, actors, and transactions.

Project Overview

In 2021, GS1 and Transmute worked together to test the business and technical feasibility of combining GS1 standards with Verifiable Credentials for product-related claims.There was mutual interest in moving from theory to practice to illustrate the tangible value of leveraging Verifiable Credentials for creating trust that travels with data.

This project was guided by these key questions:

How can GS1 create trust that travels with data by issuing authoritative information about companies and products? How might GS1 Digital Link, Verifiable Credential, and Decentralized Identifier standards be leveraged together to create business value related to product claims?

Through our collaboration we discovered particular value in the area of new product introduction — or the process of procuring new products as a retailer. As consumers we rarely think about how the things we buy arrived on the physical (or virtual) shelf. Behind the scenes it can take months for retailers and brands to establish relationships, validate products, complete distribution, and ensure quality before making those goods available to consumers. We found multiple opportunities to improve the speed and security of these processes by leveraging Verifiable Credentials in conjunction with GS1 standards and identifiers.

Solutions Design Process: GS1 + Transmute Collaboration

Our companies worked through a step-wise approach to move from broad theory to a targeted application of our combined technologies for new product introduction.

Collaborative working sessions including GS1 experts and the Transmute team were conducted over several months to identify the most feasible and impactful possibilities. We drew from the international standards, supply chain, and technical expertise of GS1 team members. Transmute combined this information with our decentralized technologies expertise to propose specific credentials, workflows, and businesses narratives to address the product claims focus. These solutions are technically supported by the Transmute platform and open source libraries.

Our group went through multiple iterations as a part of early solution design work. This stage includes detailed business narratives for a variety of supply chain use cases, workflow mapping, brand interviews, sample credentials, and early prototypes, culminating in a Solution Design Report. This phase of work ultimately drove the narrowed focus to the new product introduction use case as the highest value opportunity for illustration of immediate industry value.

Next we built a Proof-of-Concept demonstration — including an interactive prototype and technical libraries — to show how these credentials can be issued, exchanged and verified seamlessly across brands. You can see the prototype in action here:

New Product Introduction Description

The following written description provides further detail about the business story portrayed in the video above. If you already watched the video you can also jump to the subsequent section.

Brand Healthy Tots wants to sell its natural baby food products for the first time with retailer Sell Anything & Everything (SA&E). SA&E will start by selling the Healthy Tots product in its online store, with the future possibility of also selling it in brick and mortar locations.

Typically new product onboarding is a lengthy process, requiring submission of PDF documentation and requesting data from a variety of parties, often leading to duplicated costs and effort. These documents and sites are also very easy to fake, giving false confidence in product claims.

Healthy Tots provides SA&E with verifiable evidence of its GLN and the product GTINs, and the process kicks off instantly. SA&E uses the GS1 Digital Link resolver to immediately discover locations of all related product information. SA&E’s system can then request relevant information from the associated locations, providing evidence where needed that it is authorized to do so. For example, it is able to request third party issued organic certification, allergen information, and product images.

SA&E receives this information in the form of Verifiable Credentials signed by the original issuing parties, and is able to confidently populate the new product listing. SA&E can do so automatically because the issuer of the information can be identified and confirmed, allowing data they issue to be passed through relatively unknown or untrusted parties like Healthy Tots systems. The SA&E system can even automatically check that the issuing parties possess appropriate accreditations and/or GLNs to make sure such statements about the product are authoritative. This includes going all the way back to GS1 Global as a root of trust.

Healthy Tots can fill in any additional product details that SA&E needs, and the product can be listed for sale within minutes. Relevant pieces of verifiable information such as certified organic and allergen information can also be passed through to the consumer’s view, providing deeper confidence in the legitimacy of the product and associated accreditations.

Testing Business Benefits

This novel approach to sharing product data can lead to the following benefits:

Reduced administrative burden and faster timeline from engagement to product on shelves. Automation for the retailer, working with hundreds or even thousands of vendors and tens of thousands of products. This protects brand reputation and allows elevation to human review when risk is detected. Unambiguous documentation requirements that meet regulatory needs and further speed up processes. More broadly, an encoded process for establishing trust in data that persists across systems, data pools, and unknown intermediaries.

One of the next steps for this work is deeper testing of these benefits to measure cost savings, efficiency gains and strategic insight.

Building Trusted Ecosystems with GS1

The demo scenario is underpinned by GS1 as a root of trust in the network — continuing a rich history for GS1 in this role. GS1 licenses and identifiers are and will continue to be at the foundation of trusting products and companies. Combining current practices with verifiable credential, decentralized identifier, and GS1 Digital Link standards disambiguating products builds business reputation for just-in-time engagement while keeping information up to date.

We believe that institutions like GS1 are critical in paving the foundations for trusted data to move across ecosystems in the future. Ultimately, trust has to flow from one or more sources whose reputation allows businesses to act with confidence. The advantage of the approach taken in this POC is that the distance between verifiers and root-of-trust organizations can be much greater without losing confidence in the information being presented. As long as the chain of trust retains integrity, the possibilities are endless.

Next Steps

This new product introduction demonstration is now being used to test, validate, break, and refine assumptions about how the involved standards and technologies can best be leveraged to generate value for individual businesses and the broader ecosystem.

“GS1 is moving forward with work to extend existing standards in order to better support the evolving needs of our member companies and create a consistent, common path forward for the scale of digital credentials.” — Melanie Nuce, Senior Vice President, Innovation and Partnerships, GS1 US
“Verifiable Credentials are an exciting opportunity for GS1 to extend its identification system, improve data quality, and secure the supply chain in ways never before possible.” — Kevin Dean, Special Projects Consultant, GS1 Canada

Transmute is working with customers to continue designing, building, and growing digital trust ecosystems — from Solution Design work to bringing businesses onto the Transmute Platform to experience issuance, exchange, and verification of Verifiable Credentials integrated with existing business infrastructure. You can reach out to learn more about working with Transmute here.

Our companies look forward to continued collaboration in the future, and expansion of this important work.

Encoding Trust that Travels with Data — A New Product Introduction Case Study Powered by Solutions… was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology and the Coming Metaverse

The once fictional concept of a “Metaverse” is fast becoming a reality — and Ontology is set to be an integral part of this evolving ecosystem. The term metaverse combines the prefix “meta” (meaning “beyond”) with “universe”. The Metaverse refers to a collective, virtual space, encompassing the totality of virtual worlds, augmented realities, and the internet. The concept originates in Snow Crash
The once fictional concept of a “Metaverse” is fast becoming a reality — and Ontology is set to be an integral part of this evolving ecosystem.

The term metaverse combines the prefix “meta” (meaning “beyond”) with “universe”. The Metaverse refers to a collective, virtual space, encompassing the totality of virtual worlds, augmented realities, and the internet. The concept originates in Snow Crash, a science-fiction novel by Neil Stephenson depicting a futuristic world in which people operate virtual avatars in an online world known as the Metaverse.

The kind of metaverse imagined by Stephenson is yet to be realized in the real world. However, the building blocks of a potential future Metaverse are already beginning to appear. Considering recent developments in science and technology, we can roughly define the Metaverse as: the next generation internet based on XR (including VR, AR, and MR), computer technology, internet communication technology, and social networks. More broadly, the Metaverse is a virtual reality world representing ‘real’ world experiences, such as working, studying, and investing. Users can transition freely between the offline and online realm.

A widely accepted definition of the Metaverse comes from Dave Baszucki, CEO of Roblox. He outlines eight characteristics of the Metaverse, which include: identity, friends, immersion, low latency, diversity, anytime and anywhere, economy and civilization. Based on this outline, we explore the contributions Ontology expects to make to the Metaverse of the future.

Identity

In the era of Big Data, unprecedented amounts of information can be stored and shared. This has enabled the growth of online encyclopedias, video streaming services, and social networks. Such technologies allow information to be accessed globally. But due to their scale, they are vulnerable to attack from bad actors aiming to obtain private information, a commodity that has become increasingly valuable in the digital age.

Centralized organizations such as Facebook and Google control the identities, data and assets, of their users. The accumulated data are used to create targeted advertising campaigns. Platforms owned by the likes of Facebook are ostensibly free to use, with users sacrificing their privacy for the convenience that comes with using them. Most platforms require users to register a separate account for each platform, meaning each user may operate many accounts simultaneously.

To return to the concept of a Metaverse, how can we develop a system allowing users to migrate between parts of the Metaverse with a consistent identity, whilst ensuring their data and privacy are respected? Ontology’s “ONT ID”, a decentralized identity (DID) framework, and “OScore”, a self-management scoring system, aim to do just that.

ONT ID is a decentralized identity framework based on the W3C decentralized identifier specification, blockchain and cryptography. It can quickly identify and connect people, assets, objects, and events, with the characteristics of decentralization, self-management, privacy protection, security, and usability. In short, ONT ID is not governed by any centralized authority, and allows users to securely manage their identities and data, independently.

At the same time, Ontology’s DDXF (Distributed Data Exchange Framework), provides a decentralized way to exchange data, which collaborates with ONT ID to improve users’ privacy protection. In addition, OScore generates scores for users based on their on-chain data, which they can authorize third parties to access. Users can also bind their OScore data and off-chain data using ONT ID, in order to seamlessly transition between the on-chain world and the off-chain world.

ONT ID, DDXF and OScore, are all central features of Ontology’s decentralized solutions, designed to allow users to manage their identity, data and assets, independently, in a decentralized way. Based on the interoperability and verifiability of decentralized identity, our solutions not only solve the pain points of social trust, but also enrich the whole trust ecosystem.

Ontology’s ONT ID is equipped for the Metaverse of the future, allowing users to easily traverse different environments within the Metaverse, in addition to enabling migration in and out of the virtual world. Whether online or offline, trust is an integral feature of a functioning society. OScore, the credit review system, bolsters trust and reliability when transacting, benefitting both individuals and organizations alike.

Economy

In the Metaverse economy, users can earn wealth through work, education, investment, and trade. By doing so, the whole ecosystem expands. However, it’s worth noting that this kind of ecosystem is not closed-loop. The Metaverse is not a purely virtual world. It must be closely connected with the real world, in order to realize its potential. Users can earn money not only by working in the real world, but also by playing games in the virtual world. These two kinds of assets are highly liquid and contribute to the overarching ecosystem.

Ontology is not only engaged in research and technological development. It is also dedicated to launching blockchain-based products that build connections between online and offline ecosystems. In the service sector, Ontology has partnered with Kai Yun Delivery, a comprehensive logistics service provider focusing on urban transportation and terminal-end deliveries. Kai Yun provides data attestation solutions, implementing the business scenario of “online consumption + instant delivery”, and empowers the new energy vehicle industry by solving the problem of low efficiency and high cost. In the field of digital assets, Ontology provides technology support for OGQ, a leading digital assets and content platform in South Korea, helping content creators protect their copyrighted material. In the mobile and travel industry, Ontology and Daimler Mobility AG Blockchain Factory, which was established to bring the benefits of blockchain innovation and application to the automotive finance and mobility industry, jointly developed the “Welcome Home” application. The app integrates cutting-edge technologies such as digital identity to bring a new driving experience to car drivers.

Finally, in the field of job recruitment, Ontology has successfully integrated its cross-chain wallet, ONTO Wallet, into global freelancing marketplace leader, Microworkers, providing more economic opportunities to freelancers. Based on information such as resume matching and job intention, freelancers establish labor relations with recruiting companies. After they complete the work, companies can compensate freelancers with ONT or other tokens, according to the contract. Candidates can use ONTO to transfer the received digital assets to their wallet address or personal assets terminals. During this process, ONG serves as the gas fee. This is Ontology’s dual token model. Such a model can be applied online and offline, acting as a circular token in the economic system of Metaverse, helping to enrich the ecosystem.

In addition to the economic and ID-based solutions, mentioned above, the Metaverse can also benefit from blockchain technology on other levels.

Low latency: Relative to the high fees and congestion that has become the norm on Ethereum, the transaction confirmation rate on Ontology is very fast. On Ethereum, the average transaction cost is tens of US dollars, with some transactions costing over $100. However, on the Ontology chain, each transaction only consumes 0.05 ONG (equivalent to $0.05). At the same time, Ontology has been continuously researching the latest Layer 2 technology. Ontology is exploring the possibility of integrating with Ethereum Layer 2 solutions and is committed to overcoming the technical barriers between Ontology and Ethereum. Ontology hopes Ethereum users can enjoy the same high-efficiency and low-fees as Ontology users. This coincides with the low latency attribute of the Metaverse. Whether in the real world or the virtual world, transaction speed greatly affects the user experience. Diversity: On the Ontology chain, users can create multiple ONT IDs to log in on different terminals, based on their specific needs. For example, you can use different ONT IDs to independently manage ONTO and Wing or, if you prefer, you can use the same ONT ID for both. Whether in real life or in the Metaverse, you can generate multiple identities based on your own needs. The diversity of ONT ID is consistent with the diverse nature of the Metaverse. A variety of identities enrich the virtual world’s ecosystem and greatly enhance the sense of immersion. Anytime and Anywhere: ONTO is a decentralized identity mobile terminal. It provides an all-in-one service for management of decentralized identities and data. Wherever you are, you can manage your assets, data and identities, in the palm of your hand. This kind of portability is indispensable in the Metaverse. Civilization: The Metaverse originates in the human imagination. Blockchain is the underlying technology that will enable the Metaverse to become reality. Since its inception, Ontology has been developing blockchain technologies such as DID and data management tools, aiming to contribute to the global advancement of science and technology. At the same time, Ontology also hopes to inspire more people to participate in the blockchain revolution by developing new technologies and fostering a global community.

Ontology is committed to accelerating the arrival of the coming Metaverse via the research and development of blockchain and further upgrades to its decentralized identity and data solutions.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology and the Coming Metaverse was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Affinidi

Affinidi Partners With Sudan’s Tech for Hack the Mountains 2.0

Innovation is the cornerstone of Affinidi’s Global Developer Ecosystem (GDE) unit and we constantly strive to nurture, develop, and empower new ideas. We partner with developers, solopreneurs, startups, non-profit organizations, and others, as a part of our endeavor to build new applications in the Self-Sovereign Identity (SSI) space. Through such partnerships, we also strive to educate peop

Innovation is the cornerstone of Affinidi’s Global Developer Ecosystem (GDE) unit and we constantly strive to nurture, develop, and empower new ideas.

We partner with developers, solopreneurs, startups, non-profit organizations, and others, as a part of our endeavor to build new applications in the Self-Sovereign Identity (SSI) space. Through such partnerships, we also strive to educate people on the benefits of SSI and enhance its overall adoption world over.

Keeping in tune with this commitment, Affinidi is proud to be the Diamond Partner for the “Hack the Mountains 2.0”, a 36 hour virtual Hackathon by the first-ever tech community from Jammu and Kashmir “SUDAN’S TECH ” on the 26th and 27th of June 2021.

What Do We Offer for the Participants?

The top 10 projects will be awarded a cash prize worth 10000 INR in ETH. Also, the submitted projects will be featured in our blog posts and social media announcements. A Winner Verifiable Credential will be awarded as well.

All other submissions have the opportunity to apply for internships and full-time roles at Affinidi. Besides, we also provide mentorship support for teams that would like to take the idea further.

What’s Your Challenge?

Build a Proof of Concept (PoC) or integrate it into an application/idea you’re working on by leveraging Affinidi’s technology stack and build meaningful applications identifying the Issuer(s), Verifier(s), and Holder(s) within your framework.

Your submission should be a Proof of Concept (PoC) application demonstrating a Verifiable Credential-based use case that uses Affinidi’s APIs and/or SDK. It should cover the issuance and/or verification of the VC and must have a working demo of the same.

Resources

Here are some resources that can come in handy.

101 Articles on Verifiable Credentials Verifiable Credentials? FAQs on Self-Sovereign Identity and Verifiable Credentials Decentralized Identifiers Use Cases and Ideas

Here’s a bunch of ideas you can use as a starting point for this challenge.

25+ PoCs for Verifiable Credentials 25+ Use Cases for VCs Technical Documentation

To know more about our Affinidi’s APIs, visit the Affinidi API page. For SDKs, visit our GitHub repository.

And here’s a YouTube video that shows how you can build a verifiable credential-based application in just one evening!

For more questions, reach out to us on Discord and we are happy to help.

Follow us on LinkedIn, Facebook, or Twitter. You can also join our mailing list to stay on top of interesting developments in this space.

Good luck to all the participants!

Affinidi Partners With Sudan’s Tech for Hack the Mountains 2.0 was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Access Management Trends Towards a Zero-Trust Paradigm

Organizations around the world have been rapidly modernizing their access management infrastructures in response to increased cyber-attacks and data breaches, enactment of security and privacy regulations, and a shift to remote working. Access management modernization is quick shift away from insecure passwords, which time and again facilitate criminals to gain access to corporate networks and d

Organizations around the world have been rapidly modernizing their access management infrastructures in response to increased cyber-attacks and data breaches, enactment of security and privacy regulations, and a shift to remote working.

Access management modernization is quick shift away from insecure passwords, which time and again facilitate criminals to gain access to corporate networks and data. A recent survey on the Psychology of Passwords found that 59% of the respondents use the same password repeatedly, at home and in the office. The main reason behind this dangerous trend is the fear of forgetting the login information.

It is therefore time for businesses to evolve their approach to identity proofing to mitigate the risks of an increasingly remote workforce. As the corporate boundaries disintegrate, organizations require a new defensive playbook to address emerging business risk and remain secure.

A Zero-Trust approach to risk mitigation

When the concept of Zero-Trust was first conceived, it was about zero trust networks. However, today and in the future, corporate users and partners are accessing data through a plethora of networks – corporate, private, mobile. Hence, it is not efficient to discuss about securing access networks – it is almost impossible to do so.

The security perimeter of corporate assets had to shift to another frontline. The evolution of Zero-Trust is about securing the access points to apps and data. Since everyone is literally an outsider in cloud environments, there is a need to “never trust, always verify” the identity of the individual or the device requesting access to data and apps.

While Zero-Trust is not a single technology solution, it is the paradigm that can assist businesses support securely requirements for accessing a “constellation” of apps and data from everywhere. By validating the identity of the individual or device at every access point, organizations can mitigate the threats introduced by the proliferation of multi-cloud and hybrid computing environments and secure their digital transformation initiatives.

Access management trends towards a Zero-Trust future

As technology evolves, organizations can evolve their access security to meet the increasing demands of cloud computing and pave their path towards a Zero-Trust culture. Businesses need to be aware of the various emerging trends on access management, evaluate them and tailor them to their business needs.

According to many security experts, the following trends will shape the access management and security ecosystem:

Identity-first security - Now that organizations operate fully (or mostly) remote, this trend has become vital to address. The result of these technical and culture shifts is that “identity first security” now represents the way all information workers will function, regardless of whether they are remote or at their office. Managing machine identities as a critical security capability - As digital transformation progresses, organizations are experiencing increased numbers of non-human entities – containers, apps, services, mobile devices – IoT devices. Managing machine identities has become a vital part of the security strategy. As the number of devices increases, establishing an enterprise-wide strategy for managing machine identities, certificates and secrets will enable the organization to better secure digital transformation.

Shared security model in the cloud will be the defining factor for managing access requests. The old security model of “inside means trusted” and “outside means untrusted” has been broken since most digital assets and devices are outside the enterprise, as are most identities. The shared security model provides a more integrated, scalable, flexible and reliable approach to digital asset access control than traditional security perimeter controls.

Organizations are already opting for these integrated, cloud agnostic access security platforms which embed strong access management controls, such as verification and authentication of users for privileged account access via single sign-on (SSO) and multi-factor authentication (MFA). These features add a multitude of risk mitigation benefits and create sources for contextual adaptive access controls.

Thales SafeNet Trusted Access platform can help business today reach a Zero-Trust access management future. SafeNet Trusted Access has been recognized by KuppingerCole as “a market leading Enterprise Authentication and access management solution. It offers a variety of hard and soft token solutions as well as FIDO compliant authenticators. It is highly scalable and built for environments that require the highest security levels.”

The report highlights, as strong points and benefits of SafeNet Trusted Access, the compatibility with FIDO 2.0 protocol, a good out-of-the-box selection of connectors for a variety of apps, integrations with most IAM and IDaaS products, strong anti-tampering mechanisms, and machine learning risk detection models.

You may find out how a Zero-Trust approach and SafeNet Trusted Access can help you establish a strong access security by watching this podcast.


Ontology

Ontology Weekly Report (July 1–7)

This week marked the official start of the second half of the year — where does the time go? We hope the remainder of the year will provide as much success and prosperity as the previous six months have. This month we are excited to unveil our new ONTO campaign. You will find more details on this below. This week’s topics: MainNet 2-year Anniversary / FBG financial product / ont.io / U

This week marked the official start of the second half of the year — where does the time go? We hope the remainder of the year will provide as much success and prosperity as the previous six months have. This month we are excited to unveil our new ONTO campaign. You will find more details on this below.

This week’s topics:

MainNet 2-year Anniversary / FBG financial product / ont.io / Update from bloXroute

Back-end
- New token model code upgrade complete
- Decentralized governance transition complete

Product Development
ONTO v3.0.0
- 50 BTC have been collected for the first FBG financial product on ONTO within the space of two hours.
- The special NFT campaign for Ontology’s MainNet 2-year anniversary & July NFT campaign is underway, providing millions of NFTs for Ontology’s global users.

Explorer
- v2.3.2 launched
- Official website ont.io v4.0 launched

dApp
- 62 dApps live on Ontology
- 5,447,383 transactions since genesis block
- 26,558 transactions in the past week

Bounty Program
- Seeking Python SDK community developers
- 1 new application for Technical Documentation Translation

Community Growth
- We welcomed 1,035 new members across Ontology’s Vietnamese, Bengali, Russian, and Dutch communities.

Newly Released
- The Ontology 2.0 FAQ has been published, answering all of your questions about Ontology’s New Token Model, SAGA, Add-on Store, and Open Knowledge Graphs (OpenKG). Read here for a selection of the most interesting questions you asked and we answered.
- Ontology’s integration with bloXroute is under special performance testing in a cloud environment. bloXroute has completed work on the Cloud-API transaction stream service. The Cloud-API transaction stream service and account registration will be our main goal for July.

Global Events
- Ontology’s 2-year anniversary AMA was a great success. Jun LI, Founder of Ontology, answered questions from the Ontology Global Community on Ontology 2.0 and our future plans. Looking to the future, Ontology will focus on digital identity and data, as well as finding opportunities to boost the 2B and 2C business. Meanwhile, the NFT campaign is undergoing. Users who have previously created an ONT ID will receive the whole series of NFT.
- Jun LI, Founder of Ontology, was invited to the second Chain Plus Blockchain New Finance Summit 2020, where he delivered a speech on blockchain-based trust data and social environment, as well as introducing ONT ID, ONTO, and SAGA in detail.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (July 1–7) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Personal Identity and the Future of Digital Interactions

Now is the perfect time to rethink how we engage with others digitally. Online interactions are increasingly prevalent and our personal data is being spread across numerous companies’ applications and services like breadcrumbs dropped on a hike. All the while, regulations and technical requirements associated with collecting and storing that data are growing, causing businesses to reexamine their

Now is the perfect time to rethink how we engage with others digitally. Online interactions are increasingly prevalent and our personal data is being spread across numerous companies’ applications and services like breadcrumbs dropped on a hike. All the while, regulations and technical requirements associated with collecting and storing that data are growing, causing businesses to reexamine their identity and access management systems and evolve to keep pace. The game is changing—and it’s time people have more control over their own digital selves.
 

Monday, 21. June 2021

1Kosmos BlockID

Introducing the 1Kosmos Appless Feature

The Old Way: Friction Filled Authentication
The Old Way: Friction Filled Authentication

Bloom

Bloom donates WACI to the Decentralized Identity Foundation (DIF)

Bloom makes its first donation to the Decentralized Identity Foundation (DIF), Claims and Credentials Working Group The DIF and Working Groups The DIF is a Joint Development Foundation project that facilitates IPR-protected co-development among large and small industry players in the decentralized identity space. It produces open-source code, pre-standard specifications,

Bloom makes its first donation to the Decentralized Identity Foundation (DIF), Claims and Credentials Working Group

The DIF and Working Groups

The DIF is a Joint Development Foundation project that facilitates IPR-protected co-development among large and small industry players in the decentralized identity space. It produces open-source code, pre-standard specifications, market collaboration, and non-technical documentation.

Working Groups (WG) are at the core of DIF. These groups are scoped by functional areas and are designed to drive emerging standard specifications backed up by open-source code.

Bloom is a member of the Claims and Credentials WG, which focuses on standards and technology that create, exchange, and verify claims and credentials in a decentralized identity ecosystem. The Claims and Credentials WG  hosts specifications for projects like the Presentation Exchange and Credential Manifest. Although these specifications outline the data models for requesting and issuing Claims, they don't specify how the data is sent between the holder and issuer/verifier. Enter WACI!

Wallet And Credential Interactions (WACI)

WACI was introduced at the annual IIW32 Workshop  with a very warm response. Its goal is to specify how interactions between a wallet and Relying Party (RP) such as an issuer or a verifier happen. At its core, WACI is a handshake of JWTs, where the RP signs a JWT that is given to the wallet, and the wallet signs another JWT containing the initial token as a "challenge" claim. This allows the wallet to prove ownership of its DID.

The easiest way to see the benefit of WACI is to offer a way to log into an application without a password with Verified Credential (VC) based authentication that cannot be faked.

A picture (demo video in our case) is worth a thousand words:

Next Steps for Bloom and C&C WG Achieve v1 release of WACI-Presentation-Exchange specification Fold back in lessons and extensions from #1 and then flush out remaining sections to achieve v1 feature-completeness for v1 of the WACI specification and sample implementation Register a custom URI schema (waci://)

Elliptic

Crypto Enforcement Actions by US Regulators Reach $2.5 Billion

Contrary to the widely-held belief that the cryptoasset industry is unregulated, US regulators are increasingly imposing significant financial penalties on crypto businesses - for fraud, breaches of AML regulations, offering unregistered securities and sanctions violations.

Contrary to the widely-held belief that the cryptoasset industry is unregulated, US regulators are increasingly imposing significant financial penalties on crypto businesses - for fraud, breaches of AML regulations, offering unregistered securities and sanctions violations.


Dark Matter Labs

DM Note #3

Lessons from place-based strategic innovation in Europe, Asia and Africa This is the third in a series of DM notes that we will write about the insights from our work on the ground, which follows internal learning sessions called the DM Downloads that are organized every two weeks or so. The aim is to make our practice more legible, for us as well as for you. DM Note #3 is a preview of
Lessons from place-based strategic innovation in Europe, Asia and Africa
This is the third in a series of DM notes that we will write about the insights from our work on the ground, which follows internal learning sessions called the DM Downloads that are organized every two weeks or so. The aim is to make our practice more legible, for us as well as for you.
DM Note #3 is a preview of our international portfolio around place-based strategic innovation, from the city to country levels, across Europe, Asia and Africa.
In theory — What do we mean by place-based strategic innovation?

Innovation involves something new coming into existence. Whereas we see place-based strategic innovation to be something new coming into existence with the purpose of interacting with, and actively changing, the systems that shape a place.

In theory, we understand place-based strategic innovation to manifest with a focus on:

Learning / Capability & Capacity Building / Potential for Systems Change Grand Challenge / Complex Problem Traction / Tangibility / Visibility Connectivity / Cross-cutting Systemic Nature / Transformational Catalytic / Multiple Horizons / Multiple Impacts Stakeholder Diversity / Deep Engagement / Movement Building.

What does it look like at the city level?

To pursue transitions on the scale and depth required to address deep-rooted challenges faced by cities, we must recognise that urban environments are made up of a complex web of dynamic and adaptive systems and act upon them accordingly.

From single-point solutions > to systems innovation and transformation
From portfolio of projects > to portfolios of strategic experiments
From single-lever experimentation > to multi-lever ‘full-stack’ experimentation

An urban transition portfolio works to accelerate and deepen change in a city by marking out new pathways at the forefront of action. The insights and actionable intelligence that emerge, help inform the strategic decision making and investment required.

Strategic innovation in cities

There is no single, unifying theory of change that guides us. It combines at least the following:

Product innovation — identifying discrete problems and developing products as solutions to test out. Capability building — identifying outdated approaches / ways of working that aren’t conducive to strategic innovation and seeding alternative habits, rituals & norms in strategically significant people and organisations. Movement-building — developing new narratives that galvanise ‘coalitions of the willing’ and form the bedrock of legitimacy for change. Overview of a few initiatives in that domain Healthy, Clean Cities — Deep Demonstration across Europe

While political leaders across the continent are setting ambitious targets to achieve net-zero emissions, how to deliver on this at a pace and scale necessary remains unknown.

Dark Matter Labs has been supporting EIT Climate-KIC to work with the most ambitious mayors and municipalities in Europe to co-create a portfolio of joined-up strategic experiments that provide them with new knowledge, innovations and capabilities that can aid them in their Climate Action Plans — from mobility to waste to energy to health to the built environment — and help other cities to navigate the great transitions required for ‘Transformation in Time’. For more information on this initiative, click here.

NextGenCities Program: Zimbabwe + Angola

As part of the 2019 Harare Innovation Days (HID), Dark Matter Labs worked with UNDP country office staff, ‘strategic urban risk holders’ and edge experimenters from 23 countries across Africa under the banner of #NextGenCities to build shared understanding of the interdependencies and complexities facing Africa’s diverse urban realities, with clear clusters of common challenges — such as urban economies and informality, water management, the jobs deficit and waste services — you can read more about the context of HID in a blog published before the event.

We then developed a #NextGenCities pilot program that sought to build capabilities for designing mission-based strategic portfolios of interventions to tackle wicked issues and dynamics that COVID merely surfaced and accelerated. As Edgar Pieterse of the African Center for Cities in Cape Town, African cities desperately need “a more coordinated, sequenced and integrated approach to infrastructure planning, investment, delivery, maintenance and repair.” Two countries qualified for the program, Zimbabwe and Angola and we spent four months with teams from UNDP’s Accelerator Labs working on food security and urban markets for Zimbabwe and Angola respectivelys. For more information on this initiative, click here, here and here.

North Macedonia National Development Strategy

Dark Matter Labs were commissioned to develop the methodology for the 2021–2041 National Development Strategy (NDS) for North Macedonia.

We used a methodological and governance framework that could provide a long-term vision and roadmap for the country’s economic, social, cultural, and environmental development and Human, Machine and Ecological Transition at a National Scale

A primary focus has been to create the capabilities and societal resilience to deal with increasing uncertainty and risk in our interconnected world, as well as deep trends such as climate change and digital transformation. The NDS is also being prepared against a backdrop where European countries are dealing with the health, economic and social crisis presented by the Covid-19 pandemic to lay the foundations to “build back better”, in a way that delivers a healthy future for everyone.

Building System Intelligence in Bhutan, Philippines and India

Dark Matter Labs has worked with UNDP Asian Pacific Regional Innovation Centre, UNDP Bhutan, Philippines and India, which look at youth unemployment and food security issues. These two systematic challenges have a clear intention of building system intelligence: sense-making capacity and transitional capacity to understand the interdependent nature of local/national/global problems and create governance opportunities for shared learning beyond the boundaries of ‘nation’. The project aims to create a shared, cross-cutting portfolio of interventions and transnational governance innovation for both India & the Philippines rather than making comparisons as the food systems/crisis cannot be solved by one nation in isolation. You can read more about one of these initiatives in the following blog: Breaking the Silos Through Participatory Systems Mapping.

In practice — Key lessons

Strategic innovation cannot be the starting point, it needs to sit within a context. People and organisations are more willing to ‘lean into’ strategic innovation if it builds upon local identities, pre-existing strategies, strength and assets.

Strategic innovation cannot be held by only one stakeholder. Designing and deploying strategic innovation in a meaningful way requires deliberate ecosystems of technical expertise and local partners / brokers. This entails working with a consortium of partners, which presents its own challenges.The sheer amount of coordination and management of politics and personalities cannot be under-estimated.

Strategic innovation is political. Political realities dictate the permission for strategic innovation, the power dynamics that shape strategic innovation and the pace by which strategic innovation can be pursued in a place. It dictates the pace and orientation of change.

For innovation to be strategic, we need to understand the broader picture. Reframing the problem / opportunity space can help unleash strategic innovation — sowing the seeds of new mental modes that flourish further down the line. It is no longer about this single problem or solution.

For problem owners, strategic innovation is as much about organisation & culture change as it is about problems and solutions. Understanding how each organisation needs to change in order to be more systemic & strategic in the way they approach innovation is a real journey that takes time.

Strategic innovation at a national level can help test many approaches against machineries of policy making and programme delivery across the whole spectrum of national priorities, cascaded down across regions to municipalities. It helps create various adaptations of the same idea at different levels.

Strategic innovation at the national scale means we need to be much more strategic, political and adaptable in our own language. We will need to also be more practical and proactive on strategies. For instance, we had to create a glossary on what our terms meant so that when these new concepts were translated into local languages to make sure the value of the innovation would not be lost.

In strategic innovation, there is a delicate balance between the watering down of the purity of ideas VS greater societal and political acceptance that brings a framework for implementation (and often funding to do the work). On the one hand, we want to be precise and creative the language to open up new ways of seeing, and at the same time we want to ensure that the words we use don’t alienate the people we work with, as well as the people they serve.

Get in touch

If you enjoyed this DM Note, please also read DM Note #1 and #2 here, follow us on Medium for more to come and “clap” the article to show appreciation. And please feel free to reach out and share your thoughts on this as we continue to grow a community of interest / practice / impact around the world.

Tom Beresford
Dark Matter Labs UK
tom@darkmatterlabs.org

Eunji Kang
Dark Matter Labs Korea
eunji@darkmatterlabs.org

Zehra Zaidi
Dark Matter Labs UK
zehra@darkmatterlabs.org

DM Note #3 was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

Ocean Market will introduce ratings & reviews thanks to UTU’s trust infrastructure

UTU’s AI-driven trust infrastructure to build comprehensive profiles on data providers, data consumers and data sets in Ocean Market Ocean Protocol is teaming up with UTU to launch their Web3 trust infrastructure for a safer internet on Ocean Market and to build mechanisms to share data between the two platforms. UTU’s vision is to make the internet a safer, more trusted place to gather, sh

UTU’s AI-driven trust infrastructure to build comprehensive profiles on data providers, data consumers and data sets in Ocean Market

Ocean Protocol is teaming up with UTU to launch their Web3 trust infrastructure for a safer internet on Ocean Market and to build mechanisms to share data between the two platforms.

UTU’s vision is to make the internet a safer, more trusted place to gather, share, work, and trade. Its AI-driven trust infrastructure can ingest the activity of Ocean Market data providers, data consumers, datasets, and other entities to build comprehensive profiles. Additionally, Ocean Market users will be able to rate and review each of these entities. All of this data will be dynamically analyzed to create a trust score for each entity. It uses blockchain to reward trustworthy actors and to ensure reviews and ratings cannot be manipulated.

There are four components that UTU will build with Ocean, on both Ethereum Mainnet and Polygon:

A standalone web app that rates Ocean data assets and accounts; A widget that incorporates ratings directly into the Ocean Market; A curation engine that recommends the most relevant data assets and accounts to Ocean Market users; Updates to all of the above once Ocean v4 is released.

This partnership will lead to increased platform trust and transparency, which will reduce friction in transactions, help users more easily discover quality datasets from trusted data providers to either use for data analysis or stake, avoid pool rug pulls and other malicious activity, and more. Fees for software development and UTU API usage will be funded by the Ocean Protocol Foundation, and the bulk of the work will be completed by Q4 2021. We will also explore ways to share data between our platforms in a privacy-preserving manner.

Ocean Protocol’s mission is all about creating a trustless system for sharing data, so we’re very excited about our joint venture with UTU. It’ll allow good actors on Ocean Market to display the trust they’ve earnt in the marketplace, which in turn will incentivize others to trust those actors again and again — a virtuous cycle of transparency and trust without centralized intermediaries.
– Manan Patel, Growth Accelerator at Ocean Protocol

The UTU protocol compensates users for providing accurate ratings and reviews, and for sharing their data with the UTU platform. The collaboration will make it easier for UTU users to further monetize their data on Ocean Market by becoming a data provider; users will retain full control over what data to share with either platform, what data they would like to monetize, and how it should be used.

UTU and Ocean Protocol share the beliefs that 1) data is a new asset class and 2) data privacy is of utmost importance. We’ve studied their data models in the past and are impressed by the ecosystem that they’ve built. This is why we’re so excited to partner with them to increase trust in their data marketplace and to facilitate data sharing between our platforms in a privacy-preserving way. This partnership is particularly exciting for us as I remember Trent giving a small talk to our incubator cohort at Zeroth AI back in 2018 and thinking that an UTU-Ocean collaboration would be a massive validation and use case for our model — 3 years to come full circle on this.
– Jason Eisen, Co-Founder and CEO of UTU

We will explore additional ways to share data between our protocols:

Putting publicly-available, on-chain UTU data (such as endorsements, transactions, and more) on Ocean Market. Purchasing datasets on Ocean Market that may be helpful to the UTU protocol. Analysing who provided and consumed which datasets, and who provided liquidity on which datasets on Ocean (using the staking/curation mechanism) to derive UTU recommendations and feedback. About Ocean Protocol

Ocean Protocol’s mission is to kickstart a new Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Data is a new asset class; Ocean Protocol unlocks its value. Data owners and consumers use the Ocean Market app to publish, discover, and consume data assets in a secure, privacy-preserving fashion.

Ocean datatokens turn data into data assets. This enables data wallets, data exchanges, and data co-ops by leveraging crypto wallets, exchanges, and other DeFi tools. Projects use Ocean libraries and OCEAN in their own apps to help drive the new Data Economy.

The Ocean token is used to stake on data, to govern Ocean Protocol’s community funding, and to buy & sell data. Its supply is disbursed over time to drive near-term growth and long-term sustainability. OCEAN is designed to increase with a rise in usage volume.

Visit oceanprotocol.com to find out more.

Twitter | LinkedIn | Blockfolio | Blog | YouTube | Reddit | Telegram ChatDiscord

About UTU

UTU is building the trust infrastructure of the internet to help businesses and consumers engage and transact in an easier, safer, and more trustworthy way.

Our AI-based API products collect and analyze data to create trust signals and personalized recommendations that help consumers and businesses make the best decisions for their situation. And the UTU blockchain protocol rewards users for trustworthy actions and compensates them for sharing their data while protecting their privacy.

UTU changes the economics of trust, ensures trust can’t be bought or manipulated, and leverages data to help people make better decisions.

We are proudly based in Nairobi, Kenya.

Learn more about us on our website, Twitter, Telegram, LinkedIn, Reddit, YouTube, and Facebook.

Ocean Market will introduce ratings & reviews thanks to UTU’s trust infrastructure was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


ValidatedID

Digital signatures, a fast track to digital transformation in the real estate sector

The latest real estate trend reports show how the pandemic has accelerated the use of technology and the implementation of trends such as teleworking and digitisation of processes. Find out how digital signatures are revolutionising the industry.
The latest real estate trend reports show how the pandemic has accelerated the use of technology and the implementation of trends such as teleworking and digitisation of processes. Find out how digital signatures are revolutionising the industry.

PingTalk

Passwordless Authentication - What does passwordless really mean?

As a security professional, you probably hear a lot of talk about the need to maximize security while minimizing user friction. You may also be familiar with the term “passwordless authentication.” Passwordless authentication goes hand in hand with the concept of balancing security and user experience and is increasingly mentioned as a solution to achieving this balance.  Yet, the idea of

As a security professional, you probably hear a lot of talk about the need to maximize security while minimizing user friction. You may also be familiar with the term “passwordless authentication.” Passwordless authentication goes hand in hand with the concept of balancing security and user experience and is increasingly mentioned as a solution to achieving this balance. 

Yet, the idea of passwordless authentication creates more questions than answers for many. What does it really mean to authenticate without passwords? Also, why should you want to and how does it work? Read on to gain a deeper understanding of the what, why and how of removing passwords to improve both user experience and your overall security.

 


Identosphere Identity Highlights

Identosphere #37 • Apple going Passwordless • Hashgraph joins W3C • Open Badges as VCs

A curated collection of upcoming events, news, updates, standards work, tweets and blog posts in decentralized identity and SSI
Welcome and Thanks to our Patrons! Support this publication at Patreon.com — Get Patron only content!!

Our quarterly WG mailing list review is underway and will be delivered to patrons in the next few days!!!

Read previous issues and Subscribe @ https://newsletter.identosphere.net

Coming up

Identiverse 2021 • 6/21-23 (Denver)

    Session that looks good: Why Isn’t Identity Easy?

Digital Twins and Self-Sovereign Identity: Build the Next Generation of Simulation with Privacy Preservation • 6/24, Jim St.Clair

1) The challenges of digital identity and ICAM in IoT and digital twins

2) How to apply SSI and decentralized identity with IoT and digital twins

3) How the Sovrin Foundation is advancing SSI in IoT for industry use

An introduction to the Global COVID Certificate Network • 6/23

SSI/Identity Hackathon in Latin America • 6/28-7/16

EEMA Annual Conference • 6/29-7/1

The 34th The European Association for e-Identity and Security Annual Conference focuses on ‘Securing Trust in the New Digital Reality’. (Kaliya is speaking) 

Identity Report, Highlighting the Most Pervasive Threats to Digital IdentitiesJune 24

The complete Auth0 State of Secure Identity Report, which includes additional key findings and recommendations on how organizations can improve their identity security posture, can be downloaded here.  

Blockchaln, Self-Sovereign Identity (SSI) and its Use in the Real Estate Industry John Dean Markunas with Mr. Jacques Bikoundou • 6/24

Meme of the Month @erwin tweets:

Self Sovereign Identity is an integral part of WEB3.0 Be free, own your identity! #KILT #YourWildestMemes #WEB3

Good Health Pass Blueprint for a Digital Health Pass Kuppinger Cole

Binding an identity to a Verifiable Credential remains valid beyond the point of verification by being able to match a real-time biometric data point with one which was logged at the point of verification

Good Health Pass Blueprint and the Global Covid Credentials Initiative by LFPH presented at the DIF Interop Working Group High Level 5 Reasons to Use an Identity Wallet Affinidi

identity wallets are an integral part of the future as it comes with a ton of benefits geared for the next-gen online security.

DID 101: A Brief Introduction to What Makes Ontology Special

OScore is an independent on-chain reputation system, generated using on-chain data including your ONT ID, engagements, assets, and credentials.

Digital Identity Wallet: A place for your self-sovereign identity SSI Ambassador

This article explains what a wallet is, how it works and how you can use it for managing your digital identity

New to the topic of self-sovereign identity? SSIAmbassador

No problem, there are several beginner #guides, which you can use to get familiar with the new standard for digital #identity.

To Succeed In Decentralizing Digital Identity, Focus On Relationships First Forbes

2020 forced the world of identity to step up, and it has. There's much more innovation on the horizon to look forward to.

^^^ @UbikomProject tweets: “Nice overview, somewhat questionable assumptions” Standards Work Verifiable Claim Protocol Ontology

This isn’t new, but it’s new to us, and thought our readers might appreciate it, in case you have also wondered about the nuts and bolts behind OntID

Open Badges as Verifiable Credentials

In the W3C VC-EDU call on June 7, 2021 we discussed Open Badges asserted as W3C Verifiable Credentials (VCs). This call began the public discussion of Open Badges as Native VCs (potentially as Open Badges 3.0) to inform the IMS Open Badges Working Group. Why are we discussing this? Why does it matter? How will it work?  

Hedera Hashgraph Joins World Wide Web Consortium (W3C)

We welcome Hedera as a contributing member to the W3C DID Working Group and congratulate their team for reaching this milestone of a published implementation of the latest W3C DID Identifiers v1.0 draft

Verifiable Credentials Aren’t Credentials. And They’re Not Verifiable In the Way You Might Think Timothy Ruff

VCs can carry any sort of data payload, and that isn’t just a good thing, it’s a great one. Part two of my container series covers how such fluid data portability could economically affect cyberspace to a degree comparable to how shipping containers affected global trade.

DIF FAQ

This is a general-purpose collection of frequently-asked questions, initiated/straw-manned by some regular contributors to the DIF Interoperability Working Group, DIF Staff, and volunteers organized on DIF’s member Slack.

Literature

@BISconf:

supposed to be published this week

Exploring Potential Impacts of Self-Sovereign Identity on #SmartService Systems: An Analysis of #ElectricVehicle Charging Services by Daniel Richter and Jürgen Anke from @dresdenhtw

MyData  Revolutionising healthy ageing

Mydex’s role will be to provide the data sharing infrastructure to enable individuals and service providers to safely and efficiently share the right data at the right times, in ways that protects individuals’ privacy and puts them in control of their data at all times and enable two way engagement and feedback throughout the project.

Company News Evernym Selected as a 2021 Technology Pioneer by the World Economic Forum

The 2021 cohort of Tech Pioneers includes many future headline-makers at the forefront of their industries. These companies show great potential to not only shake up their industries but offer real solutions to global problems.

Introducing Veramo

how it evolved from the challenges faced with uPort’s libraries. In this next series of articles we will give Veramo a proper introduction and answer some of the basics: why it exists and what it does, followed by articles describing the architecture in more detail, and how to build applications using Veramo. 

Verifiable Credentials with Auth0 and MATTR

How to issue Verifiable Credentials from Auth0 user data using MATTR's Auth0 Marketplace Integration

The Verification Sessions API

Use the Verification Session API to securely collect information and perform verification checks. This API tracks a verification, from initial creation through the entire verification process, and shows verification results upon completion.

MemberPass Digital ID can help Reduce Expenses and Build Member Trust

It’s never been easy to run a credit union, especially when you get whacked from all sides. Regulations change, members always seem to want access to another shiny new technology gadget, and financial fraud continues to be a threat.

Certs: Now Available as a Self-Guided Demo Dock

Curious about what you can build on Dock? Try for yourself!

Public Sector The LEI: A Swiss Army Knife for the World’s Digital Economy

The Global LEI System is the only open, commercially neutral, and regulatory endorsed system capable of establishing digitized trust between all legal entities everywhere. It was established as a public good

Recognizing Digital Identity as a National Issue Ping

we dove into creating a centralized and holistic approach to protecting and regulating identity in the United States and the specifics of why digital identity and cybersecurity are national issues that the private sectors simply cannot tackle on their own.

The Trust Economy in a Future New Zealand DigitalID NZ

My interest was first piqued when I came across three videos on YouTube from Rachel Botsman, Jordan Perterson and Philipp Kristian Diekhöner.

Trust has always been at the centre of society overall and commerce in particular

A Collaborative Approach to Meeting the Challenges in President Biden’s Executive Order on Improving US Cybersecurity

One key aspect outlined in Section 4 of the Executive Order (EO) is securing the software supply chain.

eSSIF-Lab Glossary The keystone foundation companion to the Blockchain is Digital Identity DigitalScot

the core mechanics for forming these ecosystems through sharing Identity data between collaborating partners, exemplified by initiatives such as the EU’s recent announcement and explained here by the OIX Identity forum.

RaonSecure builds a blockchain-based digital wallet service with a public institution

Selection of RaonSecure as the final operator of the ‘blockchain-based digital wallet project’ that is part of the ‘2021 Blockchain Pilot Project’ program promoted by the Korean government

Identity not SSI Announcing the FIDO Developer Challenge for Developers Across the Globe

create and demonstrate compelling and innovative applications leveraging FIDO standards and technologies.

What Is Zero Trust? Ping

The network is always assumed to be hostile.

External and internal threats exist on the network at all times.

Network locality is not sufficient for deciding trust in a network.

Every device, user and network flow is authenticated and authorized.

Policies must be dynamic and calculated from as many sources of data as possible.

Introducing: The OAuth 2 Game

It features two dice, one for grants and another for application types. Throw the dice and consult the instructions to discover whether the combination of grant and application type you obtained happens to be a good one!

Social KYC, an Alternate Form of Identity Verification for Web3, with Ingo Rübe @KILTProtocol. SSIOrbit Podcast

Use cases in the Gaming industry and why gaming is poised to become early adopters of SSI

Not Identity (Business Processes)  The Document Culture of Amazon Justin Garrison

Meetings start with reading. Depending on the length of the document, we’ll read anywhere from ten minutes to half an hour. If the meeting has a long document (six-pagers are the longest) and many attendees, the meeting will be scheduled for enough time to read and discuss.

Do Consumers Even Want Personalized Ads? Anonyme

The YouGov poll of consumers in France and Germany we mentioned earlier says it’s the behind the scenes or back door nature of personalization that gives people the creeps.

Decentralized Business Model Stepan Gershuni

How reduction in transaction costs influence evolution in digital business models

Innovative concepts and software for managing digital master data and certificates IDUnion

Companies today manage and maintain master data from business partners in multiple instances in various in-house IT systems — and do the same with their own master data in thirdparty systems.

Apple’s move beyond passwords

Discover in this technology preview how Apple is approaching this standard in iOS 15 and macOS Monterey.

What Apple’s WWDC PassKeys Annoucement Means for Enterprise IAM 

Apple’s approach to passwordless is not particularly unique since it adheres to the FIDO standard, however their implementation and approach to the credential recovery problem is unique and relevant to enterprises.

Apple continues to move into the identity space. 

Apple’s story is more about individual convenience in service of Apple. When it comes to the Big in Big Tech, Apple’s as Big as they come — all while being renowned control freaks. Their top-down approach to digital identity isn’t about portability or interoperability, it’s about strengthening their platform moat, where Apple takes a 30 percent cut on all sales.

What US states will support Apple Wallet digital identity cards? CNet

Now Apple wants to store your driver’s license on Apple Wallet Mashable

Thanks for Reading!

Read more \ Subscribe @ newsletter.identosphere.net

Support this publication @ patreon.com/identosphere

Sunday, 20. June 2021

KuppingerCole

Analyst Chat #81: Fraud Reduction Intelligence Platforms Revisited

In episode seven of this podcast, John Tolbert and Matthias first looked at Fraud Reduction Intelligence Platforms more than a year ago. Much has happened in this market segment since then, and on the occasion of the release of the updated Leadership Compass, they look at the latest innovations.

In episode seven of this podcast, John Tolbert and Matthias first looked at Fraud Reduction Intelligence Platforms more than a year ago. Much has happened in this market segment since then, and on the occasion of the release of the updated Leadership Compass, they look at the latest innovations.



Friday, 18. June 2021

Holochain

A Developer Ecosystem In Full Bloom

Holochain Dev Pulse 98

In Canada, where I live, spring is coming to an end and summer is getting started. I love this time of year; the progression of flowers bursting like a well-funded fireworks show, the lettuce unfolding and gracing the dinner table every evening, the strawberries that are not quite ripe enough but you stuff your face with them anyway.

I see the Holochain developer ecosystem going through a similar flush of growth. I’ll be honest: we’re still young and small, but that means the developers generally know each other and support each other well, and we’re able to support them well too. This seems to be creating a lot of vitality, sort of like those two small garden boxes in your backyard that thrive because they’re just enough for you to take care of.

Here are a few things that have caught my eye lately. If you’re a developer, I bet you’ll find them useful.

Holochain Gym: exercise those hApp dev muscles

The Holochain Gym is a wonderful set of exercises that take you through the basics of Holochain. There are three sections:

There are two sections:

Concepts gives you a grounding in the fundamentals of Holochain. I love the interactive visualisations using Guillem Córdoba’s Holochain Playground simulator; they turn very abstract concepts into things you can see and touch. I especially love the one that shows how Holochain handles the dreaded 51% attack (or worse) without miners or consensus. Developers introduces you to the building blocks of hApp development — data structures, host API calls, built-in patterns. There are code-along exercises that give you experience writing DNA code. Holochain Playground simulator showing six bad actors who try to stage a 60% attack. Through a few rounds of validation and ‘warranting’ (publishing evidence of malicious activity), the four good actors isolate themselves from the bad actors.

This project is a collaboration between a few active community members, and it’s still in its early stages. If you’re already familiar with hApp development and would like to contribute, I’m sure they’d be super grateful.

Holochain Open Dev: libraries for use in your own hApps

I’ve reported on the Holochain Open Dev GitHub org since the Holochain Redux days. Now with a rapidly maturing Holochain RSM and a bunch of hApps being built, it’s gaining new modules while the existing ones are getting refactored. Here are a few ones that looked interesting:

https://github.com/holochain-open-dev/profiles lets users choose a nickname (AKA screen name, username, or handle) and save profile information as key/value pairs. It also comes with a sample UI to show you how to use it on the front end. The interesting thing about this module is that nicknames are not exclusive — just like in real life, it’s perfectly okay for two people to have the nickname stina_nilsson if that’s what their name actually is. This demonstrates the beauty and weirdness of an agent-centric, consensus-free system. https://github.com/holochain-open-dev/contacts, from the Kizuna team, lets users store lists of friends (contact list) and enemies (block list), as in a social media or communication app. I didn’t see any documentation, but the module’s API looks pretty self-documenting. https://github.com/holochain-open-dev/holochain-time-index, from the Perspect3vism and Junto teams (more on them below), is one solution for retrieving data by time range. This is good for social media, blogs, financial records, and other time-based data. It also eliminates DHT hotspots that would normally come from putting a zillion links on one anchor. This repo is exquisitely documented and gives rationales for the design. https://github.com/holochain-open-dev/calendar-events implements a basic calendar. This looks like it would be great for shared personal calendars, events, or resource booking. As with profiles, it includes a sample UI. https://github.com/holochain-open-dev/file-storage is a generic blob-storage module. It ‘chunks’ files behind the scenes — this prevents DHT hot spots (imagine some poor peer with a 32GB smartphone having to store your hour-long holiday video in their DHT shard) but also may make it possible to implement streaming videos. (And besides, there’s a 16MB limit on DHT entries anyway.) https://github.com/holochain-open-dev/reusable-module-template is just what it sounds like — a template that lets you scaffold modules that you want other developers to use. Holochain In Action: a community of co-learning

I’ve been following the work of the Holochain in Action group for a little while, and I’m excited about what they’re doing. This group meets every Tuesday to learn, teach, share, and develop together. There’s a big focus on exploring and implementing design patterns that are appropriate for Holochain, which I think is important work.

If you’re a developer and want to get involved, jump to the forum and fill out the application form. You can also watch their past videos on YouTube — looks like there are a lot of good presentations.

Finally, you can check out the app they’re working on, called Peer Share, which lets people share content of all sorts with each other. They’re using it to explore various patterns, including an interesting one called the JSON-Schema pattern. Instead of having to define entry types at compile time, DHT administrators can create new types on the fly using the JSON-Schema vocabulary to validate content that’s supposed to conform to those types.

Ad4m and Perspect3vism: a unique take on application development

I find this a very intriguing project. I’m still wrapping my head around it; it feels like it could have profound implications for the way we construct our online experiences. And when I say that, I mean users, not developers. It appears to be an agent-centric approach to building an online life, consisting of ‘languages’ (ways of expressing oneself, such as a tweet, a to-do list, or a chess move) and ‘perspectives’ (spaces in which one wants to participate) that a person can combine in ways that suit them best.

The architectural approach reminds me of Holoscape, and the vocabulary reminds me of the Junto social app, which shouldn’t be surprising: it’s a collaboration between Holo team member Nicolas Luck and Junto lead developer Josh Parkin.

Holo-REA: a toolkit for cooperative economics

I get pretty passionate about non-coercive, appropriately sized, regenerative economies. It’s possibly the biggest reason I got involved with Holochain — because I saw that it had the potential to be a platform for these sorts of initiatives.

So I’m thrilled that Holo-REA is nearing maturity. Holo-REA is a toolkit for economic applications, particularly ones in which the players do their accounting out in the open (though it doesn’t exclude a bit of privacy either). The folks behind Holo-REA are building it to support a big vision of true-cost accounting, or considering all the costs of producing a good or delivering a service. This is something the current economic system isn’t great at.

Holo-REA implements a business ontology called ValueFlows. It looks complicated, but it’s really just a formalisation of things that people do all the time. This gives me hope that Holo-REA, and the applications that’ll be built on it,

You can find the source code for Holo-REA, along with lots of good literature on the why and how of the project, at https://github.com/holo-rea/holo-rea . Lots of people who are building on Holochain are already getting excited about it and planning to use it in their own projects.

Dev Camp postponed until September; lots of opportunities before then

The response to the next community-run dev camp has been overwhelmingly positive — and overwhelming!

More than 700 people have registered, which has resulted in the organisers changing their plans.

It’s moving from June to September, and they’ll be working with us at Holo to provide more learning opportunities — both during and before.

You can still register, and the organisers will send you updates by email so you know what’s going on.

The theme of the Dev Camp sounds really cool:

During DevCamp 8 we’re going to be building a multiplayer game: “Tragedy of the Holocommons”

We will bring every economist’s favorite game to life as a Holochain app! The heart of this game is a demonstration of consensus-seeking in an eventually consistent environment - a crucial part of building any decentralized application!

Of course, we’ll also cover all the other essential building blocks of a Holochain application, with little adjustments to make it more fun and cool!

Reaching agreement without relying on global consensus is a challenging thing to understand, but it’s at the core of what makes Holochain Holochain. This will be a great opportunity to understand more about that in a hands-on way.

Register for the Dev Camp today! And if you have any programmer friends who are disenchanted with blockchain, share it with them too. Maybe they’re ready for something fresh and new.

Can we have some updates on Holochain and Holo too?

Of course! Here are some of the most recent updates:

Holochain has been having some serious performance issues in stress-testing, and it looks like it’s caused by a memory leak in Wasmer, the virtual machine that runs hApp DNAs. The Wasmer team have been great at tracking down the bug and sharing workarounds, and we expect that they’ll fix the bug soon as well. DHT sharding is in active development. There’s been a lot of work to make gossip efficient in a DHT that includes lots of multi-user machines, as well as the previous work of moving the storage backend from LMDB to SQLite which made performant gossip possible in the first place. Both dev teams are focusing on improving performance everywhere. Some of these improvements (like batching database writes) will stick around, while others will become obsolete once DHTs can be sharded. There’s a new standalone bootstrap server that doesn’t require you to use Holo’s or deploy your own on CloudFlare. This was created for testing, but it could be useful for applications that want to run their own infrastructure. Note that it doesn’t persist any peer information on restart; it’s all in-memory. (If you’re not familiar with the bootstrap server, it makes introductions between peers so they can start communicating with each other.) The Holochain dev team is getting ready for the second public release of Holochain RSM, which will include all the changes since the first release in February. Many developers have been tracking the changes in the develop branch; this will support developers who prefer something a bit more stable, as well as ones who prefer not to work in Holonix. BREAKING CHANGE: all the DHT retrieval functions are disabled in validation callbacks except three new ones: must_get_entry, must_get_header, and must_get_valid_element. This serves two purposes: first, validation functions should always be deterministic, and this removes the DHT as a source of non-determinism; and second, it removes a lot of boilerplate because you don’t have to manually handle all the different error types when a piece of DHT data can’t be retrieved. All you need to do is use ? at the end of your DHT retrieval function, and the validation will automatically return the right sort of error to the host. Note: this change is not in develop and won’t be until after the next version of Holochain is released. The Holo team has been working on dev-ops infrastructure that makes it easy for them to switch their HoloPorts to testing channels and back. This is because Holochain development is progressing rapidly and they need to be able to reliably reproduce environments to test apps against different commits. This won’t matter much to you, dear reader, except that it’ll speed up testing and get us to the next alpha milestone a bit sooner!

Cover photo by Sven Brandsma on Unsplash


Evernym

Evernym Selected as a 2021 Technology Pioneer by the World Economic Forum

We’re excited to announce that Evernym has been named a 2021 Technology Pioneer by the World Economic Forum (WEF). This honor is awarded to a select group of early and growth-stage companies each year that are pioneering new innovations poised to have a significant impact on business and society. Past honorees include Google, Mozilla, Twitter, Kickstarter, and Airbnb. The post Evernym Selected

We’re excited to announce that Evernym has been named a 2021 Technology Pioneer by the World Economic Forum (WEF).

This honor is awarded to a select group of early and growth-stage companies each year that are pioneering new innovations poised to have a significant impact on business and society. Past honorees include Google, Mozilla, Twitter, Kickstarter, and Airbnb.

The post Evernym Selected as a 2021 Technology Pioneer by the World Economic Forum appeared first on Evernym.


Infocert (IT)

Danilo Cattaneo, InfoCert CEO, partecipa al talk “Il Domicilio digitale e i trusted services” organizzato da AgID per il Forum PA 2021

Il prossimo 22 giugno, Danilo Cattaneo – InfoCert CEO – sarà ospite di un importante talk organizzato da AgID (Agenzia per l’Italia Digitale) all’edizione 2021 del Forum PA in programma online dal 21 al 25 giugno; l’evento è organizzato da FPA, società di servizi e consulenza del Gruppo Digital360, che dal 1990 organizza il più […] The post Danilo Cattaneo, InfoCert CEO, partecipa al talk “Il Do

Il prossimo 22 giugno, Danilo Cattaneo – InfoCert CEO – sarà ospite di un importante talk organizzato da AgID (Agenzia per l’Italia Digitale) all’edizione 2021 del Forum PA in programma online dal 21 al 25 giugno; l’evento è organizzato da FPA, società di servizi e consulenza del Gruppo Digital360, che dal 1990 organizza il più importante evento nazionale dedicato al tema della modernizzazione della PA.

Martedì 22 giugno, a partire dalle ore 09:30 – evento in live streaming Scopri di più

InfoCert sarà fra i protagonisti del talk “Il Domicilio digitale e i trusted services”, discussione che ha come obiettivo quello di fornire un quadro di riferimento e le fasi di attuazione del domicilio digitale dei cittadini e dei soggetti non obbligati. Al tempo stesso intende fare un bilancio e alcune riflessioni per il futuro in merito all’adozione in Italia dei servizi fiduciari come previsti dal regolamento eIDAS, con alcuni dei principali protagonisti sullo scenario europeo e nazionale, anche alla luce dell’evoluzione dello scenario europeo di riferimento.

Introdurrà il talk Massimiliano Roma – Responsabile Comunicazione e Relazioni Istituzionali con gli Enti pubblici di FPA, moderatore della sessione Francesco Tortorelli – Direttore Responsabile Direzione Pubblica Amministrazione e Vigilanza di AgID; intervengono Claudio Petrucci – Responsabile Gestione servizi infrastrutturali di AgID, Riccardo Genghini di Studio Genghini & Associati, Andrea Sassetti – AD Aruba PEC, Trust Services Governance and General Affairs Director di Aruba Enterprise.

In particolare, Danilo Cattaneo interverrà per fornire la visione strategica di InfoCert quale importante QTSP a livello europeo, analizzando lo scenario internazionale che si sta delineando a seguito delle ultime evoluzioni del regolamento eIDAS, e su quello nazionale in vista delle opportunità che si prospettano con l’approvazione del PNRR.

La manifestazione si conferma punto di incontro e confronto tra pubblica amministrazione, imprese, mondo della ricerca e cittadini nonché occasione di approfondimento delle più importanti tematiche inerenti l’innovazione e l’introduzione delle nuove tecnologie nella PA. AgID sarà presente con due appuntamenti mirati ad approfondire importanti progetti di digitalizzazione, di grande interesse per amministrazioni e cittadini.

Per maggiori dettagli, visita la pagina dell’evento.

Forum PA 2021

Fonti:

https://www.agid.gov.it/it/agenzia/stampa-e-comunicazione/notizie/2021/06/14/agid-presente-forum-pa-2021

The post Danilo Cattaneo, InfoCert CEO, partecipa al talk “Il Domicilio digitale e i trusted services” organizzato da AgID per il Forum PA 2021 appeared first on InfoCert.


auth0

Quarkus and Auth0 Integration

Learn how to create a natively compiled Quarkus HTTP API secured with Auth0.
Learn how to create a natively compiled Quarkus HTTP API secured with Auth0.

KuppingerCole

Thales SafeNet Trusted Access Platform

by John Tolbert Thales is a major player in the global aerospace, defense, and security arena. Thales has a respected line of products and services in cybersecurity and identity protection. Thales SafeNet Trusted Access Platform is a market leading Enterprise Authentication and access management solution. It offers a variety of hard and soft token solutions as well as FIDO compliant authenticator

by John Tolbert

Thales is a major player in the global aerospace, defense, and security arena. Thales has a respected line of products and services in cybersecurity and identity protection. Thales SafeNet Trusted Access Platform is a market leading Enterprise Authentication and access management solution. It offers a variety of hard and soft token solutions as well as FIDO compliant authenticators. It is highly scalable and built for environments that require the highest security levels.


Strivacity Fusion

by John Tolbert Strivacity Fusion is a multi-instance SaaS-based Consumer Identity and Access Management (CIAM) solution. Strivacity Fusion was built in the cloud using the modern micro-services architecture for maximum flexibility and scalability. Strivacity Fusion offers customers MFA and consent management options to help meet differing global requirements.

by John Tolbert

Strivacity Fusion is a multi-instance SaaS-based Consumer Identity and Access Management (CIAM) solution. Strivacity Fusion was built in the cloud using the modern micro-services architecture for maximum flexibility and scalability. Strivacity Fusion offers customers MFA and consent management options to help meet differing global requirements.


Providers of Verified Identity

by Anne Bailey It is no longer enough to create and secure user identities. Verifying that the identity does indeed describe the individual it was created for is a valuable capability in many use cases, including authentication, remote onboarding, and enabling high value transactions. This Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, prepare your

by Anne Bailey

It is no longer enough to create and secure user identities. Verifying that the identity does indeed describe the individual it was created for is a valuable capability in many use cases, including authentication, remote onboarding, and enabling high value transactions. This Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, prepare your organization to conduct RFIs and RFPs, and determine requirements for determining a successful Provider of Verified Identity.


NRI SecureTechnologies: Uni-ID Libra 2.4

by Richard Hill Consumer Identity and Access Management (CIAM) continues to be a growing market, offering a better user experience for the consumer and new challenges for the organization. NRI Secure's Uni-ID Libra, with a focus on the Japanese market, continues to provide innovative features within its CIAM solution.

by Richard Hill

Consumer Identity and Access Management (CIAM) continues to be a growing market, offering a better user experience for the consumer and new challenges for the organization. NRI Secure's Uni-ID Libra, with a focus on the Japanese market, continues to provide innovative features within its CIAM solution.


IaaS Tenant Security Controls

by Mike Small IT Organizations now commonly use multiple cloud services as well as on premises IT. This KuppingerCole Buyer's Compass focusses on the capabilities IaaS services provide to manage the common business risks such as loss of business continuity, data breaches and regulatory compliance failure when using cloud services as part of a hybrid IT delivery model. It will provide you with que

by Mike Small

IT Organizations now commonly use multiple cloud services as well as on premises IT. This KuppingerCole Buyer's Compass focusses on the capabilities IaaS services provide to manage the common business risks such as loss of business continuity, data breaches and regulatory compliance failure when using cloud services as part of a hybrid IT delivery model. It will provide you with questions to ask vendors, criteria to select your vendor, and the requirements for successful deployments. This report will prepare your organization to conduct RFIs and RFPs for IaaS as part of a Hybrid IT service delivery model.


Database and Big Data Security

by Alexei Balaganski As more and more companies are embracing digital transformation, the challenges of securely storing, processing, and exchanging digital data continue to multiply. This KuppingerCole Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for implementing the necessary security and compliance controls to protect your sen

by Alexei Balaganski

As more and more companies are embracing digital transformation, the challenges of securely storing, processing, and exchanging digital data continue to multiply. This KuppingerCole Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for implementing the necessary security and compliance controls to protect your sensitive corporate data against a multitude of risks.


Accenture Memority

by Martin Kuppinger Accenture Memority is an IDaaS solution supporting both Identity Lifecycle Management and Access Management use cases. It comes with full API support and positions itself as the foundation for a customer’s Identity Fabric. Its strengths stems from the leading-edge support for digital transformation use cases such as support for IoT and connected devices, which is also powered

by Martin Kuppinger

Accenture Memority is an IDaaS solution supporting both Identity Lifecycle Management and Access Management use cases. It comes with full API support and positions itself as the foundation for a customer’s Identity Fabric. Its strengths stems from the leading-edge support for digital transformation use cases such as support for IoT and connected devices, which is also powered by the close interaction with Accenture consulting services.


IT Service Management

by Warwick Ashford IT Service Management (ITSM) refers to comprehensive solutions that support IT management capabilities to help organizations to optimize the design, delivery, support, use and governance of IT. This KuppingerCole Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will also he

by Warwick Ashford

IT Service Management (ITSM) refers to comprehensive solutions that support IT management capabilities to help organizations to optimize the design, delivery, support, use and governance of IT. This KuppingerCole Buyer's Compass will provide you with questions to ask vendors, criteria to select your vendor, and requirements for successful deployments. This document will also help prepare your organization to conduct RFIs and RFP for ITSM solutions


uPort

Introducing Veramo

In our last post we briefly presented Veramo and how it evolved from the challenges faced with uPort’s libraries. In this next series of articles we will give Veramo a proper introduction and answer some of the basics: why it exists and what it does, followed by articles describing the architecture in more detail, and how to build applications using Veramo. While our name may have changed, the ori

In our last post we briefly presented Veramo and how it evolved from the challenges faced with uPort’s libraries. In this next series of articles we will give Veramo a proper introduction and answer some of the basics: why it exists and what it does, followed by articles describing the architecture in more detail, and how to build applications using Veramo. While our name may have changed, the original vision of allowing individuals and organizations to own their own data and maintain privacy, has not.

The Self Sovereign Identity (SSI) space has been moving fast by establishing standards for decentralized identifiers (DIDs) and verifiable credentials (VCs) which simply did not exist when uPort set out five years ago. What began as a niche has caught wider attention. In Apple’s WWDC last week their Verifiable Health records appear to be using W3C VCs under the hood. The European Commission recently proposed a new electronic identity regulation that aims to make identity wallets mandatory for various public and private sector service providers. This progress validates our theory that the task of managing DIDs and VCs is important work.

Enter Veramo

Veramo is built with several goals in mind. First, we aim to simplify creating and managing identifiers and issuing and receiving credentials through straightforward APIs which run across backend, frontend, and mobile. We aim to provide the ability to do this in a spec compliant way with interoperability in mind. The framework should also be flexible enough to allow for many configuration options (multiple environments, hybrid deployment models, data storage, key management, etc.) and also to accommodate future advances in the space. With Veramo, we took feedback to heart and combined these features into a single framework over the outwardly confusing set of uPort libraries: uport-connect, uport-credentials, uport-transports, and uport-mobile.

The Veramo Agent

The entry point to managing identifiers, credentials, messages, keys and more is the Veramo Agent. The Agent provides a common interface for core and custom plugins to operate and orchestrates them through an event system. An Agent can be run as a CLI, backend service, within a mobile app, or in the browser and we have tutorials for each on veramo.io/docs. Agents can also work remotely, so you can have multiple agents with specific capabilities that work together to provide tailored functionality as one.

The Veramo Agent takes care of low-level details so you can focus on building your app. Once instantiated, it exposes methods for creating and managing identifiers through the did-manager plugin. Out-of-the-box support is available for ethr-did (Ethereum address), web-did (DNS domain), and did-key (simple public/private key pair).

For credentials, the credentials-w3c plugin combined with the messaging plugin handle issuing, receiving, signing, and sending W3C VCs from one DID to another. The data-storage and key-manager plugins handle storing the identifiers, credentials, messages and the keys for signing and encryption.

Running an agent will depend on your use-case and environment. To get started, take a look at our documentation and tutorials on veramo.io. The CLI tool is the quickest way to see how it works, and we also have tutorials for Node, React, and React-Native. Hit us up at hello@veramo.io or GitHub Discussions with feedback or questions.

Coming soon: Part 2 will go deeper on architecture details and Part 3 will show how to build applications using Veramo.

Introducing Veramo was originally published in uPort on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

How NFTs are disrupting the Gaming industry in developing nations

Jeff Zirlin, Head of Growth at Sky Mavis, on democratizing play-to-earn in the Gaming industry We’ve kicked off season two of Voices of the Data Economy with Jeff Zirlin, Head of Growth at Sky Mavis, the creator of Axie Infinity, a game that uses non-fungible tokens (NFTs) to reward players financially the more they play. During our conversation, Jeff describes the disruptive business model

Jeff Zirlin, Head of Growth at Sky Mavis, on democratizing play-to-earn in the Gaming industry

We’ve kicked off season two of Voices of the Data Economy with Jeff Zirlin, Head of Growth at Sky Mavis, the creator of Axie Infinity, a game that uses non-fungible tokens (NFTs) to reward players financially the more they play. During our conversation, Jeff describes the disruptive business model behind Axie Infinity, democratizing Gaming rights through NFTs, and how developing nations are getting involved in play-to-earn. Here are edited excerpts from the podcast.

The Business Model of Axie Infinity

The business model of Axie Infinity is different from a traditional game studio or Gaming company. Rather than relying on selling copies of a game or in-game resources, they have created an in-game economy, where players can sell their NFTs and in-game resources to anyone, anywhere in the world.

“We just take 4.25% of all marketplace activity as a fee. For example, if today, there was around $3 million of NFT volume on our marketplace; 4.25% of that gets taken as a fee for running the game. Right now, that payment goes to Sky Mavis, the developers of Axie Infinity. In the future, we’ll even be sharing that fee with all the owners of our governance token,” says Jeff.

How are NFTs changing the Gaming industry?

“I think the fundamental innovation that NFTs bring to the Gaming market is allowing for user-owned, user-operated games and economies. It creates much more interesting aligned incentives where the players and the developers are both incentivized to grow the ecosystem around the game.”

Players who are new to Axie Infinity need 3 Axies to get started, costing approximately $150. The majority of players are from developing nations and developing economies where they cannot afford to get started playing on their own. That’s where organizations like Yield Guild come in, who own massive amounts of Axies, the game’s native NFTs. Yield guilds offer scholarships to cover these costs; new players participating in the scheme are aptly called scholars.

But why is play-to-earn NFT Gaming interesting for developing nations? A large percentage of Axie Infinity’s player base has been coming from the developing world — Indonesia, the Philippines, Brazil, and Venezuela.

“Around 80% of our player base are from developing nations. That’s important because these are the types of people that need to be using blockchain. It can’t just be for people in Silicon Valley or Berlin. We need people who are coming for the benefits. One of the problems that the industry has is that people think about how we can make it easier to use the technology, but they don’t think enough about why people would use the technology in the first place.”

Scaling NFT Gaming by reducing barriers to entry

Recently, Sky Mavis raised $7.5M to scale Axie Infinity. Jeff mentions that a part of it will go into reducing barriers to entry for playing. People will be able to play for free to get started; they’ll be getting three starter Axies that are not blockchain assets. But they’ll learn how to play the game; they may get a taste for earning. They might be able to make 5% of the in-game resources that a blockchain user could use.

“I think that the market for NFTs will be a trillion-dollar industry in 10 years. At present, the in-game market is only $50 billion which is a small size. Why? Because people don’t want to spend money on stuff that they don’t earn, or that they don’t own. It can be quickly banned or confiscated by the game developers. Once they own their game assets, they are going to be spending trillions of dollars,” he says.

Jeff concludes that Gamers deserve to own their game assets, their game profile, their game identity — this is all property and information they should own. We just have never even thought that people should own this stuff. “At a mass scale, as a society, we are still grasping why people would want to own their data online but more people are starting to demand these types of rights. It’s only going to increase over time.”

Here is a list of the selected time stamps on the different topics discussed during the podcast: 01:46–04:00: Business Model of Axie Infinity 04:00–05:20: How NFTs are changing the Gaming industry 05:20–06:25: Jeff’s relationship with Gaming 06:25–11:42: How to get started with play-to-earn in a developing nation 11:42–15:00: Managing a global community of gamers 15:00–18:20: The growth projections for NFT Gaming 18:20–23:18: Deploying $7.5 million to scale Axie Infinity 23:18–25:35: The EU and US vs developing nations as a user base for NFT Gaming 25:35–27:45: How to choose a credible NFT project 27:45–End: The Future of NFT Gaming

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

How NFTs are disrupting the Gaming industry in developing nations was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


uPort

Veramo: uPort’s Open Source Evolution

When uPort began at ConsenSys in 2015, the self-sovereign identity space was also in its infancy. Early concepts existed as little more than academic theories with few attempts at implementation. Despite the lack of standards or the guidance of a marketplace, uPort began experimenting with our first architecture using smart-contract based identities. Over time the technical limitations with on-ch

When uPort began at ConsenSys in 2015, the self-sovereign identity space was also in its infancy. Early concepts existed as little more than academic theories with few attempts at implementation. Despite the lack of standards or the guidance of a marketplace, uPort began experimenting with our first architecture using smart-contract based identities.

Over time the technical limitations with on-chain identities began to pile up, which led to uPort’s 1.0 architecture and pioneering the use of decentralized identifiers (DIDs) with our open-source libraries. At the time DIDs, together with verifiable credentials (VCs) were proposed W3C standards and are now nearing official status. Dozens of projects are still using several of our popular libraries: uport-connect, uport-credentials, uport-mobile, did-jwt, and did-resolver, to name a few.

While successful, some of these libraries also presented challenges of their own. Maintaining libraries across different tech stacks (web, mobile, backend) became an impediment to the rapid iteration so critical in a space where standards evolve quickly. Around 18 months ago, we began work on the next iteration of our architecture, a project internally known as DAF (DID Agent Framework).

The primary goal of DAF was to create a flexible, modular architecture based around a library of core functionality that would run on each primary stack (web, mobile and backend), and could be easily extended through add-on packages. This approach allows us and the developer community to easily add the application specific functionality they need, whether it be additional DID methods, key management functions, new protocols, and many more possibilities without needing to modify the core library.

Earlier this year, we released the culmination of these efforts, informed by years of open-source work, learning, and iteration, as Veramo.

This next major iteration also makes it necessary to begin deprecating the previous uPort mobile app, libraries, and services. The following is an outline of the schedule and what will be affected. If you currently depend on any of these, or simply have a question or two please send us an email at hello@veramo.io.

uPort Mobile

uPort Mobile will be removed from the Apple AppStore and Google PlayStore as of June 1st, 2021. The code is already open-source and will remain available in an archived github repo.

Libraries and Services

First, we will be archiving the deprecated github repos. Open-source code will still be available for reference, but no further changes/updates will be made. In addition, we will be shutting down hosted services, some of which will cause issues in libraries and/or the mobile app.

Chasqui: a server that allows communication between dApps, mobile apps (like uPort Mobile) and servers. Shutting down Chasqui will cause breaking changes in uPort Mobile and uport-transports. Shutdown date: June 1st, 2021.

Caleuche: an event hub service which allows the backup and sync of uPort mobile app events. Shutdown date: June 1st, 2021.

Pututu: a server that allows dApps and servers to send push notification messages to any uPort Mobile App. Shutdown date: June 1st, 2021.

The services below are related to proxy-contract identities and will be shutdown a couple weeks earlier.

Sensui: uPort Transaction Funding Service. Shutdown date: May 15th, 2021.

Nisaba: provides user verification for the uPort ecosystem. Shutdown date: May 15th, 2021.

Unnu: phone verifier and creator of identities. Shutdown date: May 15th, 2021.

Although we had to make some tough decisions to clear the road the ahead, none of these resources have gone to waste. Each one has played a role in our understanding of the digital identity space which directly led to the creation of Veramo. We’re proud of the result. To dig deeper, check out our documentation and experiments at Veramo Labs.

Veramo: uPort’s Open Source Evolution was originally published in uPort on Medium, where people are continuing the conversation by highlighting and responding to this story.


ShareRing

Real World Benefits of Digital Verification

  Are digitally encrypted COVID-19 vaccine certificates the tech-way forward to social living again? COVID-19 has brought an unprecedented shift in our ability to meet, socialize and interact throughout the world. Almost overnight, we became subject to contact tracing and the need to keep social distance from everyone including family and friends. Since contact of... The post Real World Ben

 

Are digitally encrypted COVID-19 vaccine certificates the tech-way forward to social living again?

COVID-19 has brought an unprecedented shift in our ability to meet, socialize and interact throughout the world. Almost overnight, we became subject to contact tracing and the need to keep social distance from everyone including family and friends. Since contact of any kind carried the risk of COVID-19 transmission, especially as variants multiplied, there seemed little hope of communities, countries and the world being able to reopen. 

As we take the first steps towards the safe opening of different regions, an increasing number of governments are requiring people to provide validated proof of COVID-19 vaccinations or test results. These certifications must be verifiable and linked to official identity documents to travel, gain access to local venues or anything involving social gatherings.

The good news is there’s been significant advances in the ability to digitally link verified vaccination certificates and COVID-19 test results to an individual’s identity. Many organizations and governments are now looking to create a contactless verification process as part of this digital transformation. Additionally, they need something that will prioritize data privacy. As these solutions are developed and rolled out, several important issues have hit the headlines such as:

Being able to trust how the data will be used and stored   How to protect against fraud when certificates are often hard copies only and partly handwritten  How to prove the certificate is tied to a specific individual  

In this series, we’ll look at the real-world benefits of digitally verifying credentials. We’ll discuss problems surrounding proof of identity and keeping data safe. We’ll highlight how your customers can have peace of mind knowing their data is encrypted, never stored anywhere other than their own personal device and can only ever be shared with their explicit instructions and consent. 

Using an elegant, safe and user focused smart application that people can use to confidently gain access to venues and travel is the answer.  People need security knowing that despite any requirements to provide proof of vaccination or test results they remain in complete control of their own data and are able to start enjoying life and allowing businesses to thrive again.

After months of lockdowns and restrictions governments are looking for safe ways to gradually open the doors and encourage people to come out of their homes and be social again. However, allowing people to travel and attend events still carries big risks, particularly as new variants spread rapidly. Governments are looking to be able to provide documentary evidence ensuring all travelers and visitors are fully vaccinated or have recent negative test results as a requirement of entry. This in itself brings new challenges. If verification remains a manual process, the individual will need to carry and produce several documents that authorities will then need to scrutinize for authenticity and manually check against the individual’s official identity documents. 

The ability to minimize physical contact, including the handling of passports and vaccination cards, is a critical element in containing a pandemic. If the existing verification processes also include fingerprint scanning and the need to type in a secure PIN on a publicly used device, then this further increases the risks of virus transmission. 

Contactless technology is the key to creating an effective solution. Companies are starting to recognize the need to evolve and offer streamlined processes that embrace digital verification. Digital verification is actually not a new concept; the main international standards organization for the World Wide Web published the Verifiable Credentials Data Model back in 2019 which references digital certification and best practices. The need for safe and secure digital verification of personal identity and certification documents becomes even more critical if we are to address and contain the pandemic.

ShareRing’s solution to the current challenges is both innovative and simplistic. It leverages the immutability of blockchain technology to build the highest levels of security, privacy and consumer trust into a flexible digital verification system.

ShareRing has created a complete ecosystem for the issuance, ownership, and consumption of documents. For both the issuer and the consumer of the document/data, the ShareRing system provides fundamentally greater levels of confidence in the validity of the document. It eliminates the opportunity for fraudulent manipulation of data that’s been uploaded onto the ShareRing blockchain platform. For the user, our independent custom-built blockchain platform is available to anyone with a mobile device. And its no-nonsense interface makes it easy for anyone to use. 

The ShareRing app includes many unique features providing users with a Personal Identification Vault and Health document access. This allows users to safely store and easily share multiple documents, including their COVID-19 vaccination, passport, national ID card and more, in an encrypted format. None of these documents are held centrally and remain only on the individual’s personal device. Our transformative products offer the highest level of security providing reassurance that documents can’t be accessed by anyone without the owner’s expressed consent. Once the documents are uploaded to their personal device a digital “fingerprint” of those documents are stored on the ShareRing blockchain. This means they are not open to hacking and/or personal data theft, as has been the case with many database-dependent systems in the past.

For businesses and operators who wish to consume the data, we have developed an easy contactless digital journey which allows them to use QR Codes scans to request and check the information they require from the individual. This not only streamlines and speeds up the journey, but it also significantly increases the reliability of their checks since the documents will have been uploaded at source and/or verified using face matching and OCR checks. 

The issuer of a document or certificate can be confident that the certificates they are issuing to an individual will not be tampered with or misused by a third party.

Using COVID-19 vaccinations certificates is one example of how our ecosystem works. It also includes actions by: 

A vaccination center that will issue the COVID-19 certificate at the point of vaccination. They then send an encrypted digital copy to the user, which only the user can access using a private key. At the same time a digital “fingerprint” or hash of that document will be uploaded to the blockchain. The user who receives the digital copy of the vaccination certificate from the issuer and uploads it to their Personal Information Vault on their mobile device, alongside their OCR and face matched identity document which they uploaded when they created their ShareRing account. A consumer of the data, for example a customs official at an airport or a ticket attendant at a venue, who scans a QR code from the user’s mobile phone which will, with the user’s explicit real-time consent, allow them to see the verified COVID-19 vaccination certificate alongside the user’s ID. The process automatically checks the digital fingerprint on the blockchain. Even if a single pixel of the image has been altered or the document on the user’s device has been tampered with in any way it will be rejected by the system.

Additional, optional features include the ability for an issuer to allow a user to find their nearest COVID-19 testing or vaccination center, book an appointment and prove health status in a private and trusted way. 

With the ShareRing app, you can have confidence that your customer: 

ID and accompanying information are valid  Will have the security to travel and access venues and events  Will feel safe knowing their information is always secure 

ShareRing is their digital passport to freedom allowing them to unlock the world. For more information on how you can obtain the ShareRing app to upgrade your business and customer experience more effectively, visit www.sharering.network . And make sure you follow us on our Twitter, LinkedIn, Facebook, and blog for all the latest updates and news.

The post Real World Benefits of Digital Verification appeared first on ShareRing.Network.


Aergo

The Transformation of NFT’s: The Unfolding

The Inception Of NFT’s Has Created A Craze: Will It Last And Have NFT’s Progressed Beyond Just Hype? NFT’s Have Sold For Millions: So Did ICO’s In 2017: Can We Draw A Fair Comparison? Introduction: Hype vs. Utility: Which Foundational Ethos Causes The Transcendence of Blockchain Ecosystems? Throughout the history of blockchain, it has become more and more evident and obvious that hype is a c

The Inception Of NFT’s Has Created A Craze: Will It Last And Have NFT’s Progressed Beyond Just Hype?

NFT’s Have Sold For Millions: So Did ICO’s In 2017: Can We Draw A Fair Comparison?

Introduction: Hype vs. Utility: Which Foundational Ethos Causes The Transcendence of Blockchain Ecosystems?

Throughout the history of blockchain, it has become more and more evident and obvious that hype is a cornerstone of a blockchain ecosystems success, specifically surrounding token price and awareness of the project. Ethereum’s infrastructure smart contract code capabilities transformed the zeitgeist of blockchain, enabling different projects to be built on Ethereum. This notion conceived of the “Great ICO Bubble Of 2017” which led to new projects skyrocketing in valuation without real-world utility. When “The Great Bear Market of 2018” commenced, investors succumbed to digital shell-shock, coming to the clear realization that many of these projects were extremely hyped but lacked utility. Fast forward to 2021, although hype is still a fundamental zeitgeist of the blockchain sphere, utility is becoming the superior and aggrandizing force. Will NFT’s meet the same fate as the ICO Bubble of 2017, or is there something deeper that is incumbent to analyze surrounding real-world utility? Lets delve deep into the intricacies of NFT’s and how projects like AERGO have created revolutionary innovations surrounding NFT expansionism.

What Are Non Fungible Tokens? The Modern Day Tulip Craze, Or Real World Value?

NFT’s, Amalgamated With DeFi, Have Been The Cornerstone Of The 2021 Bull Run, But Why NFT’s?

NFT’s have exploded during The Great Bull Run Of 2021, and it is important to recognize the intricacies of what NFT’s are. Non Fungible Tokens is a digital asset(s) that represents real-world objects such as art, music, in-game items, videos, collectible cards, an essay, a unique-sneakers and clothes etc. NFT’s are similar to cryptocurrencies in a variety of different ways, specifically highlighted by their similarities in cryptographic coding. The inception of NFT’s began in 2014, but their popularity has soared over the last two years, and since 2017, over 174 million dollars has been spent on NFT’s. But where do NFT’s garner their value, and why?

NFT’s main valuation proposition is the inception of digital scarcity. In the traditional exterior world, rarity breeds value, surplus leads to hollowness in many circumstances (see every single monetary currency that has succumbed to runaway inflation since the dawn of humanity). Within the digital world, rarity also breeds value, specifically in an entire consolidated web of an infinite supply of the majority of creations. It is evident that NFT’s hold a value proposition, but how far has NFT’s evolved since the Great NFT Craze of 2020 began?

Value In Non-Fungibility: The NFT Paradigm Of Evolution

Beeple’s “EVERYDAYS: The First 5000 Days” Sold At Auction For 69.3 Million Dollars

Since the inception of NFT’s, many artistic renderings have been anything but unique. Many NFT’s began as a digital copy of a creation that was already in existence somewhere else. Clips from iconic sporting events to securitized versions of artistic expression that already existed on social media outlets such as Instagram, Facebook and Twitter all became amalgamated under the umbrella term known as an NFT. So if NFT’s can be basically seen, screenshotted, downloaded and/or printed, how does this create an exterior value proposition; the answer is ownership. NFT’s allow the buyer to own the digital item; NFT’s enable such a digital feat through digitized authentication which can lead to owners garnering a sense of uniqueness surrounding their ownership of such a unique piece of data. Since the creation of NFT’s in 2014, the function and maxim’s of owning these pieces of data have evolved substantially, and we’re going to find out why and how.

Digitized Authentication=Legitimate Unique Ownership

NFT’s have created numerous different opportunities for celebrities, musicians, and artists to expand their business models in a variety of different ways. Artists, to begin, can utilize and add stipulations to an NFT to ensure that they receive a portion of proceeds every-time that specific NFT is sold. NFT’s, because they provide digital ownership rights through a unique paradigm, eliminate the need to track an asset’s progress and the need to enforce entitlements associated with any sale of that property. NFT’s can also benefit musicians who can utilize them to provide special perks to fans. It is also evident, to any collector, that it can be extremely difficult to point out what is a genuine collectors item or a reproduction; NFT’s end that dilemma. NFT’s with clear transaction histories back to the original creator would eliminate such an issue that has plagued collectors for far too long. NFT’s have the potential to transform the paradigm of digital ownership, and I believe NFT’s have evolved into the “Vision and Real World Application” stage, rather than blatant hype without a valid sense of utility.

The Dilemma Of NFT’s: The Great Expense: Different Blockchains Look To Solve The Issue

NFT’s Provide Real World Value And Utility: But Their Biggest Dilemma Are Minting Fees

The largest breeding ground for minting NFT’s, as of 2021, is Ethereum. Unfortunately, just like expensive transaction fees that plague the Ethereum Blockchain, minting an NFT has unfortunately met the same fate. Although there is a complete guarantee of decentralized digital ownership when an NFT is issued on Ethereum, scalability and high costs plague the network. Many blockchains have attempted to rectify the high cost dilemma of minting NFT’s, blockchains such as Tezos have created scalable solutions that allows for NFT’s to be mint in as little as two minutes and cost approximately .50 cents per mint.

Blockchains such as Aergo and ICON have also created their own NFT platforms where individuals can ensure the movement and minting of their NFT’s for a fraction of what it would cost on Ethereum. Scalability and high transaction costs have plagued blockchain networks since the inception of Bitcoin and different blockchain projects have been working tirelessly to create a seamless, affordable and efficient digital paradigm for NFT’s to be minted and sold on.

Future Use Cases Envisioned: The Practicality of NFT’s

Changing The Dynamics Of Ownership

Ownership has unfortunately been in the hands of middlemen before the birth of NFT’s, but future potential can rectify this paranoia of changing hands of ownership. Similarly to smart contracts, NFT’s enable a reliable transfer of information between two or more parties. NFT’s enable assets to reliably move around within a blockchain system, and blockchain systems such as Aergo are creating bridges where NFT’s can be transferred from a side-chain of the network to a main network. NFT’s can resolve a variety of issues that coincide with land and vehicle ownership. Did you know that less than 40% of the global population has their land registered to the rightful property owner? NFT’s can fix this! Individuals who lack verifiable defined rights find it much more difficult to access financial services and credit. With NFT’s, individuals can verify their rights through original ownership attached to their property through digitized verifiable authentication.

Aergo’s Merkle Bridge: Bridging The Gap: Rectifying Dilemmas

NFT’s Merkle Bridge Provides An Interoperable And Cost Effective Approach To Data Transfer And NFT’s

Ask yourself a focal and important question about interoperability from a humanistic and primal standpoint; if tribes can’t communicate through a similar language, how can civilization evolve as a whole? Its a simple answer, we couldn’t of. In the 21st century, one of the main dilemmas blockchains face (as digital tribes) is the inability to communicate with one another. Interoperability is blockchains next step, its destiny as a digital innovation.

The Merkle Bridge is designed to become a bridge between different blockchains and side-chains, enabling blockchains to talk with one another and exchange value in a cost effective manner. Multi-sig operators are the main and predominant form of data transfers in blockchain systems, unfortunately, a lack of security and increased costs plague this type of transference paradigm. Small transfers cannot exist with this type of paradigm because costs are incredibly high. Aergo fixes such a notion through the utilization of the Merkle Proof Tree, which is less costly and more efficient than the Ethereum Patricia Tree; however, Aergo intends on developing bridges between Aergo and Ethereum. Now how can this benefit NFT’s when utilizing the Merkle Bridge?

Merkle Bridge’s technology can benefit not only individuals attempting to transfer tokens, but data and information as well such as NFT’s. Merkle’s Bridge can be seen as a type or oracle-feed between different networks that are interconnected through the Merkle Bridge. Many centralized institutions and blockchain projects have issued their own NFT’s without the ability of cross-chain functionality and transference. With Merkle Bridge, NFT’s can be transferred to “wallets” that connect to Aergo and Ethereum and in the future, other blockchain projects connected to Merkle Bridge. Interoperable NFT transfers, as interoperable protocols are inevitable, the future of NFT’s are fated to be consolidated with the interoperable paradigm.

AERGO Incubated: An NFT Use Case Utilizing CCCV

Aergo Has Incubated A Project That Creates Member Vaccine Verifications

Since February 2020, it is evident that COVID-19 has conceived of an onslaught against our modern civilization that hasn’t been seen for at least 100 years. Nations closed and economic regimes were brought to their knees against an ancient paradigm that humanity has never truly conquered and risen victorious: The Great Biological War. Blockchain enterprises realized that immutability and transparency thereof could assist in the inception of tracking the virus through vaccine verifications; Aergo understood this notion to the fullest extent. Aergo and Blocko XYZ conceived of an incubated project known as CCCV which enables individuals who have their COVID vaccination to identify themselves as vaccinated by adding a real-time badge and sticker that can be shown to others who utilize the platform. In South Korea, an apparent dilemma is that COVID badges are only given to individuals over the age of 65, that leaves a massive age demographic where identifying who is vaccinated and who isn’t is incredibly difficult to track. CCCV solves this dilemma. Although companies have created COVID vaccination tracking apps, security and standardization have been a focal issue; with CCCV, users can request a badge and a sticker and when the document of their verification is authenticated, it is uploaded to their account. Now how does this tie into NFT’s, Ill tell you.

CCCV has garnered over 2 million subscribers (roughly 1.3 times the population of the island of Manhattan) and offers numerous different services which include NFT services. CCCV is a social networking platform and professional person validator, simultaneously, NFT’s can be implemented into this platform and its 2 million plus subscribers! Fake online impersonations, false influencer credentials and fraud are unfortunate but occur incredibly often in cyberspace and NFT’s can solve such an issue through unique-original-digital-ownership through the utilization of CCCV. NFT’s service as an authenticity technology that prevents users from succumbing to SNS linkage and enables individuals to create unique personalized design badges that emphasizes personal ownership and authenticity of ones profile. The platform is blockchain based: immutable, trustless and reliable. NFT’s also enable individuals to control the ownership of their own credentials and CCCV provides such a service utilizing this technology. CCCV has a multitude of use-cases which amalgamate digital identity verification, COVID-19 vaccination status and authenticating digital ownership through NFT’s.

Conclusion: Envisioning An Evolved NFT Paradigm

NFT’s Have Come So Far, Yet We Have A Far Way To Go

It is evident that since the inception of NFT’s in 2014, the concept and expansion of non-fungible tokens has occurred exponentially over the last seven years. We have seen over 170 million dollars spent on NFT’s since 2017, different blockchain projects creating their own NFT marketplaces, effectively competing with Ethereum by creating low-cost and scalable alternatives, and we’re beginning to see the availability of cross-chain NFT technology. The overall future prospect of NFT’s is yet to be seen, but for what we can empirically observe now, we can presume that NFT’s have a bright future.

Disclaimer: Cryptocurrency and NFT investing requires substantial risk, do not invest more than you can afford to lose! I am not a financial adviser and I am not responsible for any of your trades. It is incumbent that you always do your own research before investing in anything!

The Transformation of NFT’s: The Unfolding was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

Build Native Java Apps with Micronaut, Quarkus, and Spring Boot

Java has been able to invoke native programs on an operating system for decades. Invoking native programs is often done using JNI (Java Native Interface) and JNA (Java Native Access). In the last few years, Java has also gained the ability to run JVM apps as native apps. That is, they’re binary executables that have no dependency on the Java runtime. This is huge! Mostly because it gives J

Java has been able to invoke native programs on an operating system for decades. Invoking native programs is often done using JNI (Java Native Interface) and JNA (Java Native Access). In the last few years, Java has also gained the ability to run JVM apps as native apps. That is, they’re binary executables that have no dependency on the Java runtime.

This is huge! Mostly because it gives Java apps the ability to start up in milliseconds (as opposed to seconds). If you’re scaling up to handle millions of requests and using a serverless environment to save costs, this is a game-changer. Developers have enjoyed using Node.js, Python, and Go on serverless environments for years. The ability to use Java (or Kotlin) opens this world up to a massive swath of the developer community.

This post will show you how to run a secure, OAuth 2.0-protected, Java REST API that allows JWT authentication. I’ll showcase the three leading Java frameworks: Micronaut, Quarkus, and Spring Boot. First, I’ll show you how to run them with Java and access their data. Then, I’ll show you how to build and test native images with each framework. I’ll mention a few gotchas I ran into along the way.

Prerequisites

Java 11 with GraalVM+

HTTPie (a better version of cURL)

An Okta Developer Account

Table of Contents Get Started with Native Java Frameworks Install a JDK with GraalVM Launch a Micronaut Java API Generate an OAuth 2.0 Access Token Test Your Micronaut API with HTTPie Build a Native Micronaut App Make a Micronaut App from Scratch Run a Quarkus Java API Test Your Quarkus API with HTTPie Build a Native Quarkus App Create a Quarkus App from Scratch Start a Spring Boot Java API Test Your Spring Boot API with HTTPie Build a Native Spring Boot App Start a Spring Boot App from Scratch Build Native Images for Micronaut, Quarkus, and Spring Boot Startup Time Comparison Testing Native Images Learn More About Java and GraalVM Get Started with Native Java Frameworks

I created a GitHub repository you can clone and run to get started with all three frameworks quickly.

git clone https://github.com/oktadev/native-java-examples.git

This project has directories with the latest versions of Micronaut, Quarkus, and Spring Boot (at the time of this writing). I’ll show you how I created them in individual sections below.

Open the native-java-examples directory in your favorite IDE, so you have easy access to each framework’s project files.

If you want to see how to build native images in each framework, skip to the build native images for Micronaut, Quarkus, and Spring Boot section. Install a JDK with GraalVM

You will need a JDK with GraalVM and its native-image compiler. Using SDKMAN, run the following command and set it as the default:

sdk install java 21.1.0.r11-grl

Add the native extension to the JDK:

gu install native-image Launch a Micronaut Java API

In a terminal window, cd into the micronaut directory and run mn:run to start it.

cd micronaut ./mvn mn:run

If you open another terminal window and try to access it with HTTPie, you’ll get a 401 Unauthorized error.

$ http :8080/hello HTTP/1.1 401 Unauthorized connection: keep-alive transfer-encoding: chunked

To make it so you can access this endpoint, you’ll need to generate an OAuth 2.0 access token and update the JWKS (JSON Web Key Sets) URL to yours (in this project’s application.yml).

If you’re unsure what OIDC and OAuth 2.0 are, see our Illustrated Guide to OAuth and OpenID Connect.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Single-Page App and press Enter.

Use https://oidcdebugger.com/debug for the Redirect URI and set the Logout Redirect URI to https://oidcdebugger.com.

What does the Okta CLI do?

The Okta CLI will create an OIDC Single-Page App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. It will also add a trusted origin for https://oidcdebugger.com. You will see output like the following when it’s finished:

Okta application configuration: Issuer: https://dev-133337.okta.com/oauth2/default Client ID: 0oab8eb55Kb9jdMIr5d6

NOTE: You can also use the Okta Admin Console to create your app. See Create a Single-Page App for more information.

Take note of the clientId and issuer values. You’ll need those to get an access token and to configure each framework for JWT authentication.

Open micronaut/src/main/resources/application.yml and change the Okta URL to match yours.

micronaut: application: name: app security: enabled: true token: jwt: enabled: true claims-validators: issuer: /oauth2/default signatures: jwks: okta: url: /oauth2/default/v1/keys

Stop your Micronaut app with Ctrl+C and restart it with ⬆️+Return.

./mvn mn:run Generate an OAuth 2.0 Access Token

An easy way to get an access token is to generate one using OpenID Connect Debugger. First, you must configure your application on Okta to use OpenID Connect’s implicit flow.

Run okta login and open the resulting URL in your browser. Go to the Applications section and select the application you just created. Edit its General Settings and add Implicit (Hybrid) as an allowed grant type, with access token enabled. Then, make sure it has https://oidcdebugger.com/debug in its Login redirect URIs. Click Save and copy the client ID for the next step.

Now, navigate to the OpenID Connect Debugger website. Fill in your client ID, and use /oauth2/default/v1/authorize for the Authorize URI. The state field must be filled but can contain any characters. Select token for the response type.

Click Send Request to continue.

Once you have an access token, set it as a TOKEN environment variable in a terminal window.

TOKEN=eyJraWQiOiJYa2pXdjMzTDRBYU1ZSzNGM... You might want to keep OpenID Connect <debugger/> open to copy your access tokens. It allows you to quickly start over and regenerate a new access token if it expires. Test Your Micronaut API with HTTPie

Use HTTPie to pass the JWT in as a bearer token in the Authorization header.

http :8080/hello Authorization:"Bearer $TOKEN"

You should get a 200 response with your email in it.

Build a Native Micronaut App

To compile this Micronaut app into a native binary, run:

./mvnw package -Dpackaging=native-image

This command will take a few minutes to complete. My 2019 MacBook Pro with a 2.4 GHz 8-Core Intel Core i9 processor and 64 GB of RAM took 1 min. 28 s. to finish.

Start it with ./target/app:

$ ./target/app __ __ _ _ | \/ (_) ___ _ __ ___ _ __ __ _ _ _| |_ | |\/| | |/ __| '__/ _ \| '_ \ / _` | | | | __| | | | | | (__| | | (_) | | | | (_| | |_| | |_ |_| |_|_|\___|_| \___/|_| |_|\__,_|\__,_|\__| Micronaut (v2.5.6) 17:20:23.980 [main] INFO io.micronaut.runtime.Micronaut - Startup completed in 25ms. Server Running: http://localhost:8080

You can see it starts pretty darn quick (25ms)! Test it with HTTPie and an access token. You may have to generate a new JWT with oidcdebugger.com if yours has expired.

http :8080/hello Authorization:"Bearer $TOKEN" Make a Micronaut App from Scratch

You might be wondering, "how did you build a secure Micronaut app"? Did I just hide the complexity? No, it only takes five steps to create the same app.

Use SDKMAN! to install Micronaut’s CLI:

sdk install micronaut

Create an app using the mn create-app command and rename the project’s directory:

mn create-app com.okta.rest.app --build maven mv app micronaut

Add Micronaut’s libraries for JWT security:

<dependency> <groupId>io.micronaut.security</groupId> <artifactId>micronaut-security</artifactId> </dependency> <dependency> <groupId>io.micronaut.security</groupId> <artifactId>micronaut-security-jwt</artifactId> </dependency>

Add a HelloController in src/main/java/com/okta/rest/controller:

package com.okta.rest.controller; import io.micronaut.http.MediaType; import io.micronaut.http.annotation.Controller; import io.micronaut.http.annotation.Get; import io.micronaut.http.annotation.Produces; import io.micronaut.security.annotation.Secured; import io.micronaut.security.rules.SecurityRule; import java.security.Principal; @Controller("/hello") public class HelloController { @Get @Secured(SecurityRule.IS_AUTHENTICATED) @Produces(MediaType.TEXT_PLAIN) public String hello(Principal principal) { return "Hello, " + principal.getName() + "!"; } }

Enable and configure JWT security in src/main/resources/application.yml:

micronaut: ... security: enabled: true token: jwt: enabled: true claims-validators: issuer: /oauth2/default signatures: jwks: okta: url: /oauth2/default/v1/keys

That’s it! Now you can start the app or build the native image as shown above.

Now let’s take a look at Quarkus.

Run a Quarkus Java API

Open a terminal, cd into the quarkus directory, and run quarkus:dev to start the app.

cd quarkus ./mvnw quarkus:dev

Update the URLs in quarkus/src/main/resources/application.properties to use your Okta domain.

mp.jwt.verify.publickey.location=/oauth2/default/v1/keys mp.jwt.verify.issuer=/oauth2/default Test Your Quarkus API with HTTPie

Generate or copy an access token from OpenID Connect <debugger/> and use it to test your Quarkus API.

http :8080/hello Authorization:"Bearer $TOKEN"

You should see your email in the response.

Did you notice that Quarkus hot-reloaded your application.properties file updates? Pretty slick, eh?!

Build a Native Quarkus App

To compile this Quarkus app into a native binary, run:

./mvnw package -Pnative

The native compilation step will take a bit to complete. On my 2019 MacBook Pro, it took 1 min. 9 s.

Start it with ./target/quarkus-1.0.0-SNAPSHOT-runner:

$ ./target/quarkus-1.0.0-SNAPSHOT-runner __ ____ __ _____ ___ __ ____ ______ --/ __ \/ / / / _ | / _ \/ //_/ / / / __/ -/ /_/ / /_/ / __ |/ , _/ ,< / /_/ /\ \ --\___\_\____/_/ |_/_/|_/_/|_|\____/___/ 2021-06-15 17:35:23,886 INFO [io.quarkus] (main) quarkus 1.0.0-SNAPSHOT native (powered by Quarkus 1.13.7.Final) started in 0.014s. Listening on: http://0.0.0.0:8080 2021-06-15 17:35:23,888 INFO [io.quarkus] (main) Profile prod activated. 2021-06-15 17:35:23,889 INFO [io.quarkus] (main) Installed features: [cdi, mutiny, resteasy, security, smallrye-context-propagation, smallrye-jwt, vertx, vertx-web]

Supersonic Subatomic Java (in 14ms)! Test it with HTTPie and an access token.

http :8080/hello Authorization:"Bearer $TOKEN" Create a Quarkus App from Scratch

You can create the same Quarkus app used in this example in five steps.

Use Maven to generate a new Quarkus app with JWT support:

mvn io.quarkus:quarkus-maven-plugin:1.13.7.Final:create \ -DprojectGroupId=com.okta.rest \ -DprojectArtifactId=quarkus \ -DclassName="com.okta.rest.quarkus.HelloResource" \ -Dpath="/hello" \ -Dextensions="smallrye-jwt"

Edit src/java/com/okta/rest/quarkus/HelloResource.java and add user information to the hello() method:

package com.okta.rest.quarkus; import io.quarkus.security.Authenticated; import javax.ws.rs.GET; import javax.ws.rs.Path; import javax.ws.rs.Produces; import javax.ws.rs.core.Context; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.SecurityContext; import java.security.Principal; @Path("/hello") public class HelloResource { @GET @Path("/") @Authenticated @Produces(MediaType.TEXT_PLAIN) public String hello(@Context SecurityContext context) { Principal userPrincipal = context.getUserPrincipal(); return "Hello, " + userPrincipal.getName() + "!"; } }

Add your Okta endpoints to src/main/resources/application.properties:

mp.jwt.verify.publickey.location=/oauth2/default/v1/keys mp.jwt.verify.issuer=/oauth2/default

Modify the HelloResourceTest to expect a 401 instead of a 200:

package com.okta.rest.quarkus; import io.quarkus.test.junit.QuarkusTest; import org.junit.jupiter.api.Test; import static io.restassured.RestAssured.given; @QuarkusTest public class HelloResourceTest { @Test public void testHelloEndpoint() { given() .when().get("/hello") .then() .statusCode(401); } }

Add HTTPS support to the native profile at the bottom of pom.xml with the quarkus.native.additional-build-args property:

<properties> <quarkus.package.type>native</quarkus.package.type> <quarkus.native.additional-build-args> --enable-url-protocols=https </quarkus.native.additional-build-args> </properties>

The last step is not necessary if you’re running with Maven, but it is for the native image. Quarkus includes SSL for many extensions, but not for SmallRye JWT. The good news is Quarkus will enable SSL by default for SmallRye JWT in Quarkus 2.0.0.Final.

Last but certainly not least, let’s look at Spring Boot.

Start a Spring Boot Java API

In your IDE, update the issuer in spring-boot/src/main/resources/application.properties to use your Okta domain.

spring.security.oauth2.resourceserver.jwt.issuer-uri=/oauth2/default

Then, start your app from your IDE or using a terminal:

./mvnw spring-boot:run Test Your Spring Boot API with HTTPie

Generate an access token using ttps://oidcdebugger.com[oidcdebugger.com] and use it to test your Spring Boot API.

http :8080/hello Authorization:"Bearer $TOKEN"

You should see a response like the following.

But wait, doesn’t Okta have a Spring Boot starter? Yes, we do—however, it doesn’t work with GraalVM yet.

However, we’re calling in an expert to help us fix it! Join us next Tuesday, June 22, 2021, on twitch.tv/oktadev for a session with Josh Long. We’ll attempt to fix things live!

Build a Native Spring Boot App

To compile this Spring Boot app into a native executable, you can use the Spring Boot Maven plugin:

./mvnw spring-boot:build-image

The native compilation step will take a bit to complete. On my 2019 MacBook Pro, it took 2 min. 56 s.

Start it using Docker:

$ docker run -p 8080:8080 docker.io/library/demo:0.0.1-SNAPSHOT 2021-06-16 02:21:24.193 INFO 1 --- [ main] o.s.nativex.NativeListener : This application is bootstrapped with code generated with Spring AOT . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.5.1) ... 2021-06-16 02:21:24.970 INFO 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path '' 2021-06-16 02:21:24.971 INFO 1 --- [ main] com.okta.rest.DemoApplication : Started DemoApplication in 0.06 seconds (JVM running for 0.063)

Bootiful! Test your API with HTTPie and an access token.

http :8080/hello Authorization:"Bearer $TOKEN" Start a Spring Boot App from Scratch

To create the Spring Boot app used in this example, it’s just five steps.

Use HTTPie to generate a new Spring Boot app with OAuth 2.0 support:

http https://start.spring.io/starter.zip \ bootVersion==2.5.1 \ dependencies==web,oauth2-resource-server,native \ packageName==com.okta.rest \ name==spring-boot \ type==maven-project \ baseDir==spring-boot | tar -xzvf -

Add a HelloController class that returns the user’s information:

package com.okta.rest.controller; import org.springframework.security.core.annotation.AuthenticationPrincipal; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RestController; import java.security.Principal; @RestController public class HelloController { @GetMapping("/hello") public String hello(Principal principal) { return "Hello, " + principal.getName() + "!"; } }

Configure the app to be an OAuth 2.0 resource server by adding an issuer to application.properties:

spring.security.oauth2.resourceserver.jwt.issuer-uri=/oauth2/default

Add a SecurityConfiguration class to configure JWT authentication:

package com.okta.rest; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.config.annotation.web.configuration.EnableWebSecurity; import org.springframework.security.config.annotation.web.configuration.WebSecurityConfigurerAdapter; import org.springframework.security.config.annotation.web.configurers.oauth2.server.resource.OAuth2ResourceServerConfigurer; @EnableWebSecurity public class SecurityConfiguration extends WebSecurityConfigurerAdapter { @Override protected void configure(HttpSecurity http) throws Exception { http .authorizeRequests(request -> request.anyRequest().authenticated()) .oauth2ResourceServer(OAuth2ResourceServerConfigurer::jwt); } }

Enable HTTPS for native builds by adding a @NativeHint annotation to the DemoApplication class.

import org.springframework.nativex.hint.NativeHint; @SpringBootApplication @NativeHint(options = "--enable-url-protocols=https")

You can build and test a Spring Boot native image using the steps I outlined above.

Build Native Images for Micronaut, Quarkus, and Spring Boot

To recap, Micronaut, Quarkus, and Spring Boot all support building native executables with GraalVM. Yes, there are other frameworks, but these three seem to be the most popular.

The commands to build each app are similar but not quite the same.

Micronaut: ./mvnw package -Dpackaging=native-image

Quarkus: ./mvnw package -Pnative

Spring Boot: ./mvnw spring-boot:build-image

Of course, they all support Gradle too.

Startup Time Comparison

Performance comparisons are complex, but I’m going to do one anyway. Since this post is all about native Java, below is the data I gathered that shows the average milliseconds to start each native executable. I ran each image three times before I started recording the numbers. I then ran each command five times.

These numbers are from a 2019 MacBook Pro with a 2.4 GHz 8-Core Intel Core i9 processor and 64 GB of RAM. I think it’s important to note that my WiFi connection was 340 Mbps down and 246 Mbps up (according to the Speedtest app).

Table 1. Native Java Startup times in milliseconds Framework Command executed Milliseconds to start

Micronaut

./target/app

20.4

Quarkus

./target/quarkus-1.0.0-SNAPSHOT-runner

13.4

Spring Boot

docker run -p 8080:8080 docker.io/library/demo:0.0.1-SNAPSHOT

60.6

The chart below should help you visualize this comparison.

The Spring Boot startup times looked a little long, so I contacted my friend Josh Long. We did a debugging session over Zoom and discovered the longer startup times are because Spring Security is doing OIDC discovery with the issuer.

Spring Boot’s "initialization completed" time seemed to be right around 30ms. The duration between that and the "Started in …​" time is the time it takes to make the call to Okta. We tried optimizing it by just using the JWKS URI. For example:

spring.security.oauth2.resourceserver.jwt.jwk-set-uri=https://dev-133337.okta.com/oauth2/default/v1/keys

This only improved the startup time by 1.6ms (59ms on average).

We also experimented with using ./mvnw package -Pnative for Spring Boot. This allows you to create a native binary and run it with ./target/demo. There’s no need for Docker with this command. Startup times are relatively the same in my and Josh’s experience.

Our hypothesis is Micronaut and Quarkus do the JWKS lookup on the first request rather than at startup. That’s how they achieve faster startup times.

We later confirmed this hypothesis with Jason Schindler (from Micronaut) and Sergey Beryozkin (from Quarkus).

If I just take the value of the "initialization completed" time from Spring Boot, the numbers look a little more even.

I also tested the startup times on a Linux laptop with 64 GB of RAM. I used the value the framework displays for startup time and did not make the "initialization complete" adjustment for Spring Boot.

If you disagree with these numbers and think X framework should be faster, I encourage you to clone the repo and run these tests yourself. If you get faster startup times for Spring Boot, do you get faster startup times for Micronaut and Quarkus too? Testing Native Images

When building native images, it’s essential to test them as part of an integration testing process. This post is already long enough, so I won’t explain how to test native images in this post. We’ll publish a post in the future that covers this topic.

I do like how Quarkus generates a NativeHelloResourceIT that’s designed specifically for this, though.

package com.okta.rest.quarkus; import io.quarkus.test.junit.NativeImageTest; @NativeImageTest public class NativeHelloResourceIT extends HelloResourceTest { // Execute the same tests but in native mode. }

However, this test did not help me detect an issue with my Quarkus native image when writing this post. That’s because I was lazy in writing my test and changed it to confirm a 401 instead of testing it with Quarkus' OIDC testing support.

In the meantime, see Gradle and Maven Plugins for Native Image with Initial JUnit 5 Testing Support.

Learn More About Java and GraalVM

In this post, you learned how to develop, build, and run native Java apps with Micronaut, Quarkus, and Spring Boot. You learned how to secure them with OpenID Connect and access them with a JWT access token.

If you’re a Spring Boot aficionado, I recommend you watch Josh Long’s Spring Tips: Spring Native 0.10.0 video.

You can find the source code for all the examples used in this post on GitHub in the native-java-examples repository.

Server-side apps that serve up REST APIs aren’t the only thing that’s gone native in Java. Gluon has done a lot of work in recent years to make JavaFX apps work on iOS and Android using GraalVM. See Gail Anderson's Creating Mobile Apps with JavaFX – Part 1 to learn more about this emerging technology.

In the beginning, I mentioned JNI and JNA. Baeldung has some tutorials about both:

Guide to JNI (Java Native Interface)

Using JNA to Access Native Dynamic Libraries

If you liked this post, chances are you’ll like others we’ve published:

Watch GraalVM Turn Your Java Into Binaries

Java REST API Showdown: Which is the Best Framework on the Market?

How to Docker with Spring Boot

Build a Secure Micronaut and Angular App with JHipster

Fast Java Made Easy with Quarkus and JHipster

Got questions? Leave them in the comments below! You can also hit us up on our social channels: @oktadev on Twitter, Okta for Developers on LinkedIn, Twitch, and YouTube.

Thursday, 17. June 2021

KuppingerCole

Technological Approaches to a Zero Trust Security Model

The traditional model of enforcing security at the network perimeter is no longer valid as employees, devices and workloads move outside the corporate network. A Zero Trust model offers an alternative that secures data while ensuring it is accessible from wherever employees are working. But finding the right technological approaches to Zero Trust is challenging.

The traditional model of enforcing security at the network perimeter is no longer valid as employees, devices and workloads move outside the corporate network. A Zero Trust model offers an alternative that secures data while ensuring it is accessible from wherever employees are working. But finding the right technological approaches to Zero Trust is challenging.




Does Increased Security Still Mean Added Complexity?

We’re all accessing more goods and services online than we ever thought possible, which has presented a huge opportunity for cyber criminals. Rapid digital transformation has left some businesses exposed, and fraudsters are looking to exploit new weaknesses. Strong digital identity verification and authentication is essential, but has traditionally come with increased complexity at the expense of

We’re all accessing more goods and services online than we ever thought possible, which has presented a huge opportunity for cyber criminals. Rapid digital transformation has left some businesses exposed, and fraudsters are looking to exploit new weaknesses. Strong digital identity verification and authentication is essential, but has traditionally come with increased complexity at the expense of a good user experience. But is this still true?




Coinfirm

Anti Money Laundering and Countering the Financing of Terrorism for DeFi LPs

2021 has been seen as the year of the DeFi boom. Currently, the TVL (total value locked) into DeFi has surged to $60 billion, up from $1.2 billion from a year ago, attracting the attention of financial institutions (FIs) and enticing them to begin experimenting in the field.  As fiat interest rates have hit rock...
2021 has been seen as the year of the DeFi boom. Currently, the TVL (total value locked) into DeFi has surged to $60 billion, up from $1.2 billion from a year ago, attracting the attention of financial institutions (FIs) and enticing them to begin experimenting in the field.  As fiat interest rates have hit rock...

Civic

Civic Compliance: A Natural Evolution

The regulators are coming. The regulators are coming. Whether it’s FATF, FinCEN, CFTC, or one of many other governing bodies, lawmakers have been incrementally shaping new legislation aimed at further regulating centralized finance and DeFi. Most recently, the World Economic Forum published a white paper to help policy makers understand DeFi, so that better guardrails […] The post Civic Complian

The regulators are coming. The regulators are coming.

Whether it’s FATF, FinCEN, CFTC, or one of many other governing bodies, lawmakers have been incrementally shaping new legislation aimed at further regulating centralized finance and DeFi. Most recently, the World Economic Forum published a white paper to help policy makers understand DeFi, so that better guardrails may be designed for the ecosystem.

A confusing regulatory landscape

Companies in this new ecosystem, though, remain unclear on how to lawfully proceed. As we talk to companies about Civic age verification, log-in and KYC, we continue to hear a similar message from them: what should we do about compliance?

As an advisor offering identity services, it’s become clear to us that Civic can bring greater value to our existing and potential customers by offering more of what we do well.

That’s why we’re proud to share a window into the future of digital identity’s role in the DeFi ecosystem. 

Civic Compliance is coming soon

The new DeFi ecosystem makes it possible for anyone to trade using simply an internet connection and open-source code. The fact that these new marketplaces have proliferated illustrates the need for next generation financial services, but these solutions must begin to operate within the context of necessary compliance guardrails. Fortunately, digital identity with built-in compliance will help blockchain protocols, liquidity providers, and individuals create trust and comply with evolving regulations all at the same time.

More specifically, our newest offering, Civic Compliance, will help DeFi companies and individuals know their counterparties, so that all parties can ensure KYC, AML requirements are met.

Civic is uniquely positioned to create decentralized identities, putting users in control of their own personally identifiable information (PII) so they can access compliant DeFi services. Users of Civic Compliance may sign up once and then re-use their “attestation” again and again. This gives protocols and liquidity providers visibility into each and every user transaction. And, it gives users more control over their PII.

Partners in compliance

Whether it’s KYC, AML or regulations not yet in place, Civic is a partner in meeting DeFi regulatory requirements, all while putting users back in control over their unique digital identities. We’re looking forward to sharing more about Civic Compliance.

The post Civic Compliance: A Natural Evolution appeared first on Civic Technologies, Inc..


auth0

Secrets Access with Managed Identities in .NET Applications

How to securely access secrets stored in the Azure Key Vault service using the new Azure SDK and managed identities.
How to securely access secrets stored in the Azure Key Vault service using the new Azure SDK and managed identities.

digi.me

Transparency and trust: what we should be discussing about plans to share GP records

There has been a lot of discussion – and rightly so – about plans by the NHS England to share data about 55m patients with academic and commercial third parties. One of the biggest issues with the data collection project, which would include information about especially sensitive areas such as mental and sexual health, was the short time given for patients to opt out. The plans were only widel

There has been a lot of discussion – and rightly so – about plans by the NHS England to share data about 55m patients with academic and commercial third parties.

One of the biggest issues with the data collection project, which would include information about especially sensitive areas such as mental and sexual health, was the short time given for patients to opt out.

The plans were only widely revealed by the Financial Times on May 26, with patients initially needing to opt-out by June 23.

Continue reading Transparency and trust: what we should be discussing about plans to share GP records at Digi.me.


Ontology

DID 101: A Brief Introduction to What Makes Ontology Special

If you’re new to Ontology, or confused by terms like DID, ONT ID, and OScore, this brief introduction is for you. DID, ONT ID, OScore…you might have heard us use these words, before. Yet, you’re still unsure what they all mean. Here’s an overview of the key features that make Ontology special. Decentralized Identity & ONT ID With traditional ID systems, a central
If you’re new to Ontology, or confused by terms like DID, ONT ID, and OScore, this brief introduction is for you.

DID, ONT ID, OScore…you might have heard us use these words, before.

Yet, you’re still unsure what they all mean.

Here’s an overview of the key features that make Ontology special.

Decentralized Identity & ONT ID

With traditional ID systems, a central authority or tech platform controls the data (e.g. phone numbers, email addresses) provided to it by users.

The users (i.e. the data providers), therefore, have very little control over how their personal data is stored and who has access to it.

Ontology’s decentralized identity (DID) framework, ONT ID, is different. By providing a self-sovereign system of data authorization and ownership confirmation, ONT ID puts the power in the hands of users.

With ONT ID, ID generation, storage, and other critical operations, are fully automated and decentralized, allowing users operating within the ecosystem full control and ownership over any data associated with them.

Discrete IDs linked across various ecosystems can have multiple delegates and attributes. With verifiable credentials, entities (i.e. individuals, enterprises, institutions, devices) can make and verify claims related to data ownership, access rights, and validation.

Verifying Credentials with ONT ID

ONT ID allows you to verify credentials, whilst protecting your data and privacy.

You can verify many different kinds of credentials such as passports, national IDs, and social media accounts. Verifying credentials is one way to boost your OScore (more on this later).

The beauty of ONT ID is that you, the user, decide who has access to your verified data.

For example, imagine a third party required you to verify an aspect of your identity, such as your nationality. With ONT ID, you could share this single data point whilst keeping your name, date of birth, home address, and any other sensitive information, completely private.

You can create an ONT ID now by downloading ONTO Wallet, Ontology’s decentralized cross-chain wallet.

OScore

OScore is an independent on-chain reputation system, generated using on-chain data including your ONT ID, engagements, assets, and credentials.

Your OScore is based on data authorized by you and is updated to reflect changes in behavior and lending practices.

OScore is not associated with your off-chain identities. In other words, it respects user privacy and supports full anonymity.

Your OScore increases as you add more decentralized data to your ONT ID.

A higher OScore brings with it certain benefits such as better lending rates on platforms like Wing Finance, Ontology’s credit-based DeFi platform.

OScore also benefits credit grantors, helping them decide whether to open a credit line and how much credit to extend to a particular borrower.

Summary

Let’s put together everything we’ve learned so far.

Ontology’s decentralized identity (DID) framework is called ONT ID. Create an ONT ID with ONTO Wallet to start verifying credentials. Verifying credentials using your ONT ID helps boost your decentralized credit rating, aka OScore. Having a higher OScore entitles you to certain benefits. Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

DID 101: A Brief Introduction to What Makes Ontology Special was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Where Workflows, Service Management, Digital Identities, and Work From Home Meet

by Martin Kuppinger Simplifying access to IT services by building on platforms for IT workflows and beyond ServiceNow, over the past few years, has emerged as one of the leaders in the ITSM (IT Service Management) market and beyond to what today is named ESM (Enterprise Service Management). In fact, the evolution of that market has helped ITSM leave the bunker of IT, and become a strategic elem

by Martin Kuppinger

Simplifying access to IT services by building on platforms for IT workflows and beyond

ServiceNow, over the past few years, has emerged as one of the leaders in the ITSM (IT Service Management) market and beyond to what today is named ESM (Enterprise Service Management). In fact, the evolution of that market has helped ITSM leave the bunker of IT, and become a strategic element for businesses.

ESM today is much bigger and more important than ITSM ever was

The reason is that ESM supports organizations in business process optimization, based on central platforms with centralized data management and workflow capabilities. ESM is way bigger and more essential than ITSM has ever been, because it is business-centric and user-centric, instead of the technical, IT-centric approach exemplified by ITIL (IT Infrastructure Library), one of the foundational principles of traditional ITSM. Yes, ITIL still plays a vital role for the IT core of ITSM and thus ESM, but it is just a small piece of the puzzle today.

For a seamless experience, Identity and Access Management must be integrated with ESM

Essentially, ESM combines the ability to automate and improve processes with the workflow capabilities on common platforms that allow customers as well as partners to extend the reach. While there is the potential risk of lock-in to such a platform, the advantages outweigh this risk for many buyers (or are buyers just ignoring the lock-in risk, e.g. to ServiceNow?). ESM provides (relatively) user-friendly access to IT services. This is of ever-increasing importance, driven by trends such as Cloud Computing and Cloud first, Work from Home, and Citizen Development.

Cloud Computing: With more and more services procured from the cloud, and a higher rate of change in the use of services, managing these services and making them accessible to the users becomes of ever-increasing importance. Managing access and the identities of users, must integrate with the ESM platforms for a seamless experience.

Work from Home: We’ve all learned over the past 15 months that the delivery of new services must become seamless. Walking to the desk of the user and administering the local machine does not work anymore in the age of WfH. Seamless procurement of services is essential in today’s dispersed work environments.

Usability is not only a matter User Interfaces but of ubiquitous digital identity thinking

Citizen Development: Low-code and no-code platforms are changing the way of developing solutions. Many solutions are built by the business. They still need management. Platforms that unify initiatives and that help in getting a grip on all services, become essential.

ESM is an approach that helps address many of the emerging needs of businesses in delivering their services, where IT is not IT anymore, but ubiquitous across all parts of the business. Strategic platforms can help here, despite the lock-in risk. But they will only work with the context of the user: Who can do what, who is allowed to access which service, etc. Usability is not only a matter of UI, but of ubiquitous digital identity thinking across these platforms – and the platforms might also serve as an element or even the foundation of the IAM infrastructure.


PingTalk

Zero Trust - A Complete Guide to Zero Trust Security | Ping Identity

Enterprises are accelerating digital transformation initiatives in response to the rapidly evolving business landscape and increasingly remote workforce. Successfully implementing business-critical digital transformation efforts requires an efficient, effective workforce that has access to every application and tool they need to get their job done.  

Enterprises are accelerating digital transformation initiatives in response to the rapidly evolving business landscape and increasingly remote workforce. Successfully implementing business-critical digital transformation efforts requires an efficient, effective workforce that has access to every application and tool they need to get their job done.
 


Urbit

Developer Call: Programming Bitcoin on Urbit Workshop

Programming Bitcoin is fun, particularly on Urbit, but the devil is in the details. Tim (~timluc-miptev) will impart the implicit knowledge he gained working with Bitcoin in the course of his projects, and give an open workshop on what you need to know, either to satisfy your curiosity or to jump in and help. Tim will give a structured presentation on low-level Bitcoin fundamentals, intended to ma

Programming Bitcoin is fun, particularly on Urbit, but the devil is in the details. Tim (~timluc-miptev) will impart the implicit knowledge he gained working with Bitcoin in the course of his projects, and give an open workshop on what you need to know, either to satisfy your curiosity or to jump in and help. Tim will give a structured presentation on low-level Bitcoin fundamentals, intended to make it easy to use Bitcoin resources for further study.

Tim is the technical director at the Urbit Foundation, and wrote the backend for the Urbit Bitcoin wallet. He is currently managing volt, a project to integrate a Lightning client into Urbit. This workshop will be interactive — Tim will take questions throughout, and will conduct another session later if material remains.

As always, there will be an informal call afterwards to hang out. We will also open a Bitcoin development channel on Urbit specifically to handle follow-ups from this and for those who can't make it live.


Apprenticeship: Contacts App

Contacts App Urbit currently has ship metadata, but there is room for alternate products here. It would be useful to be able to attach arbitrary metadata to a user, both in textual format to describe them, graph-store links to materials they've written, automated lists of groups they're in, and programmatically-accessible metadata that can be used for filtering and permissioning. These contact
Contacts App

Urbit currently has ship metadata, but there is room for alternate products here. It would be useful to be able to attach arbitrary metadata to a user, both in textual format to describe them, graph-store links to materials they've written, automated lists of groups they're in, and programmatically-accessible metadata that can be used for filtering and permissioning.

These contact lists could then be shared between ships as a way to jumpstart people into your social network. Alternate interfaces to chats could give read/write access to them. While it's unlikely that this exact product would be what people finally use, it would go a long way towards feeling out the user-metadata design space.

This project would be great for someone who wanted to learn more about how Urbit data storage and sharing between apps works, as well as understanding how to interact with graph-store.

User Stories

As a user I can:

add text data to another user add permissions/markers to another user create a collection of graph-store links with data about the user automatically collect known groups the user is in search the user's written materials in graph-store (using graph-query or similar). share metadata about users with other ships, along with ways of merging it.

Bitcoin Wallet Maintenance

Overview The Urbit Bitcoin Wallet has been integrated into the core distribution, and its maintenance is now passing back to the community via the Urbit Foundation. Our goal is to maintain this product at a high level of usability and also add significant features to it over time. We want usage to become ubiquitous and easy within Urbit, while integrating without outside wallets/signing device
Overview

The Urbit Bitcoin Wallet has been integrated into the core distribution, and its maintenance is now passing back to the community via the Urbit Foundation.

Our goal is to maintain this product at a high level of usability and also add significant features to it over time. We want usage to become ubiquitous and easy within Urbit, while integrating without outside wallets/signing devices and internal applications.

If you have strong development skills and are looking to make an impact with your Urbit development, this is the project to be on. We'll likely recruit more than one contributor for this role.

Types of work

The below is a non-exhaustive list of the types of work we will need:

hardware wallet support progress meters for wallet scanning performance improvements for Bitcoin providers mempool scanning shoe CLI interface for terminal usage in Urbit general maintenance as bugs arise Requirements Skills Experience with modern React, or Hoon experience (for backend maintenance and improvement)

Having both skills is great.

Time Commitment at least 10 hours/week 2-3 months minimum weekly check-in meeting with ~timluc-miptev (Technical Director at Urbit Foundation) Compensation 1 star per month initially (assuming ~10 hours/week part-time) more stars can be negotiated if the developer has greater availability and can commit more time

JSON Parsing/Serialization Jet

Jet JSON Parsing/Serialization We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks. Urbit serializes JSON frequently when sending data to various frontends, and this would make that much more rap
Jet JSON Parsing/Serialization

We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks.

Urbit serializes JSON frequently when sending data to various frontends, and this would make that much more rapid.

Completing this bounty will have an immediate impact on the perceived speed of the system in frontends like Landscape.

Code JSON Encoders JSON Reparser Requirements Knowledge of C Experience memory profiling to prevent leaks Hoon knowledge nice but not necessary Resources Tlon engineer explanation/assistance as needed Check-in with a Foundation director as needed Writing Jets Guide Unofficial Jets Tutorial JSON Test Suite Milestone: Completion, 1 star jet is implemented passes all tests, producing same results in jetted and unjetted mode Tlon engineer gives final approval on merging jet into core

Nuck Jet

jet +nuck We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks. This is a function that parses strings and returns a @ta of their aura along with their atom value. This is invoked every time a sc
jet +nuck

We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks.

This is a function that parses strings and returns a @ta of their aura along with their atom value. This is invoked every time a scry happens, and currently takes 7ms for invocation, so there is potential for large performance gains.

Completing this bounty will have an immediate impact on the perceived speed of the system in frontends like Landscape.

Examples

Note the ~.p and ~.ux--those are the auras.

> (scan "~pillyt" nuck:so) [% p=[p=~.p q=32.819]] > (scan "0x12" nuck:so) [% p=[p=~.ux q=18]] Code

https://github.com/urbit/urbit/blob/master/pkg/arvo/sys/hoon.hoon#L5767

Requirements Knowledge of C Experience memory profiling to prevent leaks Hoon knowledge nice but not necessary Resources Tlon engineer explanation/assistance as needed Check-in with a Foundation director as needed Writing Jets Guide Unofficial Jets Tutorial Milestone: Completion, 1 star jet is implemented and passes all tests Tlon engineer gives final approval on merging jet into core

Tune slaw/scot Jets

slaw/scot We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks. slaw parses strings to atoms, and scot serializes atoms to strings. They both are used heavily throughout the codebase. Completing
slaw/scot

We need to generally speed up common parsing and serialization operations. A large part of Urbit's UX is sending data in and out of Urbit for display in the UI, and the functions inside Urbit that handle this are all significant performance bottlenecks.

slaw parses strings to atoms, and scot serializes atoms to strings. They both are used heavily throughout the codebase.

Completing this bounty will have an immediate impact on the perceived speed of the system in frontends like Landscape.

Examples

slaw

> `(unit @p)`(slaw %p '~pillyt') [~ ~pillyt] > `(unit @p)`(slaw %p '~pillam') ~ > `(unit @ux)`(slaw %ux '0x12') [~ 0x12] > `(unit @ux)`(slaw %ux '0b10')

scot

> `@t`(scot %p ~pillyt) '~pillyt' > (scot %ux 0x12) ~.0x12 Code slaw https://github.com/urbit/urbit/blob/master/pkg/arvo/sys/hoon.hoon#L5913 scot https://github.com/urbit/urbit/blob/master/pkg/arvo/sys/hoon.hoon#L5905 Requirements Knowledge of C Experience memory profiling to prevent leaks Hoon knowledge nice but not necessary Resources Tlon engineer explanation/assistance as needed Check-in with a Foundation director as needed Writing Jets Guide Unofficial Jets Tutorial Milestone: Completion, 1 star jet is implemented and passes all tests Tlon engineer gives final approval on merging jet into core

Wednesday, 16. June 2021

auth0

Introducing: The OAuth 2 Game

A fun and easy way to learn about OAuth
A fun and easy way to learn about OAuth

Verifiable Credentials with Auth0 and MATTR

How to issue Verifiable Credentials from Auth0 user data using MATTR's Auth0 Marketplace Integration
How to issue Verifiable Credentials from Auth0 user data using MATTR's Auth0 Marketplace Integration

Anonym

Do Consumers Even Want Personalized Ads?

Do consumers even want personalized ads on their devices?   The answer is yes and no.   Yes, they want a personalized shopping experience for all the reasons marketers cite in defense of their practices: targeted ads matched to immediate needs and interests save shoppers time and money and can enhance their online experience.&nb

Do consumers even want personalized ads on their devices?  

The answer is yes and no.  

Yes, they want a personalized shopping experience for all the reasons marketers cite in defense of their practices: targeted ads matched to immediate needs and interests save shoppers time and money and can enhance their online experience.  

But also no, increasingly consumers don’t want personalization when it comes at the expense of their privacy

And that’s the kicker, isn’t it? As brands continue to embrace data-driven sales strategies and invest in technologies like artificial intelligence and machine learning to go beyond personalization into hyper-personalization (think web sites that already know your clothing size and color preferences, for example), consumers are growing louder in their demand for better data protection. 

It’s the business challenge of this decade: how to engage and convert customers in a fast-paced digital market while being good and compliant stewards of their personal data? There are two sides to this coin: personalization is improving the user experience, but data abuse is growing—and consumers are noticing. 

Those of us in the privacy space completely understand the problem and the fix and are helping brands rapidly bring it to market: building privacy-first products and services that give consumers power over their privacy, and companies an easy way to engage without collecting, managing and risking personal data. 

It’s a ‘middle ground’ solution that brands must rapidly engage or risk alienating their customers. The stats tell the story: 

As far back as 2012, Pew Research found 68 percent of US consumers don’t like targeted ads and disapprove of the invasive data surveillance that drives them.  Seven years later in 2019 Pew Research found an even higher number of US consumers, at 81 percent,believes the potential risks from data-driven products and services outweigh the benefits.  This past February in Europe, YouGov polled 2,000 consumers in France and Germany and found 57 percent don’t want personalized ads on their devices and feel “deeply uncomfortable” about the granular categorization based on highly personal information, such as illness, pregnancy and religion, that drives it.   

Media and ad experts get the point:  

Yashina Burns, Director, Data Privacy and Legal Affairs at Deep Intent, told Forbes: “While CPRA won’t become law until 2023, other states will likely create similar regulations in the interim and further push the ad industry to adopt targeting technologies that are more conscious of consumer privacy. Marketing leaders need to get ready now by focusing on privacy-friendly solutions that limit the use of sensitive personal information. Publishers and platforms that offer compliant data collection across platforms – especially in the healthcare space where privacy is of the utmost importance – will be well-positioned to continue the services they offer to marketers amid the coming regulatory change.” 

Mike Edmonds, CEO of global digital services giant Pactera EDGE says similar: “The challenge moving forward into 2021 will be the continued delivery of personalized experiences while maintaining compliance with existing privacy laws and those regulations appearing on the horizon. It’s a delicate balancing act marketers will have to pull off as consumers increasingly embrace the hyper-personalization experience, but also demand transparency when it comes to how their information is being used.”  

The YouGov poll of consumers in France and Germany we mentioned earlier says it’s the behind the scenes or back door nature of personalization that gives people the creeps. This perception isn’t helped by the fact consent to datapractices is often so convoluted and tied to long-winded terms and conditions and privacy policies that consumersdon’t know exactly what it is they’re agreeing to. 

The YouGov research commissioning agency says, “The EU has a unique opportunity to tackle these issues by making sure existing privacy rules are enforced and by bringing in new rules via the Digital Services Act (DSA). We’re calling on Members of the European Parliament to amend the DSA to ensure users can see all the ways they’ve been targeted and prevent surveillance overreach.”  

We’re still waiting to see how US consumers might get similar, national data and privacy protections. In the meantime, we suggest brands recognize both the opportunity to deliver consumer applications with privacy at their core, and the responsibility to comply with tightening regulations. Discover how Sudo Platform, our complete business toolkit for rapidly developing branded privacy and cybersecurity solutions, can help. 

Photo by CardMapr.nl on Unsplash

The post Do Consumers Even Want Personalized Ads? appeared first on Anonyome Labs.


IBM Blockchain

Greening the blue economy

Our oceans sustain us. They give us oxygen and they capture carbon dioxide. They feed us and they provide a wage to 40 million people across the world. They bring us joy and they show us beauty. But we are not sustaining our oceans in return. We are taking more from them than can be […] The post Greening the blue economy appeared first on Blockchain Pulse: IBM Blockchain Blog.

Our oceans sustain us. They give us oxygen and they capture carbon dioxide. They feed us and they provide a wage to 40 million people across the world. They bring us joy and they show us beauty. But we are not sustaining our oceans in return. We are taking more from them than can be […]

The post Greening the blue economy appeared first on Blockchain Pulse: IBM Blockchain Blog.


Spherity

Spherity successfully achieves SAP Certified Cloud Solution status

The Spherity Credentialing Service is now integrated and available on the SAP Information Collaboration Hub for Life Sciences Spherity the German digital identity specialist achieves SAP Certified Cloud Solution status. This certificate confirms the technical compliance of the Spherity Credentialing Service with SAP certification procedures. SAP-certified Spherity Credentialing Service — Pho
The Spherity Credentialing Service is now integrated and available on the SAP Information Collaboration Hub for Life Sciences

Spherity the German digital identity specialist achieves SAP Certified Cloud Solution status. This certificate confirms the technical compliance of the Spherity Credentialing Service with SAP certification procedures.

SAP-certified Spherity Credentialing Service — Photo by Sasha Stories

Along with the certification, the Spherity Credentialing Service has now been integrated with the SAP Information Collaboration Hub for Life Sciences. The SAP Information Collaboration Hub is a cloud platform connecting pharmaceutical organizations and their supply chain partners on a safe and trusted network. With Spherity’s certified service offering, SAP customers can use the Spherity Credentialing Service to comply with the U.S. Drug Supply Chain Security Act (DSCSA) requirement for Authorized Trading Partners.

“SAP and Spherity collaborate to address the DSCSA compliance gap for Authorized Trading Partners. With our Spherity Credentialing Service integration in place, SAP customers can become an authorized trading partner within minutes.” says Georg Jürgens, Manager Industry Solutions at Spherity.

The Spherity Credentialing Service enables supply chain actors to verify in real time that they are exchanging information with Authorized Trading Partners (ATP) only as per U.S. DSCSA requirements, even when they do not have a direct business relationship yet. Spherity’s service sets the benchmark for compliance solutions in the field of trading partner verification. Beyond U.S. DSCSA compliance, Spherity leverages process efficiencies of exchanging data with indirect business partners by avoiding manual and time consuming due diligence processes. Thus, saving significant time and money for all participants in the ecosystem.

About Spherity

Spherity is a German decentralized digital identity software provider, bringing secure identities to enterprises, machines, products, data and even algorithms. We provide the enabling technology to digitize and automate compliance processes primarily on highly-regulated technical sectors like pharmaceuticals, automotive and logistics. Spherity’s decentralized cloud identity wallet empowers cybersecurity, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.

Stay sphered by joining Spherity’s Newsletter list and following us on LinkedIn. For press relations, contact communication@spherity.com .

Spherity successfully achieves SAP Certified Cloud Solution status was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


UComuny

Vortrag online zu “Föderierte Identitäten und Datensouveränität”

Dominik Deimel sprach auf dem eIDAS Summit 2021 des bitkom über die Vorteile des Modells für Unternehmen. Alle Vorträge und dieses Video verfügbar auf youtube. Spannend für alle die wissen möchten, wie sich die politischen und gesellschaftlichen Forderungen nach souveränen digitalen Identitäten bereits jetzt, über mobile Endgeräte und ohne Einsatz von Blockchain, realisieren lassen. Der…
Dominik Deimel sprach auf dem eIDAS Summit 2021 des bitkom über die Vorteile des Modells für Unternehmen.

Alle Vorträge und dieses Video verfügbar auf youtube. Spannend für alle die wissen möchten, wie sich die politischen und gesellschaftlichen Forderungen nach souveränen digitalen Identitäten bereits jetzt, über mobile Endgeräte und ohne Einsatz von Blockchain, realisieren lassen. Der Gesundheitsmarkt macht es vor: Hier ist man sich des verantwortungsvollen Umgangs mit Daten bewusst. Werden Unternehmen zum Identitätenprovider für ihre eigenen Kunden, lösen sie elegant die regulatorischen Herausforderungen im vertrauensvollen, elektronischen Geschäftsverkehr und minimieren Abhängigkeiten für sich und ihre Kunden.

Mehr über den Ansatz von comuny, mit dem jedes Unternehmen schnell und einfach zum Provider für die Identitäten seiner Kunden und Mitarbeiter werden kann.

#Datensouveränität #SSI #Identität #Identitätenprovider #TrintityIDP #bitkom #eIDAS21 #comuny #Vertrauensdienste #DigitalIdentity #MyDataOperator #verifizierung #authentifizierung #operator #trinity


auth0

Identity Report, Highlighting the Most Pervasive Threats to Digital Identities

Inaugural report reveals insidious trends and provides mitigation strategies for security professionals
Inaugural report reveals insidious trends and provides mitigation strategies for security professionals

Ocean Protocol

Ocean dives into Quantitative Finance with Battle of the Quants

Engagement with asset allocators, quantitative-focused hedge funds and academia to learn more about Ocean’s core technologies Ocean Protocol is collaborating with Battle of the Quants to drive growth within the quantitative hedge fund space and bring more data providers and buyers into the Ocean ecosystem. Battle of the Quants connects academia, asset allocators, and quantitative-focused hedge fu

Engagement with asset allocators, quantitative-focused hedge funds and academia to learn more about Ocean’s core technologies

Ocean Protocol is collaborating with Battle of the Quants to drive growth within the quantitative hedge fund space and bring more data providers and buyers into the Ocean ecosystem. Battle of the Quants connects academia, asset allocators, and quantitative-focused hedge funds to explore and discuss the critical issues confronting the quantitative approach to finance.

In 2021, we will collaborate for a series of events targeting the quant hedge fund world to increase awareness of Ocean Protocol within the industry. The online and offline events, in conjunction with background business development activities, aim to:

Enroll new data providers to Ocean Market. Increase data buyer usage on the Ocean Market. Increase group usage of the Ocean Market.

Ocean Protocol Founder Bruce Pon said, “Quantitative finance uses extensive datasets and mathematical models to analyze financial markets and securities, helping investors evaluate investment opportunities and develop informed and profitable trading strategies. The insights provided by Battle of the Quants — Worldwide will advance Ocean’s goals to penetrate this industry and expand adoption of Ocean Market. We look forward to collaborating with Bartt and his team of seasoned professionals to expand the Ocean ecosystem.”

Bartt Kellermann, CEO and Founder of Battle of the Quants, added, “Ocean Protocol leverages the blockchain infrastructure to provide Quantitative hedge funds with innovative ways to find ideal data sets to generate alpha for specific trading strategies. Battle of the Quants — Worldwide is excited to partner with Ocean Protocol to help facilitate their growth within the quantitative hedge fund space.”

Tying into our 2021 Roadmap, more users of Ocean technology bring us closer to our overall goal of ubiquity: where Ocean is a key utility and global IT infrastructure. Ocean’s engagement with Battle of the Quants further develops our data consumers and provider traction goals, an essential part of #TheYearofScaling.

What’s Coming Next? To kick off our collaboration, Bruce Pon was interviewed by Bartt Kellermann for the BattleFocus series about synergies between their network of Quantitative Hedge Funds and the Ocean ecosystem. Watch it on YouTube. Our first live event will be the BattleBlitz Webinar, when we’ll tackle “Alternative Data Meets Blockchain: Which is Better?” Join us on July 13th at 10:00 AM EST / 4:00 PM CEST. More information to follow! The Ocean Battle Data Competition will allow Data Providers to submit their data offerings for consideration. Coming Fall 2021. About Ocean Protocol

Ocean Protocol’s mission is to kickstart a new Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Data is a new asset class; Ocean Protocol unlocks its value. Data owners and consumers use the Ocean Market app to publish, discover, and consume data assets in a secure, privacy-preserving fashion.

Ocean datatokens turn data into data assets. This enables data wallets, data exchanges, and data co-ops by leveraging crypto wallets, exchanges, and other DeFi tools. Projects use Ocean libraries and OCEAN in their own apps to help drive the new Data Economy.

The Ocean token is used to stake on data, to govern Ocean Protocol’s community funding, and to buy & sell data. Its supply is disbursed over time to drive near-term growth and long-term sustainability. OCEAN is designed to increase with a rise in usage volume.

Visit oceanprotocol.com to find out more.

Twitter | LinkedIn | Blockfolio | Blog | YouTube | Reddit | TelegramDiscord

About Battle of the Quants

In our sixteenth year of hosting the leading quantitative event worldwide, the Battle of the Quants has become the definitive event in the quantitative space for investors, managers and data buyers looking for key industry influencers, decision makers and investment opportunities. The Battle of the Quants curates carefully selected systematic investors and managers leading the way into the new world of Artificial Intelligence, Machine Learning, Data Sets, Digital Assets, Blockchain and Quantum Computing. Attendees include HNWI’s, Family Offices, FoF’s, Institutional Investors, Systematic Hedge Fund Managers, Data Providers and Data Buyers.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

Ocean dives into Quantitative Finance with Battle of the Quants was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Weekly Report (June 9-15, 2021)

Highlights This week, ONTO announced support for Polygon, bringing the total number of supported chains to 14. Latest Developments Development Progress 100% of Ontology’s EVM-integrated design and 45% of its development is now complete. 90% of ETH RPC support is now complete. 60% of Ontology’s new Ethereum account system is now complete. Product Development ONTO App v3.8.6 has been
Highlights

This week, ONTO announced support for Polygon, bringing the total number of supported chains to 14.

Latest Developments

Development Progress

100% of Ontology’s EVM-integrated design and 45% of its development is now complete. 90% of ETH RPC support is now complete. 60% of Ontology’s new Ethereum account system is now complete.

Product Development

ONTO App v3.8.6 has been released. Support for Polygon was added to ONTO Wallet, becoming the 14th blockchain to be supported. BabySwap, a DEX on Binance Smart Chain, is now available on ONTO. Joint events with ApeSwap, Cafeswap, and HyperJump continue. Recent joint AMAs with Rabbit Finance, bloXroute Labs, and Twinci were all successful.

dApps

116 dApps have been launched on MainNet; the total dApp transaction volume is 6,612,084, an increase of 6,123 from last week. 15,777,385 transactions have been completed on MainNet, an increase of 38,856 from last week.

Community Growth

481 new members were onboarded across our global community. We are very excited to see the Ontology community continue to grow and we encourage anyone who is curious about what we do to join us. We held our weekly community call on Discord, led by our Head of Community, Humpty Calderon. Humpty gave a brief recap of the burgeoning Ontology ecosystem and development progress. As always, we’re active on Twitter and Telegram where you can keep up with all our latest developments and community updates. Global News

Ontology Partners with ZAICO

We signed an MOU with leading cloud inventory management software company, ZAICO. Ontology aims to provide solutions to ZAICO’s inventory management platform, helping to increase traceability, transparency, and trust.

Ontology Published Metaverse Articles

The Ontology Research Institute has published a number of articles about Ontology and the Metaverse. Describing the vision Ontology has for the Metaverse, they cover several attributes of the Metaverse such as economic systems, low latency, diversification, and civilization. Ontology in the Media

CPO Magazine — GDPR-Compliant Blockchain: Personal Data Privacy in Blockchain

General Data Protection Regulation (GDPR) was introduced by the European Union (EU) on May 25, 2018. GDPR was enacted to protect personal data, and ensure it is stored when there is a lawful basis, such as when consent is given or when legally required. This regulation aims to protect citizens’ data and includes the right to access, right to rectification, right to erasure, right to restriction of processing, right to be informed, right to data portability, and right not to be subject to a decision based solely on automated processing (including profiling).

Ontology prides itself on protecting personal data privacy, and we are devoted to providing the rights of self-sovereign ID, DID, and data, for users. Similarly, SAGA, our decentralized marketplace, is building distributed data capabilities for enterprises with blockchain, and is using blockchain to help protect consumers’ data privacy.

Want more Ontology?

You can find more information about our decentralized solutions across identity and data, on our website or simply follow us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (June 9-15, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


SWN Global

We are now a new member of OMFIF’s DMI(Digital Monetary Institute)

We are thrilled to announce that Sovereign Wallet Network is officially a member OMFIF’s Digital Monetary Institute! *OMFIF is an independent think tank for central banking, economic policy and public investment, providing a neutral platform for public and private sector engagement worldwide. With teams in London and the US, OMFIF focuses on global policy and investment themes relating to central

We are thrilled to announce that Sovereign Wallet Network is officially a member OMFIF’s Digital Monetary Institute!

*OMFIF is an independent think tank for central banking, economic policy and public investment, providing a neutral platform for public and private sector engagement worldwide. With teams in London and the US, OMFIF focuses on global policy and investment themes relating to central banks, sovereign funds, pension funds, regulators and treasuries. Global Public Investors with investable assets of $39.5tn are at the heart of this network.

Looking forward to starting such an amazing collaboration.

#OMFIF #MetaMUI #CBDC #DSE #OMFIF_DMI

#MetaMUI Blockchain


Magic Labs

Plug and Play Passwordless Authentication with Magic and Gatsby.js

Plug-and-play Passwordless Authentication with Magic and Gatsby.js If there’s one thing web developers know better than anyone else, it’s this: authentication sucks. Authentication is one of those things that seems simple yet oftentimes ends up being a frustrating, overly-complicated process. Even worse, your production app might be completely dependent on your auth server — if it goes down, you’
Plug-and-play Passwordless Authentication with Magic and Gatsby.js

If there’s one thing web developers know better than anyone else, it’s this: authentication sucks. Authentication is one of those things that seems simple yet oftentimes ends up being a frustrating, overly-complicated process. Even worse, your production app might be completely dependent on your auth server — if it goes down, you’re in trouble.

This is where Magic comes in. With only a few lines of code, you can integrate Magic into your application to enable passwordless authentication using magic links (similar to Slack and Medium).

✨ Why Magic Links?

In my opinion, passwordless is simply better than the typical username/password authentication. It provides developers with a seamless development experience, users love the idea of not having to keep track of a password, and best of all: it works like magic; passwords disappear and everyone’s lives are just made easier.

Passwordless authentication is secure. Over 59% of people reuse their passwords everywhere, leading to poor passwords accounting for 81% of all security breaches [source]. (Gentle reminder to update your passwords! I recommend rotating them every few months.)

Magic leverages blockchain-based, standardized pubic-key cryptography to achieve identity management. When a new user signs up via your Magic integration, a public-private key pair is automatically generated for them.

In order to authenticate requests, a user’s private keys are used to sign cryptographic proofs to verify their identity. Thus, your resource server will no longer need to store and manage (1) hashed + salted passwords or (2) user sessions in its database table.

To learn more about Magic’s enterprise-grade security thanks to the cutting-edge identity tech they use involving zero-knowledge proofs, delegated key management and decentralized ID tokens, visit their security page or read Magic’s Whitepaper.

💻 An Enhanced Developer Experience

Magic is a developer SDK that you can integrate into any application to enable blazing-fast, hardware-secured passwordless authentication with just a few lines of code (even if you already have an existing auth solution). The Magic SDK currently offers support for 32+ ecosystems — from Social Logins like Google, Facebook, and Github to Blockchains like Ethereum, Polkadot, and Solana.

Here’s how it works:

When a user wants to sign up or log in to your application:

User requests a magic link sent to their email address User clicks on that magic link User is securely logged into the application

In this tutorial, we’ll demonstrate just how easy it is to get started integrating with Magic in just a few lines of code.

👩🏻‍💻 Integrating Magic with Gatsby.js

Gatsby.js is a React-based static site generator powered by GraphQL. Gatsby essentially takes the best parts of React, Webpack, GraphQL, and various front-end tools to provide a seamless developer experience. If you’re new to Gatsby, you can learn more from their documentation here.

I’m personally a huge fan of Gatsby and have built dozens websites with it. It’s great for building portfolio sites or blogs, both of which you can easily integrate with Magic — which I’ll demonstrate below.

🛠 What We’ll Build

In this tutorial we’ll be integrating Magic to a static Gatsby.js site and deploying it to Netlify. Feel free to view and try out the production application live on Netlify here.

Step 1: Clone the Starter Template & Install Dependencies

To get started, clone the starter template from Github here. We’ll be working out of the dev/ directory, but feel free to play around with the final code in the prod/ directory. I recommend following along with the tutorial before jumping into the production code.

To clone the starter template, run the following command from your terminal:
git clone https://github.com/morganrmarie/passwordless-authentication-tutorial.git

Change to the tutorial’s directory:
cd passwordless-authentication-tutorial

And install the project’s dependencies from the root directory:
cd dev && npm install
cd prod && npm install

If you’d like to try the final application, you can run the production application locally from the root folder:
npm run start:prod

Otherwise, to follow along with the tutorial, spin up the development server (also from the root folder):
npm run start:dev

Step 2: Initialize Magic

Before we get started integrating Magic into our application, I recommend you take a look at Magic’s documentation. They’ve got easy-to-follow documentation and helpful guides on other most commonly used developer tools.

The starter template for this tutorial lies in the dev/ directory of this tutorial's codebase. To begin integrating Magic, we'll need to install the Magic Client SDK in the dev/ directory:
npm install --save magic-sdk

By default Gatsby supports two environments: development and production. To support both environments, we’ll create two files in the root of the dev/ and prod/ directories: .env.development and .env.production. Copy the contents (it's only one variable) of .env.example to both files.

Before we initialize Magic, you’ll need to sign up to the Magic Dashboard to get your own API keys. Replace GATSBY_MAGIC_PUBLISHABLE_API_KEY in .env.development and .env.production (in both directories) with your "Publishable API Key" from the Magic Dashboard:

Now that we’ve set our environment variable, we can create an SDK instance to initialize Magic.

Because Gatsby generates static HTML at build time when you run gatsby build, we need to ensure the global window object is defined before importing Magic.

We’ll initialize Magic in the src/lib/magic.js file like so:

import { Magic } from "magic-sdk" let magic if (typeof window !== `undefined`) {
magic = new Magic(process.env.GATSBY_MAGIC_PUBLISHABLE_API_KEY)
} export { magic } Step 3: Handle Authentication

Navigate to the index.js file in src/pages and import your newly-created Magic instance.

import { magic } from "../lib/magic"

Next, we’ll implement our login & logout methods. Handling user sessions is made incredibly easy with Magic. Add these two methods to the same index.js file.

// Handle login with email
const handleEmailLogin = async (e) => {
// Get the current value from the input field
const email = userEmail.current.value
if (!email) return

try {
// Login with Magic ✨
await magic.auth.loginWithMagicLink({
email
})

// Set user metadata
let userMetadata = await magic.user.getMetadata()
setUser(userMetadata)
} catch (error) {
console.error(error)
}
}

// Handle logout
const handleLogout = async () => {
// Logout with Magic
await magic.user.logout()
setUser(null)
}

In our useEffect hook, we'll check if the user is logged in and set their metadata if they are. If not, we'll set the user to null.

useEffect(() => {
// Check if user is logged in
magic.user.isLoggedIn().then((isLoggedIn) => {
return isLoggedIn
// If user is logged in, set metadata
? magic.user.getMetadata().then((userData) => setUser(userData))
: setUser(null)
})
}, []) Step 4: Deploy

Deploying the application is completely optional, but I’m personally a big fan of Netlify and once you register and connect your Netlify account to your Github, you can configure auto-deployments from the branch of your choice. If you choose to do this, set your build command as gatsby build and the public directory to public.

Don’t forget to set the GATSBY_MAGIC_PUBLISHABLE_API_KEY environment variable in Netlify with your publishable API key from the Magic Dashboard.

🚀 Wrapping Up

So… ready to integrate with Magic? Magic is currently offering users the ability to earn free logins by referring friends. For every friend that signs up for Magic, you’ll both get 3,000 bonus logins — up to 90,000 total.

You saw how easy it was to plug passwordless auth to our Gatsby app with Magic and start playing with the rest of our app’s business logic, so grab your referral link and start sharing!

Follow Magic on Twitter and Instagram to stay up-to-date. Try Magic for free here. ✨

Plug and Play Passwordless Authentication with Magic and Gatsby.js was originally published in Magic on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: DOJ Seizes $2.3M in Bitcoin Paid to the Ransomware Extortionists Darkside

The U.S. Department of Justice (DOJ) announced last week that it has seized 63.7 bitcoin, valued at roughly $2.3 million. This haul of bitcoin represents the proceeds derived from the May 8th ransomware attack on Colonial Pipeline by a criminal group known as DarkSide. 

The U.S. Department of Justice (DOJ) announced last week that it has seized 63.7 bitcoin, valued at roughly $2.3 million. This haul of bitcoin represents the proceeds derived from the May 8th ransomware attack on Colonial Pipeline by a criminal group known as DarkSide. 


Evan Network

Evan GmbH Begins Phase II of the Innovation Competition for Secure Digital Identities: Implementing its ID-Ideal Consortium Project

Evan GmbH announced that it has begun the implementation phase of its ID-Ideal Consortium project in the Secure Digital Identities Innovation Competition. Implementation is the second part of the multi-month, two-phase competition sponsored by the Federal Ministry of Economic Affairs and Energy. The post Evan GmbH Begins Phase II of the Innovation Competition for Secure Digital Identities: I
Consortium Partners, Jungheinrich AG and Evan Developing a Blockchain-based IoT Infrastructure Utilizing Self-sovereign Identities for Sharing Machines

Dresden, Germany,  June 16, 2021  – Evan GmbH, a subsidiary of Blockchains, Inc., announced that it has begun the implementation phase of its ID-Ideal Consortium project in the  Secure Digital Identities Innovation Competition . Implementation is the second part of the multi-month, two-phase competition sponsored by the Federal Ministry of Economic Affairs and Energy (BMWi). After being named a winner during the first phase of the competition in November 2020, the ID-Ideal Consortium project was awarded approximately 15 million euros in funding, to be paid out over the next three years.

The mission of the ID-Ideal Consortium is to create a blockchain-based trust network where digital data can be securely exchanged between companies, citizens, and administration. The consortium currently consists of 40 partners including evan, which also provides evan.network , the infrastructure where consortium partners can engage securely.

“We are honored to have been chosen and to be awarded this funding from the BMWi. It will enable us to drive innovation on secure digital identities together with the partners of the ID-Ideal Consortium and to illustrate the importance and value of these on real use cases, ”says Thomas Müller, CEO of evan GmbH. “Within the consortium, we focus on building a digital infrastructure with which companies can securely and autonomously make physical resources digitally available and integrate them into multi-party processes. We believe that the true success of Industry 4.0 will be measured by how we can more easily and securely share resources and exchange data: human-to-human, machine-to-human, and machine-to-machine. “

One example of an ID-Ideal Consortium innovation that partners Jungheinrich AG and evan are working on is the digitalization of the rental process for industry trucks and forklifts using blockchain-based digital identities. With a digital identity, the prospective renter can be attested and verified by the rental company that they have a valid driver’s license to rent and operate the machine. The prospective renter can also view important data about the rental machine in advance, such as operating hours or the last maintenance date. All of this can be done in a blockchain-based digital world, including processing the payment – further minimizing administrative work and eliminating the need for physical paper documents.

Evan is also helping the ID-Ideal Consortium to enable companies to prove compliance with regulatory requirements in multi-stage supply chains. Digital identities built on blockchain technology can be used to trace product lifecycles from manufacturing to usage to recycling. In addition to optimizing processes, this data can also be used to reduce carbon emissions. 

The ID-Ideal Consortium has a broad and diverse membership to better understand and be able to meet the variety of requirements of today’s businesses that want to enable digital identity-based exchanges. The consortium is led by the University of Applied Sciences Dresden. In addition to evan GmbH and Jungheinrich AG, members of the project team include HTW Dresden, the state capital of Dresden, the Leipzig city administration or Fraunhofer FIT.

About evan GmbH

Every product is a digital product – that is the vision of evan GmbH. The software company offers solutions that enable companies to collaborate efficiently, sustainably, and securely with their partners in the digital space. At the company’s locations in Dresden and Eisenach, the evan team is shaping the future of the digital market economy based on decentralized technologies – thus ensuring more transparency and trust in digital business relationships. On April 1, 2021, evan GmbH became a wholly-owned subsidiary of Blockchains, Inc.

About Blockchains, Inc.

Blockchains is a company committed to protecting and empowering individuals through the development of a blockchain-based platform that will change the way people interact with technology, infrastructure, and each other. Blockchains plans for its solutions to serve as digital infrastructure to purpose-built smart cities in the United States and around the world, while working with innovative companies and communities to showcase how, when embracing innovative technologies with a commitment to sustainability enhancement, we can transform daily human life for the better.

The post Evan GmbH Begins Phase II of the Innovation Competition for Secure Digital Identities: Implementing its ID-Ideal Consortium Project first appeared on .

Okta

A Developer's Guide to Session Management in React

Sessions can be a challenging topic for developers of all skill levels. Many React developers never consider the internals of session management because so much of the work is abstracted away. But, it is important to understand what sessions are, how they work, and how best to manage and manipulate them. There are several different strategies for session management in React. In this article

Sessions can be a challenging topic for developers of all skill levels. Many React developers never consider the internals of session management because so much of the work is abstracted away. But, it is important to understand what sessions are, how they work, and how best to manage and manipulate them.

There are several different strategies for session management in React. In this article, you will learn the basics about sessions, how to manage them in React, and see some examples using common packages.

Session Management Overview

The first thing you should know is what exactly a “session” is. In its simplest terms, a session is some data that is stored on the server. The server then provides an ID to the client, which the client can use to make requests back to the server. For example, if you needed access to a user’s email address, you could store it against the session, then return an ID to the client. The client could then request an email be sent using the ID, rather than passing his/her email back to the server. The ID field is opaque, meaning that the client knows nothing about what is saved against the ‘ID’ field. The ID can also contain validation and encryption. And, the server can return client data, which would typically be encrypted, to the client that he/she would frequently need.

In this article, I consider a range of server scenarios, as many times when working in React, developers don’t have control over what the server does. For example, when passing session data, the server will include the data in a cookie. The server may also expect a cookie to be present when making a request. Therefore, as a react developer, you will be required to include this cookie in your requests. You may also choose to store the data in localStorage, however then the server doesn’t have access to it. Local storage allows for more storage and can persist over browsing sessions, making it ideal for situations where you want to remember user actions across multiple browsing sessions.

Another situation where you may receive a cookie from the server, is when it is marked HttpOnly. When a cookie is marked HttpOnly, it cannot be read by the client in javascript. This helps minimize the risk of attack against the cookie. For example, if your site has a cross-site scripting vulnerability, marking the cookie HttpOnly will protect the cookie’s contents. Of course, you still need the data that the cookie represents, so you should make a request to the server for the resource you need and present the cookie to the server.

Along these same lines, you should also understand cookie validation. The server you are attempting to access should validate the cookie before processing any request on it. There are many validation tools to help server-side developers, such as signing and expiring cookies. Many times, the server will provide a way for you to check the state of a cookie without requesting a resource.

Manage Sessions in React

There are many packages for helping manage sessions in React. If you are using Redux, redux-react-session is a popular choice. If you are not, react-session-api is another helpful package found on npm.

Focusing on redux-react-session, the first thing you need to do is add your session reducer.

import { combineReducers } from 'redux'; import { sessionReducer } from 'redux-react-session'; const reducers = { // ... your other reducers here ... session: sessionReducer }; const reducer = combineReducers(reducers);

Next, you need to initialize your session service.

import { createStore } from 'redux'; import { sessionService } from 'redux-react-session'; const store = createStore(reducer) sessionService.initSessionService(store);

Once you are set up, you have access to the full API by the session service. There are several key benefits to this.

First, you have initSessionService. As the name implies, this call will initiate the session service. Below you can see an example of a call:

import { createStore } from 'redux'; import { sessionService } from 'redux-react-session'; const store = createStore(reducer) const validateSession = (session) => { // check if your session is still valid return true; } const options = { refreshOnCheckAuth: true, redirectPath: '/home', driver: 'COOKIES', validateSession }; sessionService.initSessionService(store, options) .then(() => console.log('Redux React Session is ready and a session was refreshed from your storage')) .catch(() => console.log('Redux React Session is ready and there is no session in your storage'));

To understand this call, you should understand the options that are passed in. First is refreshOnCheckAuth. This option defaults to false, but if set to true, will refresh the Redux store in the checkAuth() function. The checkAuth() function is provided by the sessionService object from the redux-react-session.

redirectPath defaults to /login. This is the path used when the session is rejected or doesn’t exist. Suppose a new user attempts to access a secured page by browsing to the URL directly. Because there is no session, the user will be re-routed to /login by default, or /home in the example above.

Next is the driver option. The two you have already learned about are COOKIES and LOCALSTORAGE, however redux-react-session also accepts INDEXEDDB or WEBSQL. IndexedDB is a database that is built into the browser. Applications that require a lot of client-side data storage should consider this option. Web SQL is also a browser based database, however it is not supported in HTML5 and is deprecated. IndexedDB is considered the default alternative to Web SQL.

Finally, there is the validateSession() function. This will pass the logic for session validation to the sessionService. As discussed before, this is largely dependent on your server functionality. If you can validate the session from the client-side, you can implement the logic here. Otherwise, you can use axios or fetch to make a call to the server to request session validation.

Two other useful functions are saveSession() and deleteSession(). It is a best practice to enforce some rules for deleting the session, though these rules will vary based on your use cases. These functions return promises, as does the entire API. To save the session, you need to pass your custom session object. Setting the session also changes the authenticated flag to true in the Redux store.

Learn More About Sessions and React

Managing sessions in React is an immense topic. In this article, you learned the basics of session management and how to apply them to React. You also learned how to use one of the most common react session management packages available. But this is just a start. I encourage you to look into more react session packages and continue to learn and understand how to properly manage the session. Doing so will make your applications more secure and performant.

Why JWTs Suck as Session Tokens Build a Secure CRUD App with ASP.NET Core and React Build a Secure React Application with JWTs and Redux

Make sure you follow us on Twitter and subscribe to our YouTube channel. If you have any questions, or you want to share what tutorial you’d like to see next, please comment below.

Tuesday, 15. June 2021

IdRamp

Use Case: Zero Trust Webcasting – Pricewaterhousecoopers

Computer systems and virtual environments provide essential communication services for telework and education, in addition to conducting regular business. Cyber actors exploit vulnerabilities in these systems to steal sensitive information, target individuals and businesses performing financial transactions, and engage in extortion. The post Use Case: Zero Trust Webcasting – Pricewaterhousecooper

Computer systems and virtual environments provide essential communication services for telework and education, in addition to conducting regular business. Cyber actors exploit vulnerabilities in these systems to steal sensitive information, target individuals and businesses performing financial transactions, and engage in extortion.

The post Use Case: Zero Trust Webcasting – Pricewaterhousecoopers first appeared on IdRamp | Decentralized Identity Evolution.

KuppingerCole

Privileged Access Management: Cloud Delivery Without Compromise

Privileged Access Management (PAM) solutions are critical cybersecurity and risk management tools for just about every business to address the security risks associated with privileged users and privileged access, but not everyone can afford expensive on-prem deployments.

Privileged Access Management (PAM) solutions are critical cybersecurity and risk management tools for just about every business to address the security risks associated with privileged users and privileged access, but not everyone can afford expensive on-prem deployments.




auth0

Auth0 Statement on New Standard Contractual Clauses issued by European Commission

Auth0 is preparing new documentation for its customers with European users.
Auth0 is preparing new documentation for its customers with European users.

KYC Chain

Crypto Wallet Screening: KYC-Chain’s Automated Compliance Solution

KYC-Chain’s Wallet Screening tool allows crypto companies and exchanges to generate detailed AML/KYC Risk Reports that evaluate the money laundering risk of blockchain addresses and their owners. This article explores the tool's features and unique capabilities in detail. The post Crypto Wallet Screening: KYC-Chain’s Automated Compliance Solution appeared first on KYC-Chain.

HYPR

What Apple’s WWDC Passkeys Announcement Means for Enterprise IAM

Apple’s WWDC 21 had a great set of new announcements around security. The most exciting one for us Identity and Access Management (IAM) geeks is the update on Apple’s commitment towards moving beyond passwords.  In this post, I wanted to share some thoughts on this great announcement and what it means for enterprise identity and […]

Apple’s WWDC 21 had a great set of new announcements around security. The most exciting one for us Identity and Access Management (IAM) geeks is the update on Apple’s commitment towards moving beyond passwords

In this post, I wanted to share some thoughts on this great announcement and what it means for enterprise identity and authentication.

The updates presented by Garrett Davidson from Apple build on Apple’s previous support for FIDO2 and WebAuthn open standards in the Safari browser on both iOS and OS X (now MacOS). Previously, Apple provided support for passwordless authentication in the Safari browser by adding a FIDO2 authenticator to the underlying operating system. This was a step in the right direction and followed Google’s Android passwordless implementation, which has been available for nearly three years.


Screenshot: Apple

Apple’s approach to passwordless is not particularly unique since it adheres to the FIDO standard, however their implementation and approach to the credential recovery problem is unique and relevant to enterprises. One refreshing aspect of their messaging and stance on authentication is their dedication to eliminating shared secrets. 

Statements such as “Each time that a secret is shared, there is risk,” and “Servers are less valuable targets for hackers because there are no shared secrets to steal” are encouraging to hear and reflect what we’ve been saying at HYPR for years. We put it this way: moving away from shared secrets takes an enterprise from an infrastructure that’s expensive to defend and easy to attack to one that’s expensive to attack and easy to defend.

Here’s what’s new in this announcement:  Apple now has WebAuthn and FIDO2 support for native mobile apps. This means that users can enroll for passwordless authentication in a mobile app as well as browsers. These credentials, enrolled using the native app APIs, can then be used on mobile browsers as well without having to re-enroll.
Synchronization via Keychain. Apple’s new feature “Passkeys in iCloud Keychain” — which are what they call the FIDO2 private key credentials —  are now synchronized across your Apple devices using end-to-end encryption.  Thoughts on These Updates Unsurprisingly, Apple’s keychain synchronization is more focused on consumer authentication than the enterprise.

Apple’s Passkeys approach is concerning. A Passkey is a private key, make no mistake about it. The best practice in security and cryptography is to not transfer or duplicate key material. I do believe that Apple’s approach to this is going to be world class, but that doesn’t change the fact that it’s a bad practice that adversaries will likely exploit.

Enterprises will not want to synchronize their users’ Passkeys in Bring Your Own Device (BYOD) scenarios.

Many enterprises allow BYOD. However, the Passkeys are synchronized across a user’s Apple account and devices. That means that if I’m using my personal phone to authenticate to my corporate resources, the credential used to access those resources could be copied over to my iPad that is shared with my entire household. This could hold a significant liability risk because it would be difficult to track user attribution tied to accessing corporate accounts.

I predict enterprises that want to leverage the Passkeys method of authentication will opt out of the synchronization offering from Apple and require more enterprise-friendly methods for credential recovery.

Developers and IAM vendors will need to leverage the strong association between a native app and website in order to leverage this functionality.

This functionality is seldom used in apps today. It will need to be provided as an out-of-the-box capability since oftentimes mobile app teams and web teams manage projects in silos. Conclusion

Overall the developments from Apple are highly encouraging and have an eye on the future of a passwordless world. Many of their approaches are consumer-centric which is understandable, but for those of us who want to leverage these powerful tools on the enterprise side, there are major security aspects to consider. It will be interesting to see how the MDM technologies in the market address and enforce the Passkeys replication capabilities within Apple’s products. It’s exciting to see future developments from Apple on this topic and we look forward to providing these additional capabilities to HYPR customers soon!


auth0

What Is Passwordless Authentication?

Learn how passwordless authentication can help enterprises reduce security risks and costs
Learn how passwordless authentication can help enterprises reduce security risks and costs

Global ID

GiD Report #164—What Apple’s missing about digital identity

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. ICYMI — GlobaliD joins the Linux Foundation’s Cardea Project This week: When Apple owns your digital identity Tweet of the week Stuff happens 1. When Apple owns your digital identity First

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

ICYMI — GlobaliD joins the Linux Foundation’s Cardea Project

This week:

When Apple owns your digital identity Tweet of the week Stuff happens 1. When Apple owns your digital identity

First the good. Apple is taking digital identity seriously — announcing last week that its latest mobile operating system will allow users to store digital copies of their driver’s license in their wallet.

Photo: YouTube

With a U.S. base of more than 43 million users on Apple Pay in 2021 — according to eMarketer for Business Insider — that’s a big deal. And with Apple’s brand and platform cachet, they’re able to drive adoption on both sides of the equation by working closely with local governments. According to CNET’s Russell Holly’s napkin math, there could be at least 12 states live at launch, ready to accept Apple ID.

As GlobaliD’s head of verifications /vs noted, “the digital government ID transformation is further along than you think.”

So in a lot of ways, Apple just made digital ID relevant — by helping to build the dream where we can finally ditch our chunky physical wallets with a much more robust and convenient digital offering. And because Apple’s business model isn’t based on farming data per se, their commitment to privacy should be commended.

Now for the not so good. The story of identity should be about individual empowerment. Apple’s story is more about individual convenience in service of Apple. When it comes to the Big in Big Tech, Apple’s as Big as they come — all while being renowned control freaks. Their top-down approach to digital identity isn’t about portability or interoperability, it’s about strengthening their platform moat, where Apple takes a 30 percent cut on all sales.

Which ultimately serves as a huge tax on creators and developers. Here’s The Information:

Cameo CEO Steven Galanis said more than 80% of the celebrity shoutout platform’s revenue comes from web vs. its app.
That’s notable because Galanis is joining a chorus of large and small app developers complaining that the 30% commissions Apple and Google app stores charge customers for digital purchases are holding back smaller businesses, including creators. As we reported last week, some sites that act as platforms for creators, such as Patreon and Medium, side-step those commissions by focusing primarily on purchases over web browsers. Count Cameo in that camp.
The app takes 25% of a customer’s purchase; the celebrity takes home the other 75%. But if the sale happens on Apple’s iOS, after paying the fees, “the creator is getting 52.5%,” Galanis said. “The big issue here is that if Apple continues these policies, it’s really putting a tax on creators.”

As we all know, Epic Games has taken Apple to court over that very issue. But that’s basically the point — why do we want a digital identity that doesn’t allow us to choose how we want to pay for things? That’s the end result when your digital identity is still controlled by one of the largest corporations on the planet.

Naturally, there’s rumblings as well about privacy concerns when it comes to storing digital copies of your stuff — as Mashable’s Jack Morse highlighted. Those concerns are valid — but it’s the type of problem with plenty of potential solutions, whether it be encryption, user education, or UX failsafes. The world is getting more digital — and as more and more companies evolve toward that inevitability, those solutions will only get more refined and secure over time.

The big takeaway. That Apple is venturing deeper into the digital identity and wallet space can only be a net good as well as being awesome validation for where things are going. The big challenge for smarter approaches to identity will always be around infrastructure and ecosystem development around those new identity tools, and Apple’s entrance in this space will serve as a super catalyst for getting a lot of key puzzle pieces in place such as local government support for digital offerings.

Moreover, the more people learn about their digital identity, the more they’ll also realize what they really need or what they’re missing when it comes to individual control and ownership. Having a credential is nice — in this case, your state ID in your digital wallet. But it’s really only the tip of the iceberg when it comes to a portable and interoperable identity’s true potential.

So we’re already seeing that with the various fights Apple is facing on the Apple Pay side of things. We’re also seeing a lot of initiatives such as the EU digital wallet that will compete head to head with Apple’s new offering, which, as GlobaliD co-founder and CEO Greg Kidd noted last week, was another step in the right direction for a truly portable identity.

Apple — whether they intend to or not — is getting us all closer to that goal.

Relevant:

Via /vsWhat US states will support Apple Wallet digital identity cards? Via /mNow Apple wants to store your driver’s license on Apple Wallet Privacy tech industry explodes Zuckerberg Jabs Apple Over Creator Fees; Floyd Mayweather Joins OnlyFans Plus, a couple of tweet reactions highlighted by GlobaliD product guru /jvs:
https://twitter.com/malonehedges/status/1401991977768341504?s=21
https://twitter.com/psb_dc/status/1401954735020810242?s=21
Great overview on the state of privacy regulations from Axios 2. Tweet of the week

Phil Windley:

Relevant:

Decentralized Business Model 3. Stuff happens Via /m — This Encrypted Messaging App Used by Organized Crime Was Created by the FBI A newsletter funded entirely by NFTs: Why the Dirt NFT Campaign Succeeded — Mirror Twitch Turns 10, and the Creator Economy Is in Its Debt Via /j — Netherlands must ban cryptocurrencies immediately: CPB head The existential threat facing stablecoins Central banks are headed toward digital currencies Via /rcb — The recovery of Colonial Pipeline’s ransom payment is a major moment for cryptocurrency
/rcb: “Article on how the US Department of Justice retrieved millions of ransom paid to attackers of Colonial Pipeline. While the tone of the article is upbeat about the retrieval, DoJ essentially compromised an account to retrieve the funds.”
El Salvador to Become First Nation With Bitcoin as Legal Tender — Blockworks Crypto’s China Crackdown Intensifies Jack Dorsey Suggests Twitter Likely to Integrate Lightning Network — CoinDesk BlockFi Retail Account Balance Increased Five-Fold in Past Year, CEO Says — CoinDesk Texas State Regulator Greenlights Banks to Custody Crypto — CoinDesk

GiD Report #164—What Apple’s missing about digital identity was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


UNISOT

How blockchain can improve Supply Chain efficiency

The biggest obstacle to blockchain-based improvements in the global supply chain isn’t technology, it’s getting companies to trust one another. The post How blockchain can improve Supply Chain efficiency appeared first on UNISOT.

Nilsson observes of supply chain actors, “even if they’re working together they’re also competitors.” But, he says, a public ledger is able to allow for the smallest actors in a supply chain to get involved in the sending and receiving of data. This is in part because microtransactions and small fees reduce economic barriers to entry. As blockchain technology continues to be misunderstood, those stewarding supply chain companies into the future of data must devote their attention primarily to education in the coming years. When reshaping the foundations of global distribution networks, slow and steady wins the race.

There exists no such thing as a Permissioned/Private Blockchain. A Blockchain is per definition a public timestamped record. If it is not public, it simply is a Private Database with a fancy replication mechanism.

– Stephan Nilsson, CEO UNISOT

Full article by Miles Andrews – CoinGeek

The post How blockchain can improve Supply Chain efficiency appeared first on UNISOT.


1Kosmos BlockID

Authentication vs Authorization


Magic Labs

Magic Product Updates: June Edition

Welcome back to this month’s roundup of product updates! Hot off some exciting launches, we’ve got big news to share including fresh features and new ways you can get even more out of Magic. Drumroll please… 🥁 ⚡ Simpler pricing No more lock-in from subscriptions, now you can pay as you go with credits similar to Twilio and AWS. To start, you’ll get 10,000 free logins. After

Welcome back to this month’s roundup of product updates!

Hot off some exciting launches, we’ve got big news to share including fresh features and new ways you can get even more out of Magic. Drumroll please… 🥁

⚡ Simpler pricing

No more lock-in from subscriptions, now you can pay as you go with credits similar to Twilio and AWS.

To start, you’ll get 10,000 free logins. After that, it’s just $0.0085 per login. And with fair pricing protection, you’ll only pay for up to 4 monthly logins per user — the rest are on us!

We’ve revamped our Pricing page to reflect this new model.

🚀 Magic referrals

Sean, our CEO, recently announced our new Referrals program, along with our new pricing model, in this blog post. Dive in for a quick read and learn how your feedback shaped both initiatives.

Refer friends, coworkers, or community members to Magic and you’ll both get 3,000 bonus free logins. Invite up to 30 friends for a total of 90,000 bonus logins.

Share Magic with friends.

🌍 Our mission, in 60 seconds

If you’re curious “Why Magic?” we’ve got just the thing. Check out our new video which highlights our mission and the internet-wide problem we’re passionate about solving.

We worked in close collaboration with the incredibly talented team at Kurzgesagt (they have 15 million+ subscribers on YouTube, so you may already be familiar with their unique design and animation!). Huge thanks to them for their help in bringing this story to life.

Fun fact: there’s a secret message embedded in the video, and we’re inviting everyone to crack it! Send your submission to puzzle@magic.link to enter to win awesome prizes:

1 ETH goes to the first person to solve the puzzle Magic swag for the first 10 people Twitter shout-outs to anyone who solves it before July 16 🤖 Discord login

We’re super excited to release Discord OAuth support for Web, React Native, and iOS. For anyone with a Discord community or for everyone who wants to allow users to sign up and log in to your app via Discord, Magic makes that easy.

Play with a demo to experience the end-to-end flow.

Here’s how to set up Discord login.

If you want to build your own example code, you can run:

npx make-magic — template hello-world-social-login — social-login discord ⛓ Celo support

Now you can build decentralized applications (dApps) that run on a new blockchain: Celo.

Celo is an open-source platform that enables building mobile-first dApps, especially for open financial systems.

Head to this doc for the steps to interact with Celo.

The team is working hard on shipping new features and requests from developers and customers. If you have a request or want to share feedback, we’d love to hear from you. Share it here or join the discussion on our community forum.

Magic Product Updates: June Edition was originally published in Magic on Medium, where people are continuing the conversation by highlighting and responding to this story.


SWN Global

MetaMUI AMA with Cointelegraph!

MetaMUI AMA with Cointelegraph will be held on June 16th, at 15:00 CET. Phantom Seokgu Yun, CEO of Sovereign Wallet, and Cizar Bachir Brahim, CSO, will be present and will talk about the features of MetaMUI Blockchain and how it will finally revolutionize and reform current financial world. LIVE AMA theme: Internet of Sovereign Digital Currencies : Identity-based MetaMUI blockchain Gu

MetaMUI AMA with Cointelegraph will be held on June 16th, at 15:00 CET.

Phantom Seokgu Yun, CEO of Sovereign Wallet, and Cizar Bachir Brahim, CSO, will be present and will talk about the features of MetaMUI Blockchain and how it will finally revolutionize and reform current financial world.

LIVE AMA theme: Internet of Sovereign Digital Currencies : Identity-based MetaMUI blockchain

Guest: Phantom Seokgu Yun (Founder & CEO), Cizar Bachir Brahim (CSO)

Event Link: https://youtube.com/watch?v=_g-ATBkiC4A…


Ontology

Resumen de Ontology live session at Consensus 2021 by CoinDesk

Humpty Calderon Está bien ya estamos en vivo, así que bienvenidos a todos. ¡Quién se une a nosotros aquí hoy! Uh, mi nombre es Humpty Calderon, jefe de Comunidad Ontology, tenemos a Raindy, ¿Te gustaría presentarte? Raindy Hola a todos, soy Raindy Lou, jefe de marketing y comunicación de Ontology, me alegro de estar aquí. Humpty Calderon Excelente y para que lo sepan aquellos de ustedes q

Humpty Calderon
Está bien ya estamos en vivo, así que bienvenidos a todos. ¡Quién se une a nosotros aquí hoy! Uh, mi nombre es Humpty Calderon, jefe de Comunidad Ontology, tenemos a Raindy, ¿Te gustaría presentarte?

Raindy
Hola a todos, soy Raindy Lou, jefe de marketing y comunicación de Ontology, me alegro de estar aquí.

Humpty Calderon
Excelente y para que lo sepan aquellos de ustedes que no están familiarizados con Ontology, Ontology es una cadena de bloques para identidad y datos soberanos que saben que hay una gran cantidad de tecnología que se ha creado para facilitar la identidad soberana, datos crediticios, marketplaces, DeFi, sabe que si está interesado en obtener una instantánea de lo que es, puede ir a ont.io y puede ver algunos de los diferentes productos y protocolos que se han desarrollado como ont-id, score, Wing piscinas inclusivas dentro de Wing, que es la prueba del concepto de crédito descentralizado y DeFi, hay una billetera ONTO, por supuesto, que no es ningún secreto, mi billetera favorita de activos digitales de múltiples cadenas, la forma más fácil de crear su identidad digital.

Una vez que haya creado, puede ver lo fácil que es para usted hacer una verificación suave, una verificación dura lo que necesite y luego, por supuesto, saga, algunos de los cuales vamos a hablar hoy sobre cualquier cosa que quisiera agregar a Raindy antes de comenzar, la presentación.

Raindy
Eso ya, está bien, sigue por favor.

Humpty
Está bien, así que hemos compartido una presentación aquí como una plataforma de diapositivas que sabes si alguien está interesado después de esta llamada para tener acceso a esto, solo háganos saber si en los mensajes aquí o a través de las redes sociales de Ontology o en Telegram, diga “Yo vi esta plataforma de lanzamiento realmente genial”. Realmente quiero conseguir uno de esos para mí porque quiero aprender más, quiero sumergirme en cualquiera de estos temas realmente complejos para aprender más sobre cada uno de ellos.

Así que, como dije, Ontology es una cadena de bloques para auto soberanos, identidad y datos, usted sabe en tres pasos que puede ver qué es Ontology lo que ofrece y su ecosistema, probablemente los exploraremos paso a paso a lo largo de la presentación, plataforma de colaboración distribuida blockchain pública de alto rendimiento de Ontology, básicamente significa blockchain realmente rápido el consenso ha alcanzado, creo que la finalidad es de menos de un segundo, así que definitivamente una de las cadenas de bloques más rápidas que existe, es eficiente en términos de consumo de gas y es capaz de construir sobre ella. Es compatible con Python, Java, Golang, C ++, así que definitivamente es una vía de acceso para los desarrolladores para que puedan comenzar a trabajar en desarrollo de blockchain sin necesariamente tener que aprender un lenguaje de programación diferente.

Ontology que proporciona soluciones DID para plataformas de transmisión de música, ROCKI

Entonces, si avanzamos por aquí, podemos ver algunos de los diferentes socios que Ontology ha recientemente incorporó a la cadena de bloques para su identidad, debería decir solución y eso es rocki, si no está familiarizado con quiénes son, son una plataforma de música que está construida para que músicos, artistas, creadores, compartan su música de manera descentralizada y, como se dice aquí, Ontology y rocki han desarrollado una asociación estratégica para integrar las soluciones de identidad descentralizadas desarrolladas por Ontology para verificar a sus artistas.

Ciertamente, profundicé en eso, de hecho, lo hice anteriormente en la llamada de hoy y en otras conversaciones que publiqué en Twitter y Clubhouse, donde analizamos por qué eso es importante, por qué la identidad es importante cuando se trata de verificar al creador y verificar los activos digitales, por supuesto, la autenticidad es clave, especialmente porque la razón por la que recopila algo es porque desea apoyar a un creador en particular. Raindy, ¿Quieres agregar algo sobre rocki y en términos de saber por qué Ontology ve que el proyecto es importante y valioso para expandir la identidad descentralizada?

Raindy
Genial, así que hablamos con rocki durante mucho tiempo que, dado que el mercado NFT está aumentando ahora y tienen los requisitos sobre el sistema de registro KYC y AML, están encontrando socios que prueben especialmente soluciones en esta área y compararon diferentes proyectos entre todos los que están haciendo, lo mismo que la cienciología y luego de continuar los esfuerzos de nuestro equipo técnico y de negocios, finalmente eligieron Ontology porque piensan que Ontology es el principal experto en esta área, por lo que el objetivo tanto de rocki como de Ontology es que queremos transformar el sistema de inicio de sesión tradicional para cada creador en la plataforma rocki de una forma centralizada a una forma descentralizada para que de esa manera y todos los usuarios puedan tener el poder de controlar sus propios datos y accesos sin vincular la información a la plataforma tan que es muy importante y también es una forma de innovación para todos los proyectos que tiene ese sentido de proteger la identidad y los datos del usuario, así que a través de esa manera ahora vamos a construir todo el sistema y luego mostraremos a los miembros de nuestra comunidad y también a los usuarios que saben que funciona.

Creo que no habría mucha barrera entre la forma de inicio de sesión tradicional para descentralizar la forma para que los usuarios simplemente inicien sesión como lo hacen antes, pero en realidad está protegiendo su identidad utilizando las soluciones de identificación, lo cual es realmente genial.

Humpty
Sí, creo que otra cosa que vale la pena señalar si observamos que rocki está construido en la Binance Smart Chain, por lo que esta es una plataforma, un producto que no está operando en Ontology, pero está operando en otra blockchain, por lo que es importante tener en cuenta que Ontology ha implementado estas soluciones de identidad en otras cadenas de bloques correctamente, por lo que puede decirnos brevemente cuáles son algunas de estas cadenas de bloques.

Raindy
Sí, además de BSC, también estamos probando soluciones como Ethereum, Polkadot, Tron y todas las cadenas públicas líderes, diría que puede encontrar el registro de w3c en su sitio oficial para ver cuántos hicieron soluciones y registro el cual es proporcionado por la Fundación de Ontology. Entonces, creo que no importa el “ID de ONT” o el O-score, no solo se limita a la cadena de Ontology y el ecosistema que debería ser a todo el sistema blockchain, por lo que estamos impulsando más aplicaciones y rocki es un buen comienzo para fomentar más proyectos. sobre la base de BSC para considerar que las soluciones que pueden ser empoderadas en sus propios proyectos.

Humpty
Absolutamente quiero decir que realmente me encanta esa idea de que conozcas la interoperabilidad y la portabilidad de esa identidad, porque si estás creando soluciones de identidad y las estás bloqueando dentro de un protocolo en particular, ya sea que conozcas una plataforma como esta o solo una blockchain, tú sabes que hay valor para los usuarios, pero ese valor es limitado. Es genial que lo que Ontology está construyendo, usted sabe en ONT-id y O-score y todos estos otros protocolos de identidad es esa interoperabilidad, porque de esa manera las personas que, por ejemplo, pueden utilizar su identidad en Binance Smart Chain, podrían llevar esa identidad a otra blockchain como Ontology.

Raindy
Sí, esa es la idea para que todos tengan su identidad descentralizada, pero no quieren que sea individual, por lo que debería funcionar siempre que sea posible, así que eso es lo que hacemos y hacer nuestros esfuerzos para eliminar todas las barreras y límites entre las diferentes cadenas para ser la verdadera diafonía.

Humpty
Sí, absolutamente, creo que en los últimos años hemos visto el valor de la interoperabilidad, ya sea para crear algún tipo de puente entre cadenas para activos digitales o incluso dentro de DeFi con el derecho de compatibilidad de estos protocolos para que puede crear un ecosistema más dinámico Creo que lo mismo va a ser cierto para la identidad, habrá una necesidad de esa interoperabilidad y compatibilidad para crear una experiencia más rica, una experiencia más sólida y segura para las personas.

Raindy
Sí, definitivamente genial.

Humpty
De acuerdo, en realidad, esta es una diapositiva genial, me gusta mucho esta y, de hecho, podría hacer de esta mi foto de perfil por un tiempo porque me gusta que muestra el rico ecosistema y profundo de ONTO-id, la cantidad de blockchain en la que se encuentra actualmente y sabes que esto está creciendo.

Quiero decir, vemos como dijiste que algunas de las cadenas de bloques líderes aquí tenemos Ethereum, Binance Smart Chain, Near Protocol, Elrond, Tron, Klaytn, Ontology obviamente, así que es realmente genial ver que conoces el desarrollo que cubre la gama. de blockchains, al igual que tiene su token favorito, es probable que su blockchain favorito, ONTO-id ya esté allí.

Entonces, ONTO, mi billetera favorita, ve a Twitter, verás que hablo lo suficiente sobre eso y luego sabes que las charlas que tengo en el clubhouse, ¿hay algo que no sepamos sobre ONTO o tal vez otra perspectiva sobre ONTO que creas que es valiosa para que sepamos más allá del hecho de que es una cadena múltiple que admite múltiples tokens y del hecho de que es una forma sencilla de crear su identidad descentralizada.

Raindy
Creo que hay tres puntos los puntos brillantes de ONTO, que el primero es que es una billetera descentralizada, que es diferente de las billeteras centralizadas, por lo que solo mantienen sus activos criptográficos en su servidor y aún tiene el riesgo si lo hacen, algún daño a los activos, por lo que también puede evitar ese tipo de riesgos.

El segundo es la verdadera cadena cruzada con él, por lo que significa que una vez que tenga su propio wallet, una vez que descargue y abra hasta que realmente se vincule a los otros activos que se admiten como se enumeran, debería haber como 14 en total para que no lo haga, no es necesario obtener una copia de todas las palabras dinámicas para cada diez, pero solo necesita obtener una clave privada para eso, por lo que ahorra tiempo y esfuerzos para que los usuarios administren sus activos y aunque vayan a apagar sus dispositivos. como si tuvieran el nuevo teléfono, luego descargan ONTO en su nuevo móvil e importan fácilmente sus activos allí.

Entonces, el tercero es que es el único veredicto criptográfico con la identidad activada en poderes, por lo que obtiene el ONT-id y puede tener su puntaje O con el ONTO-id dentro de ONTO, el cálculo se basa en su comportamiento en la cadena también los activos criptográficos que tiene, por lo que es realmente diferente y el criptográfico líder quería, entre otros, diría y otra cosa es que no solo es la billetera móvil porque el mes pasado obtuvimos la versión web de ONTO, por lo que es una extensión de Chrome, por lo que no importa si está usando el teléfono móvil o la PC, en realidad está usando la misma billetera y que es fácil de saltar la barrera entre los interruptores, por lo que solo tiene un ONT-id y puede administrar sus activos.

Humpty
Sabes, creo que tener esa flexibilidad para poder ir desde tu billetera de escritorio o debería decir la billetera de tu navegador e ir a tu billetera móvil para poder utilizarla no solo para activos digitales sino para identidad. Creo que es una experiencia rica esta dinámica, utilidad que le das a la gente creo que eso hace que la billetera sea mucho más valiosa es una de las cosas que digo cada vez que hablo con la gente sobre billeteras digitales lo digo y lo sabes honestamente, además creo que la billetera digital ha recorrido un largo camino desde el principio pero sabes que todavía hay espacio para la evolución, todavía hay espacio para mejorar y a medida que la identidad comienza a convertirse en un elemento importante para ayudar a que blockchain y cryptos, maduren.

Creo que veremos un montón de valor en tener billeteras que tengan estas soluciones integradas en estas soluciones de identidad y quiero decir que no estoy diciendo que sepa lo que hacen todas las billeteras, pero no creo que haya otra billetera.

Esto es lo que tiene ONTO en términos de esa flexibilidad y esa rica utilidad de conocimiento que va más allá de simplemente mantener activos digitales o más allá de simplemente trabajar en múltiples cadenas de bloques.

¿Por qué ONTO?

Humpty
Entonces, ya sabe un poco sobre lo que acabamos de hablar en términos de por qué desea usar ONTO, sepa que es esa billetera descentralizada, usted es dueño de sus activos, posee sus claves, le brinda o le permite tener su identidad también como parte de su billetera digital, creo que sabe que todos estos son elementos críticos para hacer que estas billeteras sean más seguras y las hace más valiosas para las personas, ya que, como dije, blockchain continúa creciendo y evolucionando.

Beneficiarios de ONTO ID

Humpty
Entonces, no sé si vamos a hablar sobre el o-score en una de estas próximas diapositivas, tal vez ya hemos ido más allá porque hemos hablado de ONT-id, pero creo que tal vez no deberíamos ir demasiado lejos, sin hablar de o-score, ambos van de la mano juntos o-score en ONTO-id siendo su identidad digital correcta y luego o-score siendo su reputación basada en esa identidad, ¿podría darnos una introducción rápida a qué es el o-score y por qué es importante desde su perspectiva.

Raindy
Claro, entonces, una vez que todos tienen la identidad, es como un identificador para que la gente lo conozca, quién es usted, pero cómo pueden juzgar su confianza, eso se basa en el puntaje o-score, por eso tenemos el o-score el protocolo de puntuación en nuestro ecosistema, por lo que todos pueden obtener una puntuación extra de las clasificaciones o puntuaciones que se basan en su ONTO-id y se vincula a todos los comportamientos que hacen en el chat, pero se puede personalizar de acuerdo con sus propias demandas, por lo que diría que no hay límite o pautas para decirle a la gente que cómo se generará su puntaje o porque es bastante diferente, depende de diferentes demandas, por lo que básicamente el caso de uso, Wing Finance, la plataforma DeFi que es desarrollada por la autoridad, por lo que hay un grupo inclusivo que es una plataforma de aprendizaje basada en créditos, por lo que una vez que obtenga su puntaje o, le permite estar sub-caracterizado para obtener los activos criptográficos permitidos.

Digamos que tienes $1000 USD como garantía, puede ser el USDT y obtienes un puntaje muy alto, entonces te permite pedir prestado como $ 1.000 y $ 100.000 USD tal vez en función de la confianza que tengas en el chat, por lo que hace que en la aplicación real permitir que las personas usen la reputación de sus credenciales y obtengan los beneficios reales a lo largo de ella, por lo que es un protocolo importante para el mundo de la cadena de bloques porque está resolviendo los problemas sobre el estrés, por lo que ¿Cuál es la cadena de bloques el objetivo y el valor de la transmisión? Diría yo.

Humpty
Sí, sabes, creo que existe una relación sinérgica entre ONTO-id y o-score, realmente la forma en que se utilizan y, como dijiste, hay mucha flexibilidad en cómo se puede crear esa puntuación. Realmente me gusta lo que ha hecho Ontology con Wing porque realmente está demostrando que el futuro de DeFi, sabes que aún es tan temprano que no tiene que estar demasiado colateralizado, sabes que no es así como funciona el mundo real, tienes una reputación en el mundo real y esa reputación es lo que le permite tomar prestado correctamente y, por lo tanto, con o-score y en Wing como un ejemplo como el que mencionó Raindy, es solo para reiterar porque creo que es importante que luego pueda obtener mejores tasas ya sea que se trate de un préstamo con garantía insuficiente o de mejores incentivos, lo digo en serio, eso también depende del protocolo en el que se integran el puntaje en el sitio y el puntaje o, usted sabe que puede haber un caso de uso diferente.

Ejemplo… Veo que Alonso aquí en el chat es preguntar si los sistemas de fidelización del consumidor es un buen caso absolutamente, sabes que te imaginas ahora mismo sabes que en el mundo real tienes estas tarjetas de fidelización y cuántas tienes o tal vez tienes que descargar la aplicación y cada vez que vas tienes que escanear esto pero realmente es esto la mejor manera es que haya una manera de que pueda potencialmente llevar esa reputación a todas las diferentes tiendas y sitios web que conoce que visita, si lo hace, tal vez sea una representación más precisa de quién es usted sabe que va a cinco mercados y estás diciendo … bueno, quiero comprar las mismas cosas y voy a todas partes bien con ONTO-id con o-score.

Potencialmente esta información nuevamente esta es tu identidad tus datos depende de ti cuánto de esto quieres para compartir, digo que quiero compartir esta información con estos cinco proveedores y, a partir de ahí, ellos pueden dictar su lealtad y luego otorgarle recompensas basadas en esa cadena que conoce en el mundo real o en la que conoce en la web también, porque, por supuesto, la mayor parte del mundo se ha vuelto digital, por lo que la mayoría de nosotros compramos todas nuestras cosas en línea, definitivamente un caso de uso muy bueno para los airdrops, dice Liu.

Estoy preguntando absolutamente que sabes en términos de lanzamientos aéreos, tal vez Raindy, podrías hablar un poco sobre cómo se está utilizando hoy en día.

Raindy
Sí, y en realidad, el equipo está trabajando en el caso de uso relacionado con el lanzamiento aéreo con el ID de ONT, porque en el mundo de las criptomonedas, especialmente cuando el nuevo proyecto está lanzando los tokens, habrá lanzamientos aéreos masivos allí, así que lo que el equipo del proyecto puede decir es que saben que pueden evitar que los robots obtengan los lanzamientos aéreos de forma gratuita y que es un poco perjudicial para los usuarios reales y los miembros reales de la comunidad obtener las recompensas de una manera confiable, por lo que a través del sistema ONTid y lo llamaremos Al igual que lo hicieron, las soluciones SDK brindaron a los proyectos que estamos buscando formas de evaluar a todas las personas que participan en la dirección como acreditaciones y también en función de sus retiros anteriores en la cadena, por lo que es una especie de solución de los problemas y sus inquietudes, no estoy en ese caso porque Gloria, nuestra jefa de líderes de desarrollo de ecosistemas, está liderando los proyectos, así que creo que podemos guardar esas preguntas en nuestras notas y buscaremos información más detallada con ella y le daremos las respuestas directamente.

Humpty
Sí, creo que es algo genial, reo que una de las cosas valiosas de eventos como este es que recibimos preguntas que, aunque no somos la persona técnica o no tenemos toda la información, podemos usar esto para asegurarnos. que podemos ayudar a continuar educando a la comunidad sobre todos estos casos de uso particulares e incluso algunos de los que ya estamos explorando, tal vez solo para tocar los lanzamientos aéreos un poco, ya sabes, en una de las llamadas que he alojado en la ClubHouse.

Alguien dio un muy buen ejemplo y sabes que UNI lanzó Uniswap por airdrop, debería decir que UNI lanzó a toda su base de usuarios, pero sabes una de las cosas, una de las preguntas que nos hacemos en términos de que sabes la forma en que se distribuyó frente a tal vez si hubieran usado algún tipo de solución de identidad es si una persona hubiera usado 10 billeteras y una persona solo usó una billetera, ¿cree usted que es justo que la distribución se envió de manera justa porque una persona obtuvo mucho más que otra? Es decir, porque no saben que está mal que esté usando 10 billeteras diferentes, pero ahora la distribución de esa riqueza de esa ficha no es justa, no es por persona, ahora va a las billeteras correctamente, así que si realmente queremos busque formas de crear una distribución más justa de estos tokens.

Potencialmente, podemos buscar soluciones como ONT-id para asegurarnos de que lo estamos enviando a personas reales correctamente y cómo verificamos que estas personas son reales. Quiero decir, si descarga ONTO, puede ver que hay formas de hacerlo.

En la verificación social, también puede hacerlo a través de algunos métodos de verificación estrictos que utilizan el elemento KYC que conoce, por lo que ciertamente hay muchas formas diferentes en que podemos hacer esto y usted sabe que lo realmente bueno es que hay muchas áreas que pueden ser explorado con identidad.

SAGA

Humpty
Bueno, estamos redondeando bien, somos todos, casi hemos estado aquí una hora Raindy, así que es fantástico, veamos si podemos terminar esta presentación.

Sé que Saga es algo que su equipo sabe que está desarrollando actualmente y explorando casos de uso para que pueda darnos un resumen muy rápido, qué es Saga y por qué es importante.

Raindy
Sí, Saga es un mercado de datos descentralizado que está diseñado tanto para individuos como para empresas, así que lo que estamos haciendo ahora es eso, recuerdo que un miembro de la comunidad mencionó que cuál es su comprensión de la saga es la pulgada de esos datos, por lo que es realmente impresionante porque obtuvo la idea principal, tan recientemente que en la mayoría de los casos en torno a los activos digitales, es por eso que tenemos tantas, muchas plataformas DeFi, pero en realidad tenemos los requisitos y preguntamos sobre los datos en largo plazo, esa es la razón por la que acumulamos datos y, como presentamos antes, Ontology es una cadena de bloques no solo para la identidad descentralizada sino también para los datos y el ONTid es uno de los elementos importantes del proyecto.

Entonces, ahora puedes iniciar sesión en el sitio web de Saga, ya está en vivo como por el año pasado.

Por lo tanto, puede intentar iniciar sesión con su ONT-id y comenzar a obtener los datos enumerados en el sitio web, pero a largo plazo vamos a construir más que eso, por lo que será un mercado de datos verdaderamente descentralizado, debería construirse en la Cadena de Ontology, pero será más interesante con la otra oportunidad, así que esa es la idea y hacer que este mercado sea más útil que la idea es que vamos a obtener más casos de uso que es el uso de la lógica de las plataformas de Saga y hacer el lanzamiento oficial en una etapa posterior para estar completamente preparados y mostrar nuestro profesionalismo y también el respeto a los antiguos usuarios, por lo tanto no nos queremos apresurar y mejor prestar más atención a los detalles para tenerlo mejor y más preparado.

Humpty
Sí, eso es fantástico, quiero decir, creo que lo que está haciendo Ontology es crear un ecosistema muy completo para la identidad y los datos correctos, si tiene su identidad digital, tiene su puntaje de crédito digital, tiene, debería decir, puntaje de crédito descentralizado, identidad descentralizada. tiene la integración de DeFi, que es la siguiente diapositiva a la que vamos a llegar con Wing, tiene la billetera que le permite utilizar todo este ecosistema, desde la creación de su identidad hasta la utilización de estos protocolos DeFi y potencialmente en el futuro utilizando SAGA, que es este mercado descentralizado.

WING

Humpty
Entonces, Wing la primera plataforma de préstamos de DeFi de cadena cruzada basada en crédito que es una gran realidad, básicamente la plataforma de préstamos tiene grupos flash tradicionales, actualmente es de cadena cruzada porque actualmente está en Ontology, está en Ethereum y en OKchain, que era lanzado recientemente el creo que hace una semana o dos.

El tiempo vuela, recuerdo que estábamos anunciando que era tan emocionante y luego lo interesante o el elemento único de esta plataforma es porque Ontology ha desarrollado estas tecnologías enfocadas en la identidad descentralizada, fueron capaces de crear, supongo que como un campo de pruebas de cómo eso podría funcionar y eso es derecho inclusivo.

Hablamos antes en términos de préstamos garantizados, mejores recompensas, lo cual es genial y creo que si saben que el futuro de DeFi será este, cuanto más escucho a la gente hablar sobre este tema en Twitter en Clubhouse. Sé que en YouTube, una variedad de canales de medios, la gente está viendo que actualmente la forma en que se construye DeFi no es justa, solo beneficia a aquellos que están contigo.

Conoces mucha riqueza, la palabra que veo que se usa mucho como plutocracia, lo que significa que tú tiene una tonelada más de dinero, entonces su voz cuenta más y sabe realmente lo que blockchain crypto está tratando de resolver es realmente ir en contra de muchas de estas formas tradicionales de cómo funciona la economía y quiere ser más justo y creo que con la Introducción de identidades y puntajes crediticios descentralizados.

Existe la posibilidad de aprovechar su comportamiento, su reputación para poder acceder a estos mercados sin tener necesariamente esta gran cantidad de activos digitales que usted conoce para poder respaldar eso préstamo.

Comunidad Ontology

Humpty
Entonces, creo que estamos casi al final de nuestra presentación aquí, Raindy, ¿quieres hacer algunos comentarios finales en términos de que conoces Ontology y este ecosistema tan rico?

Raindy
¡Si! Entonces, pienso a través de la charla de hoy y la presentación que la gente ya tuvo la idea de que lo que Ontology está haciendo, lo que estamos haciendo es enfocarnos en la identidad descentralizada que es lo primero y lo segundo son los datos, entonces, ¿Qué enumeramos en esto?

El gráfico de información muestra que lo que hemos logrado en este aspecto, pero más que eso, todavía estamos haciendo lo tradicional, como todos los proyectos de criptografía, estamos haciendo el ecosistema rico con todas las cadenas públicas, billeteras, proyectos e intercambios DeFi y también los otros protocolos y con el objetivo de construir palabras confiables con el esfuerzo de todos, para que pueda ver que ya hemos tenido comunidades de más de 29 idiomas y estamos presionando a nuestros socios a alrededor de 200, creo que debería ser más que eso recientemente y también hablando de la gobernabilidad, sabía que tenemos redes descentralizadas y globales de Ontology, como pueden ver, tenemos alrededor de 1.000 nodos en total y los 65 de ellos como nodos candidatos.

Por ende, incluso para los usuarios individuales que pueden apostar su ONT en cualquiera de los nodos y ganar ONG, la tarifa del gas es la que les da acceso para hacer transacciones en la cadena de Ontology y lo que es increíble es que ahora tenemos tantos ONT-ids creados en la Cadena de Ontología y es muy bueno ver que tanta gente tiene una identidad descentralizada a partir de Ontology, y espero que haya más personas que tengan sus ideas y datos que controlen por sí mismos a través de una manera descentralizada y Ontology seguirá esforzándose en estos aspectos y construyendo un mundo más confiable para todos.

Live session tomado de: Ontology live session at Consensus 2021 by CoinDesk — YouTube

Traducido por: aidonker

Encuentra mas en: Ontology | A blockchain for self-sovereign ID and DATA

Resumen de Ontology live session at Consensus 2021 by CoinDesk was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Urbit

Host Two Urbit Meetups (12)

Background Urbit needs champions! The Urbit Foundation is looking for community members to host local or online meetups to spread the word. Bounty Description This bounty is to host an Urbit meetup in your city. The meetup should meet the following criteria: The meetup must be public, and there must be a public web link for joining (like meetup.com) The meetup must have a presentation
Background

Urbit needs champions! The Urbit Foundation is looking for community members to host local or online meetups to spread the word.

Bounty Description

This bounty is to host an Urbit meetup in your city. The meetup should meet the following criteria:

The meetup must be public, and there must be a public web link for joining (like meetup.com) The meetup must have a presentation or talk about Urbit. The presentation or talk must be recorded and posted publicly online. You must promote the event vigorously; social media, a website, etc. Email support@urbit.org if you want a signal boost. You must host at least two meetups.

You should go into this bounty with the mindset of creating a regularly scheduled meetup, one that might become the nucleus of your city's Urbit community. Also, email support@tlon.io once you've hosted a meetup, and we will add you to your list of active meetups!

Resources

Check out our guide to best practices for Urbit meetups.

Contribution Guidelines Do not begin work until your request to claim this bounty is accepted. When applying to work on the bounty, tell us a little about yourself and mention which city you'd be hosting the meetups in. Proof of the meetups happening will be determined by the recordings of the presentations. Email links to these recordings to grants@urbit.org. You have 100 days from the time of approval to complete this bounty. Milestones The first meetup is held

0 stars The first meetup is held, and a recording of that meetup's talk or presentation is sent to grants@urbit.org.

The second meetup is held

1 stars The second meetup is held, and a recording of that meetup's talk or presentation is sent to grants@urbit.org.

Monday, 14. June 2021

UComuny

Föderierte Identitäten und Datensouveränität, comuny Fachvortrag auf bitkom Summit #eidas21

Nicht verpassen: Alles dreht sich um digitale Identitäten und den Einsatz von Vertrauensdiensten in der Praxis auf dem „eIDAS Summit“ des bitkom. Auf der virtuellen Hauptbühne spricht Dominik Deimel, Gründer von comuny, über “Föderierte Identitäten und Datensouveränität – ein Modell nicht nur für den Gesundheitsmarkt” (Bühne Vertrauensdienste in der Praxis, 11:50-12:10 Uhr, 15.06.2021). Föderiertes
Nicht verpassen: Alles dreht sich um digitale Identitäten und den Einsatz von Vertrauensdiensten in der Praxis auf dem „eIDAS Summit“ des bitkom. Auf der virtuellen Hauptbühne spricht Dominik Deimel, Gründer von comuny, über “Föderierte Identitäten und Datensouveränität – ein Modell nicht nur für den Gesundheitsmarkt” (Bühne Vertrauensdienste in der Praxis, 11:50-12:10 Uhr, 15.06.2021).

Föderiertes Identitätsmanagement ist heute ein etablierter Standard, der es dem Nutzer erlaubt, mit seiner Identität verschiedenste Dienste im Sinne eines Single-Sign-On zu nutzen. Dieser Ansatz bedingt, dass ein Identitätsprovider (IDP) viele Informationen des Nutzers verarbeiten kann (Stichwort “Profilbildung”) gleichzeitig aber – in regulierten Märkten – einen hohen Aufwand für den Schutz persönlicher Daten betreiben muss. Dies gilt zum Beispiel für Krankenversicherer, die den Zugang zu Diensten der Telematikinfrastruktur wie ePA oder eRezept organisieren. Hier brauchen die Teilnehmer eines föderierten Netzwerks im Gesundheitsmarkt gegenseitige Kontrolle und Akzeptanz. Die Herausforderung in Zukunft: Der souveräne Umgang des Nutzers mit seinen Identitätsdaten in der eID Wallet auf dem Mobilgerät.

Im Vortrag wird die Architektur eines IDP vorgestellt, der Vorteile und Vorgaben von OIDC und eIDAS als etablierte Standards sowie die Prinzipien von Self Sovereign Identities zusammenbringt. Damit lässt sich sofort und ohne Blockchain Technologie das Mobilgerät als komfortabler Authenticator mit einer mobilen eID Wallet nutzen. Beide Seiten profitieren: Nutzer behalten die Hoheit über ihre sensiblen Daten, und Unternehmen erfüllen die hohen Auflagen von Datenschutz und eIDAS. Dieser Ansatz bietet ein Geschäftsmodell, welches sich sehr gut auf andere Märkte übertragen lässt und offen ist für Business mit weiteren technischen Lösungspartnern und Integratoren.

#bitkom #eIDAS21 #comuny #Vertrauensdienste #DigitalIdentity #MyDataOperator #verifizierung #authentifizierung #operator #trinity

Virtuelle Konferenz des Digitalverbands Deutschlands mit branchenübergreifendem Austausch, über 1.000 Expertinnen und Experten, #Tech-Trends, #Insights & #BestPractices zur sicheren Digitalisierung von Prozessen.

Anmeldung und kostenfreie Tickets unter https://www.eidas-summit.de/de und mehr über den Ansatz von comuny mehr über den Ansatz von comuny 


Affinidi

5 Reasons to Use an Identity Wallet

The growing degree of digitization and the ever-increasing online transactions has necessitated a strong and secure way to identify and authenticate digital entities. This has led to the emergence of Digital Identity Management, where digital documents that prove the identity of an entity are stored and shared digitally. While there are many components in digital identity management, in this arti

The growing degree of digitization and the ever-increasing online transactions has necessitated a strong and secure way to identify and authenticate digital entities. This has led to the emergence of Digital Identity Management, where digital documents that prove the identity of an entity are stored and shared digitally.

While there are many components in digital identity management, in this article, we will talk about a central component called identity wallets.

What’s an Identity Wallet?

An identity wallet is a digital wallet/storage where you can safely store your digitized documents and verifiable credentials, and share them easily with others when needed.

You own this wallet and its contents, so you have the power to decide how and with whom you want to share your credentials. This concept of always being in charge of your identity is called Self Sovereign Identity (SSI) and it is one of the pillars of web 3.0.

So, how can this identity wallet be beneficial for you and why should you embrace it?

Here are five reasons why we think identity wallets are the future.

Reason #1: Keeps your Data Safe

One of the biggest issues facing the digital age is cyberattacks and the resultant identity theft.

If you analyze identity thefts and their patterns, you’ll notice that hackers steal millions of identities from a single storage/database, and often, the owner of the identity would know of the theft only much later, and sometimes, may not know it at all!

But what happens when you do away with a central repository where all the data is stored and replace it with a decentralized form of storage where each entity stores its data separately?

It becomes that much harder for someone to steal data because it requires considerable effort to steal just one record, which is not tenable in the long run.

To top it all, the owner is in complete control of the data at all times, so identity theft may not even be possible in the first place.

In all, your data is safe.

Reason #2: Know Where Your Data is Used

Surveys and opinion polls overwhelmingly show that users are concerned about their privacy and want to know how and where their data is stored and used.

For example, a survey by KPMG shows that data privacy is a concern for 97% of U.S consumers and almost 54% worry about what companies do with their data. Likewise, the Eurobarometer survey states that 72% of users want to know how their data is processed.

These numbers clearly show that users will feel more comfortable when they know where their data is stored and how it is used.

The good news with identity wallets is that you control the entire data sharing process, so there’s no concern about how someone else can use your data.

You can choose to send only the relevant data to entities for processing. For example, if you have to prove your age, just send your date of birth credential without including other details such as your address or government ID. That’s how granular you can get with data sharing and its usage.

Reason #3: Meets Mandatory Compliance Requirements

The European Parliament and Council implemented the eIDAS (electronic IDentification, Authentication, and trust Services) regulation in 2014 and paved the way for greater digital security. The more recent European Digital Identity further empowers users to store their identities in a digital wallet and use it seamlessly where needed, even across boundaries.

Such forward-looking regulations are likely to be adopted by other countries as well, so before long, having an identity wallet can become a way of life.

Why not jump on the bandwagon early and proactively meet the required compliance requirements?

This way, you can enjoy its benefits early on.

Reason #4: Facilitates Quick and Easy Authentication

Besides keeping your credentials safe, an identity wallet can facilitate quick, easy, and hassle-free authentication.

For example, opening a bank account in a different country doesn’t have to entail the collection and submission of a ton of documents that could take weeks to process. With digital credentials stored in an identity wallet, the entire process can be completed in a few minutes because all that you have to do is share it electronically.

The concerned bank authorities can verify the data and open an account for you. That’s how easy it can be!

Reason #5: Perform a Wide Range of Tasks Electronically

As more businesses accept identity wallets, it won’t be long before you can share the pertinent information in your identity wallets with a wide range of online services.

Some possible tasks you can do with an identity wallet are:

Prove that you’re 18+ years of age to buy alcohol or enter a bar Share your past employment and educational qualifications with prospective employers Use it as a health passport for travel or to buy the prescribed medicines in any part of the world.

Here are some more real-world ideas where an identity wallet can come in handy.

Undoubtedly, identity wallets are an integral part of the future as it comes with a ton of benefits geared for the next-gen online security.

So, are you ready to use an identity wallet and leverage the many benefits that come with it?

Already, the European Union has set things in motion as it plans to create a digital identity wallet that will securely store payment details, passwords, digital documents, and PII of individuals from all the 27 member nations, so the same can be used when needed across a wide range of public and private online services.

To leverage this trend, Affinidi offers APIs and SDKs to create a secure identity wallet to store your verifiable credentials. Contact us right away to get started.

Also, to learn more about self-sovereign identity and verifiable credentials, check out some related FAQs. Also, join our mailing list to stay on top of interesting developments in this space.

5 Reasons to Use an Identity Wallet was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


MyDEX

Revolutionising healthy ageing

Mydex CIC is pleased to announce its involvement in a new £12.5m project designed to ‘revolutionise’ healthy ageing. The project — called Blackwood Neighbourhoods for Independent Living — will help people to stay well and physically active as they age and explore new products and services to support them. Supported by £6m UK Research and Innovation funding as part of its Healthy Ageing Chall

Mydex CIC is pleased to announce its involvement in a new £12.5m project designed to ‘revolutionise’ healthy ageing. The project — called Blackwood Neighbourhoods for Independent Living — will help people to stay well and physically active as they age and explore new products and services to support them.

Supported by £6m UK Research and Innovation funding as part of its Healthy Ageing Challenge’ and led by Blackwood Group, the project will work with residents and partners in three neighbourhoods to enable people to live independently, including new homes, a design guide to improve upgrading accessibility and adaptations of existing homes as well as future home design.

It will include accessible outdoor spaces so that people can sustain physical activity, supported by digital connectivity and infrastructure that helps security and ethical data sharing. Sustainable energy and transport will aim to reduce community carbon footprint and reduce transport costs. Individual coaching and support will help people maintain their health and wellbeing.

The long term goal is to improve peoples’ lives as they age and reducing costs of care provision.

Key role of personal data

Mydex’s role will be to provide the data sharing infrastructure to enable individuals and service providers to safely and efficiently share the right data at the right times, in ways that protects individuals’ privacy and puts them in control of their data at all times and enable two way engagement and feedback throughout the project.

Through every aspect of the project, all personal data relating to each individual will be delivered to and accessed from the individual’s personal data store. All parties collecting or using any personal data will send it to the individual’s personal data store via a secure API, and will have a data sharing agreement designed to achieve the highest standards of data protection, transparency and control for the citizen.

Connecting to Blackwood’s CleverCogs digital system, participating residents will be able to organise their services, care and medical appointments, stay in touch with family and friends via video calls, and listen to music and entertainment. For customers living in Blackwood Home, the system can also be used to control everything from lighting and heat to opening doors and blinds.

The three neighbourhoods chosen to take part are located in Dundee, Glasgow, and Moray. Other partner organisations, besides the lead Blackwood, are:

Canon Medical Research Europe Carebuilder UK CENSIS Cisco International Ltd Enterprise Rent-a-Car UK Lewis & Hickey Architects The DataLab The University of Edinburgh

Revolutionising healthy ageing was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Blueprint for a Digital Health Pass

by Anne Bailey The Good Health Pass Interoperability Blueprint was released mid-June to help shape the development of a cohesive but decentralized method of issuing, holding, and verifying the results of a COVID status test. The Good Health Pass Collaborative (GHPC) is an open and publicly funded collaborative; it was founded by ID2020 and the working group is managed by the Trust over IP Foundat

by Anne Bailey

The Good Health Pass Interoperability Blueprint was released mid-June to help shape the development of a cohesive but decentralized method of issuing, holding, and verifying the results of a COVID status test. The Good Health Pass Collaborative (GHPC) is an open and publicly funded collaborative; it was founded by ID2020 and the working group is managed by the Trust over IP Foundation. The blueprint, which is now available for public comment, lays out the most important and intentional design choices as well as recommendations to implement an interoperable and inclusive system.

Intentional design choices

The Good Health Pass outlines their recommended architecture aspects:

Individuals should be in control of their data: This goes on to describe the relationship between the issuer of a credential (a test center) and the holder of a credential (the individual being tested), as well as the relationship between the holder and a verifier (entity that checks the credential before granting access to a service, such as airport staff, border control, etc.). It is explicitly stated that there is no room for a third party in these interactions – the holder should interact directly with anyone needing to issue or verify their credential. It recommends being restrictive by design with privacy by default. To enable this, the blueprint chooses to use the W3C Verifiable Credentials as a standard for interoperability, which also allows for selective disclosure of credentials. Equity and Inclusion: There are concerns around tracking the health status of individuals and using it as a gatekeeper to accessing private and public services, and rightly so. When equal access to basic rights are sidelined in favor of promoting mobility, there are risks that some groups will be marginalized. It must be remembered that vaccines are not universally available. Therefore, a negative test must also suffice to allow equal access to services. Thus the name of a credentialing service is important. It should not be called a vaccine passport because it is misleading and would clearly create a barrier to services for large populations who do not have access or choose not to receive a vaccine. It is recommended to be called a “digital health pass”. Access also includes device and offline access. Although a smartphone is the preferred channel for issuing, holding, and verifying credentials, there must be an offline version for those who do not have a smartphone. QR codes on a hard copy credential can fulfill this, with multiple QR codes for different selective disclosure scenarios. Decentralized for security and scalability: This blueprint argues that a decentralized approach boost security of holding and exchanging PII. This argument is made in more detail in other resources. In terms of scalability, the GHPC explains that a decentralized digital health pass could reduce stress and traffic between healthcare information systems by shifting the work of issuing and verifying credentials to other parties, but the scalability of the decentralized ecosystems is not addressed. The scalability of decentralized ecosystems is still a relatively open question, with widely varying results based on the particular public/private/consortium blockchain arrangement used. Open Standards for interoperability and participation: without a doubt, open standards must be followed so that a credential issued at one test center can be accepted at any verifier. The GHPC insists on a pragmatic and phased approach to designing and implementing. While that has led to a well-thought-out blueprint, it is already late to meeting real-world needs. In many countries, the initial rollback of lockdown measures has already occurred, and the reopening of economies means that systems to prove a negative COVID status have already been implemented in a disjointed way. The solutions that are out there many times provide a verifiable certificate, but can be missing other elements such as decentralization, the ability to certify a self-test without a smartphone, or the ability to just use one app for different testing stations. Decentralized systems also face a challenge of unified communication – already a trend in regional vs. federal measures where individuals may find it difficult to find reliable information that does not conflict with other sources. However, the systems in place are less than perfect but they do function to a limited degree and may disincentivize local governments from redesigning their initiatives based on the GHPC’s recommendations.

Identity Verification as a Foundation

Among the recommendations offered by the GHPC, binding the holder's identity with their credential is important to ensure that the credential does indeed belong to the holder. This lines up with the rising use of identity verification to accompany onboarding and authentication for higher security. Offline testing and credential issuance may rely on a manual check of a photo ID, but this does not last beyond the time of verification. Binding an identity to a Verifiable Credential remains valid beyond the point of verification by being able to match a real-time biometric data point with one which was logged at the point of verification – checking that a fingerprint matches the one which was logged when the credential was issued, for example. When using decentralized architectures, identity verification can still provide high and/or selective privacy for the holder while assuring the issuer or verifier that the credential can be trusted.

While the GHCP’s blueprint is well designed and offers concrete recommendations for a functional yet privacy-centric solution with implementation recommendations for 30, 90, and 180 day intervals, it comes late in the game and may face reluctance from public players. GHCP may also have limited the potential for its own adoption by relying on a decentralized architecture only, instead of building interoperability for health passes already in circulation. However, this may be the chance that decentralized identity has needed to demonstrate its efficacy on a global level. KuppingerCole will continue researching this market segment, as it has high potential to positively disrupt digital identity management for the individual as well as for the enterprise.


IDunion

Innovative concepts and software for managing digital master data and certificates

Companies today manage and maintain master data from business partners in multiple instances in various in-house IT systems — and do the same with their own master data in thirdparty systems. Each company has to invest a great deal of time and money in ensuring this data is of good quality. This is due to […]

Companies today manage and maintain master data from business partners in multiple instances in various in-house IT systems — and do the same with their own master data in thirdparty systems. Each company has to invest a great deal of time and money in ensuring this data is of good quality. This is due to the high number of data sets created in the process — which amount to several million in the case of major corporations. In addition to this, suppliers have to submit hundreds of certificates year after year to customers such as car manufacturers. Sustainability certificates have recently become increasingly important, too. In practice, this usually results in a considerable amount of manual work on both sides that is very time consuming and prone to errors. The team responsible for the strategic advance engineering project “Economy of Things” (EoT) at Bosch Research is working with partners on an innovative concept that aims to tackle these very issues.

Read the full article here.


Tokeny Solutions

Are Banks Missing Out on the Digital Money Revolution?

The post Are Banks Missing Out on the Digital Money Revolution? appeared first on Tokeny Solutions.
June 2021 Are Banks Missing Out on the Digital Money Revolution?

Digitalization continues to accelerate our everyday life, affecting people, businesses, governments and financial institutions. The Covid-19 pandemic acted as a catalyst that revolutionized the way people work, purchase, and invest. The hype around non-fungible tokens (NFTs) and decentralized finance (DeFi) is evidence of the growing interest in investment opportunities brought by blockchain technology. According to a recent report, security tokens are seen as the next blockchain wave and are predicted to reach a market of €918 billion by 2026.

Naturally, the demand for these new means of investment has pushed digital money into the spotlight. Why? Because the cash leg must be on the same infrastructure as the securities in order to allow a real transition to a decentralized network. While most banks are still waiting for central bank digital currencies (CBDCs) to show the way, stablecoins issued by crypto players have reached a record-high trading volume at $766.02 billion in May, up 51.9% from the previous month, and nearly 15x greater than one year ago.

CBDC vs Commercial bank digital money vs Stablecoins:

First, let’s take a deeper look at the two forms of money in the current banking system: central bank money and commercial bank money. Central bank money is made available to the public in the form of cash and to commercial banks in the form of central bank reserves. Private money is deposited into commercial banks, and banks are required to hold a certain percentage of these deposits at the central bank as a cushion. The rest of the deposits are used to grant loans to individuals and companies, where commercial bank money is created at this point. Therefore, in traditional finance, commercial banks act like trusted entities to not only provide essential services to customers, but also help create capital and liquidity in the market.

Similarly, stablecoins require promise and credibility, and they must be interchangeable with existing forms of money, in other words, they have to be anchored. This means stablecoins require users to have the same level of confidence as commercial bank money. Therefore, in onchain finance, commercial banks must take this role of trusted issuer of tokenized cash as they are the most suitable entities to issue stablecoins anchored by fiat money. They have already missed the ecommerce revolution to payment giants such as Stripe, Adyen, and Alipay, and have lost significant retail market share to neobanks. Will this happen again in the digital currency revolution?

Banks are at a critical juncture for catching up with this new wave. If they don’t act quickly, they will be seriously disrupted. With the emergence of crypto players such as Circle (which just reraised $440M), Paxos (which is now a bank), and our partner Monerium (a fiat-eMoney payment gateway for security token investors), the race is already on to become the tokenized money provider for digital assets. 

As a reminder, with Tokeny’s platform you can:

Tokenize fiat with automatic minting/burning; Track token holders and automate AML; Recover the funds in case of loss of a wallet; Enable payments in less than 2 seconds for less than 0,001€/tx. Tokeny Spotlight

Digital Identity, The Key For Security Token Custody

Digital identities lie at the heart of security tokens custody for investors, and secure the assets against loss and theft.

Read More

Why Have Binance’s Stock Tokens Concerned Regulators?

Binance has declared that holders of their stock tokens will qualify for economic returns on the underlying securities, including potential dividends, making the tokens securities.

Read More

Tokeny Solutions Brings the Digital Euro to Its Billboard Solution via Monerium

Read More

The T-REX Billboard Explainer Video

A white-label secondary market solution for security tokens

Watch The Video Market Insights

The Security Token Report 2021

The 93-page report carefully curates industry expertise from 13 authors in six countries, including Tokeny Soltuions, all of whom work at the forefront of this capital market evolution.

Cointelegraph

Read More

 

EcoWatt Becomes the First Green Asset-Backed Store of Value on the Blockchain

EcoWatt puts renewable energy assets on the blockchain to disrupt the climate change movement by making green assets accessible to the blockchain community.

The Tokenizer

Read More

 

Crypto Startup Circle Is Said to Evaluate Potential SPAC Deal After $440M Fundraising Round

Crypto startup Circle, the company behind USD Coin (USDC-USD), raised $440M in a funding round and the company is said to be considering a SPAC transaction.

Seeking Alpha

Read More

 

Paxos Becomes Third Federally Regulated Crypto ‘Bank’

Paxos earns a provisional trust charter through the U.S. Office of the Comptroller of the Currency.

Coindesk

Read More

 

Deutsche Bank and Singapore fintech STACS complete ‘bond in a box’ proof-of-concept on the use of DLT for digital assets and sustainability-linked bonds

Deutsche Bank Securities Services in Singapore and Hashstacs Pte Ltd (‘STACS’) announced the completion of their proof-of-concept (‘POC’) referred to as “Project Benja”.

Deutsche Bank 

Read More

 

Sweden’s Central Bank to Test Digital Currency With Handelsbanken

The Riksbank will partner with Handelsbanken to test how the e-krona might work in the real world.

Coindesk

Read More

 

ECB Says Lack of Official Digital Currency Risks Loss of Control

Countries that decide not to introduce digital versions of their currencies may face threats to their financial systems and monetary autonomy, the European Central Bank warned.

Bloomberg

Read More

 

Standard Chartered, OSL Parent in Pact to Create Digital Assets Platform

Initially targeting the European market, the U.K.-based company will seek to connect institutional traders to counterparties across markets.

Coindesk

Read More

 

Singapore Rolls Out World’s First Asset-Backed Digital Token Exchange

Cyberdyne Tech Exchange (CTX) of Singapore announced itself as the world’s first regulated digital exchange for tokens backed by real assets, fully live and open for business.

Investable Universe

Read More

Compliance In Focus

It’s Official: El Salvador’s Legislature Votes to Adopt Bitcoin as Legal Tender

A supermajority of the El Salvadoran legislature voted to adopt bitcoin as legal tender early Wednesday morning.

Coindesk

Read More

 

Security Token Offerings (STOs) for NFTs?

Thinking of selling NFTs? If so, you should be aware that the issuance of NFTs may, in some circumstances, constitute the sale of securities.

Dilendorf  Law Firm

Read More

 

Thailand SEC seeks to regulate decentralised finance

Activities related to decentralised finance (DeFi) projects which involve digital coin issuance may require a licence from the regulator in the near future, the SEC announced.

Bangkok Post

Read More

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Jun14 Are Banks Missing Out on the Digital Money Revolution? June 2021 Are Banks Missing Out on the Digital Money Revolution? Digitalization continues to accelerate our everyday life, affecting people, businesses, governments and financial institutions.… May18 The Era of Blockchain Interoperability Begins May 2021 The Era of Blockchain Interoperability Begins A few weeks ago, we announced our partnership with Polygon, making T-REX the first protocol for security… Apr19 Coinbase, The First Trusted DAM, Is Worth Over $50B. Who’s Next? April 2021 This week could go down as a watershed moment for the crypto industry. Coinbase, a digital asset marketplace (DAM) for cryptocurrencies, listed on… Mar16 The Knock-on Impact of NFT Tokens Going Mainstream March 2021 The Knock-on Impact of NFT Tokens Going Mainstream As we’ve seen with the recent activity amongst crypto tokens and now with NFTs, 2021…

The post Are Banks Missing Out on the Digital Money Revolution? appeared first on Tokeny Solutions.


PingTalk

Recognizing Digital Identity as a National Issue

One of the great things about my work here at Ping is that I get to explore identity matters from many different perspectives. I’m fortunate enough to be at a company that champions identity by delivering identity solutions like MFA, access management and data governance that empower companies, governments and organizations to secure their employees, customers and citizens. But identity goes well

One of the great things about my work here at Ping is that I get to explore identity matters from many different perspectives. I’m fortunate enough to be at a company that champions identity by delivering identity solutions like MFA, access management and data governance that empower companies, governments and organizations to secure their employees, customers and citizens. But identity goes well beyond the technical bits, bytes and widgets. So I also feel fortunate, and humbled, to be a part of the Better Identity Coalition, a group that brings together leading companies to promote education and collaboration on protecting identities online. It has given me an opportunity to have a richer and much fuller notion and understanding of identity.
 


Identosphere Identity Highlights

Identosphere #36 • Review the Good Health Pass Blueprint! • EU embraces SSI • SecureKey Interop DID:Orb

Upcoming events and webinars, Job opportunities, explainers, walk-throughs, the latest news and updates from the companies organizations and people creating a system of identification online
Welcome and Thanks to our Patrons! Read previous issues and Subscribe Support this publication at Patreon.com — Get Patron only content!!

Up next for Patrons only we’ll be coming out with a working groups digest, since there hasn’t been room in the weekly to cover updates from mailing lists and newsletter subscriptions. New Patrons also get access to previously released patrons only content such as the IIW Digest and our first Quarterly issue.

Coming up InfoCert at the eIDAS Summit • June 15

Carmine Auletta – our Chief Innovation and Strategy Officer – will present InfoCert perspective on the future of digital identity at 12:10, after that he will leave the stage to Andreas Plies – CEO and Founder of Authada – presenting the digital signature solution for the German market developed together with InfoCert.

Digital Twins and Self-Sovereign Identity: Build the Next Generation of Simulation with Privacy Preservation • June 24th, Jim St.Clair

1) The challenges of digital identity and ICAM in IoT and digital twins

2) How to apply SSI and decentralized identity with IoT and digital twins

3) How the Sovrin Foundation is advancing SSI in IoT for industry use

EEMA Annual Conference • 6/29-7/1

The 34th The European Association for e-Identity and Security Annual Conference focuses on ‘Securing Trust in the New Digital Reality’. (Kaliya is speaking) 

Identiverse 2021 • June 21-23 (Denver) Hiring  Communications Manager/Director position • DIF

managing and coordinating DIF’s communications channels and overall messaging strategy. 

Developer Evangelist • Affinidi

get software developers excited about building Verifiable Credential based trust ecosystems leveraging Affinidi's APIs

Business Consultant Self-Sovereign Identity (SSI) The Hague • TNO

With clients and partners you will jointly develop projects geared to the use of SSI in automated administrative decision-making processes. 

COVID-19 Good Health Pass Collaborative Releases Draft Blueprint for Digital Health Passes in Advance of G7 Summit

The Blueprint — released today in draft form for a three-week period of stakeholder consultations and public comment — is intended to stimulate discussion at the G7 Summit, which will open Friday in Carbis Bay, Cornwall, UK.

Review the Good Health Pass Blueprint Our co-editor Kaliya Young has spent the past three months working with the community to develop the Good Health Pass Interoperability Blueprint that enables the use of Verifiable Credentials to share as little information as possible with parties who need to know your COVID health status (like airlines transporting you to a country that requires test results). The comment period is open until June 17th

For a high level view, check out the terminology deck or the slide deck that was shared on webinars with the travel industry. 

Introducing the Global COVID Certificate Network (GCCN)

we are proud to launch the Global COVID Certificate Network (GCCN), an initiative to enable interoperable and trustworthy verification of COVID certificates between jurisdictions for safe border reopening. GCCN will include a global directory of trust registries to enable cross-border certificate verification, and be a home for toolkits and community-managed support for those building and managing COVID certificate systems.

Linux Foundation Public Health introduces the Global COVID Certificate Network to operationalize the Good Health Pass Interoperability Blueprint

Paul Knowles, Head of the Advisory Council at the Human Colossus Foundation, co-led the Standard Data Models and Elements drafting group, one of the nine interconnected GHPC drafting groups, to spearhead group recommendations on data elements, common models for data exchange, and semantic harmonization. The recommendations of that drafting group will help to enable data interoperability without putting any undue burden on existing health systems and workflows

Explore Verifiable Health Records Apple

Apple Announces Support for VCI credentials at WWDC (Almost proper JSON-JWT but not quite)

Implementing the Good Health Pass’s recommendations with Cardea

Cardea, a full, open-source ecosystem for verifiable health credentials developed by Indicio and now a community-led project at LFPH, meets the major recommendations of the Good Health Pass and facilitates the goals of the Global COVID Certificate Network.

IdRamp Joins Linux Foundation Public Health Cardea Project Steering Committee

The Cardea and GCCN projects are both excellent examples of breakthrough innovations that can take shape when companies and projects come together to solve real-world problems, using open source tools available to everyone

GlobaliD joins the Linux Foundation’s Cardea Project Covid-19 Vaccination Passes Could Cataylze Self-Sovereign Identity Adoption

The EU previously announced fully vaccinated Americans could travel this summer and regional EU travellers could potentially use an EU Digital COVID Certificate as early as July 1.

New York’s Vaccine Passport Could Cost Taxpayers $17 Million

The state’s contract with IBM details a Phase 2 of the Excelsior Pass, which could include uses that some advocates say raise privacy concerns.

Europe Pivots to SSI  EU Announcement: European Digital Identity The EU Announcement is the Biggest Ever in SSI Credential Master

Timothy Ruff’s analysis and commentary on the EU Announcement this week about its new digital identity strategy. 

EU plans digital ID wallet for bloc’s post-pandemic life

The European Digital Identity Wallet proposed by the EU’s executive commission is a smartphone app that would let users store electronic forms of identification and other official documents, such as driver’s licenses, prescriptions and school diplomas.

EU decision on Identity Wallet: Starting signal for a seamless digital future

Last week, the EU Commission published a draft for the so-called digital identity wallet “EUid”. According to it, within 12 months of the law coming into force, every EU state must provide its citizens with a digital wallet.

Blockchain-enabled Self-Sovereign Identity

Martin Schäffner, the initiator of the EuSSI Working Group of theEuropean Blockchain Association and expert in Self-Sovereign Identity, explains the concept of Self-Sovereign Identity and how it differentiates from conventional digital identities. 

High-Level What Are Self-Sovereign Identities (SSI)? Europechain

In this context, ‘self-sovereignty’ refers to the ability of the individual or the organization in control of the identity to share it and present it to other agencies with no intermediaries.

The Power of Verifiable Credentials Credential Master

Industries Adopting VCs

Government: EU Identity Wallet (all of European Union), German governmentCanadian GovernmentUS Dept. of Homeland SecurityUS Customs and Border PatrolBritish ColumbiaFinland/NordicsLatin America/CarribbeanNetherlands Ministry of Justice

Big Tech: MicrosoftIBMWorkdayDeloitteLGIEEE

Travel: International Air Travel Association (32 airlines so far), World Economic Forum

Education: T3 Innovation Network (US Chamber of Commerce Foundation), Digital Credentials Consortium (members include Harvard, MIT, Georgia Tech, more), ASU

Supply Chain: GS1 (bar codes, QR codes), World Economic Forum

Financial: Credit Unions, GLEIF (Global Legal Entity Identifier Foundation)

Healthcare: VCI (400 organizations), NHSLumedic

Analysts: GartnerForresterMercator

Blockchain, cyber, and Zero Trust (video)

Heather Dahl from Indicio.tech and I will discuss what the hell is the Blockchain and how is it being used for identity and in the future for the average user, and does this apply to #zerotrust?

Standards work SecureKey’s New Ledger-Agnostic did:orb

did:orb that decouples DIDs from ledgers while maintaining trust and security. SecureKey is leveraging standard and open-source peer-to-peer protocols like ActivityPub, data structures like verifiable credentials content-addressed storage like IPFS, and distributed trust services like the Google Trillian project to build a peer-to-peer trust network.

Use-Cases Building an SSI Ecosystem: MemberPass and Credit Unions

Credit unions and their members face the threat of fraud on all sides. And credit unions employ lots of tools to fight it. But ultimately, the problem comes down to the member and credit union authenticating each other. The problem is that doing this securely annoys people. 

Simplify medical supply orders with SSI: Techruption innovation project

Participants in this co-creation use case were TNO, CZ, Rabobank and Accenture. The developed solution can be applied in other industries as well. For example in public services, which are often offered by a network of organisations that are all required to comply with high administrative standards.

Divitel & Ledger Leopard Team Up to Apply Blockchain & Self Sovereign Identity Technology to Video Distribution

ready to market by end of end of 2022 offering increased flexibility, control, ease of use and speed when managing the access of video distribution ecosystem data, independent of the technology used.  Divitel video carrier customers will be offered the option to include this blockchain module on top of their ecosystems.

Digital Identity: Enabling dignified access to humanitarian services in migration - PrepareCenter

The primary objective of the report is to inform humanitarian organizations working with migrants of the opportunities and risks in the use of digital identities in providing services throughout the migrants’ journeys.

Almost SSI? You’ll soon be able to use your iPhone as ID at the airport: Apple Wallet is also getting support for hotel keys

Apple has announced a forthcoming update to its Wallet app that will allow you to use your iPhone as digital identification in select US airports. The company showed how you’ll be able to scan your driver’s license or state ID in participating US states, which will then be encrypted and stored in the iPhone’s secure enclave. The company says it’s working with the TSA to enable the iPhone to be used as identification at airport security checkpoints.

Thoughtful  Shedding Light on Dark Patterns The Me2B Alliance announces: Digital Harms Dictionary 2.0 Can I trust you? MyDex

This is the second of two blogs on our new White Paper: Achieving Transform At Scale. The first blog focused on the infrastructure challenge. This blog focuses on the parallel need for institutional innovation.

Identity Not SSI  OpenID: Public Review Period for Proposed Final OpenID Connect Client-Initiated Backchannel Authentication (CIBA) Core Specification OpenID: Public Review Period for Two Proposed SSE Implementer’s Drafts Matt Flynn: Information Security | Identity & Access Mgmt. Decentralized Finance & Self-sovereign Identity: A tale of decentralization, a new paradigm of trust Thanks for Reading!

Read more \ Subscribe @ newsletter.identosphere.net

Support this publication @ patreon.com/identosphere


Urbit

A Topiary: Hypertext and Urbit

Under the trees of England I meditated on this lost and perhaps mythical labyrinth. I imagined it untouched and perfect on the secret summit of some mountain; I imagined it drowned under rice paddies or beneath the sea; I imagined it infinite, made not only of eight-sided pavilions and of twisting paths but also of rivers, provinces and kingdoms... I thought of a maze of mazes, of a sinuous,

Under the trees of England I meditated on this lost and perhaps mythical labyrinth. I imagined it untouched and perfect on the secret summit of some mountain; I imagined it drowned under rice paddies or beneath the sea; I imagined it infinite, made not only of eight-sided pavilions and of twisting paths but also of rivers, provinces and kingdoms... I thought of a maze of mazes, of a sinuous, ever growing maze which would take in both past and future and would somehow involve the stars.

-- The Garden of Forking Paths, Jorge Luis Borges

The pre-technological world was already networked. No material object and no idea has ever stood alone—"no man is an island". Language itself is a network of associations shared between a group of people. The question of technology is: how do we bring our web of associations into the realm of computers? The legacy internet has focused on social networks almost exclusively—our mammalian brains readily took to this, we all love to signal. But before “social media” there was something called hypertext: a seemingly modest way to connect documents shared between computers.

A footnote in a book used to require a library, locating the referenced book, and finding the cited page (which might have footnotes itself)! Hypertext allows one to traverse a network quickly and easily with just mouse clicks, making information accessible to those without the time and resources to hunt down physical artifacts. Urbit’s “%graph-store” is not the same as hypertext but it does bear some familial resemblances. Fundamentally, %graph-store is a data structure that can accommodate disparate data types within a single edifice. It is based on graph databases and recent implementations such as GraphQL. To understand this data structure it is worth diving into the origins of hypertext—the original linked network.

The story of hypertext begins at the end of WWII with an article in The Atlantic that proposed to extend the human mind via machine. Vannevar Bush had seen first-hand the explosion of electronics manufacturing and scientific knowledge, recognizing the great peril and opportunity of such a revolution. His description of the Memex presaged the technological world of today, envisioning a personal device that would allow the user to draw upon a vast library of information, and link documents together arbitrarily—what he called "associative indexing"—a deliberate recreation of how the mind connects thoughts.

The 1960s saw the first implementations of these ideas. Ted Nelson's writing around Project Xanadu elaborated this linking between documents into what Nelson called hypertext. Nelson imagined "a text arranged in a graph structure", centered around a corpus arranged in nonlinear sequence, manipulable by author or audience. The concept was first demonstrated in 1968, at Douglas Engelbart's 'Mother of all Demos', where a vision of computers as a collaborative social environment was revealed to the world. "The Journal", presented alongside other revolutionary demos, exhibited collaboratively edited documents with hyperlinked connections. Any time-travellers in the audience would have immediately recognized the first webpage. From here, the idea of a linked network of documents would be developed through successive phases of computing, from the mainframe to the PC.

Experimentation with these ideas continued through the 1970s and 80s. A team directed by Andy Lippman at MIT extended the concept to threaded visual hypermedia with the Aspen Movie Map, a user-directed virtual tour of Aspen, Colorado. Hypertext and hypermedia gained a mass audience for the first time with HyperCard, a program that shipped with Mac computers. Users were able to collect or create 'stacks' of interlinked cards; a personal body of text, art, and knowledge could be accrued and modified, or shared with others—though not yet with native networking capabilities.

With the World Wide Web, hypertext achieved escape velocity. Initially a one-man project, the world wide web was similar to Bush’s vision of the Memex—a way to share, index, and connect information between individuals and groups. The web succeeded by utilizing the physical infrastructure already available: PCs with graphical interfaces, DNS, and the internet. Critically, its method of transmitting and traversing texts was a standardized protocol, allowing a free body of information to assemble atop the new infrastructure by anyone willing to learn the tools. A new frontier was cleared and cultivated with open standards, but it would gradually face enclosure.

In the following decade, after the first wave of social networking sites had crested but before phones became the primary web clients, Twitter was launched. Originally an awkward bridge between SMS and the web, its design eventually coalesced around static identity, posts with chronological sorting, and interlinked post threads. These simple primitives spawned a vast corpus of information, a graph-of-graphs intertwined and overlapped, mapping an entire platonic world. Twitter is fundamentally constrained by its very nature: it is the walled garden of an advertising company. The financial incentives of the server operators demanded seizure of the commons and the replacement of standardized protocols with proprietary databases.

As the first networked services withered, a new critique was implicitly articulated in the construction of a new kind of abstract machine. Urbit learned from the past: tightly defined protocols, baked-in cryptography, and decentralization were key. If power over the network was determined by those who ran the servers, then we would run the servers ourselves.

Urbit is not fundamentally a hypertext system; it is a new kind of computer, designed to participate in a peer-to-peer network that rejects the client-server model. All Urbit software takes its networking model for granted, it is not an ad-hoc layering of mismatched abstractions. Fundamental design decisions determine the trajectory of the future.

Early implementations of graph databases and projects like RDF hit on the same core idea: linked data needs its own data structure. In 2020 Urbit OS adopted an application called %graph-store that made use of a new, native data structure which would act as a database service for social applications. Chats, long-form text, and other types of data produced in the course of social computing now share the same tent, a single structure extensible to new types of content. Each item added to it is a node in a personal graph—a tree that grows branches as your computer communicates. These nodes can be linked arbitrarily to other graphs—Urbit's global immutable namespace grants both permanence and ownership and allows the graph’s branches to intertwine with others. Graphs can accommodate a variety of content, from simple text, to code and media yet to be conceived. You own your graph, and it is a record of your digital journey with others.

Personal hypertext graphs are leaving behind proprietary platforms and gaining independence. Graphs until now have been used as tools of surveillance and marketing, but they are being reclaimed and sewn back into the network, each bound to the cryptographic identity of an individual, permanent computer. Where the map of your life was once splintered across a thousand systems and held in the custody of strangers, it is now possessed only by you and shared only as you see fit. On Urbit, an infinite forest of digital lives can take root, nesting their boughs and stalks.

Sunday, 13. June 2021

KuppingerCole

Analyst Chat #80: AI Service Clouds

Anne Bailey has just completed extensive research into the new market segment of AI Service Clouds. In this episode, she explains this innovative concept, which aims to overcome the lack of qualified personnel and bring artificial intelligence and machine learning to more companies.

Anne Bailey has just completed extensive research into the new market segment of AI Service Clouds. In this episode, she explains this innovative concept, which aims to overcome the lack of qualified personnel and bring artificial intelligence and machine learning to more companies.



Saturday, 12. June 2021

Europechain

What Are Self-Sovereign Identities (SSI)?

Self-Sovereign Identities are a relatively new concept, but have the potential to disrupt the way online identities work, by putting control of data back into the hands of the data owners. Learn more about them in this article!

Identity is everything. It permeates everything we do, every activity (both on and offline), and has a huge bearing on our relationships. We go through life sharing parts of our identity, with people, institutions, service providers, even with our pets. Whatever animal we choose to love knows us for who we are, and how we treat them. In the offline world, we can control who we are...

Source

Friday, 11. June 2021

Cognito

Keeping Pace With a Rapidly Changing Privacy Landscape

It’s been said that regulation can’t keep pace with technological innovation but regulators are doing their best to prove otherwise. New rules around data security, privacy, and consumer protection are continually being rolled out, sometimes without much notice. And while many businesses may not be directly impacted, financial service providers in particular need to put […] The post Keeping Pace

It’s been said that regulation can’t keep pace with technological innovation but regulators are doing their best to prove otherwise. New rules around data security, privacy, and consumer protection are continually being rolled out, sometimes without much notice. And while many businesses may not be directly impacted, financial service providers in particular need to put compliance front and center.

Source


Dock

Dock’s Proof of Stake Testnet Is Now Live

The team at Dock has taken another step towards providing secure, decentralized credential solutions by transitioning our testnet to a Nominated Proof of Stake (NPoS) algorithm. We have kicked off this testing phase by successfully onboarding a number of candidate validators to the network, and will transition our current PoA

The team at Dock has taken another step towards providing secure, decentralized credential solutions by transitioning our testnet to a Nominated Proof of Stake (NPoS) algorithm. We have kicked off this testing phase by successfully onboarding a number of candidate validators to the network, and will transition our current PoA mainnet to PoS as well once the security audit and testing activities are successfully completed, which is expected to happen by the end of June 2021.

What is Proof of Stake?

Proof of Stake (PoS) refers to an algorithm in which the validators are selected on a blockchain based on the amount of tokens they have staked. Validators are entities who are charged with the important tasks of producing blocks (hence processing blockchain transactions) and validating the blocks that are produced by other validators.

In PoS, validators are selected based on how many Dock tokens have been staked (locked up) in support of each candidate. Before validators are selected, candidate validators can stake their own tokens, and/or be backed by other token holders (“nominators”) who stake their tokens on the candidates’ behalf, and receive rewards if the candidates get selected and start producing blocks. Once the nomination period is over, the algorithm automatically selects the candidates with the largest stakes to become validators. The validators are incentivized to perform their duties properly, as the staked tokens can be slashed (taken away) in case of validator misbehavior.

This is distinct from the approach taken in Proof of Authority (PoA) - the algorithm currently used on the Dock mainnet. In PoA, validators are trustworthy entities selected by the Dock Association, a non-profit organization dedicated to overseeing the Dock network.

The transition to PoS takes the validator selection decisions away from a single entity (Dock Association) and distributes it across a network of nominators and validators, hence creating more decentralized power and greater security in the network.

An Open Network for Everyone

The transition to PoS also comes with further democratization of the network governance structure. Some of the governance for our current PoA mainnet is centralized to the Governing Council, as the Council members are the only ones who can vote on any changes to the network proposed by the Council or token holders. Once the mainnet transitions to PoS, however, Dock token holders will be able to vote on proposed changes to the network (including election of Council members) by locking up their tokens. This change, along with the ability for token holders to earn passive income by staking on behalf of successful validators, enhances the utility of the Dock tokens.

Participation by token holders as validators, nominators, and voters is made possible by the fact that the Dock network is a permissionless blockchain. This is unlike many of our competitors who leverage permissioned blockchains, and hence limit the access to their blockchain governance to the select few. It is part of Dock’s vision to build a network that has maximum participation and widespread adoption, and we believe that this can only be achieved through decentralization, democracy, and transparency offered by a public blockchain that is open for all token holders, and free from the control of a single entity.

What’s Next?

We will be running the PoS testnet for the next few weeks to ensure that all network features work correctly. During this time we will also work with Solidified, a team of security experts currently performing an audit on the PoS network, to ensure that the network is safe and secure. Once these steps are completed we will transition our mainnet to PoS as well. More posts will follow to keep the community posted on our progress and further explain the changes introduced in PoS.


IdRamp

IdRamp Joins Linux Foundation Public Health Cardea Project Steering Committee

IdRamp will provide strategic leadership for Cardea and will also help the project's goal of growing decentralized identity adoption. The post IdRamp Joins Linux Foundation Public Health Cardea Project Steering Committee first appeared on IdRamp | Decentralized Identity Evolution.

IdRamp will provide strategic leadership for Cardea and will also help the project's goal of growing decentralized identity adoption.

The post IdRamp Joins Linux Foundation Public Health Cardea Project Steering Committee first appeared on IdRamp | Decentralized Identity Evolution.

Infocert

Gartner declares InfoCert a leader among over 40 global players in its “Market Guide for Electronic Signature” report

In its latest Digital Signature report, Gartner identified InfoCert as a “Full-service, enterprise electronic and digital signature platform”, ranking it among the leaders of the Digital Signature industry. The consulting firm has served as an authoritative IT advisor for over 40 years. Recently, it released its eSignature market guide, “Market Guide for Electronic Signature”, a […] The post Gar

In its latest Digital Signature report, Gartner identified InfoCert as a “Full-service, enterprise electronic and digital signature platform”, ranking it among the leaders of the Digital Signature industry.

The consulting firm has served as an authoritative IT advisor for over 40 years. Recently, it released its eSignature market guide, “Market Guide for Electronic Signature”, a true reference for IT and technology solutions professionals and companies. Gartner’s report put InfoCert among the top providers of Digital Signature solutions.

Download the report published by Gartner, free of charge, to see how the digital signature market has evolved and the main global players like InfoCert have performed

Gartner’s analysis is intended to guide professionals and businesses in assessing the solutions available on the market offered by providers in the field of Digital Signature and Trust Services. Gartner’s report divides Digital Signature and Trust Service providers into three categories. It assigned InfoCert to the “Full Service, enterprise electronic and Digital signature platform” category, which includes firms able to provide highly flexible technological and process services that offer all the technological characteristics required to meet the needs of large, medium and small companies, and even individual citizens.

The report is very useful because it classifies providers according to their specific capabilities and the levels of service they offer. It is an excellent source from which to analyse the latest B2B, B2C and B2E Use Cases. The report also provides a detailed view of the latest “Assurance” requirements (i.e., compliance with the standards established by European reference regulations) and technology requirements – aspects to which companies should pay special attention if they intend to embark on a path of Digital Transformation.

The “Full-service, enterprise electronic and Digital signature platform” segment, in which InfoCert has been recognised as a sector leader, includes Providers that adapt best to their clients’ every need. In fact, according to Gartner, these providers are able to support simple, advanced and qualified electronic signatures, they can include industry-specific processes in their Enterprise solutions, and they are able to offer “out-of-the-box” solutions that can be integrated with third-party platforms and software such as document, HCMs, CRMs and CLMs. They also support API integration, identity verification and knowledge-base verification tools.

The report also analyses the other two supplier categories: “Stand-alone, workflow-focused, click-to-sign electronic signature platform” and “Enterprise digital signature providers”.

The Gartner team of analysts that prepared the report highlights that clients must have access to services that match their needs, their geographical areas (often characterised by different regulatory requirements) and the use case they take as reference. This is why it is essential that clients be able to choose the vendor that best meets their functionality and compliance criteria at every stage of the process.

Download the Report

The post Gartner declares InfoCert a leader among over 40 global players in its “Market Guide for Electronic Signature” report appeared first on InfoCert.


ShareRing

ShareRing Gets a New Chief Operating Officer

Seasoned corporate executive with deep expertise in business growth and deploying new markets has been promoted to COO to strengthen ShareRing’s contribution to global digital transformation. June 11, 2021 – ShareRing has announced the promotion of Jonathan Duncan as Chief Operating Officer In this role, Jonathan will assume overall responsibility for ShareRing’s business operations and... The p

Seasoned corporate executive with deep expertise in business growth and deploying new markets has been promoted to COO to strengthen ShareRing’s contribution to global digital transformation.

June 11, 2021 – ShareRing has announced the promotion of Jonathan Duncan as Chief Operating Officer

In this role, Jonathan will assume overall responsibility for ShareRing’s business operations and ensure all departments and functions run efficiently and successfully.  He will have a seat in the Executive Leadership Team alongside CEO Tim Bos and newly appointed CFO Brian Norman.

ShareRing has been expanding globally and restructuring the business as a whole.  This appointment will help build on its value proposition to support businesses across a variety of industries such as travel, insurance, logistics and more.

Jonathan has over 10 years of experience building sales and expansion strategies as well as strong relationships between stakeholders. He previously held the Sales and Partnership Director position at ShareRing. However, his skill set and expertise lead him to be promoted within the company.

Commenting on the appointment, ShareRing CEO Tim Bos said, “I’m very proud that we have already started internal building and shifting, starting with Jonathan. He’s been with ShareRing as our Sales and Partnership Director. However, his strengths and ability to build long lasting relationships and effectively manage teams is exactly the type of leader we need heading up our operations. Jonathan will be key for our brand strategy and smooth production across the entire business.”

“ShareRing is a fantastic company where the sky’s the limit. I wholeheartedly believe in what we are building and how customer focused the products truly are. The contribution we’ll make toward a safer and more effective way of digital verification and identification is exciting and rewarding. I’m looking forward to my role shift and getting stuck in.” Jonathan said.

***End***

Notes to Editor:

About ShareRing

ShareRing is the most complete blockchain ecosystem for securely accessing and buying goods and services worldwide serving: travel, insurance, healthcare, logistics, education, cryptocurrency, charity and more.

ShareRing has pioneered an exceptionally secure and flexible identity verification system with a Personal Information Vault. Store verifiable documents including passports, national ID’s, driver’s licenses, COVID-19 test results and vaccination certificates securely, knowing only the owner of the data decides how and when their information is accessed or used.

For more information, contact Tomera Rodgers at tomera.rodgers@sharering.network or visit https://sharering.network/

Media contacts
Tomera Rodgers
ShareRing PR 
tomera.rodgers@sharering.network

Fran Puntyme
ShareRing Marketing  
Puntyme.a@sharering.network

SOURCE ShareRing Network

Related Links

www.Sharering.network

The post ShareRing Gets a New Chief Operating Officer appeared first on ShareRing.Network.


Safle Wallet — The Un-chaotic Smart Wallet

Safle Wallet — The Un-chaotic Smart Wallet

Safle Wallet — The Un-chaotic Smart Wallet Over the years, the digital currency ecosystem has endured and escalated to its current shape and form, advocating the global acceptance of cryptocurrencies as a form of value exchange. For an existing crypto enthusiast or someone contemplating investing into the crypto world, the main challenge is not only navigating through different avenues in th
Safle Wallet — The Un-chaotic Smart Wallet

Over the years, the digital currency ecosystem has endured and escalated to its current shape and form, advocating the global acceptance of cryptocurrencies as a form of value exchange. For an existing crypto enthusiast or someone contemplating investing into the crypto world, the main challenge is not only navigating through different avenues in the cryptoverse, but also storing and managing digital assets with ease along the way.

At times, the decentralized web network can pose an overwhelming maze of challenges and jargons that slows down the adoption rate in this industry, creating a demand for a safer and simpler approach.

Since any interaction with the decentralized network (blockchain) has to be initiated by a digital identity in the form of a wallet address, they play a central role in Web3 mass adoption. Safle understands that wallet security is paramount and the wallet providers must be transparent and trustless in nature to enable users to fully engage in the decentralized ecosystem.

While you look for the ‘right’ wallet, the checklist that shall help you ensure a flawless experience will need the wallet to be:

● Safe for your digital assets

● Simple and easy to use

● Secure for your identity

Ticking all the desired checkboxes, Safle Wallet makes storing and managing digital assets in the decentralized world a simple, hassle-free task for the users.

No lengthy public addresses; your handle name is all you need to store and manage digital assets with your Safle Wallet. No private keys required either! You just need to sign in with a password or personalized biometrics. It’s that simple.

Better control with a unique identity

Safle Wallet is a composite blockchain identity wallet that allows users to store their information in a distributed and trustless manner, giving users complete power and ownership over their new digital identity. SafleID comes with a unique legible name for the master account to store and manage 700+ digital assets like cryptocurrencies, NFTs, and other critical digital assets.

Not the alphanumeric chaos!

Since cryptocurrencies are one of the most trending subjects amongst today’s population, we have fundamentally simplified their storage and transmission through the Safle Wallet. You only need your ID name and Safle Wallet to store or access your cryptocurrency. Here technology empowers the user, bringing a win!

Get Safle Wallet — A custom name for your digital identity on the blockchain

Better access, ease of use, and added benefits.

a) Own your assets

100% Non-custodial wallet. Forget third-party dependency to hold your digital assets. Your Wallet, Your Way, No Compromise.

b) Safety is our priority

Safle is a reliable and dependable management infrastructure — designed to keep availability, flexibility, and security as its pillars.

c) Share easy

You only need to use your handle names to transact seamlessly across the ecosystem using multiple access avenues available to the end-user.

While you access and transact with the cryptocurrencies using your reliable and secure Safle Wallet, we also recommend that you consider the following –

Encrypt your wallet to ensure safety.

Encrypting your wallet or your cell phone permits you to set a passcode for anybody attempting to pull out any of your digital assets. This secures against criminals; however, it can’t ensure protection against keylogging hardware or software.

Always remember your Passcode

You must ensure you always remember the password, or your digital assets or funds will be lost forever. In contrast to your bank, there are exceptionally restricted password recovery alternatives with Bitcoin. Indeed, you ought to have the option to recollect your secret phrase even after years even if the account is not in use. It is recommended to keep a paper duplicate of your passcode in a safe space like a vault.

Keep a robust Passcode

Any passcode that contains just letters or guessable words can be considered frail and simple to break. A robust password consists of letters, numbers, special characters and should be at least 16 characters in length. The most secure passwords are those created by programs planned explicitly for this. Strong passwords are generally harder to recall, so you should take care in retaining them.

We are starting soon, and we won’t mind you stalking our social media for constant and swift updates.

Twitter — https://twitter.com/GetSafle

LinkedIn — https://www.linkedin.com/company/get-safle

Facebook — https://www.facebook.com/getsafleofficial

Instagram- https://www.instagram.com/getsafle/


Dark Matter Labs

Reconciliation with nature starts in our yards

Building a platform to facilitate socio-ecological transitions and community-based climate action in private yards Summary Rémi Müller — Unsplash Lawns are a desert in biodiversity, yet they are still the default landscaping approach across Canada. In the region of Montreal alone, lawns cover a surface 317 times the Mount-Royal Park. In the United States, it is the largest irrigated crop. Tr

Building a platform to facilitate socio-ecological transitions and community-based climate action in private yards

Summary Rémi Müller — Unsplash

Lawns are a desert in biodiversity, yet they are still the default landscaping approach across Canada. In the region of Montreal alone, lawns cover a surface 317 times the Mount-Royal Park. In the United States, it is the largest irrigated crop. Transforming the culture of perfectly manicured lawns therefore represents an enormous opportunity for collective climate action.

One of the most devastating causes and impacts of climate change is the global decline in biodiversity. Beyond the development of public policies, legislative strategies and financial incentives, it is essential to increase the capacity for on-the-ground action among individuals and communities through the development of ecological gardening on residential yards. This strategy aims to create spaces that can enrich and support biodiversity and, by doing so, transform our relationship with Nature right at home. This effort involves helping people see that the land they own is part of a larger ecosystem that requires nourishment and care; the land they own is a common responsibility.

To achieve this ambitious goal, there is a critical need to develop tools that bring together all the available knowledge about ecological gardening in order to inspire and help residents take action; to legitimize the adoption of this new residential landscape aesthetics of biodiversity by creating a community of interest and practice; and to document and measure the socio-environmental benefits of these local actions at different scales in order to sustain the movement and provide municipal authorities with necessary tools and information to incorporate biodiversity and climate mitigation into landscape planning decisions.

Pilot project, summer 2020 — Vivant & Nouveaux voisins

Nouveaux voisins / New Neighbors aims precisely at developing a platform bringing together different functions (pedagogical, actionable, evaluational, financial, regulatory, etc.) to support and accelerate this socio-ecological transition through the aggregation of residential and commercial yards. It is about establishing the foundations of a movement that proposes a new form of cohabitation between Nature and us, one yard at a time, across urban, suburban and rural divides. www.nouveauxvoisins.org is meant to be a platform for informing communities, transforming grass into habitats rich in biodiversity, and mapping and measuring the impacts of a movement already gaining momentum.

This blog summarizes the case studies and the paper prototype of a platform developed thanks to the generous contribution of The McConnell Foundation as part of the Transition Catalyst Fund, and was done in collaboration with Nouveaux voisins, Dark Matter Labs and S. Karthik Mukkavilli.

Context

While the impacts of climate change are increasingly disrupting our lives, the potential climate strategies that individuals and communities can deploy often remain abstract.

Beyond the actions themselves, the impacts of these practices on climate change at the local and global scales has also been unclear.

In this context, it is essential to support, document and measure the contribution of citizen actions in order to make them visible and encourage public authorities to support those initiatives.

How could a platform be designed to coordinate citizen engagement and scientific knowledge in order to inspire collective action? What are the necessary functions and structures of a platform that coordinates citizen engagement and scientific knowledge in order to inspire collective action? Case studies

The first step in this research was to identify certain rewilding initiatives that could inspire the development of the Nouveaux Voisins platform — and inform the relationship between citizen science and remote sensing tools.

These case studies were selected to represent different dimensions of the Nouveaux Voisins project (e.g., type of ecological actions, citizen involvement, use of technological tools, etc.).

Following an initial search for international initiatives, 8 initiatives were selected. Following an initial analysis, these initiatives were classified into 4 main categories.

These case studies have highlighted certain key lessons:

Opportunities

Economies of care: While access to native plants and ecological gardening services has been identified as a barrier, encouraging a demand for rewilding resources can stimulate a respective shift in supply. This coordinated move towards a local market for native plants and ecological landscaping services can be the beginning of an economy of care and conservation.

Facilitating civic contracts between citizens, non-humans, and governments: Assembling small-scale citizen observations of local habitat at the landscape level can help gauge the impact of biodiversity on ecological services and public health. This insight can drive government and community programs that circle back to the needs of non-human beings (prioritizing habitat corridors) and citizens (funding for gardening projects, equal access to nature).

Fostering agency: Clarifying the links between small-scale practices and larger scale impacts can help reframe individual action as collective effort.

Challenges

Integrating equity and access: Ensuring environmental surveying is accessible to various publics and providing multiple channels for citizens to voice their experiences to their governments and communities. This challenge may call for citizen science toolkits, direct communication pathways between citizens and municipalities, community-building channels, and public events that value the stewardship of citizens unable to take part in gardening at home.

Respecting local specificity while pursuing common goals: Building in flexible strategies that respond to various ecological regions, community contexts, and individual constraints while navigating commitments and lessons at larger scales.

Deployment at scale: Assembling collective habitat from small-scale transformations and decentralized citizen observations depends on the engagement of many. Nevertheless, these case studies have highlighted the possibility for surprising collaborations to pursue overlapping goals such as public health and education.

Based on these lessons, how can we build a platform to facilitate socio-ecological transitions and community-based climate action in private yards ?

From what we learned from the case studies and the Service Blueprint, we decided to organize the platform design around five categories of actions:

Informing
Transforming
Mapping & Measuring
Modeling & Visualizing
Coordinating and Influencing

Nouveaux voisins is intended to be a platform that consolidates various features; it provides information, assists in the concrete transformation of grassed areas, maps and measures the impacts of collective actions, and helps visualize possible futures.

Moreover, all the data collected will help mobilize communities wishing to promote biodiversity by helping them coordinate and influence different actors in their environment.

See below some early sketches of what it could look like:

Next steps

Currently, Nouveaux voisins is continuing its on the ground pilot projects with Vivant, while Dark Matter Labs continues to develop a full suite of deep code innovations in support of nature based solutions with Trees as Infrastructure and other initiatives.

For this specific platform project, we are currently defining the minimum viable prototype and are looking for collaborators and investors around the world to build a first version of the platform.

If you have any interest or recommendation, please reach out!

Get in touch

Emile Forest
Dark Matter Labs + Nouveaux voisins
emile@darkmatterlabs.org

Jonathan Lapalme
Dark Matter Labs + Nouveaux voisins
jonathan@darkmatterlabs.org

Credits

Jonathan Lapalme — Co-Project Manager and strategist
Dark Matter Labs + Nouveaux voisins
jonathan@darkmatterlabs.org

Emile Forest — Co-Project Manager and researcher
Dark Matter Labs + Nouveaux voisins
emile@darkmatterlabs.org

S. Karthik Mukkavilli — Lead AI and remote sensing researcher

Marie-Ellen Houde-Hostland — Lead researcher
McGill University

Philippe Asselin — Biodiversity and landscape architecture advisor
Vivant

Aaron Gillett and Gurden Batra — Design and programming advisor
Dark Matter Labs

Other collaborators on Nouveaux voisins, besides Dark Matter Labs, include: Urban Landscape Ecology Lab of Concordia University, David Suzuki Foundation, Enclume, Vivant, MIS Maison de l’innovation sociale and many others

Reconciliation with nature starts in our yards was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


Infocert

On June 15th join InfoCert at the eIDAS Summit

On June the 15th InfoCert, the largest QTSP in Europe, will take part to the international conferences on the eIDAS Regulation hosted by Bitkom. Carmine Auletta – our Chief Innovation and Strategy Officer – will present InfoCert perspective on the future of digital identity at 12:10, after that he will leave the stage to Andreas […] The post On June 15th join InfoCert at the eIDAS Summit appeare

On June the 15th InfoCert, the largest QTSP in Europe, will take part to the international conferences on the eIDAS Regulation hosted by Bitkom.

Carmine Auletta – our Chief Innovation and Strategy Officer – will present InfoCert perspective on the future of digital identity at 12:10, after that he will leave the stage to Andreas Plies – CEO and Founder of Authada – presenting the digital signature solution for the German market developed together with InfoCert.

Secure, fast, and efficient: improve the digitalization of your business processes with the eIDAS tools Register now

Remote transaction management, contract management and online identifications are constantly increasing making the digitization of operational and business processes more important than ever. Proving one’s identity is not only a necessity in the physical world, but also in an increasing number of digital contexts. Banking, government, health services, the number of processes that are moving online and require a secure and safe digital identification solution is rapidly increasing.

The digitalization of processes can greatly simplify daily activities and offers countless benefits for businesses and citizens. However, it is important to maintain high standards of privacy and security to enable a sustainable and trust-based digital future. Therefore, digitalization and security must go hand-in-hand, and the EU eIDAS regulation makes this possible.

The eIDAS Summit is Germany′s leading conference on the practical application of digital trust and identity in business. On the 15th of June 2021, you will have the opportunity to listen to influential speakers and to network digitally with eIDAS experts, decision-makers, and solution architects. Experience best-practice presentations, interactive workshops, and exciting keynotes.

“Our digital identity is the very foundation of our existence in the digital world. Europe is envisioning a significant evolution of its citizens’ digital identity moving from a patchwork of different national identity schemes to a standard scheme based on the Self-Sovreign-Identity and Zero-Knowledge-Proof paradigms.”


Carmine Auletta – InfoCert Chief Innovation and Strategy Officer

Get inspired at #eidas21 and advance your business with new approaches to digitalization.

Register now

The post On June 15th join InfoCert at the eIDAS Summit appeared first on InfoCert.


Aergo

AERGO incubated project CCCV creates member vaccine verification

CCCV, product of Blocko XYZ, is an incubated project by the AERGO foundation, that anchors member data on the AERGO public mainnet. Blocko XYZ has announced that from June 11th, anyone who has their COVID-19 vaccination status verified through the social DID service CCCV, will be able to receive a real badge and sticker that displays their status to others. Starting July, the Korean go

CCCV, product of Blocko XYZ, is an incubated project by the AERGO foundation, that anchors member data on the AERGO public mainnet.

Blocko XYZ has announced that from June 11th, anyone who has their COVID-19 vaccination status verified through the social DID service CCCV, will be able to receive a real badge and sticker that displays their status to others.

Starting July, the Korean government is waiving the outdoor mask mandate for those who are vaccinated. Once this policy is implemented, it will be difficult to discern who has been truly vaccinated. As the government has only issued confirmation stickers for people who have been vaccinated above the age of 65, there needs to be a method of distinguishing vaccinated people in younger age groups.

Although apps for digital vaccination verification have been developed and other similar services are being announced, there is continuing controversy over the issues of security and standardization, and all responsibility is being placed on the individual. Self-employed business owners who manage small scale businesses, such as health and fitness managers, pilates teachers and golf coaches, are particularly in urgent need of a way to distinguish unvaccinated clients who falsely claim to be vaccinated for the sake of remaining unmasked.

In an effort to solve this problem, Blocko XYZ has announced that it will award free stickers and badges to people who verify their vaccination status by signing up to social verification service CCCV.

Users can request a badge by clicking on the digital vaccination badge under the CCCV verification tab and uploading evidence of their vaccination status. The requested badge will be granted within 24 hours after the authenticity of the uploaded document has been confirmed by an administrator. After this badge has been granted, users will be able to request concrete vaccination confirmation stickers and badges by simply filling out the required information.

The digital verification badge can be selected as a top badge and shared with others in the form of a link. Users will be able to easily announce their vaccination status to their family, acquaintances and clients, and influencers may in this way encourage many others to also get vaccinated.

Kim Kyung-hoon, CEO of Blocko XYZ, announced, “We would like to actively offer our support in finding a solution to the various social problems that may arise following the loosening of the mask policy in July. As a social content business card service, CCCV is used by many self-employed people, from models and coaches to freelancers in various fields, for the sake of verifying their qualifications. We hurried to create the vaccination badge in response to the demand of users like this who required vaccination confirmation due to the frequent interaction with clients. However, consumers have complained that online proof is simply not convenient to apply to offline businesses. Since the Korea Centers for Disease Control and Prevention’s vaccination APIs are not open, services are only being provided manually. So, we decided to produce a real badge that could be used offline.”

Blocko XYZ’s CCCV, which has introduced the vaccination confirmation badge, was launched in November 2020. It is a new online content business card service that combines a badge system for expressing one’s academic background, assets and various abilities, with a link sharing service. Since its launch, within just 6 months, CCCV has gained over 2 million subscribers and offers a variety of blockchain-based services including DID and NFT.

AERGO incubated project CCCV creates member vaccine verification was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

Implement Introspection Flow With Kong Konnect and Okta

In our third Kong and Okta tutorial, we’ll go through the introspection flow implementation. This series will show you how to implement service authentication and authorization for Kong Konnect and Okta using the OpenID Connect (OIDC) plugin. Parts 1, 2 and 4 cover: Implement Client Credentials with Kong Konnect and Okta Implement Introspection Flow With Kong Konnect and Okta Access

In our third Kong and Okta tutorial, we’ll go through the introspection flow implementation. This series will show you how to implement service authentication and authorization for Kong Konnect and Okta using the OpenID Connect (OIDC) plugin. Parts 1, 2 and 4 cover:

Implement Client Credentials with Kong Konnect and Okta Implement Introspection Flow With Kong Konnect and Okta Access control based on Okta’s groups and planes (coming soon)

Table of Contents

Konnect and Okta Integration Topology Introspection Flow Set Up the Okta Application Apply the OpenID Connect Plugin Test the Introspection Flow With Insomnia Deactivate the Okta Application Protect Your Applications with Kong Konnect and Okta

You can also watch this tutorial as a screencast.

Konnect and Okta Integration Topology

In this example, I’m using the Konnect control plane to create new APIs and policies and publish them to my data plane running as a Docker container in an AWS EC2 instance.

Introspection Flow

The introspection flow is part of the token validation process. Kong Gateway evaluates the injected token at the request processing time to see if it’s still valid to the upstream services. The evaluation hits a specific Okta endpoint, passing the received token. Based on the response provided by Okta, Kong Gateway accepts or rejects the request.

For production environments, the OIDC plugin provides caching capabilities for the Okta responses. However, for this tutorial, I’m going to disable caching to better view the flow.

Set Up the Okta Application

Regarding Okta’s settings, I’m going to use the same client credentials application I created before. With the client ID and client secret. However, my OIDC plugin has to be set with specific parameters to implement introspection.

Apply the OpenID Connect Plugin

In the Konnect ServiceHub, I have an IntrospectionRoute OIDC plugin enabled.

The settings should be:

Config.Auth Methods

Config.Issuer

Config.Introspect Jwt Tokens

Config.Introspection Endpoint with a specific endpoint provided by Okta to implement introspection

Test the Introspection Flow With Insomnia

To better view the flow, I will use Insomnia, Kong’s API spec editor, to send requests to both Okta and Konnect. Below are my two requests.

The first one I’m sending to Okta, passing the expected parameters to get it authenticated and receive a token.

For the second one to consume the route, I’m using a specific Insomnia capability called Request Chaining. With this, I’ll be able to extract values from the response of a given request to build new ones. In my case, I’m pulling the access token from Okta’s response to make the other request and then send it to Konnect.

Next, let’s send a request to Okta to get our token. There it is.

This time, we can see that Kong’s request is ready to be sent since we got Okta’s token injected inside of it.

And here’s the Konnect response:

It’s important to note that Konnect is validating the token behind the scenes. Here’s one EC2 terminal where my data plane is running. Since I disabled introspection caching for the OIDC plugin, Konnect hits Okta for each request to validate the token.

Deactivate the Okta Application

Another way to see introspection is by deactivating the Okta application. All tokens related to it will be considered invalid and, as a consequence, will not be accepted by Kong again.

Let’s get back to Okta’s application and deactivate it. We should get a 401 error code from Kong.

Protect Your Applications with Kong Konnect and Okta

Start a free trial, or contact Kong if you have any questions as you’re getting set up.

Once you’ve set up Konnect and Okta introspection flow, you may find these other tutorials helpful:

Automating Your Developer Pipeline With APIOps (DevOps + GitOps) Service Design Guidelines Part 2: API Versioning “Gateway Mode” in Kuma and Kong Mesh Use Kong Gateway to Centralize Authentication

If you have any questions about this post, please leave a comment below. To be notified when the OktaDev team posts new content, please follow @oktadev on Twitter, like us on LinkedIn, or subscribe to our YouTube channel.


Aergo

CRISPY the world!!

1.Introduction CRISPY the world!! Here is CRISPY WHALES selected as the second company for the AERGO incubation fund program. We are making the online platform, Banana Clips. Thanks to AERGO’s support. Banana Clips is a platform that provides users to buy and sell short clips. It allows users to convert trendy video clips into NFT, and protects their copyrights of video clips through
1.Introduction

CRISPY the world!!

Here is CRISPY WHALES selected as the second company for the AERGO incubation fund program. We are making the online platform, Banana Clips. Thanks to AERGO’s support.

Banana Clips is a platform that provides users to buy and sell short clips. It allows users to convert trendy video clips into NFT, and protects their copyrights of video clips through the service with patent technologies. Users on Banana Clips can easily purchase a variety of short clips without any concern about copyrights, and can trade their video clips and NTF assets. Please look forward to Banana Clips, the new paradigm of the short clip platform.

2.On going Banana Clips

CRISPY WHALES is accelerating its development with the support of AERGO to open the Banana Clips service in July.

Not long before this report was written, new members with more than 10 years of experience joined us one after another to build and share the future of Banana Clips.

Among them, designer Jiwon Huh, who is leading the brand and UI/UX of our service, is with over 10 years of experience in various companies as Chief Design Officer (CDO) and has tremendous potential.

Jiwon Huh

Jiwon Huh

ex CDO of 3billion ex CDO of Casual Steps ex Design Director of Klleon 2–1. Creative

CRISP WHALES is currently working hard to develop Banana Clips. In addition to the technical part, we are putting a lot of effort into developing designs that users can directly feel.

PANTONE LLC, a global company that leads colour trends in various industries, selected Ultimate Gray and Illuminating, a yellow scale, as its 2021 colours, which contain warm and positive messages for tired humanity suffering from COVID-19.

This is Banana Clips’s primary colour, which naturally has melted the 2021 trend into our service. However, unlike other companies that use a yellow colour, we applied Baby Green, which reminds us of unripe bananas, to further clarify Banana Clips’ identity, to express the process of short clips becoming a well-cooked banana.

Banana Yellow

This colour means pure and warm affection, sincere participation and ‘ego’.

We chose Persona, who shares fun and funny things with friends and blends from morning to night and blends in harmony.

Baby Green

This colour represents young and cute likning the unique to fresh-looking baby bananas.

It contains the future of short clips that can be anything with colours that contains infinite possibilities.

Banana Clips’ colors according to basic behaviors

2–2.Growth

CRISPY WHALES is making great efforts externally as well for Banana Clips will blow a new wave in the contents business.

We have signed an MOU for a cooperative relationship between Videocon (videocon.co.kr), which is a video contest platform provided by Slate Media, and Banana Clips. Starting with this, we are preparing cooperations with various content productions and distribution channels like MOU. In addition, we completed a survey of popular works in major NFT markets and short clips’ markets and found out what they want through the main target community of video producers and editors. Based on these data, we will fill the service with various trending videos until the launch.

2–3. Tech

CRISPY WHALES is a technical partner of AERGO and is in the process of maintenance on some of the AERGO modules. There are many aspects that have not been reflected in the AERGO source repositories because technical reviews are still in progress, but new features that suit the trend will be provided one after another soon.

In addition, the technical reviews for the recently announced ARC2 has been completed, and actual use is planned for the service currently under development. This smart contract will be mainly used for video transactions through our service, but we are not forgetting to carry out various usability experiments, so we look forward to introducing other use cases soon.

Finally, all videos uploaded to our service will be uploaded to the AERGO network after various information is extracted by our internal system. Currently, we are developing a video extraction system and search platform, but a new smart contract for this will be announced and distributed in time for the service opening in July.

TODO

CRISPY WHALES is planning several marketing plans to acquire early users and various contents according to the Banana Clips launch plan to be introduced in July. In addition, we plan to take steps to launch a global service and a mobile app in business. With Banana Clips, you will always have access to videos featuring local cultures from around the world. In addition, we will actively utilize SNS channels mainly used by each generation to share information about video creating and editing and NFT, so that community participants can use the service naturally at the same time as Banana Clips is launched.

The development team of CRISPY WHALES is putting development plans for another smart contracts and layer 2 on the roadmap in addition to ARC2 to safely store NFT, trading information and copyrights data on the AERGO blockchain network. We hope to share with our community extremely thrilling news , such as a new business model to increase the use of AERGO tokens in trading, but we ask for your understanding that we are still cautious about revealing many things.

CRISPY WHALES is doing its best to meet the AERGO community as soon as possible. In the next month or so, Banana Clips will be coming to you. Thanks to everyone who waited, and we hope that you will continue to keep in watching with a little more interest.

CRISPY the world!! was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 10. June 2021

Secure Key

SecureKey’s New Ledger-Agnostic Solution, Orb, Helps Solve Decentralized Identifier Challenges

The post SecureKey’s New Ledger-Agnostic Solution, Orb, Helps Solve Decentralized Identifier Challenges appeared first on SecureKey Technologies Inc..

IBM Blockchain

Reinforcing IBM’s commitment to open source Hyperledger Fabric

It’s been an exciting year for blockchain development. From privacy-preserving digital health credentials to new, digitized supply chain infrastructure — some of the most interesting technology developments of late have drawn on the benefits of enterprise blockchain and distributed ledgers. That’s why today we are doubling down on our contributions to enterprise blockchain and Hyperledger, […] T

It’s been an exciting year for blockchain development. From privacy-preserving digital health credentials to new, digitized supply chain infrastructure — some of the most interesting technology developments of late have drawn on the benefits of enterprise blockchain and distributed ledgers. That’s why today we are doubling down on our contributions to enterprise blockchain and Hyperledger, […]

The post Reinforcing IBM’s commitment to open source Hyperledger Fabric appeared first on Blockchain Pulse: IBM Blockchain Blog.


Magic Labs

Async Art is bringing NFTs to life with a little help from Magic

There’s nothing new about direct collaboration between artist and viewer. Participatory art has existed for decades, dating back to the Italian Futurists of the early 20th-century and resurfacing in various forms ever since. But what happens when that collaboration moves from a gallery or theatre on to the blockchain? The answer is, well, just about anything. Async Art is pioneering a new kind o

There’s nothing new about direct collaboration between artist and viewer. Participatory art has existed for decades, dating back to the Italian Futurists of the early 20th-century and resurfacing in various forms ever since. But what happens when that collaboration moves from a gallery or theatre on to the blockchain?

The answer is, well, just about anything.

Async Art is pioneering a new kind of digital art crafted as a collection of live, editable “Layers”. Artists can maintain control of those Layers, sell ownership of individual Layers to collectors, or program the Layers to adapt programmatically based on anything from time of day to fluctuating stock market prices. What results is a fascinating realm of art that morphs and evolves, reacting to its community and the world at large.

I sat down with Lisa Liang, Co-founder of Async Art, to learn more about how Magic helps make it all possible.

Desert Clockwork, Cyrus James Khan, 4-state autonomous NFT, 2021 Appealing to a wider audience

When Async launched, new users were required to log in using a crypto wallet. This seemed reasonable for a platform built on the Ethereum blockchain, but user onboarding quickly became a bottleneck for the company’s growth. As Async began branching out beyond the crypto art community, it was clear that using crypto wallets like MetaMask was a serious challenge for many artists.

“Before Magic, only a very small percentage of new artists actually ended up minting a piece.”

According to Lisa, “A lot of traditional artists aren’t even on Twitter. The concept of using MetaMask or buying ETH — these are huge user onboarding hurdles. Magic makes it a lot easier.”

Like any NFT platform that relies on wallets, the Async team regularly spent hours walking new artists and potential buyers through the wallet setup process. This meant that the startup production team had less time to spend on developing important new features, and only a certain percentage of artists ultimately minted and listed artwork. Something had to change.

Onboarding built for everyone

To appeal to a wider audience, the Async team knew they needed an onboarding flow that didn’t rely solely on a wallet integration. On the flip side, they had an existing user base of NFT collectors who were passionate about crypto and decentralization. Magic provided Async with the intuitive user experience they needed while maintaining the security and integrity that are table stakes for a Web3 application.

“Having Magic there eliminates 4 or 5 steps from our onboarding funnel.”

Async’s top priority is enabling groundbreaking new forms of art, not necessarily exposing users to the tech stack that makes it possible. Lisa said it best: “Our vision for Async Art is to make this accessible for everyone. This is not just about NFTs, this is about utilizing digital technology to create artwork that could never have existed before.”

With Magic, new users simply enter their email address, click a magic link, and then dive right into the bold new world of programmable art. This streamlined onboarding flow translated to a smoother, faster creation process for artists. After implementing Magic links along with other additional features, Async more than doubled the amount of artworks created on their platform.

Focusing on fundamentals

Considering Async has invented an entirely new medium of art, newcomers already have quite a bit to wrap their heads around. Magic authentication means login is one less thing for new users to worry about.

Almost 40% of Async’s new users are now onboarded with Magic

The Async team has seen a growing number of new users opt for passwordless login since implementing Magic just a few months ago. As of May 2021, about 39% of all new users opt for magic link login instead of using a wallet. For Lisa and the rest of the team, this has been immediate validation that choosing Magic has had a positive impact on Async’s growth.

“Magic makes everyone’s lives easier.”

In addition to providing a smoother, more accessible onboarding experience for users, Magic has helped the Async team get back valuable time. Lisa told me login-related support requests are now practically nonexistent. That means the Async team can stop worrying about wallet creation and instead focus on trailblazing the future of digital art with new offerings like Async Music.

Whatever comes next, I can honestly say I’m thrilled to see where the journey takes them.

Async Art is bringing NFTs to life with a little help from Magic was originally published in Magic on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

A First End to the Schrems II Limbo

by Matthias Reinwarth Writing about legal topics is always a challenge. I am not a lawyer, but as an analyst and advisor, however, I almost inevitably encounter the implications of laws and current case law. Thus, this text describes only a personal assessment and is not intended to be, and should not be used as, legal advice (and in any case KuppingerCole Analysts do not provide legal advice). A

by Matthias Reinwarth

Writing about legal topics is always a challenge. I am not a lawyer, but as an analyst and advisor, however, I almost inevitably encounter the implications of laws and current case law. Thus, this text describes only a personal assessment and is not intended to be, and should not be used as, legal advice (and in any case KuppingerCole Analysts do not provide legal advice). At the end of the day, any action should be clarified with colleagues properly qualified to do so, including the legal team.

The ruling of the European Court of Justice (ECJ), which has gained some notoriety under the name "Schrems II", has caused great uncertainty among many organizations. The fact that the data protection standard, as currently practiced in the United States of America, is to be considered non-compliant with the requirements of the European Union and the General Data Protection Regulation has had a direct significant impact on an assessment of the processing of personal data abroad, especially of course in the USA. The mechanism previously known as "Privacy Shield" was rendered void overnight. This, of course, particularly impacts the use of cloud services, which had grown significantly over the past year and a half, not least due to the pandemic-induced migration to digital services.

The resulting state of limbo and lack of legal certainty required a comprehensive solution. This is now available in the form of the final version of the "standard contractual clauses for the transfer of personal data to third countries" (published on June 4, 2021). This document, which was published by the European Commission, and the measures it describes are intended to make it possible to continue transferring personal data to countries outside of the EU and to process it there.

More than just paper – tangible supplementary measures are needed.

However, merely amending the contractual basis is not enough: tangible technical and organizational measures that ensure the minimization, pseudonymization (as defined by the GDPR) and encryption of transferred data must be developed, and implemented precisely as described. This not only puts the onus on the providers of such Internet services (and their supply chain, end-to-end) to support these measures, but also forces the users of cloud services, for example, to practice data hygiene and reduce the volume of data, which can only be beneficial in terms of improved data protection. The SCCs (Standard contractual clauses) will only be considered effective providing these measures are in place.

Call for action

Considering this development as a layman in legal matters this document and the updated SCCs represent an important first step in my opinion. Even if the actual problem (the root cause) namely the possibility of American authorities viewing personal data on the systems of American providers, is not solved. Consideration must now be given to each individual case where personal data is transferred outside of the EU to select and fully implement the correct tools for encryption, pseudonymization and minimization of data at rest, in transit and during processing. While this will involve considerable effort, it is certainly more favorable when compared to the previous legal uncertainty.

This is where we as analysts and advisors come into play. Concepts, technologies, and products that can help implement appropriate technical and organizational measures are evolving quickly. Information Protection and Secure Information Sharing will become increasingly important. These include, for example, Azure Information Protection or advanced tools that enable the transparent encryption and use of critical information, including personal data. Sensitive data discovery, data cataloging and encryption concepts will play a significant role in light of this development. This holds true for structured data and unstructured data alike. Data governance will become a key area to exercise appropriate control over data of varying sensitivity. Data catalogs and data management will facilitate transparency regarding the nature, location and flow of data.

Finally, the requirement for organizational and technical measures to protect transferred data will also have a significant impact on the spread and use of more sophisticated, emerging technologies for encrypting data during use. For example, providers of technologies for homomorphic encryption can make a substantial contribution to the definition and implementation of appropriate measures.

KuppingerCole Analysts research – including our market compasses and leadership compasses – builds the foundation for an in-depth view of those markets and technologies. Our advisors will support in defining and architecting real life concepts and technologies to underpin the standard contractual clauses (SCC) with tangible controls.


Ocean Protocol

OceanDAO: Round 6 Results

OceanDAO Round 6 Results OceanDAO Grants Hello, Ocean Community! For those who are new, OceanDAO is a community-curated funding system directed towards projects building the Ocean Protocol ecosystem. The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, and resources that the community finds valuable. G
OceanDAO Round 6 Results OceanDAO Grants

Hello, Ocean Community!

For those who are new, OceanDAO is a community-curated funding system directed towards projects building the Ocean Protocol ecosystem.

The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, and resources that the community finds valuable.

Grant Funding Categories:

Building or improving applications or integrations to Ocean Community or developer outreach (grants don’t need to be technical in nature) Unleashing data Building and/or improving core Ocean software Improvements to OceanDAO itself

For up-to-date information on getting started with OceanDAO, we invite you to get involved and learn more about Ocean’s community-curated funding on the OceanDAO website.

The goal is to grow the DAO each round. We encourage the Ocean ecosystem to apply or re-apply AND to vote! Thank you to all of the participants, voters, and proposers.

OceanDAO Round 6 Results

Round 6 Rules

Proposals with 50% or more “Yes” Votes received a grant, until the “Total Round Funding Available” is depleted in descending number of votes received order.

35% of “Total Round Funding Available” was earmarked for New Projects. Earmarked proposals were eligible for entire “Total Round Funding Available”; returning (general) grants were eligible for 65%.

The grant proposals from the snapshot ballot that met these criteria were selected to receive their $OCEAN Amount Requested to foster positive value creation for the overall Ocean ecosystem.

Voting opened on June 3rd at Midnight GMT Voting closed on June 7th at 12:00 GMT

Proposal Vote Results:

14 proposals submitted 13 funded or partially funded 67 Unique Wallets Voted 259 voters across all proposals (same wallet can vote on multiple proposals) 244 total Yes votes 15 total No Votes 10,843,578.84 $OCEAN voted Yes on proposals 332,109.68 $OCEAN voted No on proposals 11,175,688.51 $OCEAN Tokens voted across all proposals Recipients

Congratulations to the grant recipients! These projects have received an OceanDAO grant in the form of $OCEAN tokens.

See all the expanded proposal details on the Round 6 Ocean Port Forum!

If your Proposal was voted to receive a grant, if you haven’t already, please submit a Request Invoice to the Ocean Protocol Foundation (OPF) for the Ocean Granted amount.

Please note:

Solipay received their full Ocean Requested amount for 27,200 (just split between New category and general). Longtail financial had >50% No; therefore did not receive a grant. Fair Data, due to the total available funds remaining, they have been granted only 1,710.00 OCEAN out of the 20,000 OCEAN requested. They will adjust their plan to focus on website and brand development for the next month to move it along for the next round. All $OCEAN was granted therefore there is no burn this round. OceanDAO Ecosystem

Continue to support and track progress on all of the Grant Recipients here!

Much more to come — join our Town Halls to stay up to date and see you in Round 7. Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO: Round 6 Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


SelfKey

Buy KEY Using the Moonpay Integrated SelfKey Wallet

The article will help in guiding you on how to use the Moonpay integrated SelfKey Wallet to buy KEY tokens. The post Buy KEY Using the Moonpay Integrated SelfKey Wallet appeared first on SelfKey.

The article will help in guiding you on how to use the Moonpay integrated SelfKey Wallet to buy KEY tokens.

The post Buy KEY Using the Moonpay Integrated SelfKey Wallet appeared first on SelfKey.


Coinfirm

World Economic Forum DeFi Policy Maker Toolkit

The World Economic Forum (WEF) has published a whitepaper on DeFi (Decentralized Finance) for policymakers this week. With DeFi being a gray area in the crypto-asset industry with few clear regulations, a policy paper from the WEF is welcome as a guideline for regulators to follow and for DApp creators to understand what shapes compliance...
The World Economic Forum (WEF) has published a whitepaper on DeFi (Decentralized Finance) for policymakers this week. With DeFi being a gray area in the crypto-asset industry with few clear regulations, a policy paper from the WEF is welcome as a guideline for regulators to follow and for DApp creators to understand what shapes compliance...

Bitcoin Taproot Coming Soon

Coinfirm is very closely watching the recent proposed changes on the Bitcoin network, which are aimed at the activation of Taproot. The chances that activation will take place in the following days is very high, which is why Coinfirm is constantly monitoring the number of blocks mined with positive signaling of activation of Taproot by...
Coinfirm is very closely watching the recent proposed changes on the Bitcoin network, which are aimed at the activation of Taproot. The chances that activation will take place in the following days is very high, which is why Coinfirm is constantly monitoring the number of blocks mined with positive signaling of activation of Taproot by...

Aergo

Pikkle x AERGO

Pikkle is the first blockchain-based voting and global viewer participation service application built on the AERGO platform. Without understanding blockchain technology, users can access voting as well as attendance results in real-time. Pikkle will be used by the Korean Broadcasters Association and developed by Blocko, AERGO’s technology partner. What problems is Pikkle trying to resolve with th

Pikkle is the first blockchain-based voting and global viewer participation service application built on the AERGO platform. Without understanding blockchain technology, users can access voting as well as attendance results in real-time. Pikkle will be used by the Korean Broadcasters Association and developed by Blocko, AERGO’s technology partner.

What problems is Pikkle trying to resolve with the traditional voting system?

1. Cost inefficiency

In general, the voting process is complicated and incurs substantial cost in voter targeting, identity verification, ballot count, opening process tables as well as checklists

2. Trust in voting result

There has been trust issues with the electoral system for a long time. Many people do not trust the result of votes — such as resident votes, broadcast program votes and legislative elections There is a prevailing believe that views and poll results collected via social network services can be easily manipulated and influenced for the platform provider’s own interest

Main Features:

Voting:

With the Korean Broadcasters Associations, Pikkle aims to provide a reliable voting service for all users. Pikkle collects users’ views and help their decision-making process through an app service rather than the traditional text voting or spam-type cold calls. ‘Pikkle’ will solve the existing voting problems by storing key data of the voting process (identification of voters, validation of voting, voting results etc.) using blockchain technology.

Many polls and surveys are still conducted by classic methods such as texting or phone calls despite the current rate of mobile phone penetration. Pikkle encourages users to simply scan a QR code to participate in festivals and voting actions. In this manner, we believe it can be a more convenient and fun service.

This method has been attempted by many big tech platforms. However, due to public interest and platform dependency issues, it was previously difficult to be socially accepted. Using blockchain technology, Pikkle aims to prevent platforms or stakeholders from interfering in decision-making processes and to promote two way communication by introducing a fair voting system in many content based industries.

Event Participation:

In addition to participating in various cultural and artistic events, festivals and broadcasting programs, you can use Pikkle in ‘Metaverse’ events. Pikkle diversifies the participation experience by issuing NFT digital tickets built on the blockchain.

Mobile tickets with blockchain NFTs (Non-Fungible Token) applied can be used to prove the ownership in the offline world. It even provides features such as joining a queue and tickets for concerts in the virtual world.

Pikkle’s vision and roadmap:

2021

Q2: Develop Pikkle Service ver 1.0 Q3: Release Pikkle Service ver 1.0 and integrate the system with broadcasting companies and festivals Q4: Develop Pikkle Service ver 2.0 — Payment system

2022

Q1: Develop Pikkle Service ver2.0 — issuing tickets on ARC2 NFT (AERGO Non Fungible Token Contract) Q2: Develop and release Pikkle service ver 2.0 — Integrate the system with Metaverse

Origins of Pikkle

Pikkle originates from the words ‘Pick’ and ‘People’.

Pikkle’s logo was designed using a round shape feature to emphasize friendliness for every user. Purple is the primary color of Pikkle as the color reflects credibility and creativity.

Pikkle’s symbol was inspired by ‘P’ from the word ‘Pikkle’ and the shape of a cell phone camera lens that is used to scan QR codes.

Pikkle x AERGO was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 09. June 2021

KuppingerCole

Warum Privileged Access Management? Hacker knacken keine Firewalls mehr, sie loggen sich ein




Warum Zero Trust kein Schlagwort ist

Sie kennen das Sprichwort mit dem Tier, das durch das Dorf getrieben wird. Zero Trust könnte mal wieder dieses Tier sein. Doch Zero Trust gehört nicht in diese Kategorie und ist ein absolut notwendiges Sicherheitskonzept in Zeiten, in denen Unternehmen mit extrem kostenintensiven Angriffen wie Ransomware konfrontiert werden. Daher beleuchten wir in diesem Vortrag ganz aktuelle Szenarien und was e

Sie kennen das Sprichwort mit dem Tier, das durch das Dorf getrieben wird. Zero Trust könnte mal wieder dieses Tier sein.
Doch Zero Trust gehört nicht in diese Kategorie und ist ein absolut notwendiges Sicherheitskonzept in Zeiten, in denen Unternehmen mit extrem kostenintensiven Angriffen wie Ransomware konfrontiert werden. Daher beleuchten wir in diesem Vortrag ganz aktuelle Szenarien und was es braucht, um derer Herr zu werden.




Zero Trust & SASE: Ohne IAM geht nichts

Zusammenarbeit und neue Arbeitsformen ermöglichen, IT-Modernisierung unterstützen IAM (Identity und Access Management) ist ein Kernelement jeder Strategie im Bereich der Cybersicherheit. Kontext- und risikobasierte Zugriffssteuerung und adaptive Authentifizierung sind Kernelemente jeder funktionierenden Sicherheitsstrategie. Gerade für Zero Trust mit seinem Grundsatz „Nicht vertrauen – überprüfe
Zusammenarbeit und neue Arbeitsformen ermöglichen, IT-Modernisierung unterstützen

IAM (Identity und Access Management) ist ein Kernelement jeder Strategie im Bereich der Cybersicherheit. Kontext- und risikobasierte Zugriffssteuerung und adaptive Authentifizierung sind Kernelemente jeder funktionierenden Sicherheitsstrategie. Gerade für Zero Trust mit seinem Grundsatz „Nicht vertrauen – überprüfen!“ ist ein gutes, modernes IAM essentiell, um eben diese Überprüfung durchführen und Zugriffe in Abhängigkeit vom Risiko steuern zu können.

Martin Kuppinger wird in seinem Vortrag auf die Bedeutung von Zero Trust-Strategien ebenso wie für SASE (Secure Access Service Edge) eingehen, aber auch für die Möglichkeiten, IT insgesamt zu modernisieren. Dabei wird er aufzeigen, wie ein modernes IAM die heutigen Anforderungen unterstützen kann und dabei hilft, eine IT zu schaffen, in der Zusammenarbeitsmodelle mit Partnern und Kunden ebenso wie neue Arbeitsformen für Mitarbeiter flexibel und sicher unterstützt werden und die bereit für alle Varianten von Deployment-Modellen ist.




Kunden-Identitätsmanagement als Fundament Ihrer Innovation

Customer Identity and Access Management (CIAM), die Verwaltung und Kontrolle der Kunden-Identitäten, hilft Firmen, die Nutzung der Kundendaten sicher und datenschutzkonform zu gestalten, ohne den Blick auf ihr Business zu verlieren. Erfahren Sie von Auth0 wie man einfach, schnell, und sicher eine Identity Plattform integrieren kann und sparen Sie wertvolle Zeit um Ihr Kern-Geschäft weiter zu ent

Customer Identity and Access Management (CIAM), die Verwaltung und Kontrolle der Kunden-Identitäten, hilft Firmen, die Nutzung der Kundendaten sicher und datenschutzkonform zu gestalten, ohne den Blick auf ihr Business zu verlieren.

Erfahren Sie von Auth0 wie man einfach, schnell, und sicher eine Identity Plattform integrieren kann und sparen Sie wertvolle Zeit um Ihr Kern-Geschäft weiter zu entwickeln und zu innovieren.

 

Features einer modernen CIAM-Lösungen und die Vorteile von Standardlösungen Mehrwert der Integration von Identitätsmanagement-Lösungen in Ihre Geschäftsprozesse Kundenbeispiele für erfolgreiche Integrationen und gehobene Potenziale Zeit für Innovationen gewinnen


Sicherheitsherausforderungen einer Multicloud-Welt – Privileged Access & Identities

 Viele Unternehmen nutzen heute mehrere Cloud-Dienste, wobei ihre Enduser regelmäßig Dutzende oder sogar Hunderte verschiedener SaaS-Anwendungen verwenden. Diese große Cloud-Migration hat erfolgreich eine Ausweitung des mobilen Arbeitens ermöglicht und beschleunigt Initiativen zur digitalen Transformation. Eine steigende Anzahl an Cloud-Diensten bedeutet allerdings ebenso einen Anstieg an IT-Siche

 Viele Unternehmen nutzen heute mehrere Cloud-Dienste, wobei ihre Enduser regelmäßig Dutzende oder sogar Hunderte verschiedener SaaS-Anwendungen verwenden. Diese große Cloud-Migration hat erfolgreich eine Ausweitung des mobilen Arbeitens ermöglicht und beschleunigt Initiativen zur digitalen Transformation. Eine steigende Anzahl an Cloud-Diensten bedeutet allerdings ebenso einen Anstieg an IT-Sicherheitsherausforderungen. Neben den grundlegenden Cloud-Sicherheitsaspekten entsteht eine zusätzliche Komplexität sowie Interoperabilitätsprobleme, die sich aus siloartigen Identity Stores, nativen Toolsets und einem Konflikt aufgrund der geteilten Verantwortlichkeiten der Cloud-Anbieter ergeben. All das schafft eine erweiterte Angriffsfläche und muss von Unternehmen adressiert werden.

Die Identity Challenge ist die wichtigste Sicherheitsherausforderung die es für Unternehmen zu lösen gilt und wird primär durch die Standardisierung des Managements und der Sicherheitskontrollen im gesamten IT-Ökosystem bewältigt. Nehmen Sie an dieser Sitzung teil und erfahren Sie mehr über: 

Die wichtigsten Cloud-Sicherheitsrisiken  Wo native Toolsets Lücken in der Sicherheit hinterlassen, die Sie adressieren müssen  Wie Sie mit Privileged Access Management (PAM) Best Practices für Cloud-Sicherheit implementieren, um die Wahrscheinlichkeit und das Ausmaß von Sicherheitsverletzungen in der Cloud deutlich zu verringern


Secure Remote Work: IT-Risiken vermeiden, Passwörter schützen & Kosten mindern

Im „New Normal“ und mit dem Anstieg von Remote-Work hat sich Vieles geändert. Ganz oben auf der Liste: IT-Sicherheit. IT-Abteilungen haben die Herausforderung, den gestiegenen Sicherheitsanforderungen gerecht zu werden und trotzdem für ein gutes Nutzererlebnis – auch im Home-Office – zu sorgen. Die wachsende Bedrohungslage für Unternehmen hat dabei nachvollziehbar sehr oft mit Engpässen bei der IT

Im „New Normal“ und mit dem Anstieg von Remote-Work hat sich Vieles geändert. Ganz oben auf der Liste: IT-Sicherheit. IT-Abteilungen haben die Herausforderung, den gestiegenen Sicherheitsanforderungen gerecht zu werden und trotzdem für ein gutes Nutzererlebnis – auch im Home-Office – zu sorgen. Die wachsende Bedrohungslage für Unternehmen hat dabei nachvollziehbar sehr oft mit Engpässen bei der IT zu tun. Hier einige Lösungsansätze:

Sicheres Passwortmanagement unternehmensweit einführen Produktivität für Mitarbeiter steigern und gleichzeitig Kosten reduzieren Best-Practice Beispiele für erfolgreiche Identitätsverwaltung


KI zur Überwindung des Daten-Problems




IT'S ALL ABOUT ACCESS - Identity & Access Management




Wahre Identitäten in einer digitalen Welt verankern

In einer Welt, in der Sie Kund/innen oder Mitarbeiter/innen nie persönlich treffen, ist es entscheidend realen Identitäten mit digitalen zu verankern. Nur so können Sie als Unternehmen sicher online Zugriff gewähren, Aktionen mit hohem Risiko verifizieren und ein Nutzererlebnis bieten, das ihre Kund/innen begeistert. In dieser Session wird Olli Krebs (VP Central EMEA bei Onfido) erläutern, wie Dok

In einer Welt, in der Sie Kund/innen oder Mitarbeiter/innen nie persönlich treffen, ist es entscheidend realen Identitäten mit digitalen zu verankern. Nur so können Sie als Unternehmen sicher online Zugriff gewähren, Aktionen mit hohem Risiko verifizieren und ein Nutzererlebnis bieten, das ihre Kund/innen begeistert. In dieser Session wird Olli Krebs (VP Central EMEA bei Onfido) erläutern, wie Dokumenten- und biometrische Verifikation nahtlos Vertrauen über den gesamten Identitäts-Lebenszyklus ermöglichen kann.

 




Zero Trust Anwendungsfälle




Eine mögliche Weiterentwicklung

Der Vortrag geht auf die mögliche Weiterentwicklung ein, die Sicherheitsorganisationen durchlaufen müssten, um mit der modernen Welt und den daraus resultierenden Anforderungen Schritt zu halten. Als Ausgangspunkt werden die IAM-Methoden genommen, aber auch wichtige Aspekte eines Sicherheitsprogramms durchgespielt. Die treibende Kraft für diese Veränderungen wird in der Digitalisierung gesehen. (K

Der Vortrag geht auf die mögliche Weiterentwicklung ein, die Sicherheitsorganisationen durchlaufen müssten, um mit der modernen Welt und den daraus resultierenden Anforderungen Schritt zu halten. Als Ausgangspunkt werden die IAM-Methoden genommen, aber auch wichtige Aspekte eines Sicherheitsprogramms durchgespielt. Die treibende Kraft für diese Veränderungen wird in der Digitalisierung gesehen. (Kürzung: Es werden kurze Einblicke in moderne Ansätze des IAM, des Risikomanagements, der Organisation von Sicherheit, der Sicherheitsarchitektur und weiterer Automatisierungen gegeben




Expertengespräch mit Dominik Achleitner

Im Expertengespräch unterhalten sich Dominik Achleitner und Daniel Holzinger über Passwörter und Passwort-Gewohnheiten. Darüber hinaus werden aktuelle Fragestellungen thematisiert, wie beispielsweise die Passwortsicherheit gemessen werden kann und wie es mit der grundsätzlichen Zukunft von Passwörtern aussieht. Grundlage des Gesprächs ist ein starker Praxisbezug unter Einbindung der TeilnehmerInne

Im Expertengespräch unterhalten sich Dominik Achleitner und Daniel Holzinger über Passwörter und Passwort-Gewohnheiten. Darüber hinaus werden aktuelle Fragestellungen thematisiert, wie beispielsweise die Passwortsicherheit gemessen werden kann und wie es mit der grundsätzlichen Zukunft von Passwörtern aussieht. Grundlage des Gesprächs ist ein starker Praxisbezug unter Einbindung der TeilnehmerInnen.




Self Sovereign Identity – Wie die aufkommende Technologie die Digitalisierung beflügelt




Sichere Identität für offene cyberphysikalische Systeme




Interview mit Goetz Walecki, Okta




Shyft Network

The Day the Veriscope Secretariat Responded to the FATF

The Day the Veriscope Secretariat Responded to the Financial Action Task Force (FATF) Veriscope and Shyft Network’s response to FATF Draft Guidance on Virtual Assets and Virtual Asset Service Providers Last March, the Financial Action Task Force (FATF) issued a Public consultation on FATF draft guidance on a risk-based approach to virtual assets and virtual asset service providers. Their hi
The Day the Veriscope Secretariat Responded to the Financial Action Task Force (FATF)

Veriscope and Shyft Network’s response to FATF Draft Guidance on Virtual Assets and Virtual Asset Service Providers

Last March, the Financial Action Task Force (FATF) issued a Public consultation on FATF draft guidance on a risk-based approach to virtual assets and virtual asset service providers. Their high-level objective is simple: to update its Guidance on the risk-based approach to virtual assets (VAs) and virtual asset service providers (VASPs).

As long-time players in the crypto industry, the decision to participate in the consultation was an easy one. The Guidance is an extremely important piece of international policy for our industry; it sets the pace for all member states of FATF on how to treat and deal with anti-money laundering and terrorist financing issues when it comes to cryptocurrencies and VASPs.

… we support the FATF’s overarching objectives of updating its pre-existing Guidance in a manner that maintains a level playing field for VASPs, minimizes opportunities for regulatory arbitrage, and preserves the intended technological neutrality of the FATF Standards. However, in our view, further revisions are required to ensure that the updated guidance achieves these objectives without going beyond the requirements of the FATF Standards or introducing elements that will have undesirable or intended consequences.
Intro

The consultation presents five different areas of focus for participants to comment. For the purposes of keeping this post simple, we will only summarize the responses.

In general terms, the Veriscope Secretariat sees several omissions or shortcomings in the proposed guidance. Should these remain in the updated guidance and become policy, the development, adoption and improvement of blockchain technology and cryptocurrencies as a nascent industry will be at risk.

If you want to read the complete document, please click on the following link: Response by the VERISCOPE Secretariat & Shyft Network.

The Response I. Definition of VASP and VASP activities subject to the FATF Standards

The revised Guidance expansive approach to the definition of VASP appears to be inconsistent and its scope could potentially lead regulators to consider, for example, key signers for Decentralized Autonomous Organizations (DAO’s), as VASPs, regardless of their ability, or lack thereof, to participate in the decision making process and custodial obligations.

We see unintended consequences from this that could move to eliminate key signers from these processes which can further lead to security vulnerabilities at the smart contract level (putting innocent users of these systems at risk), and could introduce innovative obfuscation techniques that further eliminate governance methods that play important roles in transparency and functionality.

Here’s an example of how this approach can create unnecessary complexity or, worse, roadblocks in the development of solutions for better financial inclusion: Key signing entities in community-run asset pools in DeFi protocols do not have the ability to verify the destination of fund transmissions, and, at times, these signing parties are involved in the blind transmission of assets through autonomous routes. We proposed a review of the definition of control and the role of key signers entirely in the process of transmission.

In addition, the expanded guidance touches on the role of developers, and the distinction between development companies in contrast to open source software developers (a very common occurrence in our space), is unclear to the point of being blurred.

It should be noted that in the event this was to be true, even public protocols that are being designed today to solve FATF guidance and travel rule requirements would therefore be VASPs. This would lead to a massive slow down in current timelines and the ability for the industry to utilize smart contract native systems in enabling effective methods of risk mitigation for regulatory purposes.

Our recommendation to the FATF is to redraft the areas that describe “developers” making a clear distinction between development companies and individuals paid on a fee basis. The blockchain space has grown, in many ways, thanks to open-source software and independent contractors; it’s imperative that we defend and promote this practice.

II. Mitigation of money laundering and terrorist financing (ML/TF) risks relating to peer-to-peer transactions

We think that the measures and controls in the draft guidance will not mitigate the ML/TF risks that might emerge when P2P transactions gain widespread acceptance. Most of the proposed measures will place additional obligations on VASPs and other obligated entities, resulting in minimal impact of mitigating the ML/TF risks of P2P transactions of unhosted wallets.

We support the recommendation that countries would consider ways of mitigating ML/TF P2P transaction risks through blockchain analytics. We encourage the FATF to further explore how blockchain analytics and other innovative technological solutions can provide greater visibility over P2P transactions between unhosted wallets.

III. Travel Rule

We believe that further clarity is needed around the FATF’s expectations of VASPs when transacting with unhosted wallets since the recent drafting suggests that the transactions should be treated as higher risk without providing a supporting rationale.

Based on our reliance on VASPs and their role as the verifying entities of the largest amount of KYC’s users in the space, we proposed that the FATF allow time for travel rule solutions to work directly on unhosted wallet discovery as well as VASP discovery before determining risk profiles and mitigation methods that may inaccurately assume the risk and hinder growth.

IV. FATF and Stablecoins

The revised FATF Guidance is generally helpful in confirming the applicability of VASP regulations to stablecoin issuers. However, when it comes to the specific details of comparing stablecoin issuers to other VASPs, particularly for the purpose of conducting an AML/CFT risk assessment, there are several aspects of the Guidance that appear to reflect a misunderstanding of how centrally administered stablecoins function. We believe that conducting an accurate risk assessment of stablecoins is contingent upon having a thorough understanding of how these products currently function, in practice.

Stablecoins have become an essential element in the crypto industry, with billions of $USD in volume being transacted through them. The FATF’s understanding of the key features of stablecoins is somewhat limited to them being a volatility avoidance instrument. We expanded on this definition and discussed that their true nature is speed, reliability and low cost for cross-border transactions. Their usability is not limited to issues related to virtual assets, rather, to limitations of cross-border banking.

Our position, in general terms, is that the risks of stablecoins and their issuers are analogous to the risks of VAs and other VASPs and that prior guidance was already sufficient to mitigate these potential risks. New recommendations by the FATF necessitate an updated risk assessment of this sector.

V. Effective implementation of FATF Standards

One important element that we identified and discussed was the licensing or registration of VASPs in the applicable jurisdiction. We view this, and the suggestion that local authorities impose conditions on VASPs seeking a license to be able to be supervised as an unfair adaptation of prudential and market conduct requirements for traditional financial institutions unfit for purpose in an AML/CFT context for the VASP sector.

We believe that the only way to effectively enable compliance in this new realm is to allow for data-collecting centralized intermediaries, to be able to represent users and act as data custodians of that data, while allowing users to passport across decentralized applications.

Rather than imposing requirements, we recommended that the FATF follows a similar path as it did with money or value transfer services (MVTS), which, like VASPs, may have no physical presence in the country where a transaction is sent or received. In its risk-based approach guidance for MVTS, the FATF encourages competent authorities in the host and home jurisdictions to liaise as appropriate to ensure any ML/TF concerns are adequately addressed. We believe that this would be a more appropriate approach in the VASP context that would ensure a more level playing field among AML/CFT-obliged entities and would reinforce the FATF’s principles of information-sharing and cooperation amongst VASP supervisors.

We also commented on the risk assessment requirement the FATF seeks to implement. To this point, we asserted that the currently available framework is insufficient in helping the industry identify, assess and understand their ML/TF risks. The crypto industry is rapidly evolving and, unlike the traditional financial sector, hasn’t developed effective controls over decades of experience and operations. That said, industry stakeholders are already initiating a risk assessment exercise and welcome the FATF to an open dialogue.

Wrapping up

As it pertains to the approach to effective regulation, we believe that the FATF will likely need a fundamentally different approach to regulation.

When it comes to decentralized systems and smart contracts that do not, and cannot, centralize the data collection and compliance processes that traditional intermediaries hold, we need to look at new approaches to compliance and KYC verification. This is especially true for Decentralized Finance (DeFi).

Systems are being built today that allow us to decentralize or passport the identities and KYC data sets of users across smart contracts and noncustodial wallets. We believe that the only way to effectively enable compliance in this new realm is to allow for data-collecting centralized intermediaries, to be able to represent users and act as data custodians of that data, while allowing users to passport across decentralized applications. These systems can allow us to have source nexus points for user validation and onboarding, but still allow those users (represented by the public addresses they use today to move assets) to utilize smart contract applications while leveraging reliance on the source data stores and validating onboarding entities. This will be the future of how compliant opt-in systems work across this ecosystem and can solve many of the largest risks and threats that are inherent from an AML/CFT perspective.

While this infrastructure is currently being developed in systems like Shyft Network among others, we believe that users should not be required to take on compliance or sanctions obligations directly. These systems, when they are solely in the non-custodial realm, are extensions of bearer instruments like cash, and effective regulation needs to focus its efforts on the on-ramps and off-ramps (like that of the traditional financial system) without requiring innocent civilians to take on compliance obligations and the responsibility of sanctions requirements.

Decentralized systems should be looked at largely as public utilities and enhancements to the utilization of digital bearer instruments that are designed to invoke user freedom and the betterment of individual choice, while still ensuring law enforcement has the ability to effectively address illicit activity. Our ability to ensure these networks do not unintentionally transition to deeper levels of obfuscation is critical in this current time to ensure we can maintain visibility and transparency into how these networks publicly function. Regulations can help maintain this visibility in collaboration with this technology, or hinder it if we do not act collaboratively and cautiously to nurture its benefits.

Shyft Network aggregates trust and contextualizes data to build an authentic reputation, identity, and creditability framework for individuals and enterprises.

Join our Newsletter
Telegram (https://t.me/shyftnetwork)
Follow us on Twitter (https://twitter.com/shyftnetwork)
Check out our GitHub (https://github.com/ShyftNetwork)
Check out our website (https://www.shyft.network)
Check out our Discord (https://discord.gg/ZcBNW37t)

1Kosmos BlockID

Secure Authentication: Which Method is Best?

Choosing the best method of secure authentication for your system can be a determining factor in your ability to withstand cyberattacks.

Choosing the best method of secure authentication for your system can be a determining factor in your ability to withstand cyberattacks.


Anonym

Digital Advertising May Present National Security Risks: US Senators

The global digital advertising spend is set to hit nearly USD 390 billion in 2021. But in yet another blow to the big tech companies reaping the rewards, a bipartisan group of US senators has raised national security concerns over the automated process that makes personalized ads possible.

The global digital advertising spend is set to hit nearly USD 390 billion in 2021. But in yet another blow to the big tech companies reaping the rewards, a bipartisan group of US senators has raised national security concerns over the automated process that makes personalized ads possible.  

Specifically, their concerns relate to what’s known as “real time bidding”, the split-second automated auction process used to rapidly place personalized ads on web pages, and the “bid stream data”, comprising a users’ personal data, that is used in the bidding process. 

There are three important things to understand here:  

What the senators want to know and why  How real time bidding works  The privacy issues that potentially risk the safety of US citizens.   

What the senators want to know and why 

On April 1, 2021, six US senators sent a letter to AT&T, Index Exchange, Google, Magnite, OpenX, PubMatic, Twitter and Verizon demanding to know, by May 4, 2021, every “foreign-headquartered or foreign-majority owned company” to whom the companies had given the personal data of US users over the pastthree years. In addition, they want to know:  

The specific data about users, their devices and the websites and apps they’re using that these companies are sharing with ad auction participants  Every foreign and domestic company that has received bid stream data in the past three years thatisn’t contractually prohibited from using that data in any way unrelated to the bid process  Any contractual restrictions in place prohibiting the sharing, sale, or secondary use of bid stream    data and all compliance audit efforts and results. 

The senators fear that the easy access to bid stream data during the real time bidding process allows foreign governments to profile US citizens. To wrap your head around that risk, it’s important to understand real time bidding and the significant data privacy issues that result.   

How real time bidding works 

Real time bidding is an exchange that happens in the milliseconds before a web page loads. It automates the process of buying and selling ad space online and makes personalized ads possible. It’s often difficult for non-tech people to believe this data exchange and ad placement can happen in milliseconds of real time, but it does. 

Real time bidding works like an auction in that advertisers bid on available space on web pages and the space typically goes to the highest bidder. It’s a four-part process

Once a user clicks on a link to open a web site, the site’s publisher sends the dimensions of its available ad space to what’s known as a supply side platform (SSP), a technology platform (like WebFX) that automates the process of web publishers selling their ad space to advertisers.   The SSP then analyzes the user’s cookies to gather as much data as possible about the user. This is known as bid stream data and typically includes URL, device type, model, screen size, CPU, operating system and connection, web browsing activity and interests, IP address and ZIP code location, as well as age and gender. This data determines the most relevant ad for the user.  Next, a demand side platform (e.g. Google Ad Manager) uses the bid stream data from the SSP to assign a dollar value to the user’s impression (display of the ad on the user’s screen) and place bids from relevant advertisers on the ad space.   Finally, the SSP receives the bids and awards the ad space to the highest bidder. The web page then loads with the ad in the contested slot. The publisher has sold their ad for profit, the advertiser has got its product in front of its highly targeted audience, and the user is none the wiser that their profiledata was up for grabs only moments before. 

5 significant privacy issues of the process   

You guessed it: the process is fraught with significant data privacy issues: 

Hundreds of companies can participate in the real time bidding process. Every auction participant gets access to the bid stream data and they don’t even have to bid.  Most anyone can participate in the auction: barriers to entry are low. And while there are penalties for misusing bid stream data, parsing the data is still highly valuable to participants.   Bid stream data can be harvested even without third party cookies so recent efforts by Apple and Google to ban third party cookies do nothing to mitigate the privacy risks.   The bid stream data is usually anonymized but, as we’ve recently covered, it’s relatively easy to match a user to their information.   Data brokers readily package the bid stream data (particularly valuable location data) and sell it to other companies and even governments with little oversight — the key point of the senators’ concerns.  

Sen. Ron Wyden, D-Ore., who led the senators in writing to the eight ad exchange firms, says: “Few Americans realize that some auction participants are siphoning off and storing ‘bid stream data’ to compile exhaustive dossiers about them. In turn, these dossiers are being openly sold to anyone with a credit card, including to hedge funds, political campaigns and even to governments.”  

The concern of course is that the information ends up with foreign governments who could create digital profiles of US citizens. “This information would be a goldmine for foreign intelligence services that could exploit it to inform and supercharge hacking, blackmail and influence campaigns,” the senators say. 

It’ll be interesting to see which companies reply, what they say, and what happens next. It’s definitely yet another reason to put a US national privacy law in place and to proactively protect your users’ personal data while we wait. 

The post Digital Advertising May Present National Security Risks: US Senators appeared first on Anonyome Labs.


Coinfirm

Jobs: Project Analyst

At Coinfirm, we are an international company centred around blockchain technology, providing AML and transaction security solutions for the financial and cryptoasset industries. Coinfirm is full of professionals with experience in compliance, finance and IT powering the mass adoption of blockchain. With actionable intelligence, we support our partners and clients around the globe, including industr
At Coinfirm, we are an international company centred around blockchain technology, providing AML and transaction security solutions for the financial and cryptoasset industries. Coinfirm is full of professionals with experience in compliance, finance and IT powering the mass adoption of blockchain. With actionable intelligence, we support our partners and clients around the globe, including industry...

Ontology

Ontology Weekly Report (June 1–8, 2021)

Highlights This week, Ontology announced exciting new partnerships with Japanese consulting company, AP.LLC, and music streaming platform, ROCKI. We also released an updated visual summary of ONTO Wallet’s dApp ecosystem. Latest Developments Development Progress - Ontology’s EVM-integrated design and 30% of its development is now complete, making Ontology compatible with the Ethereum smart con
Highlights

This week, Ontology announced exciting new partnerships with Japanese consulting company, AP.LLC, and music streaming platform, ROCKI. We also released an updated visual summary of ONTO Wallet’s dApp ecosystem.

Latest Developments

Development Progress

- Ontology’s EVM-integrated design and 30% of its development is now complete, making Ontology compatible with the Ethereum smart contract ecology.

- 80% of ETH RPC support is now complete

- 50% of Ontology’s new Ethereum account system is now complete

Product Development

- The Solana chain was added to ONTO Wallet, becoming the thirteenth blockchain supported by ONTO.

- The second-layer privacy protocol, Suterusu, is now available on ONTO Wallet, providing ONTO users with stronger privacy protection services.

- Joint events with ApeSwap, Cafeswap, and HyperJump continue. Rewards for the ONTO/HyperJump event were distributed.

dApps

- 116 dApps have been launched on MainNet; the total dApp transaction volume is 6,605,96, an increase of 4,532 from last week.

- 15,738,529 transactions have been completed on MainNet, an increase of 101,094 from last week.

Community Growth

- 239 new members were on-boarded across our global community. We are very excited to see the Ontology community continue to grow and we encourage anyone who is curious about Ontology to join us.

- Ontology launched a Spanish twitter account, operated by our Spanish community admin. The introduction of our Spanish-speaking social channel adds to our multilingual online communities, which currently includes Philippine, Persian and Japanese Twitter accounts. Follow our Spanish-speaking account here.

- Remember, we’re active on Twitter and Telegram where you can keep up with all our latest developments and community updates.

Global News

Ontology Partners With AP.LLC

- Ontology is partnering with AP.LLC, a Japanese consulting company, to provide security and privacy protection support for AP.LLC’s asset management system through decentralized digital identities, further expanding our presence in the Japanese market.

Ontology Provides DID Solutions to Music Streaming Platform, ROCKI

- Ontology is proud to partner with ROCKI, a next-generation music streaming service and music NFT platform built on Binance Smart Chain. Ontology’s decentralized identity (DID) software will enable ROCKI to prevent bad actors from impersonating artists and help buyers avoid purchasing inauthentic NFTs. Using Ontology’s blockchain-based DID solutions, ROCKI will ensure each artist’s data remains immutable, secure and authentic.

Binance Smart Chain (BSC) dApps Supported on ONTO Wallet

- Ontology released an updated version of ONTO Wallet’s Binance Smartchain dApp ecosystem landscape. As of the end of May, there are 85 dApps on ONTO Wallet including liquidity protocols, Layer 2, NFTs, and games, which each provide rich and diverse one-stop on-chain functions. This allows users to access cross-chain integrations between Binance Smart Chain and other blockchains, as well as self-sovereign identities and data.

Ontology’s DID 101

- Ontology published its “Ontology’s DID 101” mini-series on Twitter to help the community better understand the core features of Ontology i.e., DID, ONT ID, and OScore.

Clubhouse Event: “NFT Creators Identity and Reputation”

- Ontology was invited to attend a Clubhouse event entitled “NFT creators identity and reputation” to discuss how to establish the authenticity of NFTs by confirming the artists’ identities.

Ontology In The Media

Nasdaq — “OFAC Requests Chainalysis Subscription For Bitcoin Blockchain Surveillance”

Recently, as on-chain attacks against decentralized projects have intensified, the Office of Foreign Assets Control (OFAC), a regulatory agency under the U.S. Department of the Treasury, has decided to collaborate with the blockchain analysis company, Chainalysis, to crack down on-chain crime cases.

Ontology has been collaborating with Chainalysis since October of last year. We hope to enhance trust and security in the Ontology ecosystem using Chainalysis’s compliance and investigation tools, whilst respecting user privacy. Read the full article here.

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (June 1–8, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology Partners with ZAICO to Empower its Inventory Management System, Increasing Traceability…

Ontology Partners with ZAICO to Empower its Inventory Management System, Increasing Traceability, Transparency, and Trust ZAICO’s real-time cloud storage inventory management application will leverage the Ontology blockchain to improve its services. Ontology, the public blockchain specializing in decentralized identity and data, has signed a Memorandum of Understanding (MOU) with leading cl
Ontology Partners with ZAICO to Empower its Inventory Management System, Increasing Traceability, Transparency, and Trust ZAICO’s real-time cloud storage inventory management application will leverage the Ontology blockchain to improve its services.

Ontology, the public blockchain specializing in decentralized identity and data, has signed a Memorandum of Understanding (MOU) with leading cloud inventory management software company, ZAICO. Ontology aims to provide solutions to ZAICO’s inventory management platform to help increase traceability, transparency, and trust.

ZAICO makes inventory management of large amounts of stock faster and easier. The free cloud application for inventory management is accessible through web and smartphone applications for iPhone, iPad, and Android. ZAICO’s cloud storage system allows for inventory to be counted in real-time, allowing its employees to work anywhere at any time.

By adopting Ontology’s blockchain technology, specifically its attestation service, ZAICO will develop greater control over how its information is shared and utilized. Together, the companies are building an anti-falsification solution to promote accurate and traceable inventory management. As a result of the partnership, ZAICO’s Japanese clients will soon enjoy secure, trustworthy blockchain-based inventory management at a low cost.

Commenting on the partnership, Li Jun, Founder of Ontology, said, “Inventory management has historically been plagued with lengthy, unreliable processes and lists housed on excel sheets which are prone to error and leave businesses vulnerable to malpractice. Now, via the use of blockchain in its inventory management systems in partnership with Ontology, ZAICO will increase traceability, transparency, and trust, in its services. Our recent partnership with AP.LLC will help facilitate integration with ZAICO and other third parties, as we further promote decentralized identity solutions within inventory management.”

Tamura Toshihide, Founder and CEO of ZAICO Inc., said “ZAICO develops and sells innovative cloud inventory management software with the vision to improve societal efficiency by collecting, arranging, and providing information on goods around the world, with the power of technology. With this partnership, we are thrilled to be able to provide improved and enhanced features powered by blockchain-based solutions which benefit our clients.”

About Ontology

Ontology is a high performance, public blockchain specializing in decentralized identity and data. Ontology’s unique infrastructure supports robust cross-chain collaboration and Layer 2 scalability, offering businesses the flexibility to design a blockchain that suits their needs. With a suite of decentralized identity and data sharing protocols to enhance speed, security, and trust, Ontology’s features include ONT ID, a mobile digital ID application and DID used throughout the ecosystem, and DDXF, a decentralized data exchange and collaboration framework. For more, visit ont.io.

About ZAICO

ZAICO is a cloud inventory management software that has been supported by a total of 120,000 users. It is a free to start service that can be used on PCs, tablets, iPads, iPhones, and Android applications, and supports the smooth development of businesses by optimizing inventory management as well as reducing the number of personnel and costs involved in inventory management. A series of research support services, including ZAICO, were accredited by the Ministry of Education, Culture, Sports, Science and Technology (MEXT) as a “Research Support Service Partnership Accreditation System” on April 1, 2020. ZAICO continues to expand and enhance its services and functions to contribute to the improvement of Japan’s research environment, the promotion of science and technology, and the fostering of innovation.

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Ontology Partners with ZAICO to Empower its Inventory Management System, Increasing Traceability… was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Digital Identity, The Key For Security Token Custody

The post Digital Identity, The Key For Security Token Custody appeared first on Tokeny Solutions.
The growing interest in security tokens

Capital markets are advancing into a digital-first world and financial institutions have never been so enthusiastic about investing in digital assets. According to a study conducted by the Frankfurt School of Finance and Management, Plutoneo, and Tangany, the European security token market is expected to grow approximately 81% annually over the next five years, and is projected to reach €918 billion by 2026. Its market volume could surpass the cryptocurrency industry in the next five years.

Identifying wallet owners is a regulatory obligation

Security tokens are becoming popular and are likely to become a major investment option in the future as the interest keeps growing. However, the change in market infrastructure does not change the fact that securities need to comply with financial regulations. In the tokenization world, investors hold their assets in blockchain wallet(s). Therefore, it is critical for issuers to be able to identify who the wallet owner is and whether this owner is eligible to own the security. Blockchain identities solve this legal problem for issuers. Digital identities not only allow issuers to address compliance issues, but also enable protection for investors through recovery processes.

Digital identities enable the recovery of security tokens

If a cryptocurrency holder loses their private key or wallet access credentials for their self-custody wallet, the assets (Bitcoin etc.) are lost and cannot be recovered as most cryptocurrencies are decentralized and controlled by algorithms. Unlike cryptocurrencies, security tokens are controlled by issuers who are liable to their investors by law. As each investor is identified on the blockchain by his/her self-sovereign digital identity instead of the wallet, issuers can recover security tokens to the investor’s new wallet after confirming his/her identity. The traceability of ownership is recorded on the blockchain.

For example, issuers of tokenized securities using the T-REX open source protocol are able to trigger a recovery function present by default in the suite of smart contracts. In practice, the security tokens can be recovered in just a few steps:

Declare the loss of the wallet: The investor creates a request to recover their tokens to the issuer (or its agent). Verify the digital identity – ONCHAINID: The issuer (or its agent) verifies the token holder’s identity with a standardized KYC process to validate the provenance of the request. Recover the same tokens in the new wallet: If the verification is positive, the issuer triggers the recovery function and the lost tokens are transferred to the new investor wallet. This new wallet will also be added to the ONCHAINID of the investor.

Therefore, issuers and securities custodians can fulfil their securities restitution obligations, whilst investors can participate in tokenized investment opportunities with greater confidence.

Custodial wallets vs self-custody wallets

As security tokens are recoverable, investors can choose to use custodial wallets or self-custody ones to store their security tokens with low risk. A self-custody wallet, such as Metamask, allows investors to self-custody their assets by managing the private keys themselves. However, in the case of private key loss, the issuer may charge a recovery operation fee.

While self-custody may seem like a good option, it can be difficult for the layperson due to the technical knowledge required.  Some investors may want another regulated party to hold the asset custody accountable, such as institutions. A custodial wallet is a more suitable option for them because regulated entities like banks, issuers or marketplace operators will then keep their private keys. The cost of a custodial wallet will obviously be more expensive, and is sometimes based on assets under management (AuM).

Regardless of the type of wallet, digital identities lie at the heart of security tokens custody for investors, and secure the assets against loss and theft. As such, wallets are less important to tokenized securities holders than to cryptocurrency holders. With security tokens, the digital identities of investors double up as their blockchain account, with one or several wallets, where assets are controlled and managed, while their wallets are more like blockchain browsers where assets are viewed and where they confirm actions. Digital identity paves the way for the new era of asset custody.

The post Digital Identity, The Key For Security Token Custody appeared first on Tokeny Solutions.


IdRamp

Oracle + IdRamp at the Hyperledger Global Forum

IdRamp CEO, Mike Vesey will be presenting at HGF with Mark Rakhmilevich, Senior Director, Blockchain Product Management at Oracle. In their session, titled “Identity Proofing Solution Combining HL Indy and Fabric” The post Oracle + IdRamp at the Hyperledger Global Forum first appeared on IdRamp | Decentralized Identity Evolution.

IdRamp CEO, Mike Vesey will be presenting at HGF with Mark Rakhmilevich, Senior Director, Blockchain Product Management at Oracle. In their session, titled “Identity Proofing Solution Combining HL Indy and Fabric”

The post Oracle + IdRamp at the Hyperledger Global Forum first appeared on IdRamp | Decentralized Identity Evolution.

Tuesday, 08. June 2021

KuppingerCole

Identity und Access im Digitalen (Cloud) Arbeitsplatz

Identity im Bereich Digitalen Arbeitsplatz am Beispiel von Microsoft 365.  

Identity im Bereich Digitalen Arbeitsplatz am Beispiel von Microsoft 365.

 




Distributed Identity am Beispiel eines digitalen Impfpasses

Distributed Identity ist vielen eher weniger bekannt und noch weniger in Verbindung mit der Pandemie. Die Konzepte, welche DI liefert, sind aber ein hervorragender Ausgangspunkt einen digitalen Impfpass zu erstellen. Warum DI generell eine gute Idee ist und wie darauf basierend ein digitaler Impfpass aussehen kann, wird in diesem Vortrag dargestellt. Wer seiner Familie mal praktisch erklären möcht

Distributed Identity ist vielen eher weniger bekannt und noch weniger in Verbindung mit der Pandemie. Die Konzepte, welche DI liefert, sind aber ein hervorragender Ausgangspunkt einen digitalen Impfpass zu erstellen. Warum DI generell eine gute Idee ist und wie darauf basierend ein digitaler Impfpass aussehen kann, wird in diesem Vortrag dargestellt. Wer seiner Familie mal praktisch erklären möchte was IAM, IGA und PAM so machen: impfen lassen und (hoffentlich bald) einen digitalen Impfpass beantragen können!




Zugriff für alle und dennoch die Kontrolle behalten




Passwordless and Beyond - Die Zukunft des Identitätsmanagements




Smart IAM Services for the Modern Digital Enterprise

The identities of employees, appropriate authorizations in processes and systems, and a permanent control and monitoring of access to prove compliance are becoming increasingly important for organizations. However, the management of these things remains less than optimal.

The identities of employees, appropriate authorizations in processes and systems, and a permanent control and monitoring of access to prove compliance are becoming increasingly important for organizations. However, the management of these things remains less than optimal.




Global ID

GlobaliD joins the Linux Foundation’s Cardea Project

GlobaliD is thrilled to announce that it has joined the Cardea Project’s steering committee as part of Linux Foundation Public Health (LFPH). Our participation in the Cardea Project as a founding member highlights our ongoing commitment to provide users ownership and control over their identity and data — in this case, around health testing and vaccinations to facilitate international travel

GlobaliD is thrilled to announce that it has joined the Cardea Project’s steering committee as part of Linux Foundation Public Health (LFPH).

Our participation in the Cardea Project as a founding member highlights our ongoing commitment to provide users ownership and control over their identity and data — in this case, around health testing and vaccinations to facilitate international travel.

Cardea was launched by GlobaliD partner Indicio:

Cardea is a complete ecosystem for the exchange of privacy-preserving digital credentials, open sourced as a project in Linux Foundation Public Health. Launched by Indicio.Tech, Cardea provides an easily verifiable, trustworthy, unalterable proof of health tests or vaccination that can be shared in a privacy-preserving way. Cardea easily integrates with existing health systems to ensure trusted data sources for credentials and uses decentralized identity technology to enable better control of data for individuals. Cardea recently announced its first reference implementation in partnership with SITA for the island of Aruba.

As part of this ongoing initiative, the Linux Foundation today announced the launch of the Global COVID Certificate Network (GCCN) with Indicio as an initial supporter, among others.

The goal of the GCCN is to “to facilitate the safe and free movement of individuals globally during the COVID pandemic. GCCN will establish a global trust registry network that enables interoperable and trustworthy exchanges of COVID certificates among countries for safe reopening and provide related technology and guidance for implementation.”

More about the GCCN:

With the go-live of the EU Digital COVID Certificate (previously called the Digital Green Certificate), the lack of a global trust architecture and ready-to-deploy tools to build compatible systems in other countries could not be more clear. LFPH is launching GCCN to address this gap, with an initial focus on safely reopening borders between EU and non-EU countries. It follows and operationalizes the Interoperability Blueprint (the “Blueprint”) of the Good Health Pass Collaborative (GHPC), an industry coalition that has defined principles and standards for COVID certificates. LFPH has co-led the process drafting the Blueprint, which was released on Monday for a period of public review.

So what does this all mean?

Between Cardea’s verifiable credentials platform and the GCCN registry to promote standardization and interoperability, we’re that much closer to having a portable identity that works (and facilitates travel) across borders — all while giving users ownership and control over their identity and data.

Learn more: GlobaliD connects to the Indicio Network Introducing the Global COVID Certificate Network (GCCN) Indicio Announcement of the Cardea Project contribution to Linux Foundation Public Health Linux Foundation Public Health announcement of the Cardea project Cardea App webpage with community meeting information (as well as links to meeting archives in Github, Slack, Twitter) Ledger Insights pre-announcement coverage of GCCN Good Health Pass (and Blueprint)

GlobaliD joins the Linux Foundation’s Cardea Project was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

Implementing the Good Health Pass’s recommendations with Cardea

The post Implementing the Good Health Pass’s recommendations with Cardea appeared first on Indicio Tech.

Implementing the Good Health Pass’s recommendations with Cardea Cardea is a fully open-source and field-tested verifiable digital credential that meets the major requirements of the Good Health Pass.

With the rollout of the EU Digital COVID Certificate (previously called the Digital Green
Certificate), digital health credentials for managing COVID-19 are now a reality. Political urgency has, unsurprisingly, driven the certificate’s development. A borderless political bloc needs a safe way to ensure travel within its member states. The question is how to make these certificates work—and evolve—in a way that maximizes privacy and security, while interoperating with other credential systems outside the EU.

Linux Foundation Public Health (LFPH) is leading two collaborative projects that aim to do this. The Global COVID Certificate Network, announced today, is a way for COVID-19 test certificates issued in one country to be recognized as authoritative in another. The second is the Good Health Pass Collaborative, an industry coalition that has defined principles and standards for COVID-19 certificates themselves so that they are privacy-preserving, secure, and interoperable. The Good Health Pass “blueprint” is currently in public draft review.

Cardea, a full, open-source ecosystem for verifiable health credentials developed by Indicio and now a community-led project at LFPH, meets the major recommendations of the Good Health Pass and facilitates the goals of the Global COVID Certificate Network.

The chart below shows the alignment between Cardea and the Good Health Pass recommendations—and where development efforts will be focused to enable full compliance.

The Good Health Pass recommendations and Cardea at-a-glance

Cardea has been successfully trialed on the island of Aruba by SITA, the leading global provider of technology to the air transport industry. The pilot project enabled travelers to prove their Covid-test status to restaurants and hospitality locations around the island without having to share any personal information. Dangui Oduber, Aruba’s minister of tourism, public health and sport, described the ability to trust important health information while preserving visitors’ privacy as “a revolutionary step forward.”

Cardea’s strength as a functioning, deployable ecosystem is that it is built using open source software and doesn’t require any proprietary technology for implementation. It is based on the Hyperledger Indy platform, and uses Hyperledger Aries agents and Hyperledger Ursa cryptography—each backed by dynamic developer communities.

The Cardea Community Group, which meets weekly, is led by co-chairs, Ken Ebert, CTO of Indicio, and Keela Shatzkin of Shatzkin Systems. The Cardea Steering Committee, which will guide Cardea’s development  is currently composed of Trevor Butterworth, Indicio; Kristy Gale, Honor DRM; John Kindervag, ON2IT; RJ Reiser, Liquid Avatar;  Adrien Sanglier, SITA; Mitja Simcic, GlobaliD, and Mike Vesey, IdRamp. The Cardea Steering Committee is committed to inclusion in developing this project and will actively engage in recruiting a diversity of voices to join the committee.

To learn more or get involved in the growing Cardea Community, go to Cardea.app or Linux Foundation Public Health, or contact us at Indicio.

 

The post Implementing the Good Health Pass’s recommendations with Cardea appeared first on Indicio Tech.


Mythics

Oracle Cloud Infrastructure - A Cloud That Tells You How To Save Money!

When I talk to customers, their first reaction about OCI is the savings that Oracle's IaaS can offer them. Sometimes it can be as much…

When I talk to customers, their first reaction about OCI is the savings that Oracle's IaaS can offer them. Sometimes it can be as much…


Infocert (IT)

InfoCert interviene all’evento: “Identità Digitale” organizzato da ClubTI Triveneto e ClubTI Milano

Il prossimo 10 di giugno, avrà luogo un nuovo evento virtuale sul tema “Identità Digitale, opportunità per le imprese, collocazione europea, casi d’uso, rischi operativi ed evoluzioni”, organizzato da ClubTI Milano e ClubTI Triveneto, che a partire dalle 17:30, vedrà susseguirsi gli interventi di alcuni importanti relatori fra cui Marta Gaia Castellan, Business Compliance Senior […] The post Inf

Il prossimo 10 di giugno, avrà luogo un nuovo evento virtuale sul tema “Identità Digitale, opportunità per le imprese, collocazione europea, casi d’uso, rischi operativi ed evoluzioni”, organizzato da ClubTI Milano e ClubTI Triveneto, che a partire dalle 17:30, vedrà susseguirsi gli interventi di alcuni importanti relatori fra cui Marta Gaia Castellan, Business Compliance Senior Consultant e Luca Boldrin, Product Evolution Senior Consultant di InfoCert.

“Identità Digitale, opportunità per le imprese, collocazione europea, casi d’uso, rischi operativi ed evoluzioni”

Le Identità Digitali stanno creando un paradigma in cui servizi online e transazioni elettroniche possono sfruttare nuove forme di identificazione digitale per gestire in modo più rapido, efficace e remotizzato quei processi in cui il riconoscimento è obbligatorio. In questo evento cercheremo di dare una panoramica il più possibile completa su: opportunità per le imprese, collocazione europea, casi d’uso, rischi operativi ed evoluzioni in fase di studio.

Marta Gaia Castellan farà un intervento che avrà come focus “Il rischio del furto di identità digitale e le attività antifrode degli Identity Provider SPID”. Luca Boldrin invece, affronterà il tema “Self Sovereign Identity: un nuovo modello di identità decentralizzata”.

Chairwoman dell’evento sarà Claudia Sandei, dottore di ricerca, avvocato e professore associato di Diritto Industriale e delle Nuove Tecnologie dell’Università di Padova, Fondatrice di Itll.it.

Grazie alla collaborazione del ClubTI di Milano e di Infocert avremo i seguenti relatori:

Alberto Zanini – Esperto Identità Digitale e Consulente Commissione Europea Andrea Danielli – ClubTI Milano Fabrizio Lupone – ClubTI Milano Gianluca Marcellino – Vice Presidente ClubTI Milano Sergio Stefanoni – CIO Bein Marta Gaia Castellan – Business Compliance, Innovation & Strategy InfoCert Luca Boldrin – Innovation Portfolio, Innovation & Strategy InfoCert

Le conoscenze e le competenze di RAO (Registration Authority Office), Incaricati al Riconoscimento e di tutti i soggetti coinvolti nell’attività di Identity Provider SPID sono varie e complesse: diplomatica del documento di identità (tecniche di stampa, microscritte, rilievi tattili, timbri), controlli antifrode, utilizzo incrociato di banche dati ministeriali, elementi di social engineering. L’obiettivo fondamentale è proteggersi dal rischio di furto di identità durante la fase di rilascio, e in particolar modo dall’utilizzo di documenti di identità falsi, rubati, contraffatti, attraverso precisi processi di identity proofing e di vetting.

Marta Gaia Castellan – Business Compliance, Innovation & Strategy InfoCert

In parallelo all’evoluzione del quadro normativo per un’identità digitale europea, negli ultimi anni sta acquisendo importanza un nuovo modello per la gestione dell’identità digitale, noto come Self Sovereign Identity” (SSI). La principale differenza rispetto ai più diffusi modelli di identificazione e autenticazione sta nel fatto che non richiede la presenza di trusted parties per l’autenticazione: in questo senso si usa spesso lo slogan “you are your own identity provider”. Anche se spesso viene associato a distributed ledger e a nuovi standard (formati, protocolli), SSI è in realtà un paradigma che può essere implementato su diverse tecnologie e in diversi flavour. Questa presentazione fornirà una panoramica del mondo SSI e di alcune importanti iniziative a livello globale ed europeo.

Luca Boldrin – Innovation Portfolio, Innovation & Strategy InfoCer

The post InfoCert interviene all’evento: “Identità Digitale” organizzato da ClubTI Triveneto e ClubTI Milano appeared first on InfoCert.


Global ID

GiD Report#163 — EU’s portable identity, Supreme Court on data access rights

GiD Report#163 — EU’s portable identity, Supreme Court on data access rights Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. First up — Greg Kidd (GlobaliD), Meg Nakamura (Apto), and JP Thieriot (Uphold) will be participating in the next Linqto Global Invest
GiD Report#163 — EU’s portable identity, Supreme Court on data access rights

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

First up — Greg Kidd (GlobaliD), Meg Nakamura (Apto), and JP Thieriot (Uphold) will be participating in the next Linqto Global Investor Conference on June 22. A video preview:

Global Investor Conference (June, 2021)

Also, check out our latest podcast with GlobaliD’s VP guru Antoine!

EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin

OK — this week:

A step towards portable identity for the EU Meanwhile in Big Tech land, your data isn’t your data Tweet of the week A landmark Supreme Court decision over data access On the topic of data rights and Big Tech bullying — New York to “revolutionize” antitrust At Bitcoin Miami 2021 — it was all about community Stuff happens 1. A step towards portable identity for the EU EU Commissioner Margrethe Vestager, Photo: Radikale Venstre

Here’s the FT (via /gregkidd):

The EU is set to unveil detailed plans for a bloc-wide digital wallet on Wednesday following requests from member states to find a safe way for citizens to access public and private services online.
The digital wallet would securely store payment details and passwords and allow citizens from all 27 countries to log into local government websites or pay utility bills using a single recognised identity, said people with direct knowledge of the plans.
The EU-wide app, which can be accessed via fingerprint or retina scanning among other methods, will also serve as a vault where users can store official documents such as a driver’s licence. Using the wallet was not compulsory, those involved said, but citizens who chose to sign up would benefit from an extra-secure digital ecosystem and greater flexibility ideal for post-pandemic life.

And here’s Greg Kidd:

A step in the right direction towards a more portable form of identity — well, at least for Europeans.

The AP on the portable aspects:

All EU residents would be entitled to an e-wallet, but they won’t be mandatory, according to the EU Commission.
But dominant online platforms would be required to accept the wallet, a provision that aligns with the commission’s goal of reining in big tech companies and their control of personal data.
Vestager said people would be able to use their EU digital wallets to access Google or Facebook instead of their “platform-specific” accounts.
“Because of that, you can decide how much data you want to share — only enough to identify yourself,” the commissioner said from Brussels during a virtual media briefing.

Sound familiar?

Relevant:

Via /gregkidd — EU set to unveil digital wallet fit for post-Covid life EU plans digital ID wallet for bloc’s post-pandemic life EU Digital COVID Certificate Via /anej — Apple Pay competitors could include an EU digital wallet — 9to5Mac 2. Meanwhile in Big Tech land, your data still isn’t your data

Here’s TechCrunch (via /m — who noted: “This is going in the wrong direction.”):

A change to TikTok’s U.S. privacy policy on Wednesday introduced a new section that says the social video app “may collect biometric identifiers and biometric information” from its users’ content. This includes things like “faceprints and voiceprints,” the policy explained. Reached for comment, TikTok could not confirm what product developments necessitated the addition of biometric data to its list of disclosures about the information it automatically collects from users, but said it would ask for consent in the case such data collection practices began.

Relevant:

Via /m — TikTok just gave itself permission to collect biometric data on US users, including ‘faceprints and voiceprints’ — TechCrunch 3. Tweet of the week

New York governor Andrew Cuomo:

4. A landmark Supreme Court decision over data access

Here’s the EFF (via /gregkidd):

The Supreme Court rightly overturned the Eleventh Circuit, and held that exceeding authorized access under the CFAA does not encompass “violations of circumstance-based access restrictions on employers’ computers.” Rather, the statute’s prohibition is limited to someone who “accesses a computer with authorization but then obtains information located in particular areas of the computer — such as files, folders, or databases — that are off limits to him.” The Court adopted a “gates-up-or-down” approach: either you are entitled to access the information or you are not. If you need to break through a digital gate to get in, entry is a crime, but if you are allowed through an open gateway, it’s not a crime to be inside.
This means that private parties’ terms of service limitations on how you can use information, or for what purposes you can access it, are not criminally enforced by the CFAA. For example, if you can look at housing ads as a user, it is not a hacking crime to pull them for your bias-in-housing research project, even if the TOS forbids it. Van Buren is really good news for port scanning, for example: so long as the computer is open to the public, you don’t have to worry about the conditions for use to scan the port.

And here’s Greg Kidd:

If one accesses a public website or a private website with user permission, it should NEVER be a felony crime to do so. While it is certainly possible to have penalties that publish misuse of public or private information, it is crazy to have a reading of a statute that can be used to penalize mere access to public data or a user’s delegated access to their own data.
Public access to public data is a public good. And private access to one’s own data is a right not a privilege. Neither Linkedin/Microsoft nor Microsoft should be able to intimidate startup developers for innovating around these principles on behalf of users. We look forward to building on our past victories in this realm with additional ammo from the Supreme Court that included support from elements in both the liberal and conservative branches.

You can check out the Supreme Court decision here (pdf).

Relevant:

Van Buren is a Victory Against Overbroad Interpretations of the CFAA, and Protects Security Researchers 5. On the topic of data rights and Big Tech bullying — New York to “revolutionize” antitrust

Here’s Dealbook:

The act’s passage would put New York at the “vanguard” of a national movement, 15 state and national labor and policy groups wrote in a letter to legislators that DealBook is first to report. The organizations argue that the law would “rein in many abusive tactics corporations use against other firms and workers that are difficult to challenge under current antitrust law and precedent.”
Market dominance would be presumed with 40 percent market share. Currently, companies qualify as dominant if they have 70 to 90 percent of a market. “It’s not illegal to be dominant,” said Pat Garofalo of the American Economic Liberties Project, one of the groups that signed the letter. “It’s just illegal to block others from the market unfairly.” In the act, an “abuse of dominance” standard would replace the current “consumer welfare” standard.
“Where New York goes, often the rest of the country follows,” Garofalo said, and lawmakers at the federal level could be motivated “if they see the ship is leaving port.”

Here’s Matt Stoller:

What would this bill do? It would change standards for how a firm is considered to be too powerful. While changing standards seems technical and wonky, such a change would in fact be a revolutionary act to break the power that dominant firms have over our economy. (I testified on behalf of the bill last year, and my organization has written an explainer of the legislation.)
Right now, to be considered subject to monopolization law, a firm has to have 70–90% of a market, plus it has to engage in egregious behavior that economists measure as inefficient. This bill would blow up that entire framework. First, a firm would only have to have 40% of a market to be considered dominant. Plus, firms that are powerful enough to set wages across an industry, ahem Amazon, would also be considered dominant. It wouldn’t be illegal to be dominant, but under this legislative framework, dominant firms would no longer be allowed to engage in predatory conduct or block competitors from the market.
The legislation would also expand what would be considered anti-competitive behavior. A bunch of stuff that is now dead-letter law, like predatory pricing or selling below cost to capture market share, could come back. The bill would also let private actors sue under these new standards, so it would allow newspapers, grocery stores, pharmacists, manufacturer and employees to sue if they are being abused by a powerful buyer, seller, and/or distributor.

Meanwhile, on the Federal side of things. Axios:

Divisions within party caucuses, particularly Republicans, are emerging as a new threat to Congressional action against alleged monopolistic behavior by tech giants, Axios’ Ashley Gold and Margaret Harding McGill report.
Why it matters: It’s a blow to the longstanding theory that a bitterly divided Congress could still agree to tighten the antitrust screws on Big Tech, since both sides have beefs with the industry.

Relevant:

Briefing: Facebook Will Face UK and European Antitrust Probes Into Data Use EU, U.K. launch antitrust investigations into Facebook New York State to Revolutionize Antitrust 6. At Bitcoin Miami 2021 — it was all about community

GlobaliD senior product manager Paul Stavropoulos:

My biggest takeaway from Bitcoin2021 in Miami — we’re just scratching the surface on how tokens + NFTs will be used together to facilitate community.

Maria Shen (via /markonovak):

I saw crypto being used in the coolest ways in Miami this week & none of them had to do with payments.
All of them had to do with building strong communities.

Maria should check out GlobaliD Groups!

And here’s Michael Casey on “money=community”:

Our $DESK experience was a reminder that money has always been about community. There’s a feedback loop between people’s desire to own and use a currency and the way in which it intermediates human interaction. Regardless of how functionally advanced your currency is as a technology, this community function is a major determinant of its value.
Let’s face it, technologically, the dollar is outdated. The inefficient U.S. banking and payments system still uses paper checks, for crying out loud. It integrates none of the cutting-edge monetary policy and governance features that comes with cryptocurrencies and decentralized finance. Yet, it rules the world thanks largely to the self-reinforcing power of a “dollar community.”
Now, for the first time, that community is looking a little less united. As concerns rise about the political and economic sustainability of a dollar-centric global financial system, and as technologies spanning crypto, data services, trading and social media foster new approaches to payments and other financial services, people are exploring alternatives. It’s why everything around money suddenly seems unmoored, as a baffling array of new finance acronyms and memes leaves many in the mainstream confused: NFTs, SPACs, stonks, DeFi.

Relevant:

Michael Casey: Miami, Piranhas and the Power of Community — CoinDesk 7. Stuff happens: Media industry braces for post-pandemic reality Pandemic drove small businesses online — and they’re staying WSJ News Exclusive | FBI Director Compares Ransomware Challenge to 9/11 Blockchain.com to Move US Headquarters to Miami From New York — CoinDesk Daryl Morey on Crypto, NFTs: ‘It’s the Start of a Major Trend’ — CoinDesk Alibaba, Google Among More Than 300 Companies Seeking Singapore Crypto Licenses — CoinDesk The rise of crypto laundries: how criminals cash out of bitcoin Via /m — Search giant Google lifts 2018 ban on crypto exchange, wallet advertisements Axios newsletter: The future of concerts Bitcoin and Ethereum are now, decentralized apps are the future Jack Dorsey says Square is weighing a possible bitcoin hardware wallet Austrian Blockchain Company Builds Platform to Tokenize Solar Energy — CoinDesk

GiD Report#163 — EU’s portable identity, Supreme Court on data access rights was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

IBM Security Verify for CIAM

by John Tolbert IBM Security Verify is an industry-leading Consumer Identity and Access Management (CIAM) solution that can satisfy both B2B and B2C requirements. Customers can choose to operate the product on-premises or in IaaS and IBM hosts it as fully multi-tenant SaaS. The micro-services design exemplifies the modern paradigm of Identity Fabrics. IBM Security Verify addresses the full range

by John Tolbert

IBM Security Verify is an industry-leading Consumer Identity and Access Management (CIAM) solution that can satisfy both B2B and B2C requirements. Customers can choose to operate the product on-premises or in IaaS and IBM hosts it as fully multi-tenant SaaS. The micro-services design exemplifies the modern paradigm of Identity Fabrics. IBM Security Verify addresses the full range of common CIAM capabilities plus many advanced use cases for IoT device association. IBM Trusteer integration allows for robust fraud prevention for consumers and customers.


Saviynt Enterprise Identity Cloud

by Richard Hill Enterprise IT organizations need Identity solutions to protect data, applications, and third-party access to critical IT infrastructure and services. The Saviynt Enterprise Identity Cloud (EIC) takes an all-encompassing approach to these different aspects of identity, delivering an integrated platform to unify Identity Access Management and Governance.

by Richard Hill

Enterprise IT organizations need Identity solutions to protect data, applications, and third-party access to critical IT infrastructure and services. The Saviynt Enterprise Identity Cloud (EIC) takes an all-encompassing approach to these different aspects of identity, delivering an integrated platform to unify Identity Access Management and Governance.


Dark Matter Labs

African Cities — Disrupting the Urban Future

African Cities — Disrupting the Urban Future Harare, Zimbabwe The future of humanity is urban. Africa is the fastest urbanizing continent in the world with an expected 1.2 billion urban residents by 2050. The continent’s urbanisation comes with extraordinary transformative potential, but the structural hurdles are equally huge. Historically, there is compelling evidence to suggest that urbani
African Cities — Disrupting the Urban Future Harare, Zimbabwe

The future of humanity is urban. Africa is the fastest urbanizing continent in the world with an expected 1.2 billion urban residents by 2050. The continent’s urbanisation comes with extraordinary transformative potential, but the structural hurdles are equally huge. Historically, there is compelling evidence to suggest that urbanisation and economic growth are mutually reinforcing. Yet in Africa, urbanisation is happening in a context of slow structural transformation, pervasive urban poverty and inequalities that compromise a hopeful urban future.

Building on last year’s Istanbul Innovation Days, the 2019 Harare Innovation Days on #NextGenCities will focus on the strategic risks and opportunities inherent in Africa’s rapid urbanisation. It will invite senior civil servants responsible for cities and bring them together with cutting-edge practitioners working on urban solutions from digitally-enabled distributed infrastructure provision to nature-based solutions, and from participatory city-making to sustainable food systems.

Harare Innovation Days is twinned with an Asia-Pacific focussed Innovation Days in March 2020 event on #NextGenGov. Following both events we aim to open up new pathways for targeted support for collaborative experimentation and learning trajectories within governments starting in the Asia-Pacific, Africa and Arab States region.

#AfricaIsNow

The numbers may be familiar but are staggering nonetheless. According to the World Bank, the continent’s urban population is projected to reach 1.2 billion by 2050. Currently, the continent has three megacities: Cairo (10 million), Kinshasa (12 million) and Lagos (21 million). But this is just the start: if Nigeria’s population continues to grow and people move to cities at the current rate, Lagos could become the world’s largest metropolis, home to up to 85 or even 100 million people by 2100 — more people than California or Britain today. And Lagos is not alone — according to some scenarios, Kinshasa may have 83.5 million inhabitants at that point; Niamey, currently home to fewer than a million people, may have 46 million; and Kigali could grow from 1 to 5 million residents already by 2050. And it’s not just over the next 30 years, with all of the world’s 10 fastest growing cities are going to be in Africa over the next 16 years. In this context, it’s crucial to remember that over 60% of the land projected to become urban by 2030 is yet to be developed.

The accelerated growth, not just of Africa’s megacities but also its medium sized growth hubs and even previously rural areas, poses extraordinary challenges for national governments and municipalities. But behind the evident everyday challenges of Africa’s rapid urbanisation — unemployment and informality; the infrastructure gap; structurally underfunded municipalities; huge waste problems; widespread issues with land registry and corruption — we see a range of more long-term, interdependent strategic risks: whether climate change and the impact of air pollution on the health of urban dwellers; uncertain economic development and rising inequality; the lack of locally rooted evidence and its impact on planning and governance; the unintended consequences of disruptive technology; food insecurity and the risks of nutrient decline; or the impact of rapid development on the natural ecosystems cities rely on. They all call into question the current pathways to greater prosperity and the achievement of the Sustainable Development Goals (SDGs) in cities.

On the other hand, it also provides a massive opportunity. Frequently celebrated under the banner #AfricaIsNow, we are seeing incredible dynamism, creativity and innovation across the continent, spearheaded by industry, forward-looking governments and creative, tech-savvy entrepreneurs and enabled by new technologies and new platforms enabling cross-sectoral collaborations.

NextGenUNDP

By building on such disruptive and pioneering ways of working from across the continent, the Innovation Days seeks to unlock an urban future that addresses the current challenges and strategic risks of our time — a future that is uniquely African, rooted in the diverse identities of Africa’s peoples and places, and the continent’s ability to combine global innovations with local solutions. Through the Harare Innovation Days and ongoing support, UNDP aims to accelerate this African future by supporting next generation strategic experimentation.

The 2018 Istanbul Innovation Days on NextGenGov already showed how disruptive technologies are shifting the ability of cities to serve their populations, from the blockchain-enabled real-time pollution monitoring system pioneered by Commons Impact, to MetaSUB’s use of advanced microbiome data to enable predictive approaches like early warnings on looming epidemics or emerging signs of microbial antibiotic resistance. Equally, it recognised that new spaces are needed for citizens and communities to access and make sense of such digital opportunities — showcasing institutions like Madrid’s Medialab Prado that enable such engagement and which develop participative tools like Consul, a software now being adapted and adopted across the world including in the rebuilding of Mogadishu.

Recognising African cities are a key entry point to address the interconnected issues of uncertain economic growth, climate breakdown, social disruption and governance shortfalls; in Harare, we will explore how a series of cutting edge innovations already being pioneered in Africa could further develop cities’ capacity for sustainable and resilient growth. In an uncertain urban future it is essential we co-build the institutional capacity to explore and lead towards opportunities that are genuinely relevant to Africa without trying to replicate western models and mistakes. To do this, UNDP is reaching out to senior level civil servants across some of Africa’s fast-changing cities — the key risk holders in the decades ahead. The aim of the Harare Innovation Days is to be a springboard to supporting them with hands on policy experiments in urban governance, development and infrastructure provision, with UNDP providing the required infrastructure to help these key decision-makers develop experiments that will help address their strategic risks. By convening them alongside some of Africa’s leading innovators and disruptors in the technology, urban planning and policy space, we aim to build a platform and peer to peer network for accelerated learning and implementation capacity for urban innovation.

Zones of Experiment

We propose four initial focal themes below. Of course they are not all-encompassing, but they are indicative of the complexities and possibilities of urban change in Africa. They go across a series of cross cutting themes — new pathways to creating secure livelihoods and meaningful jobs in an often informal and changing economic context; accelerating the shift to environmental sustainability and ecological regeneration; using the best of cutting edge technologies (AI, IoT and real-time satellite data) to provide us with new governance capabilities; and securing pathways to inclusion across highly diverse populations. These are indispensable elements for the achievement of the SDGs.

Participatory city-making

The future of urban development in Africa cannot solely rely on state actors or large-scale corporate investment; to be resilient and legitimate, it needs to tap into the creativity, passion and drive of local communities and embedded entrepreneurs to co-create urban spaces and re-imagine how they are used. This is crucial in order to build on the unique identity of cities and the creativity of young people as a source of strength and dynamism — to engender new cultures of participation that cut across ‘online’ and ‘offline’, and in turn to enhance sense of belonging (especially in a case of unprecedented migration from rural areas). For example: the Block by Block project uses Minecraft (a game) to give citizens an opportunity to design and re-create local public spaces; i-CMiiST in Nairobi and Kampala are using creative methods to explore more sustainable mobility and RanLab’s deliberative polling experiments across Africa show that offline settings for participation can enable better implementation of difficult policy decisions like the relocation of communities due to flooding.

This is being supported by a range of diagnostic tools, from Arup’s City Resilience Framework to UN-Habitat’s City City Resilience Profiling Programme (CRPP), and newfound collaborative energy. The latter converges around innovation hubs which put African talents and innovation at the forefront of a movement that will shape their urban future. These hubs from — iHub in Nairobi to the Co-creation Hub in Lusaka — serve as data-driven catalysts and tech incubators for young creative talents to experiment with unlikely ideas that address gaps in service delivery in their local communities and beyond. Equally, networks of young civic innovators like i4policy foster a collaborative dialogue amongst policy makers, technologists and local communities, in order to create policy frameworks conducive to bottom-up innovation. Such networked movements show how people are already finding ways for their voices to be heard, shaping fertile ground for creative problem solving as well as the leapfrog innovation the continent needs.

How can we build on such dynamics to re-invent urban governance fit for the 21st century, rather than merely tweaking the structures we have?

Distributed, hybrid infrastructure (and service) delivery:

While many African cities struggle after decades of underinvestment, they also have the opportunity to learn from the past mistakes in creating centralised large scale public works. Huge increases in the demand for electricity mean there are now more than 100 million urban Africans who live right under a grid, but lack an electricity connection, with massive impacts on productivity — but various alternatives are being tested to make systems more affordable, faster to deploy and less fragile both physically and organisationally.

In response to this, we are seeing a growing series of infrastructure experiments that utilize decentralized approaches, leading to more resilient urban infrastructures that help to stimulate local economic development. These dynamics can be found across mobility infrastructure — with Mobilized Construction piloting a digitally enabled approach to sensing needs for road repair and creating micro-contracts that can be procured locally, often at a fraction of the cost — and housing — with ibuild creating a digital ecosystem that provides the transparency and accountability for both governments to procure locally and people to incrementally build shelters/housing based on their socio-economic state — as well as sanitation and waste management with start-ups such as Wecyclers which work with low-income households using an incentive-based model to tackle Lagos’ widespread waste problems. This also provides much-needed opportunities for the creation of innovative enterprises, e.g. in renewables, that strengthen local economies by generating income. Enabling many of these is the rise of technological innovations, from smart-contracts to digital cash — ranging from the well known large-scale success stories such as Ecocash to local initiatives such as Grassroots economics, which issue digital complementary currencies to foster local growth.

How can cities embrace such distributed approaches in their policies and strategies to build the infrastructures for the 21st Century at the scale, speed and inclusion necessary?

Food as a lense for circular economies

Cities have a unique opportunity — and need — to spark a transformation towards a circular economy for food, given that 80% of all food will be consumed in cities by 2050. In many African cities, urban agriculture already plays an important role in poverty reduction, food security, flood protection and (as biogas) energy generation. However a combination of population growth, development pressures and cultural change is threatening this provision model. On top of production, the arrival of industrial food processing and distribution (such as supermarkets) has in many cases gone hand in hand with a rise in low-nutrient food types, replacing traditional foods with nutritionally inferior food; extractive economic models and the rise of open food systems that create externalities — leading to reductions in economic, social and environmental sustainability.

In response, several entrepreneurial experiments are combining new technologies with traditional products to recover wealth from waste products — such as Kusini Water which uses Macadamia nut shells in mobile solar powered water treatment — as well as using the mobile revolution to develop more transparent, equitable value chains such as Twiga foods in Nigeria and Fresh in a box in Zimbabwe. Taking this to city-wide systems change scale, Lusaka’s Food Change Lab is starting to work with local stakeholders to create a new urban policy approach to the food system and Antananarivo have set up a Food Policy Council.

How can new technological capabilities harness the potential inherent in such locally-rooted pathways to better food, jobs, and health?

Nature-based solutions

Urban growth is damaging vital ecosystems globally, with land use change a key issue alongside pollution, waste and carbon emissions. African cities are starting to recognise the importance of integrating climate change information into long term planning and design for critical (and green) water related infrastructure which also drive economic development and inclusion. As our built world is made up of complicated, open-loop systems that subsume and deplete the substrate of our complex environment — it is imperative that we create city systems that don’t depend on depletion and accumulation dynamics.

City-wide innovations focused on strategic infrastructure include multiple examples of innovative financing such as the Upper Tana River Water Fund, which uses payments from downstream water users to provide education and support for over 20,000 farmers in methods that increase yields while reducing the maintenance costs for Nairobi’s water infrastructure. Other innovations include disruptive technologies, such as the use of real-time sensors to create Digital Aquifers in Kenya; community based solutions such as EcoLoos which provide a sustainable waterless sanitation solution while creating fertilisers, and Mass Design which uses exclusively local, responsibly sourced materials, with community members trained and employed throughout the construction process.

Given cities’ vulnerability to the effects of climate change, how can we build the strategies, regulations and investment pathways to ensure that nature-based solutions become the norm instead of the exception?

NextGen UNDP: An antifragile organization

These innovations hint at alternative futures, ones where innovation has been nurtured in our policy-making and regulatory capacities and our agile, multi-level governance models are more fit for purpose. In this context of high complexity and accelerated change, our ambition as UNDP is to transform into an antifragile organization, borrowing from Nassim Taleb, that adapts to change and converts strategic risk into strategic opportunity. We understand that the key to antifragility is the ability to fail in small doses and to use external stressors, like the significant strategic risks that cities face, to ‘gain from disorder’ over time and arrive at better outcomes. Our collective ability to deal with such risks and shocks, and to rapidly learn and re-configure our strategies, technology, people and interventions, (in short: our transformational capability) will enable our partner governments and their partners to show how #AfricaIsNow isn’t just true for the continent’s booming creative industries but also its unique urban future.


Tokeny Solutions

Blockchain Layer Enhancement

The post Blockchain Layer Enhancement appeared first on Tokeny Solutions.

Product Focus

Blockchain Layer Enhancement

We have fully tested the integration with all of our solutions on Polygon, and our customers can now choose to deploy their security tokens on Polygon or Ethereum.

This content is taken from the monthly Product Focus newsletter in June 2021.

This month we’ll take a look at our recent integration with Polygon.

Polygon (previously known as “Matic Network”) is growing rapidly, and has recently overtaken the likes of Stellar, Binance Smart Chain, Solana and Tezos in terms of its market capitalization. In the last months Polygon made some serious strides with some blue chip DeFi protocols migrating, including Aave, Curve finance, and 0x.

But what does our integration with Polygon mean for our security token customers?

Over the last 6 months, we have seen gas fees skyrocketing as a result of network congestion and ETH price increase with an all-time-high reached last month. Transaction processing times have also been varying a lot depending on the network load. The congestion and issues related to fees should get resolved through the gradual implementations of various deliveries of Ethereum 2.0 in the course of this year and next. However, for our customers and for ourselves having live activity on the network, these congestion issues have become a handicap over the last few months.

Foreseeing that this will continue to be a problem for our customers, we benchmarked several so-called Layer 2 solutions and sidechains as well as some separate Ethereum Virtual Machine (EVM) based blockchains and we selected Polygon as the best alternative solution for deploying our products (ONCHAINIDs, the T-REX protocol and the T-REX platform) on Ethereum Public for the time being.

We have fully tested the integration with all of our solutions on Polygon to make sure that the deployment of new security tokens or the migration of existing ones will be as seamless as possible for our customers and their investors. The first T-REX security token has already been deployed successfully on Polygon in May on behalf of one of our customers.

Our customers will benefit from Polygon being:

Secure – the network is secured by a proof of stake consensus and by the Ethereum network;

Performant – transactions are executed in seconds for fees that are usually under 1 cent, making transaction fees virtually irrelevant (we have seen a fees reduction by a factor of 1/10000 compared to what they have been over the last few months on Ethereum Public);

Easy to use – users can still use their Ethereum-based wallets, such as a Metamask or custodial wallets like the ones connected to ONCHAINID;

Additionally, the consensus mechanisms used on Polygon are far-less energy-consuming and as a result more environment-friendly. The Polygon team has an aggressive roadmap to implement interconnection with other blockchains and support interoperability. It has also proved over the last months that it is delivering the products planned as per its roadmap with the introduction of full stack scaling last week (26/05/2021).

Obviously, we remain fully compatible and supportive of the Ethereum Public network, and our customers will be offered the choice between deploying their security tokens on one or the other network. Existing customers will also be offered the possibility to migrate existing tokens from one to the other. Finally, this integration makes us confident that our products and services could be adapted to other EVM-compatible Layer 2 solutions, sidechains and blockchains going forward according to our customers needs.

Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs Blockchain Layer Enhancement 8 June 2021 T-REX Factory Enhancements 29 March 2021 Security Tokens Conditional Transfer 1 March 2021 Messaging 25 January 2021 ONCHAINID Notifications 30 November 2020 Tokens Recovery 2 November 2020 Tokenize securities with us

Our experts with decades of experience across capital markets will help you to digitize assets on the decentralized infrastructure. 

Contact us

The post Blockchain Layer Enhancement appeared first on Tokeny Solutions.


IDnow

EU decision on Identity Wallet: Starting signal for a seamless digital future

Armin Bauer, Co-Founder and Managing Director Technology at IDnow Last week, the EU Commission published a draft for the so-called digital identity wallet “EUid”. According to it, within 12 months of the law coming into force, every EU state must provide its citizens with a digital wallet. The draft for the “Digital Identity Wallet” states […]
Armin Bauer, Co-Founder and Managing Director Technology at IDnow

Last week, the EU Commission published a draft for the so-called digital identity wallet “EUid”. According to it, within 12 months of the law coming into force, every EU state must provide its citizens with a digital wallet.

The draft for the “Digital Identity Wallet” states the following: Every citizen will be given access to a so-called digital wallet. They have to identify themselves once and the data will be stored for further use.

In addition to electronic signatures, documents such as birth and marriage certificates or driving licences are to be stored in the digital wallet in future. This goes hand in hand with numerous use cases – from digital administrative procedures such as address changes or tax returns to the use of health services. A possibility that would revolutionise our daily lives.

According to the draft, each country has three options for implementation: either it launches a solution for its citizens and businesses itself. Or it commissions a private company to develop the wallet and make it publicly available. The third option is a government-approved certification programme that allows private companies to certify applications for identification. The exact technical standards are currently still being discussed at EU level. Only after publication of these standards, the member states will be able to start implementing the respective solution. In doing so, they will have to pay attention to two aspects in particular: high security standards as well as a user experience that is as pleasant as possible. This is the only way to ensure that the solution is accepted by as many citizens as possible.

From our point of view, the introduction of the digital wallet is a necessary and right step to simplify the everyday life of citizens and to overcome digital identification borders within the EU. With the new regulation, it is possible to create an overarching infrastructure.

As one of the European market leaders in the field of digital identification, IDnow itself is working on its own technology for digital wallets. With the “IDnow Wallet”, users’ data is stored securely on the user’s smartphone in compliance with data protection laws. Fingerprint or facial recognition (Touch or Face ID) is used to release the data for identification. The user retains control over his or her own data at all times. Due to the zero-knowledge principle, it is not possible for IDnow as a provider to gain access to the user’s data. As a certified identification service by the Registration Authority according to §24 1 d) eIDAS and as one of the largest providers of eIDs, the online ID function, IDnow is ideally positioned to provide all citizens and companies with an identity within the framework of a digital identity wallet in the future.


Fission

Tools for Thought Interchange: Part 2

June 3rd Tools for Thought Interchange Part 2 event, with full video, and slides and links for all presenters: Iian Neill, Frode Hegland, and Brian Rubinton.

The second edition of Tools for Thought Interchange took place on June 3rd – or June 4th if you were joining us from the other side of the international date line! And in fact, Iian Neill did just that from Australia, plus Frode Hegland calling from the other side of midnight in the UK, rounded out by Brian Rubinton in North America.

Fission: IPFS as global data commons

Boris kicked things off with why we at Fission care about the category of tools for thought, and offering the IPFS protocol as part of the answer to interchange: a global data commons.

Slide deck PDF »

For those that aren't familiar with it, InterPlanetary Linked Data (IPLD) is also worth digging into.

JeffG from Fission also briefly talked about an idea we're exploring: an open source browser extension to be used as a shared web clipper by any tool for thought, with content clipped and shared to IPFS for maximum re-use. If you're interested in #ProjectPachyderm, get in touch with JeffG @midijeffg.

As always, drop by the Fission Discord chat to find out more »

Part 2 Presentations

Here are the presentations, notes, and video from Part 2. Part 1 is also available on the blog here.

Iian Neil, Codex Editor & Standoff Editing

Iian Neill talked about Codex Editor and the concepts around it:

"We need a file system for thought" @codexeditor

Multiple apps, semantic artifacts that can be shared across each other. Something "like" an operating OS, or at least the desktop OS metaphor for knowledge workers.#ToolsForThought pic.twitter.com/h3Q3hRUdmm

— FISSION (@FISSIONcodes) June 4, 2021

The follow up here with Iian and what he has built with Codex was meant as a compliment to Blaine's talk on atJSON: the concept of Standoff Editing, where annotations don't just exist as inline markup, but in fact are meant to be multi player – including with past / future versions of yourself.

"Liberate the user from the tyranny of embedded markup" was a particularly strong phrase!

Slides » Standoff Properties Editor on Github Main page for Codex Editor on Patreon

As well as his on camera presentation, Iian also prepared an extended demo to YouTube:

Frode Hegland, Visual Meta

Frode Hegland took us all the way back to the original Tools for Thought: Howard Rheingold's book by that name.

Visual Meta is what Frode demo'd for us, "BibTeX and a little JSON": a way to put metadata directly into documents, that is both human readable and machine readable.

Frode made an amazing case for PDF as the 1000 year format:

"I expect PDF to be around in 1000 years"

"I don't want my son to grow up with PDFs that are basically JPEGs"

Visual Meta is meant to add more info to PDFs. pic.twitter.com/JO2DGEf5kl

— FISSION (@FISSIONcodes) June 4, 2021

His sentiment about building for his son called back to our presentation with Tienson, creator of Logseq, who also mentioned building for his daughter.

Frode invites everyone to join the Future of Text community, as well as to submit work for inclusion in the forthcoming second volume.

Slides » Augmented Text Company, supporting Visual Meta in Author & Reader, both apps available for macOS Future of Text Community on Circle » Future of Text Volume I Brian Rubinton, Kanopi & RDF for interchange

Brian Rubinton joined us to talk about and demo Kanopi, his fully RDF-based note taking tool.

Crowded market, BUT "Almost everyone is still using digital paper. There is still a very long way to go"@brianru #ToolsForThoughts pic.twitter.com/wStIFLQXWf

— FISSION (@FISSIONcodes) June 4, 2021

This point about "digital paper" in the crowded market of note taking was in reference to limited "meaning" embedded in those notes (same as Frode's PDFs-as-JPEGs comment). Which perhaps was the theme that ended up emerging from all three presentations: more meaning.

Slides » Demo playlist on Youtube Video

Even though we had one less presentation, we still ended up going for almost 2 hours.

We had many wonderful attendees keeping the chat active, including a number of European night owls. The chat log is once again checked into the TFTInterchange repo.

TFTInterchange Community

This two part series was organized by Boris and the Fission team. Tools for Thought are a recurring theme in our presentations, and we'd love to keep supporting Interchange with our Zoom webinar account and our time at the very least!

We'd like to get a more diverse group of people involved, and make this more of a community project, that bridges community discussion, open source code, and interoperable formats.

There seemed to be general interest in at least a monthly community event, so perhaps along with timezone differences, we'll see if we can stagger them and so still have a presentation every 2 weeks or so.

Next steps:

Please share your thoughts and notes from Part 1 and Part2: #ToolsForThought on Twitter, add a link in the comments on this post Who wants to help organize ongoing talks? @shuyunzhang99 has already volunteered in this thread on Github. She's in Australia, which naturally may give us an organizing lead for that time zone. What open source code and interchange can we work on together? Boris has proposed ePub to atJSON in the Github repo as a project, please add your own ideas.

Fission invites presenters most Thursdays on a broad set of tech topics. Sign up to get notified of future events »


Urbit

How to Buy a Star: a Guide

Vision Buying a star can be a daunting prospect, even for those who have owned a planet for some time. A star is much more expensive than a planet, and it comes with new and confusing vocabulary. What's a naked star? Are some stars in lockup? Is mine in lockup? If it is in lockup, how does it get unlocked? If the prospective buyer is not using an established website such as OpenSea, and is ins
Vision

Buying a star can be a daunting prospect, even for those who have owned a planet for some time. A star is much more expensive than a planet, and it comes with new and confusing vocabulary. What's a naked star? Are some stars in lockup? Is mine in lockup? If it is in lockup, how does it get unlocked?

If the prospective buyer is not using an established website such as OpenSea, and is instead buying from an individual, these problems become much worse. What makes a star commodity grade? How can the buyer verify the status of the star?

This documentation will function as a Guide for the Perplexed. It will tell the buyer what to look for, and how to look for it, in entirely non-technical language. More daunting tasks such as verifying the star status on Hoon and/or Ethereum will be described clearly and plainly. By following this guide, the buyer will be able to verify the star's status, not merely trust the seller.

Design

The documentation will be entirely text. Since I know few of the answers to these questions, I will post in The Marketplace and other channels on Urbit to seek community input. I will gather knowledge from anyone willing to share, and compile an easy-to-read and well-sourced document that can be posted and shared anywhere useful.

Me

I'm a full-stack software engineer and former lawyer. I'm adept at navigating new domains and sharing what I've learned in an accessible and pleasant way.

Milestones Documentation is merged

1 stars 1-2 months, probably toward the end of July 2021

Monday, 07. June 2021

Elliptic

US Authorities Seize the Affiliate’s Share of the DarkSide Ransom Paid by Colonial Pipeline

The US Department of Justice and the FBI today announced that they had seized 63.7 BTC of the 75 BTC ransom paid to DarkSide by Colonial Pipeline. Elliptic’s analysis shows that this represents the bulk of the affiliate’s share of the ransom.

The US Department of Justice and the FBI today announced that they had seized 63.7 BTC of the 75 BTC ransom paid to DarkSide by Colonial Pipeline. Elliptic’s analysis shows that this represents the bulk of the affiliate’s share of the ransom.


Infocert (IT)

Riflettori su InfoCert e GoSign, il 9 giugno alle 11:45, all’AWS Summit Online EMEA

The post Riflettori su InfoCert e GoSign, il 9 giugno alle 11:45, all’AWS Summit Online EMEA appeared first on InfoCert.

Il 9 giugno 2021 tra le 11:45 e le 12:15 si parlerà di InfoCert e GoSign all’AWS Summit Online EMEA, un evento pensato per aiutare esperti e sviluppatori a connettersi, collaborare e approfondire le conoscenze sui servizi AWS (Amazon Web Services).

AWS Summit Online EMEA 2021

L’evento, virtuale e gratuito, riunisce la cloud community di tutta Europa, Medio Oriente e Africa e, tra il 9 e il 10 giugno 2021, vedrà alternarsi sessioni di formazione, laboratori pratici e la possibilità di partecipare ad incontri one-to-one con esperti AWS.

Mercoledì 9 giugno 2021 avrà luogo il Builders’ Day, l’appuntamento dedicato agli esperti e agli sviluppatori che affronta alcuni degli argomenti più attuali del cloud computing, con approfondimenti sulle principali novità dei servizi AWS.

Giovedì 10 giugno 2021 sarà invece dedicato all’Innovation Day, l’appuntamento pensato per leader in cerca di ispirazione e idee innovative per accelerare la crescita nelle proprie aziende.

Come evolvere Soluzioni Legacy basate su database commerciali e innovare rapidamente attraverso database gestiti ed Engine Open-source

L’appuntamento con InfoCert è per il 9 giugno 2021 alle 11:45.

Nella sessione dal titolo “Come evolvere Soluzioni Legacy basate su database commerciali e innovare rapidamente attraverso database gestiti ed Engine Open-source”, Nicola Bustreo – Data Architect at InfoCert S.p.A – e Paola Lorusso – Solutions Architect Database Specialist at Amazon Web Services (AWS) – parleranno delle applicazioni InfoCert migrate ad AWS ed in particolar modo di GoSign e dei vantaggi ottenuti in termini di scalabilità, prestazioni, gestione dell’infrastruttura e costi ottenuti passando da database legacy a una soluzione open-source grazie a database gestiti.

Si parlerà anche di come l’offerta di database porpouse-built di AWS abbia permesso al team di sviluppatori InfoCert di scegliere le soluzioni adatte a soddisfare requisiti specifici e di come il team sia riuscito ad innovare ed essere autonomo nello sviluppo di nuove applicazioni.

Registrazione e Agenda completa dell’AWS Summit Online EMEA 2021 Per seguire l’evento è necessario registrarsi sul sito di AWS L’agenda completa è disponibile a questo link

The post Riflettori su InfoCert e GoSign, il 9 giugno alle 11:45, all’AWS Summit Online EMEA appeared first on InfoCert.


Spherity

rfxcel integrates Spherity Credentialing Service to enable DSCSA Authorized Trading Partner…

rfxcel integrates Spherity Credentialing Service to enable DSCSA Authorized Trading Partner Compliance for its Customers Beginning in June, rfxcel customers can use the service to confirm their authorized trading partner status in saleable return verification Spherity, the German decentralized digital identity specialist, announces a partnership with rfxcel, a U.S. based global leader in digital
rfxcel integrates Spherity Credentialing Service to enable DSCSA Authorized Trading Partner Compliance for its Customers Beginning in June, rfxcel customers can use the service to confirm their authorized trading partner status in saleable return verification

Spherity, the German decentralized digital identity specialist, announces a partnership with rfxcel, a U.S. based global leader in digital supply chain traceability solutions.

rfxcel integrates Spherity Credentialing Service to enable DSCSA Authorized Trading Partner Compliance for its Customers

This strategic partnership will see the integration of the Spherity Credentialing Service in rfxcel’s verification routing service software. With that integration in place, rfxcel can offer their customers a solution to establish a digital enterprise identity and credentialize their state and FEI license becoming an electronically verifiable authorized trading partner (ATP) under the U.S. Drug Supply Chain Security Act (DSCSA). From June onwards, rfxcel customers can use the Spherity Credentialing Service to attach their authorized trading partner status in saleable return verifications of products.

Herb Wong, Vice President of Marketing & Strategic Initiatives at rfxcel says “The Spherity Credentialing Service is the most comprehensive effort to address the upcoming Authorized Trading Partner requirement for DSCSA. rfxcel was impressed to see how seamlessly it integrated with our solution.” The primary objective of the partnership is to deliver a secure solution to rfxcel’s customers, establishing trust in digital interactions between trading partners in pharmaceutical supply chains, and to ensure compliance with the U.S. Drug Supply Chain Security Act (DSCSA).

To drive industry adoption of the provided solution, Spherity and rfxcel are both members of the Open Credentialing Initiative. The consortium is working on rolling out, standardizing and exploring credential based implementations along the pharmaceutical supply chain.

“rfxcel, as an innovation pioneer in the industry, and Spherity have been a good fit from the very beginning, since the start of the collaboration. We are now very proud that rfxcel is offering our new Spherity Credentialing Service to their customers and look forward to delivering further integrations to establish trust in supply chains.” says Dr. Carsten Stöcker, CEO at Spherity.

About rfxcel

Part of Antares Vision Group, rfxcel provides leading-edge software solutions to help companies build and manage their digital supply chain, lower costs, and protect their products and brand reputations. Blue-chip organizations in the life sciences (pharmaceuticals and medical devices), food and beverage, worldwide government, and consumer goods industries trust rfxcel’s signature Traceability System (rTS) to power end-to-end supply chain solutions in key areas such as track and trace, environmental monitoring, regulatory compliance, serialization, and visibility. Founded in 2003, the company is headquartered in the United States and has offices in the United Kingdom, the EU, Latin America, Russia, India, Japan, the Middle East, and the Asia-Pacific region.

About Spherity

Spherity is building decentralized digital identity management solutions to power the fourth industrial revolution, bringing secure identities to enterprises, machines, products, data and even algorithms. We provide the enabling technology to digitize and automate compliance processes primarily in highly-regulated technical sectors like pharmaceuticals, automotive and logistics. Spherity’s decentralized cloud identity wallet empowers cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.

Stay sphered by signing up for our newsletter, follow us on LinkedIn or Twitter.

Press Inquiries
For press relations contact:
Marius Goebel
communication@spherity.com

rfxcel integrates Spherity Credentialing Service to enable DSCSA Authorized Trading Partner… was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: UK Regulator Extends Its Crypto Registration Regime, Again

On June 3, the UK Financial Conduct Authority (FCA) announced a lengthy 9-month extension to its timeline for reviewing crypto registration applications.  

On June 3, the UK Financial Conduct Authority (FCA) announced a lengthy 9-month extension to its timeline for reviewing crypto registration applications.  


MyDEX

Can I trust you?

This is the second of two blogs on our new White Paper: Achieving Transform At Scale. The first blog focused on the infrastructure challenge. This blog focuses on the parallel need for institutional innovation. Sometimes, when a society confronts a new challenge, the institutions it previously relied on to provide solutions cannot cope. New, different institutions are needed. We think this is the

This is the second of two blogs on our new White Paper: Achieving Transform At Scale. The first blog focused on the infrastructure challenge. This blog focuses on the parallel need for institutional innovation.

Sometimes, when a society confronts a new challenge, the institutions it previously relied on to provide solutions cannot cope. New, different institutions are needed. We think this is the case with personal data. Traditionally, our society has looked to two main types of institution to unleash social and economic opportunities: private sector firms focused on maximising their profits and state-owned enterprises. But as this blog explains, these types of institution cannot rise to the particular challenges posed by personal data. A different type of institution is needed, and thankfully we have one to hand: the Community Interest Company (CIC).

Many people are still not familiar with CICs, which often come across as a rather strange hybrid beast. CICs are:

asset locked. This means any assets a CIC develops cannot be sold to another entity that is not also asset locked and equally committed to pursuing its community purpose. Mydex is not seeking a trade sale or ‘exit’: it is committed to continuing the operation and extension of its platform as permanent infrastructure to benefit its community (citizens). dividend capped. Only 35% of distributable profits can be returned to shareholders. The remaining 65% must be reinvested in furthering the community benefits for which the CIC was established.

Why has Mydex chosen this unfamiliar CIC status?

Mission logic

One simple explanation is that when Mydex was established back in 2007, its founders didn’t just want to sell a product in order to make money. They wanted to produce a service that brings benefits to people and communities and recognised they needed to make money in order to fund this service provision. Making money isn’t the purpose of the business. Benefiting the community by achieving its mission is the purpose, and making money is a means to achieving that goal.

A second reason is that we recognised that personal data throws up huge issues and challenges relating to trust. We reasoned as follows: If there is a lack of trust surrounding the collection and use of personal data, its full personal, social and economic potential will be hampered by mutual suspicion, power struggles and conflict and therefore never realised. A new type of institution that builds trustworthiness into what it does, and how, is needed for this challenge.

How CIC status helps us rise to this challenge is not immediately obvious but the logic is powerful. It’s worth exploring.

Economic logic

The unique and particular challenges (and opportunities) of personal data lie in the fact that unlike physical products and most services, data is a resource that can be used by many different people and organisations for many different purposes without ever getting ‘used up’. Because of this, it breaks the boundaries imposed by current notions of ‘private property’. Institutions organised around the notion of ownership of private property and profit maximisation are struggling to come to terms with a new world where value is created by data sharing.

This takes us to the first unique challenge for Mydex: the question “What makes an enterprise economically viable and successful?”

It’s commonly assumed that the acid test of an enterprise’s economic success is how much money it makes. But that relates to its financial success, not its economic success.

If you stop to think about it, organisations’ economic results always lie outside their traditional legal and accounting boundaries — in the value their products or services generate in being used by other parties (while how much money they make is an internal measure). So, for example, the economic (and social) value of electricity isn’t measured by the money electricity suppliers happen to make. It lies in all the many uses our society uses electricity for.

This is true of all enterprises. The job of a car or phone maker is to provide people with cars or phones — that provide them with mobility and help them communicate: things they want to do in their lives. The job of a hospital is to treat the sick; the job of an orchestra to delight audiences with its music; the job of a local shop to make accessing the necessaries of life easy.

For most enterprises, this external economic impact is implicit. It’s not something they worry about too much, because they are focused on the internal measures that matter to them. But in the case of Mydex it needs to be made explicit, because the whole purpose of the organisation is the external value it helps to create: making data available to people (especially citizens) outside its traditional organisational boundaries, so that they can use this data for their own purposes. Adopting CIC status makes this external purpose explicit and focuses attention (of people both inside and outside the organisation) on this external purpose.

Financial logic

If it’s true that financial success is not the same as economic success, then how much money an organisation happens to make has got nothing to do with its external economic impact. If it’s a charity or public service, it could deliver huge economic benefits but not ‘make’ any money at all. If it’s a mafia syndicate, it could make huge amounts of money while its external economic impacts are 100% negative.

But to survive, an organisation needs to cover its costs, and how it does so matters.

If it’s a charity, it needs income from donations. If it’s a public service, it needs to be paid for by taxes. If it sells products or services, it needs customers willing to pay for them. If it needs external investment, it needs investors willing to invest.

Each of these approaches to funding has its advantages and disadvantages. At Mydex, we chose not to be a charity for two reasons. First, because with a focus like ours on personal data we feared that we would end up spending so much time and effort seeking benefactors and donations that this quest could end up diverting our attention away from actually delivering our mission. Second, this constant need to attract benefactors might place us under pressure to bend our mission to these benefactors’ whims.

Likewise, we don’t think an organisation with a mission like ours, relating to personal data, should rely directly on taxpayer funding for two reasons. First, there are immense risk and trust issues involved in the state having comprehensive citizen databases under its control. Second, taxpayer funded services often find themselves at the mercy of shifting political winds. That is why we chose to be a company that can cover its costs from what it sells: so that it isn’t dependent on external funders and remains free to make its own decisions. For us, this strategic independence is extremely important.

But does this mean we simply have to bend to the will of a different set of external parties, namely customers (e.g. the organisations who pay Mydex fees to connect to its platform) and investors?

It could, in theory. But we have designed our business model carefully to avoid this risk. We have designed our revenue streams so that we are financially incentivised to be a neutral enabler, not a ‘data monetiser’: we only sell connections to our platform to enable safe, easy, low cost data sharing; we don’t sell or rent any data itself. And the dividend cap for shareholders means the community always benefits from the lions’ share of any profits we happen to make.

Getting this balance right is crucial because of the extraordinary potential of the Mydex business model: the more it grows the more profitable it gets. Exponentially. As a platform business, Mydex’s core operating costs do not rise rapidly with volume use. But the more organisations that connect to this platform, the more revenues the platform generates. In other words, as Mydex grows its revenue streams grow much faster than its costs — which means that Mydex has the potential to become extremely profitable. By creating a legally-enforceable dividend cap that requires it to reinvest two thirds of any profits it makes in its community mission, CIC status ensures that Mydex’s external, economically beneficial community purpose always remains paramount.

(This has an important knock-on internal cultural impact. It means that in everything we do and how we do, we focus all our efforts on doing those things that will continually improve the external value we generate — our community contribution — on ‘making’ rather than ‘taking’. It creates a discipline and a yardstick for decision-making.)

Strategic logic

But this external value focus creates a potential problem. Why should investors bother investing in a company that only returns a third of the profits it makes to them when they could in theory get all the profits?

The simple answer to this question is that a third of a very large sum is much bigger than all of a tiny sum.

There is a paradox to this answer which goes to the heart of Mydex’s CIC status. It relates to the two separate elements: the ‘very large sum’ and the ‘tiny sum’.

Let’s start with the very large sum. With its data sharing infrastructure Mydex is creating the equivalent of a mass production assembly line for the entire data economy. Henry Ford’s assembly lines reduced the costs of making motor cars by over 90%. They made a transformational product affordable to ordinary people, unleashed a tidal wave of product innovation and transformed the way societies and economies worked. Mydex is doing the same with personal data: slashing the costs of accessing and using it, making its benefits available to ordinary people, unleashing innovation and transforming the way our society and economy uses our data in the process. The potential scale of this business is enormous and global. It could generate very large sums of money.

What about the tiny sum? A year or two ago, the UK Treasury published a paper on the Economic Value of Data. It contained a crucial insight. It noted that “innovative uses of data can generate powerful positive externalities (i.e. benefits) that may not always accrue to the data creator or controller. This can result in data being under-exploited or under-shared.”

In other words, the Treasury was accepting that private companies focused only on profit maximisation have no reason to pursue the transformational external economic benefits we have been talking about: they have no reason to build the data sharing infrastructure that Mydex is building.

This means that without a new type of institution that prioritises the external benefits identified by Treasury they won’t ever happen. If we stick with existing institutions with their existing priorities, the very large sum becomes a tiny sum. Forever. Investors who insist on having access to all the profits will get close to none instead. The opportunity will be stillborn.

This goes deep into Mydex’s role. The UK Treasury was highlighting a strategic collective action problem. It would greatly benefit all parties if investment was made into the infrastructure that makes the safe, efficient, privacy protecting sharing of personal data possible, so that all its uses and potential can be realised. But it’s not in the immediate interests (or role) of any current data controllers (e.g. organisations collecting and using personal data) to take on the cost, risk or responsibility of making this investment.

Somebody else needs to take on this role. But this somebody has to be different. They cannot be focused on grabbing all the benefits for themselves. They have to be focused on creating external community value. And these priorities need to be baked into how they work. That’s what CIC status does: bake these purposes into the organisation’s legal structure. As a CIC Mydex is legally required to share financial rewards equitably. And, as a CIC, we are legally required to keep to the above promises in perpetuity (unlike venture capitalist funded outfits that are incentivised to make all the promises in the world and to break these promises once they have achieved a profitable ‘exit’).

To put it simply, to break the Treasury’s collective action problem and to fully unleash personal data’s ‘positive powerful externalities’ we need a new type of institution that every stakeholder can trust. Only a new, neutral, independent, non-threatening, non-competing, trustworthy body can break the collective action logjam and Mydex’s CIC status formally confirms and signals to everyone concerned that this is its positive, permanent, enabling role.

The final piece of jigsaw

There is one more piece of this jigsaw that finally locks all the others into place. There are two main types of Community Interest Company: companies limited by guarantee and companies limited by shares.

Companies limited by guarantee are not-for-profit enterprises, while companies limited by shares can make a profit and distribute (some) dividends to shareholders. Mydex has chosen to be a company limited by shares. Why?

To some purists, CICs limited by shares are not ‘real’ community interest companies. The purists’ assumption is that if any shareholder stands to make any money out of investing in the company, then by definition they are extracting value from it and therefore exploiting the community that the company is supposed to be serving.

We don’t see it that way. Mydex is building nationwide infrastructure that will last for decades and building such infrastructure takes time (decades) and money. Which means it needs investment.

Most financial investors today adopt the role of cherry pickers. They search around for ripe fruit ready for the picking and invest in the cherry picking operation: big, quick, low-risk returns. Mydex is doing the opposite. We are planting and tending a cherry orchard from seedling to maturity — without which there will be no cherries to pick (the ‘positive externalities’ the Treasury was talking about). But this requires a different type of investor: one who ‘gets’ the mission and is prepared to be patient (because reaching the point where those increasing returns kick in will take many years).

Paying dividends (limited to one third of any profits made) to such investors is not exploitation of the community. It is paying for a service that makes a community benefit possible, just as paying staff to develop the software is not staff exploiting the community but making the community benefit possible. Once again, CIC status helps us to turn potential conflict into alignment to achieve a positive way forward.

Locust or bee?

Mydex’s mission — to empower every individual with their own data — is potentially global, long term and paradigm-changing: if successful it would put the modern data economy on a new and different footing.

In nature, we find many strategies for survival. There are predators and prey, parasites and hosts, foragers, scavengers, hunters. Locusts flourish by devouring the value other organisms have produced. Bees flourish by helping plants flourish — to flower and fruit.

Today’s personal data economy is dominated by locusts, intent on extracting as much profit as they can from individuals’ data. In choosing to be a Community Interest Company, Mydex brings together economic, financial and strategic logic to make it possible for the company to flourish as a bee, and therefore for the orchards to flourish and fruit too. We flourish by helping others to flourish. That, we believe, is why in the long term we will be successful. And that is why we are a Community Interest Company.

Can I trust you? was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

Everything You Ever Wanted to Know About Session Management in Node.js

Session Management is a pretty scary term for a lot of developers. Most of the mechanics of session management are abstracted away from developers, to the point where they don’t properly learn about it until it’s necessary. Usually, this means a late night trying to figure out a vulnerability, a bug, or how to work with a new session management library. Hopefully, I can remove some of the magic

Session Management is a pretty scary term for a lot of developers. Most of the mechanics of session management are abstracted away from developers, to the point where they don’t properly learn about it until it’s necessary. Usually, this means a late night trying to figure out a vulnerability, a bug, or how to work with a new session management library. Hopefully, I can remove some of the magic behind session management in NodeJs and give you a good fundamental place to start. In this article, you will learn what sessions are, how to manage them in Node.js, and some details that should help you minimize bugs and vulnerabilities.

Overview of Sessions

I will start at the beginning. When a user uses your web application, they make an HTTP request to the webserver. The server then knows what to do with this request and returns a response. The user’s request must contain all the necessary information for the server to make a decision. For example, is the user authenticated? Is he/she authorized to view a specific page? Does the user have a shopping cart that needs to be displayed? All this information that makes the user experience feel seamless, as though he/she is using one continuous application, must be transmitted from the client to the server. If the application were to transmit all this data back and forth on each request, you would introduce massive security and performance concerns. There are many ways to streamline this effort, but for the sake of this article, I will focus on cookie-based sessions.

When a user sends a request to your web application, they will add a session cookie to the request headers. If this is the first time a user has requested your site, the server will create a new session cookie and return it to the client. The session is given an ID that both the client and server can reference. This enables the application to split the session data between that which is stored on the client-side and that which is stored on the server. For example, the cookie may tell the application the session ID and that the user is authenticated. The server can use this information to access the user’s shopping cart in the server’s store.

Of course, any time an application sends information to the client it could possibly end up in an enemy’s hands. The majority of your users aren’t necessarily bad actors, but bad actors are certainly looking for ways to exploit your application. Most security concerns should be addressed by you, the developer and maintainer of the application and the developer of whatever session management library you’re using.

Session Management Risks

As an aside, I should note that it’s generally better to use a well-established session management library than to roll your own. You will see a few examples of that later in this article. For now, let’s take a look at some common session security concerns.

Session Prediction

Each session has an associated session ID with it. It is very important that this session ID is properly randomized, such that an attacker cannot simply guess a few options and bypass any security associated with the session ID. Suppose your session IDs were sequential integers. An attacker could log in, create a new session, and see their session ID is 12345. The attacker could then try to pass the session ID 12344 or 12343 to the server in an attempt to hijack a session from another user.

Session Sniffing

In session sniffing, an attacker can use a sniffing application such as Wireshark or a proxy to capture network traffic between a client and server. As you’ve learned, that traffic will ultimately contain a request with a session cookie in it. This cookie will have the session ID which can then be hijacked.

Man-in-the-Middle Attacks

In a man-in-the-middle attack, an attacker sits between the web server and the client. The attacker can then pass requests from the client to the server and respond without detection from either. But along the way, the attacker has gained access to the session.

Client Side Attacks

There are many strategies for attacking the client. Some of the best known are Cross-Site Scripting, Trojans, and malicious javascript codes. Sometimes it just takes some good old-fashioned social engineering to obtain session information from a client. The idea here is that the attack will attempt to exploit the client itself to get access from the cookie in the browser’s storage. For example, an attacker that can inject malicious Javascript code could inject the following javascript:

alert(document.cookie);

With that simple line of code, the attack can now gain access to the cookie along with all the session goodies in it.

Good Session Management Practices

All of this is probably a little scary, But many people are working on the other side to help prevent these attacks or minimize their impacts. Most of these strategies are rolled into session management libraries, and any library that is continuously maintained should be up to date with the latest security enhancements. But, it’s important for you, as a developer who takes security seriously, to understand what security should be in place.

Session Secret

Any good session management library should come with an option to change the session secret. It may have slightly different names but the idea is always the same. The secret is used to compute a hash against the session ID. This helps to prevent session hijacking by ensuring the session cookie hasn’t been tampered with.

Session Expiration

Another good practice is to expire the session after some predetermined time. There are two ways to expire a session: (1) based on inactivity or (2) absolutely. When you base your expiration on inactivity, it will keep the session open until the user hasn’t made a request for some amount of time. When you choose to expire it absolutely, then the session will expire after some predetermined time, regardless of activity. The session will then need to be refreshed. You should at least set an inactivity session expiration so you don’t have stale and vulnerable sessions available for attack.

Cookie Expiration

Similar to session expiration, you can also expire the cookie that was sent to the browser. Many times, cookies are set to expire when the session expires. However, it is possible to allow the cookie to remain available indefinitely. This is a poor decision for the same reason as session expiration. Generally, session expiration is a strong tool for minimizing the impacts of attack, but implementing cookie expiration is also helpful.

Regeneration of Session After Login

When a user first accesses your site, he/she can use an anonymous session. This is a fairly common practice where you want to track a user’s movements in your application but don’t want to require them to log in. For example, consider a shopping site where an anonymous user can have a shopping cart but can’t check out until he/she logs in. In this case, there is still a session ID provided to the user. When the user does log in, you should regenerate the session ID to prevent session fixation attacks against your web application.

Session Management in Node.js

Node.js has become wildly popular over the last 10 years. With it, several frameworks and session management libraries have cropped up to support it. Most frameworks use their own session management middleware. For example, express, the most popular server framework for Node.js, has the accompanying express-session for session management. Similarly, koa uses koajs/session for its session management.

For this article, I want to focus on express-session, as it is one of the most mature and widely used session management libraries available for Node.js. For a full rundown of the express-session package, you can view the readme here. Below are some of the highlights:

Registering the middleware for express-session is very simple.

var session = require('express-session') var app = express() app.use( session({ secret: 'SomeSuperLongHardToGuessSecretString', resave: true, saveUninitialized: false, }) );

This is the minimum you need to do to get express-session working in a development environment. As discussed, the secret is used in hashing the session ID, to ensure it isn’t tampered with between the client and the server. This should be some very long, complex string that is hard to guess. This string should be rotated periodically to ensure that if it was compromised, it doesn’t stay that way long. The secret can also take an array of secrets to make it even harder to guess.

The resave option enforces that the session is resaved against the server store on each request, even if the session wasn’t modified. The saveUninitialized property forces a new session to be saved. Both the resave and saveUninitialized options are left to your discretion on how best to implement. Generally, saveUninitialized: false is used to reduce the session storage on the server for unauthenticated requests. resave is defaulted to true, but to reduce overhead you can set it to false if your store allows.

Speaking of the store, you notice in this example you haven’t implemented one. The default server-side session storage is MemoryStore. According to the documentation, this is purposely not designed for production. This means it’s fine to leave the above code this way in development, but you should implement a different store in production. Not doing so can result in memory leaks and does not scale past a single process. A list of compatible session stores can be found here.

The following example uses express-sessions (note sessions instead of session) as a server-side store using MongoDB:

app.use( express.session({ secret: 'SomeSuperLongHardToGuessSecretString', resave: true, saveUninitialized: false, store: new (require('express-sessions'))({ storage: 'mongodb', instance: mongoose, host: 'localhost', port: 27017, db: 'test', collection: 'sessions', expire: 86400 }) }));

Now your application will use your MongoDB instance to store the server session data. Express-sessions also supports a Redis implementation. Furthermore, there are many libraries for supporting other databases and in-memory solutions. Your stack will likely dictate what package you need to use.

Coming back to the security aspect of sessions, you learned that you should expire the session and the cookie. The example below builds on our working example to do just that.

app.use( express.session({ secret: 'SomeSuperLongHardToGuessSecretString', resave: true, saveUninitialized: false, store: new (require('express-sessions'))({ storage: 'mongodb', instance: mongoose, host: 'localhost', port: 27017, db: 'test', collection: 'sessions', expire: 86400 }), cookie: { maxAge: 2628000000 }, }));

In the above expression, only the session ID is stored in the cookie, so you can set the cookie.maxAge value to expire the session and the cookie in one shot. There is also an option for cookie.expires, however it is recommended that you set the maxAge option instead.

Now from your routes, you should be able to access the session object from your request object. Earlier you learned that you should regenerate your session after the user logs in. Let’s take a look at that using express and the ExpressOIDC from Okta.

const oidc = new ExpressOIDC({ issuer: {yourOktaDomain} + "/oauth2/default", client_id: {yourClientId}, client_secret: {yourClientSecret}, appBaseUrl: process.env.APP_BASE_URL, scope: "openid profile", routes: { login: { path: "/users/login", }, callback: { path: "/authorization-code/callback", }, loginCallback: { afterCallback: "/users/afterLogin", }, }, }); app.use(oidc.router); app.get("/users/afterLogin", ensureAuthenticated, (request, response, next) => { request.session.regenerate(function(err) { // will have a new session here }) });

In the above example, you are registering the ExpressOIDC middleware provided by Okta to handle the login. After the successful login, you redirect the users to the users/afterLogin router, which then has access to the request object. Express-session has attached the session object to the request for you and you can use the session API to call regenerate. This will create a new session for the logged-in users.

Learn More

Session management is a topic that you could spend days researching and understanding. As I noted, the session management package you will use in Node.js will largely depend on your stack and your server framework. However, by becoming familiar with implementing safe and optimized session management in one framework, you can carry that knowledge to all other Node.js frameworks.

Why JWTs Suck as Session Tokens Build a Simple CRUD Application with Node and MySQL Build A Simple Web App with Node and Postgres

Make sure you follow us on Twitter and subscribe to our YouTube channel. If you have any questions, or you want to share what tutorial you’d like to see next, please comment below.

Sunday, 06. June 2021

Identosphere Identity Highlights

Identosphere #35 • SSI for Europe • Covid Creds coming together • Issue DID+VC w Azure AD

Who's Hiring? Whos Funding? What Who's Who thinks about What and Why! Where and When are upcoming events? What recordings are available from previous events?
Welcome and Thanks to our Patrons! Read more and Subscribe Support this publication + get Patron only content!! Hiring

GIMLY IS HIRING - work closely with the UX lead and product owner in planning the development work

Indicio Hiring Interns - Test Engineer Mobile Developer Python Developer Project Manager Business Development Assistant to the CEO

Indicio Hiring Sr Devs - Sr Python Developer Sr Mobile Developer

Provenance Managing Director • every great product should come with Provenance: accessible, trustworthy information about origin, journey and impact.

Coming Up

Paving the Way to a Safer Travel Experience - Heather Dahl, & Scott Harris, Indicio.tech; Adrien Sanglier, SITA • June 8

Hyperledger Global Forum•June 8-10

Building a Hyperledger Indy Network – A Technical Overview•Lynn Bendixsen

Panel: Paving the Way to a Safer Travel Experience • Heather Dahl, & Scott Harris, Adrien Sanglier, SITA

Panel: Start Simple to Scale Decentralized Identity • Heather Dahl & Kenneth Ebert, R. J. Reiser, Liquid Avatar Technologies

Panel: Self-Sovereign Identity for Economic Empowerment: Lessons from Africa • Anna Johnson, Trinsic; Thea Sommerseth Myhren, Diwala; Lohan Spies, DIDx; Fabian Portmann, Farmer Connect; Bryan Pon, Kiva

Identiverse 2021 • June 21-23 (Denver)

FRP from UN for SSI RFP: Implementation of blockchain based self-sovereign identity (SSI) Solutions United Nations Global Market Place 

The United Nations International Computing Centre (UNICC) invites you to submit a proposal for the implementation of blockchain based self-sovereign identity (SSI) solutions. (DEADLINE 06/22)

Interesting and Thoughtful  The shadowy hierarchy

I remain curious about how I can make better or wiser decisions.  I am sharing this as part of my journey as I unpack my own boundaries and models that prevent me from making better decisions.  

Local-first software: You own your data, in spite of the cloud

include the ability to work offline and collaborate across multiple devices, while also improving the security, privacy, long-term preservation, and user control of data.

Apple vs (or plus) Adtech, Part II

To review… in Settings—> Privacy—> Tracking, is a single OFF/ON switch for “Allow Ads to Request to Track.” It is by default set to ON. 

101 A beginner’s guide to self-sovereign identity (SSI) 19 FAQs on Verifiable Credentials and Self-Sovereign Identity Do I Need a Verifiable Credential? What is a DID? Part 1 XSL Labs En Francais Aussi Qu’est-ce qu’un DID? Partie 1 XSL Labs Defining Self-Sovereign Identity with Kaliya Young | Coding Over Cocktails Podcast Adrian Doerk on the difference between identification, authentication and authorization

Identification: Who are you?

Authentication: Is it you again?

Authorization: What rights do I want to grant you?

Use-case Digital signatures — digital transformation in the real estate sector

The implementation of the electronic signature was a major development in the development in management.

99% of our documents are digitally signed digitally and we have eliminated many face-to-face meetings.

‘new normal’ in housing is digital, democratic and sustainable

how can governments, citizens and landlords of all types harness these opportunities and collaborate to shape housing of the future so that it works for everyone, everywhere? 

SSI Updates Introducing the Evernym Mobile SDK

It’s through your feedback that we’ve been able to iterate, refine, and ultimately launch a product that will help make self-sovereign identity more accessible to all.

Anonyome Labs Listed in IAPP Privacy Tech Vendor Report 2021

joins 355 other qualified organizations in the 2021 compendium of privacy tech vendors. This is a significant increase from the 44 vendors listed in the inaugural report in 2017.

Canada’s Community of Digital Identity Leaders 100+ Members

more than ever before, our communities, our businesses, and our citizens are looking to the leaders within the DIACC to help deliver a robust, secure, trusted digital ID ecosystem that works for all Canadians.

The Future of Self-Sovereign Identity (SSI) (video)

We were joined by Drummond Reed and Alex Preukschat, co-authors of Manning Publication's new book 'Self-Sovereign Identity,' for a conversation on the book's development and recent release and what the future holds for SSI as a technology, architecture, and movement.

COVID Creds Platform Architecture for Covid-19 Digital Passports

This highlights a couple of key vendor solutions:

Appii – Appii has developed their Health Passport, a service that verifies your identify through a selfie photo, is populated through recording your test result at one of their partner sites (eg. Lloyds Pharmacy) and provides a digital certification.

Digi.me – Digi.me is a specialist in general data sharing services and have developed a number of apps that build on this capability, including a Covid-19 solution.

Health data must be private and secure by design, always

At digi.me, we practice what we preach, with privacy and security always core considerations for our health data capability as well as our Consentry health pass as they move forwards.

How can we make platform livelihoods better for young women, especially during and after COVID-19?

But who is the “we”? The research asks exactly that — who is the “we” that needs to make the platform work better for women?

How festival organisers can maximise Covid safety and eradicate ticket touts

Festival organisers will also need to do better at managing delays than other sectors. In recent weeks, we’ve seen Heathrow airport reporting delays of up to six hours. This would be catastrophic at a festival

Verify Vaccination Data w ZKP using ASP.Net Core and Mattr

verify a persons vaccination data implemented in ASP.NET Core and MATTR. The ZKP BBS+ verifiable credentials are issued and stored on a digital wallet using a Self-Issued Identity Provider (SIOP) and Open ID Connect. […] only the required data is used and returned to the verification application. 

JWTs done right: Quebec's proof of vaccination

my proof of vaccination finally arrived, and the result is… actually pretty okay. Still, there's always some fun to be had in zero-knowledge hacks

PocketCred Verifiable Credentials

We at Pravici have been working to build a digital pass that citizens can carry in their mobile device or digital card to prove that they have taken a test or vaccine. Our software application features user-friendly creation of schemas* and proof templates**, as well as QR code technology for credential issuance and verification.

Beyond the Basics Issuing your own DIDs & VCs with Azure AD Expert Q&A about SSI with Dr. Milly Perry and Martin Schäffner

In may, blockchain expert & former research director at the Open University of Israel, spoke with initiator of the SSI Working Group at the European Blockchain Association at an Israeli Chamber of Information Technology webinar. (recording)

We asked both experts, which questions they would like to know their peers’ thoughts about. Here is their exchange about Verifiable Credentials, biometrics, pitfalls and barriers, NFTs, the role of governments and the thing that could make SSI obsolete.

Compare and Contrast — IRMA vs Verifiable Credentials Data Watch “Secure Platform” for Europe - a Trusted and Secure Foundation for a Human-Centric Digital World MyData

solutions for creating an open and secure IT infrastructure where data privacy can always be guaranteed. […] written by esatus, founding member and lead of the “Secure Platform” working group, a thematic group within the IT Security Association Germany (TeleTrusT).

Schema.org is ten!

Schema.org was founded on the idea of making it easier and simpler for the ordinary, everyday sites that make up the web to use machine-readable data, and for that data to enable an ecosystem of applications used by millions of people. […] if we can all keep these founding concerns in mind as we improve, refine and curate our growing collection of schemas, we'll be doing our part to continue improving the web.

SSI for Europe Digital Identity for all Europeans

Available to any EU citizen, resident, or business in the EU who wants to use it

Widely useable as a way of identification or to confirm certain personal attributes for the purpose of access to public and private digital services across the EU

Giving full control to users to choose which aspects of their identity, data and certificates they share with third parties, and keep track of such sharing

eSSIF-Lab Principles

Since parties are autonomous, their trust is highly subjective. As a consequence, the idea of having 'trusted registries', 'trusted issuers' that do not take this subjectivity into account basically act as (centralized) authorities, denying that parties are autonomous.

Commission proposes a trusted and secure Digital Identity for all Europeans

there is no requirement for Member States to develop a national digital ID and to make it interoperable with the ones of other Member States, which leads to high discrepancies between countries. The current proposal will address these shortcomings by improving the effectiveness of the framework and extending its benefits to the private sector and to mobile use.

TechCrunch Europe wants to go its own way on digital identity

Alongside today’s regulatory proposal they’ve put out a recommendation, inviting member states to “establish a common toolbox by September 2022 and to start the necessary preparatory work immediately” — with a goal of publishing the agreed toolbox in October 2022 and starting pilot projects (based on the agreed technical framework) sometime thereafter.

“This toolbox should include the technical architecture, standards and guidelines for best practices,” the commission adds, eliding the large cans of worms being firmly cracked open.

A trusted and secure European e-ID - Regulation

The legal instrument aims to provide, for cross-border use:

– access to highly secure and trustworthy electronic identity solutions,

– that public and private services can rely on trusted and secure digital identity solutions,

– that natural and legal persons are empowered to use digital identity solutions, 

– that these solutions are linked to a variety of attributes and allow for the targeted sharing of identity data limited to the needs of the specific service requested,

– acceptance of qualified trust services in the EU and equal conditions for their provision.

Beyond SSI Holochain Gets Some Parsts Upgrades Tools for Thought Interchange: Part 1 Fission Codes

also known as second brains, digital gardens, or simply personal note taking apps — […] The question we posed was: how do we work on interchange between these systems?

What does ‘Good’ Digital Identification look like? Not Looking Good Very Worrying Development Internet Identity: The End of Usernames and Passwords via centralized issuance of a number?  The business models of identity

A post by Verim justifying their pay to play for Identity credentials.  Adding another layer of complication

Thanks for Reading!

Read more \ Subscribe @ newsletter.identosphere.net

Support this publication @ patreon.com/identosphere


KuppingerCole

Analyst Chat #79: DNS and Privacy

Your DNS server knows what websites you use, what the name of your mail server is, and which corporate services you use while working from your home office. And there are even broader challenges when it comes to protecting sensitive personal data in that context. Alexei Balaganski and Matthias continue their conversation about a fundamental Internet resource, the Domain Name System, this time walk

Your DNS server knows what websites you use, what the name of your mail server is, and which corporate services you use while working from your home office. And there are even broader challenges when it comes to protecting sensitive personal data in that context. Alexei Balaganski and Matthias continue their conversation about a fundamental Internet resource, the Domain Name System, this time walking the fine line between technology and trust.




Spruce Systems

DIDKit v0.2.1 Now Available on Cargo

This week, we’re proud to announce the v0.2.1 release of DIDKit on Cargo, and DIDKit-WASM v0.1.7 on npm. DIDKit is a cross-platform toolkit for working with W3C Verifiable Credentials and Decentralized Identifiers. It has SDKs in JavaScript, Java, Python, and more, and it even runs in the browser frontend. It supports a variety of cryptographic algorithms allowing it to seamlessly bridge tru

This week, we’re proud to announce the v0.2.1 release of DIDKit on Cargo, and DIDKit-WASM v0.1.7 on npm.

DIDKit is a cross-platform toolkit for working with W3C Verifiable Credentials and Decentralized Identifiers. It has SDKs in JavaScript, Java, Python, and more, and it even runs in the browser frontend. It supports a variety of cryptographic algorithms allowing it to seamlessly bridge trust across different blockchains and public key infrastructures, including traditional X.509 and TLS.

We wrote DIDKit in Rust due to its memory safety, expressive type system, and suitability across a variety of systems and environments. For example, the Rust ecosystem has already explored WASM compilation targets in support of single-page apps running in browsers, and we wanted to be able to support those and also browser extensions with DID and VC operations.

To try out DIDKit using the command line from a Rust-enabled environment, simply run the following:

$ cargo install didkit-cli
$ didkit --version
didkit-cli 0.1.0
$ didkit generate-ed25519-key | tree key.jwk
{"kty":"OKP","crv":"Ed25519","x":"g2rSWdI1YGbOTwkzQl2HbqZ-LJNS69p-GZdNdv6N7OU","d":"1mZOfhcQa3JG_P-jqyyAdVSaSluh9p1BwP0GATOtECI"}
$ didkit key-to-did key --key-path key.jwk
did:key:z6MkoJFvkSmF2bjqGbu3fBj71hTRmj9ScwVFoA99JGW2iTnt

Once you have DIDKit installed, you can run an example script here that will show you how you can issue, present, and verify a credential. For more details, please see our documentation here.

Follow us on Twitter

Follow us on LinkedIn

Saturday, 05. June 2021

Ocean Protocol

NFTs & IP 3: Combining ERC721 & ERC20

Representing base IP with ERC721 and fungible IP licenses with ERC20, to combine benefits of both Summary The goal of this series is to practically connect NFTs with Intellectual Property (IP) to help NFT creators and collectors. It uses the language of IP and focuses on Solidity code-level implementations. Article 1 of this series focused on the ERC721 non-fungible token standard, and art
Representing base IP with ERC721 and fungible IP licenses with ERC20, to combine benefits of both Summary

The goal of this series is to practically connect NFTs with Intellectual Property (IP) to help NFT creators and collectors. It uses the language of IP and focuses on Solidity code-level implementations.

Article 1 of this series focused on the ERC721 non-fungible token standard, and article 2 on ERC20 fungible token standard. This article links ERC721 & ERC20, and describes how combining them gets benefits of both.

1. Introduction 1.1 Scenarios

There are two big-picture scenarios on how base IP and sub-licenses relate.

Just base IP. This is quite simple. It’s a single artifact and there is no sub-licenses. For example, an artist paints a painting and has a copyright of it. That painting is sold to a collector who then gets exclusive rights to it. The base IP started as copyright held by the painter, then was transferred to the collector. Sub-licenses against the base IP. This has many varieties. Typically the sub-licenses are fungible. For example, an author writes a book and has copyright of it; he does an exclusive license of this base IP to a publishing house. The house sells 10,000 physical books, each having its own fungible sub-license against the base IP. As a second revenue stream, it might also sell 20,000 e-books, each having its own fungible sub-license (a different license compared to the physical books). 1.2 Implementing Scenarios in Blockchain

Here’s how these scenarios might get implemented in blockchain.

For scenario 1 above, the Single-Edition ERC721 described in Article 1 is appropriate.

Scenario 2 is much richer, leading to several implementation possibilities. Article 1 described how Limited-Edition ERC721 could be used, and Article 2 described how ERC20 could be used. This article will describe a third approach that combines both. But first let’s do some comparison.

1.3 Comparing ERC721 andERC20

Here are pros and cons for ERC721 for limited-edition / fungible IP.

EC721 Benefits. First, if history of each unique token matters, and its provenance needs to be traced. Second, user experience is decent, since ERC721 wallets explicitly hold ERC721 tokens and render images alongside the ERC721. ERC721 markets have good price discovery for truly non-fungible assets, typically using auctions. ERC721 Challenges. Limited-edition ERC721 assumes the tokens are non-fungible, even if the actual asset is fungible IP licenses. This makes the pricing problem harder, because it converts an apples-to-apples comparison into an apples-to-oranges one. Then, it can’t use AMMs and other powerful ERC20 price discovery tools. Finally, limited-edition ERC721 doesn’t model who owns the base IP at all, let alone make it easy to manage.

Here are pros and cons for ERC20 for Limited-edition / fungible IP.

ERC20 Benefits. First, tokens can go directly in AMMs which allows automatic price discovery, liquidity, and curation. Second, tokens play well with the rest of ERC20 & DeFi infrastructure, from ERC20 wallets to bridges to loans. Third, base IP is represented by ERC20.owner address, and therefore it can be transferred and managed. ERC20 Challenges. While base IP can be transferred and managed, tooling doing so is poor. For example, wallets only store the private key for the ERC20.owner address; it doesn’t show up as a token.

Here’s the key takeaways. First, the benefits of one are roughly the challenges for the other. Second, neither does well in managing base IP. It’s nonexistent for ERC721 and poor for ERC20. Yet managing base IP is extremely important! We want to make it easy for people to sell base IP, have multiple revenue streams against it, link it with physical objects and more.

1.4 Combining ERC721 + ERC20: How

This section describes at a high level how to combine the benefits of ERC721 and ERC20, and better manage base IP.

The basic idea is to tokenize the base IP and tokenize sub-licenses against it. Let’s bring in specific token standards.

Base IP is non-fungible, so ERC721 is a good way to represent it. Fungible IP sub-licenses are best represented by fungible token standard ERC20. And, we generalize this to multiple revenue streams against the base IP; some use ERC20, some may use ERC721, and others may use other standards or fully custom approaches [1]. A “good default” is simply ERC20. 1.4 Combining ERC721 + ERC20: Benefits

This leads to benefits at a high level, benefits from ERC721, and benefits from ERC20. Here are high-level benefits:

Simple mental model non-fungible base IP → non-fungible token standard, and fungible IP licenses → fungible standard. Enables multiple revenue streams against the base IP, with different sub-licenses. For example, a book might have a kindle version, softcover print version, and hardcover print version [8].

Base IP inherits benefits of ERC721:

Direct wallet support for base IP. Render images, transfer base IP, versus just holding the private key for ERC20.owner Sell base IP in marketplaces focusing on ERC721 NFTs, with corresponding price discovery tuned for ERC721. E.g. OpenSea Easier to link physical objects with base IP, e.g. WISeKey chips in paintings, linked directly to NFTs Tools to fractionalize ownership of base IP, e.g. Niftex tool. This allows co-ownership of base IP, such as VitaDAO for drug discovery data

Fungible IP sub-licenses inherit benefits of ERC20:

Fungible IP tokens can go directly in AMMs, for automatic price discovery, liquidity, and curation. Fungible IP tokens play well with the rest of ERC20 & DeFi infrastructure, from ERC20 wallets to bridges to loans.

This combines the UX benefits of ERC721 with the fungibility benefits of ERC20. ERC20 doesn’t have to be limited-edition either; for example ERC20 tokens (licenses) could be minted on-the-fly.

At Ocean Protocol for V3, we built an ERC20-based IP framework. We’re now working on ERC721+ERC20 for Ocean V4.

The rest of this article gets more technical; it describes what ERC721+ERC20 code is doing from an IP perspective.

2. ERC20 + ERC721 Code & IP 2.1 Definitions To Publish means to claim base IP, ready for sub-licensing of limited editions. The claim is via ERC721._safeMint() at a new tokenId with to=self, and metadata points to a T&C stating this is a claim of base IP. Metadata follows ERC721Metadata extension. Then, use ERC20.constructor() and ERC20.mint(to=self, value=n licenses) to prepare for sub-licensing. The ERC20 can be linked to ERC721 in one of many ways. To Sub-license means to transfer one (of many) sub-licenses to a new licensee. This is implemented as ERC20.transfer(to=licensee, value=1.0) [6]. For legals, T&C states that owning ≥1.0 tokens means having a sub-license with specific rights. Those sub-license rights associated with the given ERC20 contract, not the ERC721 contract. Base IP: represented by the tuple of {ERC721 deployment, tokenId}. Base IP holder: represented by the owner of the address at ERC721.owners[tokenId]. Sub-licensee: any address holding ≥ 1.0 ERC20 tokens. 2.2 ERC20 + ERC721: Sequence Diagram

Here’s a worked example focusing on behavior. There are 3 limited editions.

Step ❶ is a Publish action. The publisher calls ERC721._safeMint() to claim the base IP, which sets the value at ERC721._owner[tokenId=7] is set to addressP. Then the publisher calls ERC20.constructor() and ERC20.mint() to mint 3 sub-license tokens with addressP as initial owner; this sets the value at ERC20.owner to addressP, and ERC20.balances value at key addressP is set to 3.0.

Step ❷, ❸ and ❹ only involve the ERC20 contract, and work exactly like Limited-Edition ERC20 sub-license steps.

Step ❺ transfers the base IP to a new exclusive license holder, by calling ERC721.safeTransferFrom() which changes ERC721.owners[tokenId=x] to a new address.

ERC721+ERC20: UML Sequence Diagram 2.3 ERC20 + ERC721: Structure Diagram

Here’s the same worked example, showing system state after each action. The two images below illustrate.

When Completed ❶ (Publish), the ERC721._owners[tokenId=7] value is addressP, meaning the publisher is the base IP holder. The ERC20._balances attribute holds one entry with key addressP and value 3.0, meaning the publisher holds three sub-licenses.

The ERC20 can link to the ERC721 in one of many ways. In this example, we riff on ERC20 Ownable mixin with an NftOwnable mixin, replacing ERC20.owner with ERC20.nftOwner that points to the NFT tuple {ERC721 address, tokenId}. Ability to mint ERC20 tokens and other permissions that Ownable gave to the address at ERC20.owner now go to the address at ERC721._owners[tokenId].

Steps ❷, ❸ and ❹ only involve the ERC20 contract. They only change ERC20._balances variables just like Limited-Edition ERC20 sub-license steps. ERC721 state is not changed.

ERC721+ERC20: UML Structure Diagram, steps 1–4

When ❺ is complete, base IP has been transferred. ERC721.owners[tokenId=7] now holds the value address3 (not shown). The address3 owner now controls ERC20 minting and other permissions.

ERC721+ERC20: Structure Diagram, step 5 3. Conclusion

Article 1 of this series focused on the ERC721 non-fungible token standard, and article 2 on ERC20 fungible token standard.

This article connected IP to ERC721+ERC20 Solidity implementations, and how it combines the benefits of ERC721 and of ERC20 individually.

4. Appendix: Related Approaches 4.1 NFT Vaults

This is another approach represent base IP & fractional licenses. Start with ERC20 setup of article 2, but put the private key that owns ERC20.owner attribute into an NFT “vault” using NFwallets, Charged Particles, or emblem.vault.

This seems simple at first glance. However, each ERC20’s owner would be represented by a different NFT, which causes ambiguity if only one of those NFTs is transferred. Also, the workflow to publish is clumsier. Finally, the overall mental model is more complex, which hinders adoptability.

4.2 Fractional ownership of ERC721

Base IP can have fractional ownership. Here’s a traditional example: a company holds a patent, and that company has 10 shareholders. Each shareholder therefore has fractional ownership of the base IP.

Standards like EIP1633, EIP2981 and EIP3601, and tools like Niftex allow one to shard ownership of an ERC721 token, where ERC20 tokens represent the fraction of ownership into the ERC721 token. That’s highly useful to shard ownership for base IP, including the setup in this article where base IP is represented by ERC721.

The main goal of this article is to contemplate representing ERC20 as fungible IP license against ERC721 base IP.

In short, there are (at least) two complementary ways to use ERC20 in relation to ERC721: ERC20 to shard an ERC721 token, and ERC20 for fungible IP licenses against base ERC721 IP. This article focuses on the latter.

Interestingly, by using ERC20 for fungible IP licenses, we get fractional ownership of these IP licenses (if decimals > 0, which is usually the case).

Acknowledgements

Article 1 of this series acknowledges the many people that have helped towards the whole series, including this article. Thank you again!

Notes

[1] Having base IP represented as ERC721 tokens is general enough to allow monetization beyond basic ERC20 IP sub-license datatokens. For example, have Superfluid ERC20 datatokens for streaming payments in C2D or subscriptions. Or, have ERC721 IP sub-licenses. Basically anything, since they just point to the ERC721.

NFTs & IP 3: Combining ERC721 & ERC20 was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Urbit

Azimuth’s First Contract Upgrade

Recently, the first upgrade to the Azimuth smart contracts which make up Urbit ID was put up to an upgrade proposal vote by the Galactic Senate. This vote is still open and will last for 30 days (June 20th) or until an absolute majority is reached. We’d like to take this opportunity to discuss two things. First, what the proposed changes are, and second, a review of how Urbit governance wor


Recently, the first upgrade to the Azimuth smart contracts which make up Urbit ID was put up to an upgrade proposal vote by the Galactic Senate. This vote is still open and will last for 30 days (June 20th) or until an absolute majority is reached. We’d like to take this opportunity to discuss two things. First, what the proposed changes are, and second, a review of how Urbit governance works and where we’re at in the decentralization process.

Before we dive into what the proposed changes are, let’s review what the Senate is actually capable of changing. Urbit ID is really two sets of smart contracts: Azimuth and Ecliptic. Azimuth is the data of the public key infrastructure - roughly, this is the list of ships and which Ethereum addresses own them, along with other data such as networking keys and sponsorship status. The Senate has no ability to touch this data directly. This is in direct contrast to all existing centralized services, where your account is always at risk of being taken away from you. What the Senate can change is Ecliptic, which is the “business logic” that decides how you can interact with the data in Azimuth. These are mechanisms such as what powers various proxies have, how stars/planets are released over time, and how sponsorship works. Put another way, the data and database format must remain the same, but the rules by which we interact with it may change according to the governance rules we detail below.

Changelist

The Galactic Senate is voting on the following changes to Ecliptic:

Fixed ERC721 compatibility Self-modifying proxies Upgraded Claims contract Fixed ERC721 compatibility

In Ecliptic, the contract includes various functions and events that make it conform to the ERC721 (non-fungible token) standard. However, two events have ever so slightly non-confirming descriptions. This causes Ethereum explorers like Etherscan to incorrectly recognize Azimuth as ERC20 (fungible token), rather than ERC721.

The proposed change simply modifies the event descriptions to accurately match the ERC721 definition. This does not affect any functionality within the contract itself.

Self-modifying proxies

Owners of Azimuth assets are allowed to configure "proxy addresses" for those assets: Ethereum addresses allowed to act "as" the owner, but only for a subset of operations. For example, setting the management proxy will let you change your networking keys and sponsor using that address, just like your ownership address can. This is useful for keeping the ownership keys in very cold storage while still letting you perform lower-value operations.

Currently, only the ownership address is allowed to change proxy addresses. Fairly early on after Azimuth's deployment, we realized it might be nice for proxies to be able to change themselves.

This means that, in addition to being able to configure networking keys and sponsorship, your management proxy would be able to assign a new management proxy. This would allow you, or a trusted third party holding your management proxy, to rotate your proxy keys without needing to take your ownership keys out of cold storage.

Upgraded Claims contract

The Claims contract lets asset managers associate various "claims" with their identity. For example, an Ethereum address to send donations to, or proving ownership of a Twitter account.

For ease of use by off-chain tools and services, the contract emits events (notifications) whenever any claims are updated. However, the current version of the contract has a bug where it does not emit events when all claims are removed at once (clearClaims()). This "gap" in the event stream makes it much more difficult to write off-chain tools around this contract. Considering Claims has seen practically no use since it was first deployed, it should not be a problem to simply start over fresh with a new contract that contains the fix for the described bug.

The reason we must do an Ecliptic upgrade to switch to a new Claims contract, is that it is tied in with the logic for transferring an asset. If an asset is flagged for "reset" (common when transferring to/from someone other than yourself), its configured proxies, networking keys and claims are cleared during transfer. As such, Ecliptic will need to know about the address of the new Claims contract.

Urbit Governance

The ultimate goal of Urbit is to become a digital republic manifested as a peer to peer network owned and controlled by its users. The goal of Tlon, then, is to become one of many companies who build products for Urbit rather than being the primary driving force behind its development. We’d like to take this opportunity to spell out our perspective on where we’re at in that process. Here are some previous posts that are relevant, but please note that some of them are several years old and as such may not accurately reflect our current position, but still serve as useful historical markers. We hope to revisit and refresh these documents soon.

2016.5.16 - Interim Constitution 2016.5.16 - The Urbit Address Space 2016.6.24 - The DAO as a Lesson in Decentralized Governance 2019.1.11 - Governance of urbit.org Galactic Senate

The Galactic Senate is composed of all galaxy holders, which at present consists of more than 100 individuals and a few organizations, including Tlon and the Urbit Foundation. Using the Azimuth voting contract, the Senate can present and vote on two kinds of proposals: document proposals and upgrade proposals. Thus far, all matters which the Senate has voted on have been document proposals. The most recent vote declared the Urbit network as (i) being secure (as confirmed by a third party audit) and (ii) having reached continuity (as in, no further network breaches are expected). Previous votes were to declare that Azimuth is live and that Arvo is stable.

Address space distribution

Perhaps the most informative measure of how decentralized Urbit is in terms of how many independent parties hold address space. By the very nature of Urbit, it is impossible to know this with great accuracy (a common feature of decentralized projects). However, the movement of galaxies is closely watched and thus we have a fairly good idea of how distributed they are.

In the beginning, Urbit’s creator Curtis Yarvin was in possession of the entire address space - all 256 galaxies, and everything underneath it. A network of one is no network at all, and so over the last decade Urbit’s development has been primarily driven by selling or giving away these galaxies. On June 1, 2016, the allocation was:

95, to the Tlon Corporation. 50, to urbit.org, the future community foundation. 40, to Tlon employees and their family members (24 to Curtis, who started in 2002; 16 to everyone else, who started in 2014). 34, to outside investors in Tlon. 37, to 33 other individuals, who donated to the project, contributed code or services, won a contest, or were just in the right place at the right time.

Since then, Tlon has sold a number of its galaxies to fund development and others have changed hands to the point that Tlon and urbit.org no longer possess a majority share of galaxies. In January 2019, Curtis gave all of his galaxies to Tlon when he left the project. In August 2020 we shared an update on the known distribution of address space, where Curtis’ galaxies are marked as Tlon’s “naked galaxies”. Shortly thereafter, Tlon disbursed its naked galaxies among the employees that wanted one who did not already possess one, thus removing Tlon and urbit.org’s controlling share of the Senate. Galaxies held by current and former Tlon employees are entirely independent of Tlon Corporation—they may do with them, and vote with them, as they please. ~ravmel-ropdyl plans to give more color to this decision in the near future.

Distribution of stars is much more difficult to know. There is an active market for stars on OpenSea, so we know that they are changing hands frequently, but checking the number of distinct Ethereum addresses that hold stars does not tell you very much information since a single person can control multiple adresses.

Software and smart contracts

As was written in The DAO as a Lesson in Decentralized Governance, we are keenly aware of the threat of decentralization theater. Urbit has undergone steady progress towards decentralization, and with the developments we outline below, it’s never been more clear that the project is beginning to leave the cradle of Tlon. It’s also important to note that the decentralization of a project is always a movement away from an initial centralized state⁠—you cannot go from A to Z without first traversing every letter in between. While Urbit was centralized at the beginning, the journey towards a network owned and controlled by the users is now well underway.

In the past, nearly all Urbit software was written by Tlon. While Tlon is still the only corporation actively updating Urbit’s MIT-licensed open source software, namely Vere (the runtime), and Arvo (the kernel), unaffiliated individuals have been making enormous contributions to Urbit over the past couple of years via the grants program. Recent examples include the Bitcoin node and wallet integration, WebRTC integration, Port, an Urbit installer and ship manager, and |fuse, an important primitive for 3rd party software distribution.

The only real power Tlon holds over Urbit is the ability to push OTA updates via ~zod, and suggest that people download our binaries. To the best of our knowledge, all extant galaxies retrieve their OTA updates from ~zod, who forward it to their stars, who forward it to their planets. This software distribution route is merely a convenient default setting. Ships initially retrieve OTA updates from their sponsor, but they have full authority to retrieve updates from whoever they want with the |ota command, or even to only install software updates by hand if they so choose.

In the future, one can easily imagine many organizations shipping their own distributions of Urbit from their own urbit nodes, somewhat analogously to the large number of Linux distributions which exist today. There is nothing we can do to stop a group from forking Urbit, declaring ~sampel-palnet the new ~zod (both for updates and for sponsorship purposes), and retrieving all OTA updates via that route and building their own binaries. This is all by design, and we wouldn’t want it any other way.

The use of the Azimuth smart contracts is also somewhat optional. By changing a few lines of code to point their ships at another set of PKI smart contracts, they could rid themselves of hierarchical peer discovery entirely and make use of some other routing strategy such as Kademlia based DHT routing. This would likely give rise to an entirely separate network. The fact that the network is fully capable of such a schism helps keep the Senate honest and accountable to the users. If they ever were to vote on a contract modification that greatly angered or upset a portion of the user base, they would be inviting such a schism and be entirely deserving of it. In that sense, the power to control the network is already held by the users, they just have not yet had a reason to make use of it.

Legitimacy

The only way by which Tlon and the Senate hold any power is via legitimacy. Vitalik Buterin recently wrote a very informative article on the role of legitimacy in decentralized governance, and I do not know of a better source to understand how Tlon and the Senate are kept in check by the users. Besides the DAO split resulting in Ethereum and Ethereum Classic mentioned in the article, perhaps the most prominent example of the power users have over a decentralized network is the saga of Steem and Justin Sun. The short version of the story is that Justin Sun bought a controlling share of the Steem blockchain and attempted to exert control over it, and the users banded together to fork the blockchain into one called Hive, which was identical to Steem except with Sun’s token balance set to zero. An analogous outcome for Urbit would be that the Senate makes sweeping changes to the PKI and invites a schism like that outlined above.

For these reasons, there are strong disincentives for any one organization or individual to acquire too many galaxies. If galaxy owenship becomes more centralized, individuals will be less incentivized to build on the network which will reduce the value of address space. Furthermore, such centralization might provoke a fork if such centralized powers are perceived as malicious by the users. This is in contrast to real-life republics, where there is little risk for one party to accumulate as much power as they can manage since the cost to the citizens to overthrow that power is enormous. Thus, only especially egregious displays of power ever result in rebellion. Within the Urbit republic, users need only to change the code they run to leave the Senate powerless, and so the threat of rebellion is much more real. We estimate that within a digital republic like Urbit, the centrifugal forces outweigh the centripetal forces, and that the outcome of this is more stable than the reverse.

Why have governance at all?

An important question the reader may still have lingering is: why does Urbit need governance at all? The short answer is: (i) there will always be necessary protocol upgrades that cannot be foreseen, and (ii) a total lack of governance inevitably gives rise to a shadow government with undefined powers that is almost impossible to hold accountable.

With (ii), we see this to be the case on many early blockchain projects where power is shared by the developers and the miners, but what each is capable of doing has never been codified, and thus users of the blockchain can never have a great deal of certainty of what changes might occur in the future. This doesn’t guarantee a bad outcome, but it does create more wariness among developers when they cannot know whether they really have a say in the project's direction. Recognizing this issue, many newer blockchain and blockchain-adjacent projects have built-in governance mechanisms that lay out exactly what an individual can expect should they join and contribute to that network.

As for (i), with Urbit being a wholly new kind of network, it would take incredible luck to get the PKI exactly right on the very first try without ever needing another modification. Thus, some mechanism was needed to guarantee that the PKI’s logic could be altered. Leaving control of Azimuth in the hands of Tlon would have just granted Tlon dictatorial power, which Tlon nor Urbit users desired. There’s certainly room to argue that the Galactic Senate is not a perfect system: our interim constitution did outline potential roles for stars and planets to play in governance in the future, so there is ample opportunity for the system to evolve in response to the will of the users. As always, it is important to keep in mind the unfinished nature of the Urbit of today and the still moldable clay of the Urbit of tomorrow⁠—nothing yet is set in stone.

Friday, 04. June 2021

Holochain

Holochain Gets Some Parts Upgrades

Holochain Dev Pulse 97

A lot of what I've shared recently has been about Holo hosting -- getting all the bits really robust so they handle both 'happy path' user interactions (everything is working as expected) and 'sad path' ones (HoloPort goes down, login fails, etc). But I haven't spent much time talking about how Holochain itself is improving -- and when I have, it's usually been about bugfixes to support the aforementioned Holo work.

But behind the scenes, Holochain has been getting some big upgrades to some of its parts. This is mainly to make it faster -- fast enough for the next stages of Elemental Chat testing. (And of course, Elemental Chat testing is really about Holo hosting and Holochain.)

LMDB to SQLite: the road to infinite scalability

A lot of peer-to-peer networks, from decentralised social networks like Secure Scuttlebutt to blockchains like Bitcoin, are built on something called 'gossip protocol'. It's a simple protocol, and it works like it sounds like it would work: I know a bunch of people (or rather, computers), and when one of my friends shares something with me I share it with everyone else I know. With this protocol, messages spread rapidly through networks without a central authority.

And it also creates a lot of network traffic. That might be fine for blockchains, in which every machine is online most of the time and needs to know about everything anyway. But for networks that power everyday applications, you want to be respectful of your users' bandwidth -- and you can't expect their machines to be on all the time either. (Even blockchains tweak gossip to make it more efficient.)

Holochain is based on the idea of 'neighbourhoods' of authority -- each piece of data gets an address, and it's only sent to a group of machines that have taken responsibility for the range of addresses that cover its address. That means traditional gossip is too chatty -- as an authority, you need to constantly make decisions about what to share with your neighbours, given that they aren't responsible for the exact same range that you are.

It gets more complicated when machines are turning on and off all the time (which happens a lot when they're just users' devices). A machine that's been