Last Update 3:53 PM January 25, 2022 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Tuesday, 25. January 2022

FindBiometrics

Keesing Integrates Mobai Face Verification Into Onboarding System

Keesing Technologies has teamed up with Mobai AS to deliver a more comprehensive remote onboarding solution. To that end, Keesing will be integrating Mobai’s Face Verification technology into its own […] The post Keesing Integrates Mobai Face Verification Into Onboarding System appeared first on FindBiometrics.

Keesing Technologies has teamed up with Mobai AS to deliver a more comprehensive remote onboarding solution. To that end, Keesing will be integrating Mobai’s Face Verification technology into its own AuthentiScan onboarding platform.

The result is a simple onboarding solution that can verify someone’s identity in either an in-person or online situation. To use it, individual end users are asked to take a selfie and a picture of a photo ID. Keesing’s document verification tech will confirm that the document is legitimate, comparing the new image to a reference database that includes identity documents from virtually every country.

Once that process is complete, Mobai’s facial recognition technology will compare the selfie to the image on that document to make sure that they are a match. Mobai’s solution comes with a Presentation Attack Detection Service (PAD) to prevent spoofing and mitigate the risk of fraud.

Future versions of AuthentiScan could also take advantage of Mobai’s Morphing Attack Service to help protect people’s personal information. Mobai suggested that the platform’s ease of use makes it ideal for anyone who struggles with passwords. Both companies suggested that the improved AuthentiScan can help meet the growing demand for contactless identity verification offerings. AuthentiScan itself is compliant with the latest Know Your Customer regulations.

“We know how crucial a secure and smooth onboarding process is for users, and that a trusted means of identity verification is key in this respect,” said Keesing Managing Director Jan Lindeman. “Mobai is known for providing its customers with safe and accessible biometric services and we are very pleased with the addition of Mobai biometrics to our solutions.”

Keesing first added facial recognition capabilities to the AuthentiScan portfolio back in 2019. The company was also one of several identity providers that offered free identity services to emergency responders in the early stages of the COVID-19 pandemic.

(Originally posted on Mobile ID World)

The post Keesing Integrates Mobai Face Verification Into Onboarding System appeared first on FindBiometrics.


auth0

Application Session Management

Let's see how to maintain application sessions in different scenarios
Let's see how to maintain application sessions in different scenarios

FindBiometrics

Paravision Takes Top Spot in Latest NIST Evaluation

Paravision is celebrating its strong performance in the NIST’s most recent Face Recognition Vendor Test (FRVT). The 1:N test evaluated the ability of different facial recognition algorithms to match a […] The post Paravision Takes Top Spot in Latest NIST Evaluation appeared first on FindBiometrics.

Paravision is celebrating its strong performance in the NIST’s most recent Face Recognition Vendor Test (FRVT). The 1:N test evaluated the ability of different facial recognition algorithms to match a newly captured image to one in an existing database.

In that regard, Paravision proved to be the most accurate solution in the world in the Visa/Border test, with a 0.3 percent False Positive Identification Rate and an even better False Negative Identification Rate of only 0.22 percent. Paravision’s overall error rate was 18 percent lower than that of its closest competitor.

Paravision also took the top spot in the self-service immigration kiosk category. Both the Visa/Border test and the self-service kiosk test ask algorithms to match faces in challenging scenarios. For example, the Visa/Border test captures faces at odd angles in high-throughput areas (like airports) with poor spatial resolution and contrast, while the kiosks often capture faces from above and generate images that may be poorly cropped if the person using the kiosk is either very short or very tall. In both cases, those new images are compared to high-quality originals like those on passports and driver’s licenses.

The two tests were performed with a dataset of 1,600,000 images. Paravision was the only US vendor in the NIST’s top ten, and dropped its error rate by 70 percent in the last year alone. The algorithm submitted for the test is Paravision’s 5th generation facial recognition solution, which will be available commercially sometime in the spring of 2022.

“Paravision is now independently recognized as the most accurate face recognition platform in the world and the only U.S. company in the top ten,” said Paravision CEO Doug Aley. “Given the scale of legacy providers and the massive funding and resources going specifically to Chinese AI firms, the ranking is a huge testament to our approach.”

The enhanced Paravsion algorithm will offer better support for mobile devices and edge facial recognition applications.  The company’s top overall score improves on its 2021 performance, when it was only the top-ranked Western facial recognition developer in the 1:N evaluation.

January 25, 2022 – by Eric Weiss

The post Paravision Takes Top Spot in Latest NIST Evaluation appeared first on FindBiometrics.


Police and Privacy Activists Prepare for Surveillance Showdown in India’s Courts

A pending court case is reigniting the debate about the right to privacy in India. S Q Masood’s petition against the Telangana police first hit the headlines earlier this month, […] The post Police and Privacy Activists Prepare for Surveillance Showdown in India’s Courts appeared first on FindBiometrics.

A pending court case is reigniting the debate about the right to privacy in India. S Q Masood’s petition against the Telangana police first hit the headlines earlier this month, and stems from an incident that occurred in May of 2021.

In his complaint, Masood alleges that he was stopped by the police in Hyderabad without cause, and that the police forced him to remove his mask before taking his picture without his consent. He characterizes the incident as a violation of his fundamental right to privacy, which was guaranteed by the Indian Supreme Court back in 2017.

However, that ruling does not seem to have deterred law enforcement. The use of face-based surveillance technology has exploded in Telangana in the past few years, to the point that Amnesty International, the International Freedom Foundation (IFF), and Article 19 listed the state as the most surveilled place in the entire world. There are now more than 600,000 CCTV cameras scattered across Telangana, most of which are located in the capital of Hyderabad.

Civil rights groups like the IFF (which is backing Masood’s case) argue that that rapidly expanding network violates India’s constitution, and infringes on people’s civil liberties. In that regard, Masood’s case could be particularly telling. Activists are worried that police could use facial recognition technology to monitor, profile, and discriminate against minority groups like Muslims, Dalits, and transgender people, in much the same way that they allege that the police profiled and harassed Masood on the streets last May. Police in New Delhi also used facial recognition to arrest more than 1,100 protesters in early 2020.

In New Delhi, faces were cross-referenced with driver’s licenses and other official databases, a fact that raised additional concerns about data security. India does not yet have a meaningful data protection law, and the one that has been proposed creates broad exceptions for national security agencies. That means that Indian citizens do not know how their personal data will be used, or potentially used against them in a surveillance state. The federal government is moving forward with plans for a national surveillance system despite those objections.

With that in mind, the upcoming case could end up being a major showdown between privacy advocates and Indian law enforcement. The former groups are hoping that the court case will raise public awareness about the issue, and force the government to slow the expansion of its network. The IFF has asked for a three-year facial recognition ban to give lawmakers more time to craft meaningful guidelines for the ethical use of the technology.

Source: Al Jazeera

January 25, 2022 – by Eric Weiss

The post Police and Privacy Activists Prepare for Surveillance Showdown in India’s Courts appeared first on FindBiometrics.


KuppingerCole

OneWelcome Customer Identity and B2B identity

by John Tolbert OneWelcome is the combined brand after the merger of iWelcome and Onegini, two leading Identity as a Service providers that are based in the Netherlands. OneWelcome offers a full-featured IDaaS for B2C use cases and B2B relationship management. As a European headquartered identity service provider, OneWelcome is uniquely attuned to the business and regulatory compliance requirement

by John Tolbert

OneWelcome is the combined brand after the merger of iWelcome and Onegini, two leading Identity as a Service providers that are based in the Netherlands. OneWelcome offers a full-featured IDaaS for B2C use cases and B2B relationship management. As a European headquartered identity service provider, OneWelcome is uniquely attuned to the business and regulatory compliance requirements of GDPR. OneWelcome has extensive privacy and consent management features and data residency compliance.

Infocert (IT)

Al via il tour “SPID in ogni dove”, InfoCert grazie alla rete dei partner promuove l’iniziativa sociale ideata dall’associazione no-profit MigliorAttivaMente

Il progetto consiste in una iniziativa sociale itinerante che ha come obiettivo principale l’attivazione facilitata e gratuita dell’identità digitale per cittadini, nell’ottica di orientare nella dimensione digitale coloro che ne potrebbero rimanere esclusi. A seguito della pandemia globale che ha avuto inizio nel febbraio 2020, la necessità di utilizzare gli strumenti digitali per sopperire all’i

Il progetto consiste in una iniziativa sociale itinerante che ha come obiettivo principale l’attivazione facilitata e gratuita dell’identità digitale per cittadini, nell’ottica di orientare nella dimensione digitale coloro che ne potrebbero rimanere esclusi.

A seguito della pandemia globale che ha avuto inizio nel febbraio 2020, la necessità di utilizzare gli strumenti digitali per sopperire all’impossibilità di svolgere le più comuni operazioni amministrative o di interfaccia con la PA, ha fatto salire vertiginosamente il bisogno da parte di cittadini e imprese di dotarsi di mezzi quali ad esempio SPID; molti cittadini però corrono il rischio di essere tagliati fuori dal mondo del digitale, poiché l’accesso a queste risorse prevede un livello minimo di conoscenza della tecnologia e dei device non sempre posseduto da tutti.

Con il proposito di non lasciare nessuno indietro, è nata l’associazione di promozione sociale MigliorAttivaMente, che persegue lo scopo di favorire la “salita a bordo” del cittadino nella dimensione digitale del Paese, così da fruire telematicamente, senza sprecare tempo e risorse, dei servizi offerti da aziende e pubblica amministrazione; questa idea è nata durante la pandemia, mentre l’associazione ha preso vita nel 2021, ha già all’attivo numerose iniziative di interesse sociale e si appresta a compiere ulteriori importanti passi.

Il progetto “SPID in ogni dove” è un’importante iniziativa che ha l’obiettivo di dotare gli utenti meno avvezzi alla tecnologia della propria identità digitale, il tutto si svolgerà durante le tappe di un tour che vedrà coinvolta InfoCert come provider per l’attivazione facilitata e gratuita del servizio, il tutto avverrà grazie all’impegno degli InfoCert partners Gedea, Studio Eureka, TeamService e Unappa che si adopereranno materialmente mettendo a disposizione i referenti che avranno il compito di attivare le identità digitali,  MigliorAttivaMente invece si occuperà di coordinare le tappe del tour e far sì che quanti più utenti in difficoltà possano avere l’opportunità di accedere al servizio.

Al momento sono state scelte le città che saranno tappa del tour (Roma, Milano, Napoli, Palermo, Verona, Brescia, Reggio Emilia, Perugia, Rimini, Salerno, Monza, Tento, Pescara, Cagliari), il tour avrà inizio nella primavera 2022 e MigliorAttivaMente ha già iniziato ad ingaggiare i volontari dedicati al servizio di accettazione del cittadino, Infocert, attraverso i propri partner, metterà a disposizione l’addetto al riconoscimento e ovviamente il servizio SPID; le location per ogni città sono ancora in fase di selezione, verranno comunicate tramite i canali di InfoCert e MigliorArrivaMente non appena ufficializzate.

Per saperne di più

https://www.migliorattivamente.org/

The post Al via il tour “SPID in ogni dove”, InfoCert grazie alla rete dei partner promuove l’iniziativa sociale ideata dall’associazione no-profit MigliorAttivaMente appeared first on InfoCert.


FindBiometrics

Display Expert Doubts Return of Touch ID for This Year’s iPhones

A recent Apple rumor speculates that the tech giant will not in fact be implementing an in-display Touch ID sensor in this year’s upcoming iPhone 14 series. The report — […] The post Display Expert Doubts Return of Touch ID for This Year’s iPhones appeared first on FindBiometrics.

A recent Apple rumor speculates that the tech giant will not in fact be implementing an in-display Touch ID sensor in this year’s upcoming iPhone 14 series.

The report — courtesy of Cult of Mac — traces back to a Twitter thread of noted display industry veteran and expert Ross Young, in which he noted his belief that the 2022 edition of the iPhone will come with a hole-punch display. In the thread, Ross replied to a question about an in-display fingerprint sensor by saying he doesn’t see Touch ID making its way into this year’s lineup.

If true, this could come as a major disappointment for iPhone users who have been waiting for Apple to go multimodal with its biometric authentication while also providing an alternative to Face ID.

When and if Apple does finally decide to add fingerprint authentication back to its flagship phones, it could be playing catch-up with Google’s Pixel lineup.

A November 2021 report revealed that Google planned on including Face Unlock — its answer to Face ID — in last year’s Pixel 6 release, but instead chose to go with just an in-display fingerprint sensor for biometric authentication. In Google’s case however, it is suggested that Face Unlock can be added back to the Pixel phones at a later date via a software update.

Rumors of Apple moving to in-display Touch ID have been around for the better part of a year now, with many having predicted the feature would be included in the iPhone 13, which launched in the Fall of 2021.

Though this is still a rumor, Cult of Mac is careful to point out that Ross’s previous predictions regarding Apple products have been deemed 100% accurate thus far by website AppleTrack.

Source: Cult of Mac

(Originally posted on Mobile ID World)

The post Display Expert Doubts Return of Touch ID for This Year’s iPhones appeared first on FindBiometrics.


Civic

Liquid Meta Partners with Civic Technologies to Bring Capital Liquidity to Permissioned dApps

Partnership will responsibly facilitate access to DeFi for global institutional investors and enterprises TORONTO,  January 25, 2022 – Liquid Meta Capital Holdings Ltd. (the “Company” or “Liquid Meta”) (NEO:LIQD) a leading decentralized finance (DeFi) infrastructure and technology company, today announced a partnership with Civic Technologies, Inc. (“Civic”), a leading innovator in blockchain

Partnership will responsibly facilitate access to DeFi for global institutional investors and enterprises

TORONTO,  January 25, 2022 – Liquid Meta Capital Holdings Ltd. (the “Company” or “Liquid Meta”) (NEO:LIQD) a leading decentralized finance (DeFi) infrastructure and technology company, today announced a partnership with Civic Technologies, Inc. (“Civic”), a leading innovator in blockchain-powered digital identity solutions. Together, the companies will bring trusted, secure permissioned identity services to DeFi and enable Liquid Meta to provide capital liquidity to permissioned dApps. This will promote the overall adoption of DeFi.

“Liquid Meta’s mission is to build a bridge between Traditional Finance and Decentralized Finance,” said Jonathan Wiesblatt, CEO.  “Our partnership with Civic Technologies will provide us with the enhanced capabilities and tools needed to expand upon this mission and to create an environment for traditional institutional investors to access this nascent and emerging industry in a more secure and trusted capacity.” 

The partnership is timely, given interest in DeFi from institutional investors. At its core, DeFi relies on permissionless access, but this principle often runs counter to the obligations of large capital investors that must know the counter-parties they are participating with in order to reduce counter-party risk. Permissioned access through the partnership will provide a gateway into DeFi for these investors. Liquid Meta is already providing liquidity to some of the most exciting Decentralized Applications and Exchanges in DeFi.

As the first publicly traded, pure-play liquidity mining operation in the world, Liquid Meta is building proprietary software and tools to access, automate and unlock tremendous growth within DeFi. The Company is focused on generating cash flow in the fast-growing DeFi segment of the blockchain industry.

“The team at Liquid Meta has a long history in the financial services industry, public capital markets and the crypto and technology industries. The leadership team is attuned to the operational rigor that institutional investors require in order to explore investments in new categories like DeFi,” said Chris Hart, CEO of Civic. “We can’t think of a better partner that’s more aligned with our mission to open the DeFi ecosystem to new investors through permissioned markets.”

Civic provides a KYC and KYB solution, through Civic Pass, that a dApp provider can use as an input to their compliance program. Liquid Meta will use Civic Pass to determine which participants meet a dApp’s rigorous standards for verification prior to allowing them the ability to trade. Civic Pass is seamlessly integrated into a dApp’s onboarding flow. Once an institutional investor has completed the Civic Pass screening process, the dApp will use the results of the screening to allow trading.

For more information about Liquid Meta, visit https://liquidmeta.io/.

For more information about Civic or to become a Civic business partner, visit https://www.civic.com

About Liquid Meta

Liquid Meta is a decentralized finance infrastructure and technology company that is powering the next generation of open-access protocols and applications. The Company is creating the bridge between traditional and decentralized finance while ushering in a new era of financial infrastructure that benefits anyone, anywhere.  

To learn more visit: Website | LinkedInTwitter 

About Civic

Civic is a leader in the decentralized identity space, focused on real-world applications of blockchain-powered identity verification technology. Civic offers solutions for DeFi users to selectively share personal information, while also providing cryptocurrency projects a path to meeting global compliance standards. By delivering adaptable identity verification solutions for businesses and intuitive digital identity tools to end-users, Civic supports the long-term viability of the greater cryptocurrency economy. Civic uses identity.com’s open-source, blockchain-based ecosystem to verify credentials. Civic aims to be the most trusted Metaverse, Web3, DeFi on-chain identity platform in the world, used by billions every day. Vinny Lingham and Jonathan Smith co-founded Civic in 2015.

Cautionary Notice

Neo Exchange has not reviewed or approved this press release for the adequacy or accuracy of its contents.

Notice Regarding Forward-Looking Information:

This news release contains “forward-looking information” and “forward-looking statements” (collectively, “forward-looking statements”) within the meaning of the applicable Canadian securities legislation. All statements, other than statements of historical fact, are forward-looking statements and are based on expectations, estimates and projections as at the date of this news release. Any statement that involves discussions with respect to predictions, expectations, beliefs, plans, projections, objectives, assumptions, future events or performance (often but not always using phrases such as “expects”, or “does not expect”, “is expected”, “anticipates” or “does not anticipate”, “plans”, “budget”, “scheduled”, “forecasts”, “estimates”, “believes” or “intends” or variations of such words and phrases or stating that certain actions, events or results “may” or “could”, “would”, “might” or “will” be taken to occur or be achieved) are not statements of historical fact and may be forward-looking statements. 

Forward-looking statements involve known and unknown risks, uncertainties and other factors which may cause the actual results, performance or achievements of Liquid Meta to be materially different from any future results, performance or achievements expressed or implied by the forward-looking statements. Factors that could cause actual results to differ materially from those anticipated in these forward-looking statements are described under the caption “Risks and Uncertainties” in the Company’s Filing Statement dated as of December 17, 2021 which is available for view on SEDAR at www.sedar.com. Forward-looking statements contained herein are made as of the date of this press release and Liquid Meta disclaims, other than as required by law, any obligation to update any forward-looking statements whether as a result of new information, results, future events, circumstances, or if management’s estimates or opinions should change, or otherwise. There can be no assurance that forward-looking statements will prove to be accurate, as actual results and future events could differ materially from those anticipated in such statements. Accordingly, the reader is cautioned not to place undue reliance on forward-looking statements.

Liquid Meta’s operations could be significantly adversely affected by the effects of a widespread global outbreak of a contagious disease, including the recent outbreak of illness caused by COVID-19. It is not possible to accurately predict the impact COVID-19 will have on operations and the ability of others to meet their obligations, including uncertainties relating to the ultimate geographic spread of the virus, the severity of the disease, the duration of the outbreak, and the length of travel and quarantine restrictions imposed by governments of affected countries. In addition, a significant outbreak of contagious diseases in the human population could result in a widespread health crisis that could adversely affect the economies and financial markets of many countries, resulting in an economic downturn that could further affect operations and the ability to finance its operations.

Further information

For further information regarding Liquid Meta, please contact:

James Bowen, CFA

Liquid Meta Investor Relations 

416-519-9442

investors@liquidmeta.io 

The post Liquid Meta Partners with Civic Technologies to Bring Capital Liquidity to Permissioned dApps appeared first on Civic Technologies, Inc..


Spherity

Spherity Credentialing Service Now Available on SAP® Store

By integrating with the SAP Information Collaboration Hub, Spherity’s solution allows pharmaceutical supply chain trading partners to share and verify enterprise license status data securely in compliance with U.S. life science market regulations Spherity announced today that its Spherity Credentialing Service is now available on SAP® Store, the online marketplace for SAP and partner offerings. T
By integrating with the SAP Information Collaboration Hub, Spherity’s solution allows pharmaceutical supply chain trading partners to share and verify enterprise license status data securely in compliance with U.S. life science market regulations

Spherity announced today that its Spherity Credentialing Service is now available on SAP® Store, the online marketplace for SAP and partner offerings. The solution integrates with the SAP Information Collaboration Hub and offers a service that enables U.S. pharmaceutical supply chain trading partners to easily comply with the DSCSA 2023 Authorized Trading Partner requirements. This supports the FDA’s objective of protecting patient safety. With the solution integrated with SAP’s verification routing service feature, pharmaceutical companies can be assured that they only interact with trading partners that are properly authorized.

Spherity Credentialing Service is now available on SAP® Store

“During a two-year pilot, test and validation phase with SAP, industry pioneers and our partners in the Open Credentialing Initiative, we co-developed an interoperable solution that can be connected with verification routing services,” said Georg Jürgens, Manager Industry Solutions at Spherity. “Spherity Credentialing Service is one of the first products on the market to provide electronic interactions a real-time compliance check of the counterparty’s authorized trading partner status under U.S. state or federal law.”

Spherity Credentialing Service has a lean onboarding path for businesses using SAP solutions and requires no additional technical integration. “All I had to do is sign up with Spherity and tell my key account at SAP that I want to use credentialing for product verifications,” said David Mason, Supply Chain Compliance and Serialization Lead at Novartis.

SAP Store, found at store.sap.com, delivers a simplified and connected digital customer experience for finding, trying, buying and renewing more than 1,800 solutions from SAP and its partners. There, customers can find the SAP solutions and SAP-validated solutions they need to grow their business. And for each purchase made through SAP Store, SAP will plant a tree.

Spherity is a partner in the SAP PartnerEdge® program. The SAP PartnerEdge program provides the enablement tools, benefits and support to facilitate building high-quality, disruptive applications focused on specific business needs — quickly and cost-effectively.

About Spherity

Spherity is a German software provider, bringing secure and decentralized identity management solutions to enterprises, machines, products, data and even algorithms. Spherity provides the enabling technology to digitalize and automate compliance processes in highly-regulated technical sectors. Spherity’s products empower cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.

SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE in Germany and other countries. Please see https://www.sap.com/copyright for additional trademark information and notices. All other product and service names mentioned are the trademarks of their respective companies.

For more information, press only:

Marius Goebel

Manager Marketing & Sales, CISO

marius.goebel@spherity.com

Spherity Credentialing Service Now Available on SAP® Store was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

Boost Your Productivity Using Okta CLI with Fig

CLIs are great. I love the speed and productivity increases I get when using a CLI, but memorizing commands – especially when commands need arguments, options, flags, and so on – can be daunting. Luckily, there are tools available for CLI fans out there, and one tool I’ve been enjoying is Fig. Fig powers up your CLI productivity Fig adds autocompletion to supported terminals, which makes usi

CLIs are great. I love the speed and productivity increases I get when using a CLI, but memorizing commands – especially when commands need arguments, options, flags, and so on – can be daunting. Luckily, there are tools available for CLI fans out there, and one tool I’ve been enjoying is Fig.

Fig powers up your CLI productivity

Fig adds autocompletion to supported terminals, which makes using CLIs so much easier. Using Git? You’ll see a small window pop up with different commands and options.

It’s a cool way to reinforce remembering Git command-line operations. But most of us use Git on a daily basis, so we probably have a good command… of our Git commands. 😳 Hilarious pun aside, the branch names in the context menu are pretty darn helpful.

Boosting the power of Okta CLI

How about when it’s a CLI that you might not use daily? This is where the autocompletion shines and really helps you power up!

We’re pleased to announce that Okta CLI autocompletion is available in Fig! When you use a supported terminal, you’ll see hints to help you navigate through the commands and flags, so managing your Okta applications has never been easier.

The autocomplete context menu lists all the commands available and shows you which command requires an argument. The okta start command allows you to pass in a name, and you can see the optional argument at a glance.

Managing Okta apps

You may need to create Okta apps and get app configurations. When you need to create an Okta app, the Okta CLI helps you step through the process. However, if you are an Okta power user, you may be fine creating the Okta app in one shot by passing in the app settings. In this case, the Okta CLI autocomplete is pretty handy as it shows you all the options you can pass in!

Help is on the way!

Getting more detailed help information is easier too! The autocomplete context menu helps you navigate to the help output for a command. And yup, you can get help about the help command. Pretty meta, right?

Supported terminals for Okta CLI using Fig

Now that you’re excited, here’s the fine print. Fig currently only supports macOS. 😰 And only certain terminals and shells within macOS at that. Read Fig’s FAQ for supported terminals and shells, as well as information for work support cross-platform and more terminal/shell integrations. Don’t forget to get on the waiting list for your OS or shell, or submit an issue so Fig knows how to prioritize future work.

Learn more

I hope this autocomplete feature helps you! I’d love to hear how you use Okta CLI and what tools you use to power up your terminal commands. We welcome issues and PRs in the okta-cli GitHub repo too.

If you want to learn more about Okta, Okta CLI, security practices, and tooling, check out these posts:

Introducing Okta CLI Developers Guide to GPG and YubiKey Use Okta like a Rockstar The Development Environment of the Future

Follow us on Twitter and subscribe to our YouTube Channel so you never miss any of our awesome content!


Identosphere Identity Highlights

Identosphere 66 • Transatlantic Interop • ENISA Reports on Digital ID • Capability-based Data Security Ceramic

A weekly digest of upcoming events, company news, organizational updates, development in open standards and the latest headlines in self sovereign identity.
Thanks to Patrons like you. Consider supporting this publication with a monthly payment via Patreon ← click here

…or reach out to Kaliya, and she can send you an invoice

if you have a “budget” as part of your employment spend some of it on us we can send you an invoice ; )

Read previous issues and Subscribe : newsletter.identosphere.net

Content Submissions: newsletter [at] identosphere [dot] net

Upcoming

How To: Own Your Identity 1/25 UFOstart (IAMX)
Speaking: Markus Sabadello, Michael Shae, Tim Brückmann, Tim Heidfeld

Identity and Security Meetup Sydney (video) (Auth0 sponsored) Next meeting 1/25

Ally Haire talks about identity using blockchain, and Vandana Verma tells us how to stay secure with third-party dependencies!

Hyperledger Indy Technical Deep Dive 2/3

Data Space Launchpad MyData • Until 2/22

Decentralized Autonomous Organizations to revolutionize the way we work? Alex Puig • Caelum Labs 2/24

Build using Decentralized Identities with Microsoft

Build an app that uses Verifiable Credentials • 01/25

Setting up your DID infrastructure • 01/25

Hackathon: Let’s build something cool with DID • 01/27

Class on Digital Identity starting up at University of South Florida EME 4390: Digital Identity USF

Students examine identity in a digital world. Topics covered include identifying digital identity, managing digital identities, examining marginalized digital identities, and the governing of digital identities. This is primarily a project-based course.

Transatlantic Interop Transatlantic SSI Interop Markus Sabadello

The "Transatlantic SSI Interop" experiment was successfully conducted to demonstrate interoperability between the EU EBSI Diploma use case, and the US SVIP Permanent Resident Card use case. This was jointly planned and executed by EU partner Danube Tech and US partner Digital Bazaar.

SSI Interop Video NGIatlantic.eu

Results from an interoperability project in the are of Decentralized Identity, conducted by EU company Danube Tech and US company Digital Bazaar.

Hiring Senior React Native Mobile Software Engineer Indicio

The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new React Native mobile (iOS and Android) apps in the Decentralized Identity industry. You should excel in working with large-scale applications and frameworks autonomously. As you’ll work remotely you must have outstanding communication and leadership skills.

Evernym is Hiring

The @Avast Identity team has several openings in Product Management, Program Management, Engineering, and DevOps.

Funding Round Merit grabs $50M Series B to expand digital credentials platform TechCrunch ←its not SSI though

The company spent the last five years working with various governments to build connectors to these licensing databases to allow third parties to access the data and have it update automatically. So if you are a plumbing company, you can display your employees’ plumbing credentials on the company website and have them update automatically when the license is renewed (or show that it wasn’t).

BLOQZONE RECEIVES €155K EU GRANT FROM ESSIF-LAB FOR PROJECT SSICOMMS

We don’t know what this is… is it DIDComm? if not what is it? 

The project SSIComms adds SSI to internet communications by adding SSI wallets to the renowned SYLK Suite, an award winning ensemble of communications solutions with the SIP protocol at its core. This enables users to respond to presentation requests for credentials entirely voluntarily and according to SSI principles during communications sessions.

Explainer Custodial vs Non-custodial Wallets Affinidi

The biggest disadvantage of non-custodial wallets is their inaccessibility when you lose or forget the password. Since you are the sole custodian, there is a huge responsibility on you to always remember the password of your wallet.

A brief history of SSI: Where does it come from? A timeline Jolocom

A short timeline highlights just how quickly SSI has developed. It underlines the path and development of the evolution of Internet Identity.

Verifiable Credentials W3C Verifiable Credentials Education Task Force 2022 Planning Kerri Lemoie 

We’ve been hard at work writing use cases, helping education standards organizations understand and align with VCs, and we’ve been heading towards a model recommendation doc for the community.

The World of Anonymous Credentials Dock

A credential is called a verifiable credential when its authenticity can be cryptographically checked by anyone because the credential contains a cryptographic signature by the issuer, and the issuer's public key is well known.

WHY THRIVACY?: Think about it. What did you leave behind when you bought the last round of drinks.

Your Thrivacy wallet allows you to request all your important, personal information that can be used to identify who you are to be created into what we call verified credentials. Then those same verified credentials or VCs can be downloaded and stored in your own personal wallet that is kept inside your cell phone.

SSI Meme of the week SSI By Memes Development Building capability-based data security for Ceramic

The 3Box Labs team recently published a new standard for creating capability containers for accessing decentralized data to the Chain Agnostic Standards Alliance. Capability containers are an approach for managing advanced data security and permissions, commonly referred to as “Object Capabilities” or “OCAPs.”

This new standard is currently in development for use on Ceramic. Once deployed in a future version of the protocol, it will allow Ceramic to be fully compatible with the new Sign-in with Ethereum (SIWE) specification as well as provide advanced data flow control features for resources stored on the Ceramic network.

The SSI Kit Walt ID

Introducing the SSI Kit, which offers developers and organisations an easy and fast way to use Self-Sovereign Identity (SSI).

The Journey of an SSI Developer Affinidi A Responsible Reporting Nightmare: Right-clicking is Not a Crime Me2Ba

This is a story of a politician who cried “hacker” after a reporter informed a state agency that sensitive information was embedded in their website’s HTML source code.

Design\UX Backchannel: A relationship-based digital identity system Ink and Switch

Using Backchannel as a model example, we propose four design principles for trusted digital relationships. Then we used Backchannel to design and build three sample apps: chat, location sharing, and document preview. We also tested these designs with journalists, researchers, and designers. Based on this testing, we outline common user experience challenges and recommended solutions.

Demonstration https://demo.animo.id/

It allows people to experience SSI first-hand by choosing a character and 'playing' through their unique use cases. The student can enrol in college, visit a club or join a gym, while the business woman can attend a conference and check into a hotel for example. It was built with Aries Framework JavaScript's REST API and supports the Lissi and the Trinsic wallet

Lissi Connect Demo <- in German

The login is only the start of the mutual customer relationship users do not want to monitor and maintain dozens of communication interfaces, but prefer a solution that brings these aspects together […] The media break and the fragmentation of the current systems poses a major challenge for users and organizations. However, once stored in the user's wallet, this information can be easily managed, sorted and presented as needed.

Liquid Avatar and Ontario Convenience Stores Association (OSCA) Successful Pilot of Digital Age-Verification to Reach 8000+ Retail Locations

The Smart Age program provides digital age verification, supported with biometric authentication for restricted product sales like lottery tickets, tobacco, alcohol and other goods and services through a mobile device using verifiable digital credentials and biometrics without a user divulging any personally identifiable information to the store clerk. 

ENISA Reports on Digital ID Beware of Digital ID attacks: your face can be spoofed!

Digital identification is the focus of two new reports by the European Union Agency for Cybersecurity (ENISA): an analysis of self-sovereign identity (SSI) and a study of major face presentation attacks.

Digital Identity: Leveraging the SSI Concept to Build Trust 

This report explores the potential of self-sovereign identity (SSI) technologies to ensure secure electronic identification and authentication to access cross-border online services offered by Member States under the eIDAS Regulation. It critically assesses the current literature and reports on the current technological landscape of SSI and existing eID solutions, as well as the standards, communities, and pilot projects that are presently developing in support of these solutions.

Standardization Indicio Wins British Columbia Code With Us Challenge to Upgrade Hyperledger Indy

Most of Hyperledger Indy’s development occurred prior to the completion of the standard DID Specification by the W3C and, as a result, identifiers written to one network are currently not resolvable on other networks. A new did:indy DID Method will fix that and make it easier for decentralized identity products and services to interoperate across different Indy networks.

Vote for First Implementer’s Drafts of OIDConnect SIOPV2 and OIDC4VP Specifications OpenID

The official voting period will be between Tuesday, February 1, 2022 and Tuesday, February 8, 2022, following the 45-day review of the specifications. 

Use Case TheirCharts Doc Searls

If you’re getting health care in the U.S., chances are your providers are now trying to give you a better patient experience through a website called MyChart.

This is supposed to be yours, as the first person singular pronoun My implies. Problem is, it’s TheirChart. 

Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Marke Spherity

With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.

are we building for these use-cases? These vending machines sell internet access five minutes at a time: For many Filipinos, coins are the currency of the internet RestofWorld

Gatekeeping of internet access is a fact of life in the Philippines, where the market is shaped by the telecommunications duopoly. President Rodrigo Duterte threatened to seize the telco giants, Globe and PLDT, if they didn’t improve their service by the end of 2020. Like much of his bluster, though, the threat has failed to have an effect. 

Thoughtful Web3, Coherence, and Platform Sovereignty Phil Windley

In The crypto-communists behind the Web3 revolution, Benjamin Pimentel argues that "The future of decentralized finance echoes a decidedly Marxist vision of the future." He references various Silicon Valley icons like Jack Dorsey, Marc Andreessen, Elon Musk, and others, comparing their statements on Web3 and crypto with the ideology of communism.

Why do you care about identity? Phil Wolff

I love that it’s more than a technical discussion. It’s laws. Sociology. Politics and civics. Commerce. Psychology. Ethics. History. Economy. All the humanities.

Identity touches everything. It always will. Computation and communication continue their pervasion, and identity spreads with them.  

oldie but goodie Batman & Identity: Crash Course Philosophy #18

Hank explores different ways of understanding identity – including the Indiscernibility of Identicals, and essential and accidental properties. In what ways does affect identity? In what ways does it not? What does it mean for a thing to persist over time?

Identity not SSI Use Fido2 Passwords Authentication with Azure AD Damion Bod

This article shows how to implement FIDO2 passwordless authentication with Azure AD for users in an Azure tenant. 

What is Knowledge-based Authentication (KBA)? Ping Identity

Answering security questions based on personal information when you log in to an app or system is called knowledge-based authentication (KBA).

Companies Self-Sovereign Identity – a game changer regarding privacy Adnovum Another Swiss company is promoting SSI

After rejection of the e-ID Act in March 2021, the Swiss Federal government is working at full speed on a new proposal. The first results were published in a discussion paper on September 2. Self-Sovereign Identity (SSI) is one of three possible solutions proposed for the implementation of the future e-ID.

The Wallet Walt.id

This post introduces our new open source product: the Wallet, which enables developers and organisations to put Self-Sovereign Identity (SSI) into the hands and pockets of the people via easy-to-use applications.

IdRamp CEO Mike Vesey: Pink Locker Rooms School of Cyber Hard Knocks

Passwords and zero-trust and pink locker rooms, oh my! In this episode, Mike discusses IdRamp, what self-sovereign identity is, why we still have passwords today, zero-trust, what the near future holds, pink locker rooms!, his path to IdRamp, and as always, his toughest lesson learned.

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net
Support this publication: patreon.com/identosphere
Contact \ Submission: newsletter [at] identosphere [dot] net

Monday, 24. January 2022

FindBiometrics

After Corporate Restructuring, Fingerprints to Deliver Q4 Update

Fingerprint Cards has set a date for its next corporate update: the company will deliver its Q4 results and host a conference call on January 28. It could prove to […] The post After Corporate Restructuring, Fingerprints to Deliver Q4 Update appeared first on FindBiometrics.

Fingerprint Cards has set a date for its next corporate update: the company will deliver its Q4 results and host a conference call on January 28.

It could prove to be a particularly interesting update for a couple of reasons. One is that the Q4 and full years results for fiscal 2021 will likely shed light on a year that has continued to deliver pandemic-related disruptions. Fingerprint Cards is a leading provider of fingerprint sensors for mobile devices, and the mobile sector has been an area in which supply chain issues have been challenging over the past several months.

The conference call, meanwhile, may offer some further insights into the transformative changes underway within the company. Fingerprint Cards’ board of directors announced a strategic review of the company in July of 2021. In November, the company announced the result: that FPC would essentially split into two subsidiaries. The Fingerprint Technology Company subsidiary would be based in Shanghai, and would focus on mobile biometrics; while the Fingerprint Cards Switzerland subsidiary would be headquartered in Zug, and would carry on the work of FPC’s Payments & Access business line.

Operationally, it isn’t yet clear how this structural change has impacted Fingerprint Cards, but the company’s forthcoming conference call could help to clarify the matter. Scheduled for 9:00 a.m. CET on January 28, the call will be conducted in English, and will be hosted by Fingerprint Cards CEO Christian Fredrikson together with Per Sundqvist, the company’s Chief Financial Officer.

It’s possible that the executives will also touch on a more recent development for the company – a SEK 300 million issuance of senior secured bonds. Announced in December, the issuance was aimed at securing working capital to help FPC follow through on its “ambitious growth plans,” Fredrikson explained at the time. He also noted that both of FPC’s subsidiaries were seeing increasing demand, a propitious sign going into 2022.

January 24, 2022 – by Alex Perala

The post After Corporate Restructuring, Fingerprints to Deliver Q4 Update appeared first on FindBiometrics.


Finicity

CEOs Sound Off: Fintechs Press for Open Banking Mandates

Co-founder and former CEO of Finicity Steve Smith and other CEOs with experience across the global payments spectrum offer up their views on the current open banking landscape in the United States. Should U.S. policymakers and regulators take action to push it beyond market-driven moves, such as bilateral agreements between companies? Read more here. The post CEOs Sound Off: Fintechs Press for O

Co-founder and former CEO of Finicity Steve Smith and other CEOs with experience across the global payments spectrum offer up their views on the current open banking landscape in the United States. Should U.S. policymakers and regulators take action to push it beyond market-driven moves, such as bilateral agreements between companies?

Read more here.

The post CEOs Sound Off: Fintechs Press for Open Banking Mandates appeared first on Finicity.


FindBiometrics

Korea’s National Biometric Testing Center Certifies FacePhi Tech

“This certification allows FacePhi to be recognized as a certified company both in Korea and abroad, and is a clear example of the company adhering to strict international ISO standards.” […] The post Korea’s National Biometric Testing Center Certifies FacePhi Tech appeared first on FindBiometrics.
“This certification allows FacePhi to be recognized as a certified company both in Korea and abroad, and is a clear example of the company adhering to strict international ISO standards.” – Dongpyo Hong, CEO, FacePhi APAC

FacePhi has received an important new certification in its efforts to expand its global reach: the biometrics specialist has earned K-NBTC certification from the Korean Internet Security Agency.

The certification is issued by KISA’s National Biometric Testing Center, and is based on the evaluation of FacePhi’s facial recognition technology. That evaluation was, in turn, based on methodology from the National Institute of Standards and Technology (NIST), as well as ISO standards.

In announcing the certification, FacePhi explained in a statement that its facial recognition algorithm was “tested for changes in lighting conditions of different scenarios, changes in facial expression, pose and the use of accessories.”

The Spain-based company first announced its South Korean subsidiary, dubbed “FacePhi APAC”, in May of 2019. Headed by former CrucialTec USA President Dongpyo Hong, the outfit has since gained some traction, with deployments including a patient identification system at Kangbuk Samsung Hospital in Seoul and a selfie-based onboarding system for Hanwha Investment & Securities Co.

Now, the K-NBTC certification is poised to help FacePhi further accelerate its efforts in the region.

“This certification allows FacePhi to be recognized as a certified company both in Korea and abroad, and is a clear example of the company adhering to strict international ISO standards,” explained Dongpyo Hong. “For FacePhi APAC, 2022 will be a year of expansion in the financial sector and of continuous offers of our solutions to clients in new industries.”

Meanwhile, FacePhi continues to enjoy success at home, having announced partnerships with Spain’s SoYou financial services startup and the Valencia CF soccer club this month. The latter could also help FacePhi to extend its global reach, with the European football club having agreed to act as “a global ambassador” for FacePhi that will help to promote its biometric technology at major soccer events including the 2022 World Cup in Qatar.

January 24, 2022 – by Alex Perala

The post Korea’s National Biometric Testing Center Certifies FacePhi Tech appeared first on FindBiometrics.


IRS Makes Selfie Identification Mandatory for Online Services

The IRS is making ID.me its official identity provider. The agency first started trialing ID.me’s platform back in 2017, initially using the technology to administer its child tax credit programs. […] The post IRS Makes Selfie Identification Mandatory for Online Services appeared first on FindBiometrics.

The IRS is making ID.me its official identity provider. The agency first started trialing ID.me’s platform back in 2017, initially using the technology to administer its child tax credit programs. The service has since expanded to encompass other digital services, and it will soon be mandatory for any taxpayers who want to access IRS services online.

As it stands, anyone who tries to create a new Online Account with the IRS will now be forced to create an ID.me account in order to do so. Those who opened an IRS account before the ID.me requirement was put in place will not need to make an ID.me account immediately, but will be expected to have one by summer of this year. The IRS is yet to provide a firm date for the transition.

Once created, taxpayers can use their account to view their tax records, make tax payments, and manage things like their advance child tax credit payments. ID.me can also be used with other participating federal agencies like the Social Security and Veteran Affairs departments, and to access government services in 27 US states. Citizens will not need to re-enroll in the ID.me program, but they will need to give each agency permission to access the information in their ID.me account.

To create an account, taxpayers will need to take a photo of an identity document (either a US passport, a state driver’s license, or a US passport card), and then take a video selfie. ID.me will use facial recognition to match the selfie to the image on the ID. If ID.me is unable to complete the match, the user will be referred to a human agent, who will verify that person’s identity during a video call. In those cases, the user will be expected to provide additional documents to prove their identity.

The IRS is hoping that its partnership with ID.me will help reduce taxation fraud. ID.me received FedRAMP Authority to Operate in June, and boasted that its remote identity technology had helped distribute $1.2 billion in unemployment claims as of October of 2020. In December, the company also hired a new Chief Product Officer to drive its expansion efforts.

Source: CNET

(Originally posted on Mobile ID World)

The post IRS Makes Selfie Identification Mandatory for Online Services appeared first on FindBiometrics.


ENISA Lays Groundwork for European Digital IDs With Two New Reports

The European Union Agency for Cybersecurity (ENISA) has released a pair of new reports that could influence the development of digital identity technologies in EU member states. One of the […] The post ENISA Lays Groundwork for European Digital IDs With Two New Reports appeared first on FindBiometrics.

The European Union Agency for Cybersecurity (ENISA) has released a pair of new reports that could influence the development of digital identity technologies in EU member states. One of the reports addresses some of the concerns with facial recognition technology, while the other details the potential for self-sovereign identities (SSI).

With regards to the former, ENISA noted that COVID-19 has created a need for technologies that can verify someone’s identity online. Facial recognition has helped meet that demand, and has been used to facilitate everything from financial transactions to citizen interactions with various government agencies.

The problem, according to the ENISA report, is that facial recognition is still vulnerable to spoofing. The report lists photos, video replays, masks, and deepfakes as the primary methods of attack, and recommends some steps that organizations can take to guard against those threats. Most notably, the report encourages organizations to set a minimum video quality for face checks, and to implement presentation attack detection systems that can gauge depth and spot some of the inconsistencies found in deepfakes. It also advises organizations to cross-reference identity documents with lists of lost, stolen, and expired IDs, and to adhere to industry standards and best practices when implementing an authentication system.

ENISA’s SSI report, meanwhile, could inform the creation of a broader European Digital Identity scheme. The digital IDs would be available to all EU citizens, and would theoretically serve as a trusted digital identity that could be used for a range of cross-border interactions.

ENISA noted that a good SSI should give individuals more control over their personal information, allowing them to choose what and when to share when proving their identity. Priorities for SSI include data minimization, accuracy, and consent, as well as utility more generally. Once completed, citizens will be able to store a European Digital Identity in a wallet on a mobile phone, and share information by clicking an icon when accessing online services.

In doing so, citizens would only need to share enough information to complete a transaction, which allows people to do business with a greater degree of anonymity. The two reports are part of ENISA’s ongoing efforts to support the EU’s eIDAS Regulation, which seeks to create an interoperable standard for electronic interactions in Europe. ENISA has previously advocated for the use of passwordless FIDO2 authentication technologies.

January 24, 2022 – by Eric Weiss

The post ENISA Lays Groundwork for European Digital IDs With Two New Reports appeared first on FindBiometrics.


ValidSoft Looks to Expand With New Chief Growth Officer

ValidSoft is looking to raise its profile in the North American market. To that end, the company has decided to bring on Joe O’Donnell as its new Chief Growth Officer […] The post ValidSoft Looks to Expand With New Chief Growth Officer appeared first on FindBiometrics.

ValidSoft is looking to raise its profile in the North American market. To that end, the company has decided to bring on Joe O’Donnell as its new Chief Growth Officer to oversee its ongoing expansion efforts.

O’Donnell’s career includes previous stops at Splunk, Cybereason, Palo Alto Networks, and Cisco Systems. In each instance, O’Donnell was asked to execute sales strategies that would lead to rapid growth, and he will be expected to do the same during his tenure at ValidSoft. In that regard, O’Donnell will be responsible for growth operations, a mandate that covers everything from business development to partnerships and go-to-market initiatives.

ValidSoft indicated that O’Donnell will not have to do everything alone, insofar as the company has tried to bolster its sales and growth teams with hires from a number of leading cybersecurity and call center providers. The company noted that fraud losses (and the associated costs) now exceed $3 trillion on an annual basis, which creates a demand for innovative security solutions that can help mitigate that threat. ValidSoft itself is best known for its voice authentication technology, which has proven to be popular with financial institutions.

“Joe is a world-class executive, and this appointment signals a critical growth phase for ValidSoft,” said ValidSoft Founder, Executive Chairman, and CEO Patrick Carroll. “We have spent the last few years building world-class, unique privacy-certified solutions adopted by some of the largest and most respected brands in the world. The addition of Joe and others comes at the right time and adds senior expertise that will take us to the next level and help us achieve our bold growth plans.”

“Classic authentication deployments, including multi-factor authentication implementations, are fallible,” added O’Donnell. “It’s time to flip the script and elevate voice to be the prime factor for authentication.”

Opus Research has identified ValidSoft as a leader in the voice biometrics market. MarketsandMarkets has predicted that the demand for voice-based authentication solutions will push the voice recognition market to $3.9 billion by 2026.

January 24, 2022 – by Eric Weiss

The post ValidSoft Looks to Expand With New Chief Growth Officer appeared first on FindBiometrics.


Transmit Security Brings On New Chief Marketing Officer to Spearhead International Expansion

Transmit Security is bringing on a new Chief Marketing Officer. Chris Pick joins the company following a stint at Tanium, where he held the same position. As CMO, Pick will […] The post Transmit Security Brings On New Chief Marketing Officer to Spearhead International Expansion appeared first on FindBiometrics.

Transmit Security is bringing on a new Chief Marketing Officer. Chris Pick joins the company following a stint at Tanium, where he held the same position.

As CMO, Pick will be asked to develop and run a new international marketing strategy to further the company’s growth efforts all over the world. He will be expected to help define a new cybersecurity category for Transmit Security as part of that process. In that regard, Transmit noted that Pick accomplished similar feats at Tanium and Apptio, where he served as Chief Marketing Officer for nine years before moving on to Tanium. At Apptio, Pick redefined the Technology Business Management category, and made Apptio a leader in that field.

Transmit itself is anticipating a period of significant growth. The company brought in $543 million in a record Series A funding round in June, and is now providing identity coverage for 115 million people in sectors like finance and retail. Transmit holds multiple patents for its biometric authentication technology.

“This year we are making very significant moves as a company, because we are disrupting the status quo that has dominated a critical area of technology for decades,” said Transmit President and Co-Founder Rakesh Loonkar. “Chris has differentiated experience tackling huge, intractable problems in technology, and his marketing strategies are innovative and impactful to large enterprises.”

“Passwords are the weakest link in online services,” added Pick. “It’s time we move the entire industry past the password problem, and I joined Transmit Security because it has the team, vision and vast resources to do so.”

The announcement comes only a month after Transmit brought on Chris Kaddaras as its new Chief Revenue Officer. Transmit has previously reported that customers will abandon sites if the login process takes too long, and it has released a passwordless BindID solution to help solve that problem. BindID offers support for face and fingerprint recognition.

January 20, 2022 – by Eric Weiss

The post Transmit Security Brings On New Chief Marketing Officer to Spearhead International Expansion appeared first on FindBiometrics.


Continuum Loop Inc.

Webinar Recording – 2021 Redux/2022 Trends To Watch

Thanks to all for participating in the 2021 Redux/2022 Trends To Watch webinar on January 19th, 2022. For the attendees, and those folks that couldn’t make it, we’ve put together a breakdown and some key items for you. Link to the recording Link to PDF of the presentation Darrell started off the webinar talking about […] The post Webinar Recording – 2021 Redux/2022 Trends To Watch appeared first

Thanks to all for participating in the 2021 Redux/2022 Trends To Watch webinar on January 19th, 2022. For the attendees, and those folks that couldn’t make it, we’ve put together a breakdown and some key items for you.

Link to the recording

Link to PDF of the presentation

Darrell started off the webinar talking about his top 5 stories from 2021. Check out his anecdotes:

Good Health Pass Interoperability Blueprint  SSI Book Self-Sovereign Identity – Decentralized digital identity and verifiable credentials The Current and Future State of Digital Wallets Governments slinging code Data sovereignty: UK & Canadian governments have great policy pieces issued by the respective governments on why open source helps and the concept of SSI. Evernym Acquisition cheqd launch

Darrell points out a few key trends to keep an eye on in the Decentralized Identity and SSI space for 2022.

Identity & Crypto/Blockchain Orchestration/Convergence Standards Shakeup Convergence of Credential Exchange Interoperability is still nascent (and premature) Trust Registries rolling out

Have a look at our past webinars, and the breakdown for each, on Trust Registries and The Wallet Report Update. They provide great content to help you navigate and understand the SSI world.

QUESTIONS!?

There were some great questions sent in, we really value and appreciate your engagement! Our Question Period started at about 45:00 – though some questions did come in earlier and were touched on through the presentation. Here is a run down of what we covered:

In regards to Trust Registries, do you foresee these registries being run in a centralized or decentralized manner? Are there any leaders? What business models have you seen? Are there any sector agnostic trust registries protocols and or standards yet or is the generic concept still in its early days? What scares you the most about 2022 and beyond for SSI? What about Mobile Driver licenses (mDLs)? Didn’t hear you mention that Apple supports them now. Did you hear about tbDEX and what Square/Block are doing? Where should I look for more detail on interoperability, standards, and how things are being put together? Why are you cynical about interoperability? Trust Over IP Foundation – Governance Stack Working Group (GSWG) & Technology Stack Working Group (TSWG) What is the biggest challenge or challenges the governments in Canada are struggling with on moving forward with digital ID’s

Be sure to reach out via our Contact Us page – we’re happy to jump on a call to see how we can help you examine Decentralized Identity and how to take control of your own digital world. We will also get you hooked up with our (semi-)monthly newsletter where we share a few key things that have been shifting in the industry.

The post Webinar Recording – 2021 Redux/2022 Trends To Watch appeared first on Continuum Loop Inc..


Dock

Join Dock’s Developer Program

Supporting developments that further the growth and adoption of the Dock network, developers are encouraged to apply for Dock’s grant program.

Supporting developments that further the growth and adoption of the Dock network, developers are encouraged to apply for Dock’s grant program. Choose a task, submit an application to Dock, and receive a reward once the project is completed.

Dock’s mission is to solve universal problems with existing data solutions. The grant program aims to further the growth and adoption of the Dock network for decentralized identities and verifiable credentials. All applications will be reviewed by the Dock team and ones that fall within the guidelines and meet the objectives will be rewarded the grant to begin their project.

Grant Objectives

Projects that could receive a grant must fulfil one or more of the following objectives:

Development of innovative applications using Dock’s credentialing technology for specific use cases. Designing ecosystem components to further usage and adoption. Low-level infrastructure development.

The Dock team has provided a list of potential tasks that can be completed under the grant program. Developers are also welcome to pitch additional projects to the Dock team, as long as these projects aim to further the growth and adoption of the Dock network for decentralized identities and verifiable credentials.

Guidelines and Criteria

Projects submitted for evaluation must clearly demonstrate how it fulfils one or more of the grant objectives listed above. As well as meeting the above objectives, applicants must ensure the projects main aim fits the guidelines and criteria as follows:

Individuals and all team members must have proven experience in their field and have the skills required to complete the project. On submission of the project, it should be well-defined and technically sound, listing milestones which you intend to reach in an agreed timeframe. The Dock team may choose to fund the project based on a series of milestones you set with specific deliverables. All code produced on production, after receiving the grant, must be open-sourced. It must not rely on closed-source software for full functionality.

When completing the application, it’s essential to include every detail from team members, their skills and duties, to deliverables you expect to produce at each milestone and how you plan to achieve them. When evaluating your project, the Dock Association may contact you and the team for additional information and/or clarification.

Project Ideas

Applicants are welcome to apply to Dock’s grant program for any project that fulfils the objectives and meets the guidelines listed above. Applicants are encouraged to propose and develop their own project ideas, here are some task ideas we’d like to see developed:

Anonymous Credentials Applications
Build the application using our Typescript crypto package, SDK, and blockchain.

ICV/Auth0 Credential Authorisation Flow
Use the API to issue short-lived credentials that allow for a user to login to a system or authorize their identity.

Improve TS Package
Improve the TS package using the WASM Wrapper. See what you can do to improve and enhance the project.

Client Library
Provide client wrapper code for our Dock API in various languages.

Improve WASM Wrapper
Improve the WASM Wrapper here. See what you can do to improve the WASM project.

Documentation and Examples

Improve documentation that may be confusing or is lacking sufficient content. 

Application Process

After choosing a project, developers should complete the Grant Program Application. This form should be submitted to the Dock Association for review. As mentioned above, the team may contact you and your team for additional information and/or clarification.

After evaluation, the Dock Association will choose a number of projects they wish to see be produced. Select developers will be notified and presented with a grant agreement outlining the terms of the grant, the project scope, timeline and milestones.

Robin’s Work

A member of the Dock community, Robin Bortlik, has taken part in the Grant Program and built a Ruby Wrapper for Dock’s API. The Ruby Wrapper can be found here.

An API Wrapper provides a way to access the API through a particular programming language or interface, simplifying the process of interacting with it. All in all, the Wrapper helps streamline the steps of making API calls, and thanks to Robin’s contribution, Ruby developers will gain all the benefits of using Dock’s API.

Dock’s API allows you to issue, verify, and revoke verifiable credentials in a few simple clicks. By making data available via API, the faster and easier data migration ensures improved data quality review and cleanup.

Apply to Dock's Developer program here and help further the growth and adoption of the Dock Network.


auth0

How Developers Will Work in 2022

From tooling changes to the ascent of the hybrid event model, it’s all change, all the time.
From tooling changes to the ascent of the hybrid event model, it’s all change, all the time.

Dark Matter Labs

Towards healthy, sustainable and just Swedish and planetary food systems

In a post-pandemic world, food is not just our biggest challenge, but also our biggest opportunity for sustainable change. The ways that food is produced, distributed and consumed have major negative impacts on people and the planet. Food production is the largest human pressure on Earth, causing a mass extinction of species, and accounting for 23% of annual greenhouse emissions. Food is at the c

In a post-pandemic world, food is not just our biggest challenge, but also our biggest opportunity for sustainable change.

The ways that food is produced, distributed and consumed have major negative impacts on people and the planet. Food production is the largest human pressure on Earth, causing a mass extinction of species, and accounting for 23% of annual greenhouse emissions. Food is at the core of various social justice challenges such as working conditions, access to affordable good food, cultural diversity and burgeoning food-related health crises of both malnutrition and obesity. This means that many of the broader environmental and social challenges can be addressed if the way that food is produced and managed is transformed. As Johan Rockström, Director of Potsdam Institute for Climate Impact Research, said in 2016:

If we get it right on food, we get it right for both people and planet.

Food transformations will require a series of fast and slow transitions over the next decades. The world’s governments have agreed on the UN Global Goals for 2030. Here in Sweden the country plans to have net zero emissions by 2045. However, neither the world nor Sweden are on track to achieve these goals, and with every day that passes, impacts accumulate and the challenges grow.

Food is essential and a source of great pleasure. A shared meal can be a ritual or celebration, and a simple everyday pleasure. Food is a rich and meaningful expression of our shared cultural diversity, and the way we make, distribute, and enjoy it, speaks loudly as to what we stand for as a society. Food can be a generator of wealth, in numerous forms, and its cultivation can restore ecological functions. Given that food is the strongest connection humanity has with the biosphere, that most of our food systems are productive but unequal, and that food cultures can either be enriching or destructive, transforming our global food systems represent both a grand challenge and a grand opportunity. If we are able to make a just transition to food cultures that restore the biosphere rather than degrade it, we will not only improve the lives of the world’s people, and enhance the biosphere, but also learn how to become better stewards of the Earth.

Sweden and other Nordic countries are well-positioned to lead the way in the global transformations towards sustainable food systems. It shares many of the world’s problems, while beginning to take action to address them. Current diets in Norway, Sweden, Denmark, Iceland and Finland are unhealthy, carbon intensive, and harm terrestrial and aquatic ecosystems, both in the Nordics and elsewhere. About half the population in Nordic countries is overweight or obese, food waste is a major problem, and meat consumption is well above amounts recommended for health or sustainability.

Sweden is also taking action to change its food system. The Nordic region has world-class dietary guidelines, strict agricultural regulations and emerging innovations to food challenges. In Sweden, diverse innovations are taking place in farming, marketing, logistics and retail. Similarly, many cities, municipalities, and national government are working to develop policy to improve the food system, for both health reasons and environmental objectives. Furthermore, there is widespread public discussion, led in particular by chefs, to build on the success of the New Nordic cuisine [PO1] to create a cuisine that is good for both people and the planet, exploring how to reverse these recent trends towards greater meat consumption and increased environmental damage. As Nordic populations diversify, the various cuisines in the region are enriched accordingly. This means Sweden has the potential to lead global food system transformations.

The unequal and often unplanned impacts of the Covid-19 pandemic have highlighted many flaws and fractures in the fabric of everyday life, not least because it moves as a complex system, as opposed to respecting the way Sweden is organised within health, mobility, urbanism, security, governance and other sectors. We believe that the responses to COVID-19 on the Swedish food systems may reveal aspects of the resilience measures needed to meaningfully address the risks and opportunities facing Swedish food systems.

While some of the rapid responses of food system actors to COVID-19 might accelerate a ‘just transition’ towards sustainable and resilient food systems, others may impede it. To better navigate towards sustainability, we want to understand how responses to COVID either built or eroded resilience within the Swedish food system resilience, and whether these responses enhances or reduced the transformative capacities of the Swedish food systems. Such understanding can help us better align existing and future research and innovation activities, in order to enable a ‘just transition’ of Sweden’s complex food systems; systems not defined by its territorial boundaries but the planetary flows that it is interlinked with.

Transformations require significant changes in multiple dimensions of society. These changes must take place at different levels in society, from practices and behaviours, to rules and regulations, to values (financial and non-financial), and world-views. Transformation involves changing the relationships among people, but also profoundly changing the relationships between people and nature. History shows us that crises can create openings for transformation. But while positive societal transformations may arise from crises, the consequences of crises are often not positive. The risk is that opportunities created by the crisis are missed and that crisis response — despite good intentions and innovation — fail to address accumulated problems, and restore a slightly improved or a damaged status-quo. Navigating crises requires understanding of capabilities and capacities that are needed and then developing and mobilizing those capabilities and capacities for change.

Our project aims to learn from the current COVID-19 crisis and support the enabling of capacities for transforming towards Swedish food systems that promote the health, equity and sustainability of people and the planet. The project will identify and map risks and opportunities emerging in the Swedish food systems due to the COVID-19 pandemic through a co-creative process with public, private and civic food system actors. This process will develop understanding and articulate alternative pathways for food system transformations. It will do this through a Rapid Transition Lab.

Rapid Transition Lab

The Rapid Transition Lab will identify opportunities and pathways for Swedish food system actors to engage in a rapid transition, in response to the COVID-19 crisis. It will assess different strategies within the Swedish food systems, and support the design of more transformative strategies, practices, and institutions. Vinnova, the Swedish government’s innovation agency who fund this work, plans to connect the learning to multiple related stakeholders in and around policymaking and practice relating to food systems in Sweden.

The Rapid Transition lab will consist of a sensemaking and a futuring project and be structured around three overlapping phases: First, the project will build a comprehensive understanding of the impacts of and the responses to Covid-19 in the Swedish food systems. Second, based on these insights, existing food system scenarios, and through workshops engaging multiple stakeholders, the project will build transformative food system scenarios. In the third phase, the project will use these scenarios to identify strategies and pathways and possible portfolios of experiments for actors to contribute to the transformations of the Swedish food systems. These will be accomplished by developing a process of collective sense-making.

Learning from the crisis to reframe food systems: By exploring the interconnected cascading risks and opportunities that have emerged from responses to COVID-19 and the 2018 drought in Sweden, in the context of Swedish food system transformations, we will create broader views of food systems and cultures. Suggesting strategies to support the capacity building for transformations: Building exploratory scenarios, informed by the responses to the COVID-19 crises in the Swedish food systems, can reveal capacity-building strategies that might better enable transformations in the Swedish food systems. The scenarios, will build on previous and ongoing scenarios, to develop systemic, holistic and multidimensional scenarios that will be used to identify a portfolio of pathways and experiments. Input into strategic policymaking and innovation practice: Key stakeholders in and around policymaking and practice relating to food systems in Sweden, will be engaged to actively contribute and learn from these exploratory processes. The analysis of the crisis, and development of the scenarios should be relevant to these stakeholders.

The Rapid Transition Lab will be linked to other ongoing transdisciplinary food system projects that Vinnova or the Stockholm Resilience Centre lead or are partners in. These include MISTRA Food Futures, Nordic Food Policy Lab, Sustainable Finance Lab, and NorthWestern Paths. The Rapid Transition lab will complement these projects by focusing on the responses to Covid 19 pandemic in the Swedish food systems and use these to build scenarios that in turn will be used to develop transformation strategies and identify capacities for change.

Further reading

More on Nordic food system transformations can be found in four policy briefs from the SRC, https://www.stockholmresilience.org/research/research-news/2020-11-24-nordic-countries-are-well-suited-to-collaborate-on-food-systems-transformation.html

What can the COVID-19 pandemic teach us about resilient Nordic food systems? http://norden.diva-portal.org/smash/get/diva2:1450471/FULLTEXT01.pdf

Who we are

Stockholm Resilience Centre Stockholm Resilience Centre (SRC) is an international research centre on resilience and sustainability science. The centre is a joint initiative between Stockholm University and the Beijer Institute of Ecological Economics at The Royal Swedish Academy Sciences. Since its launch in 2007, SRC has developed into a world-leading science centre for addressing the complex challenges facing humanity. This project will build upon ongoing work on food, scenarios and transformations within the SRC, which include MISTRA Food Futures, NorthWestern Paths, as well as the Seeds of the Good Anthropocene project. link to SRC Food theme

Dark Matter Labs Dark Matter Labs (DML) is a lab working to transition society in response to climate breakdown and the technological revolution. Collaborating with stakeholders from various contexts we aim to discover, design and develop the new institutional ‘dark matter’ that inhibits movement towards more democratic, distributed and long-term futures. Through the Rapid Transition Lab, our ambition is to move towards a participatory portfolio of experiments that investigate dark matter in the Swedish food systems; systems which is not defined by their territorial boundaries but interlinked in planetary systems.

Vinnova Vinnova is Sweden’s innovation agency. The purpose of the agency is to help build, refine and coordinate Sweden’s innovation capacity, contributing to sustainable growth. The vision is that Sweden is an innovative force in a sustainable world. The work is governed by the Swedish government, and is based on the global sustainable development goals of the 2030 Agenda adopted by the United Nations. The agency identifies areas where its efforts can make a difference and creates opportunities and incentives for organizations to work together to meet important societal challenges. It is an expert authority with 200 employees, and provides funding as well as strategic and system design expertise to the Rapid Transition Lab project.

This blog is co-written by Per Olsson, Garry Peterson from Stockholm Resilience Centre, Alexander Alvsilver, Dan Hill, from Vinnova, Linnea Rönnquist, Aleksander Nowak, and Juhee Hahm from Dark Matter Labs. Rapid Transition Lab is funded by Vinnova, and the project will continue throughout 2021, and 2022. If you want to find out more, please contact us.

Towards healthy, sustainable and just Swedish and planetary food systems was originally published in Rapid Transition Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Hackers Exploit Bug to "Steal" $1 Million in NFTs from OpenSea Users

A bug has been exploited to purchase NFTs from users of OpenSea, at well below market value. NFTs with a market value of $1.1 million have been purchased in this way.

A bug has been exploited to purchase NFTs from users of OpenSea, at well below market value. NFTs with a market value of $1.1 million have been purchased in this way.


Crypto Regulatory Affairs: US Federal Reserve Issues Paper on CBDCs

US Federal Reserve Issues Paper on CBDCs The Federal Reserve Board (The Fed) released a hotly-anticipated paper on Thursday January 20th, in which it discusses many of the potential risks and benefits of issuing a central bank digital currency (CBDC) version of the dollar. Formal CBDC working groups have been developed in other countries including Australia and China. Though this
US Federal Reserve Issues Paper on CBDCs

The Federal Reserve Board (The Fed) released a hotly-anticipated paper on Thursday January 20th, in which it discusses many of the potential risks and benefits of issuing a central bank digital currency (CBDC) version of the dollar. Formal CBDC working groups have been developed in other countries including Australia and China. Though this represents the first major discussion of the issue by the Fed – the primary monetary authority and bank holding company regulator in the US. 


Okta

Simplify Building Vue Applications with NuxtJS

Nuxt calls itself the intuitive Vue framework. It aims to make a developer-friendly experience while not sacrificing performance or degrading the integrity of your architecture. It has been exciting to see the community and tooling around VueJS grow and evolve — there’s no better time to get started in this ecosystem than now. In this tutorial, you will build a small web application that retrie

Nuxt calls itself the intuitive Vue framework. It aims to make a developer-friendly experience while not sacrificing performance or degrading the integrity of your architecture. It has been exciting to see the community and tooling around VueJS grow and evolve — there’s no better time to get started in this ecosystem than now.

In this tutorial, you will build a small web application that retrieves some posts from an API and displays them for authenticated users. For authentication, you will integrate Okta into your Nuxt application. Okta’s simple authentication system and the power of Nuxt means you can configure and set up your authentication in just a few moments.

What you’ll need

Your favorite IDE (I will be using Visual Studio Code) Node.js An Okta Developer Account The Okta CLI tool Set up your Okta application

Install the Okta CLI and run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Single-Page App and press Enter.

Use http://localhost:3000/login for the Redirect URI and set the Logout Redirect URI to http://localhost:3000/.

What does the Okta CLI do?

The Okta CLI will create an OIDC Single-Page App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. It will also add a trusted origin for http://localhost:3000/. You will see output like the following when it’s finished:

Okta application configuration: Issuer: https://dev-133337.okta.com/oauth2/default Client ID: 0oab8eb55Kb9jdMIr5d6

NOTE: You can also use the Okta Admin Console to create your app. See Create a Vue App for more information.

Build your web application with Nuxt

If you wish to follow along using the completed project you can check out the GitHub repository here.

Nuxt provides a scaffolding tool called create-nuxt-app to make scaffolding your application easy. You can run the following command to create the application:

npx create-nuxt-app okta-vue-nuxt-example

Usually, task runners have several options. There are some important options you should take a look at while using this task runner.

Programming language: JavaScript Package manager: Npm UI framework: Bootstrap Vue Nuxt modules: Axios - Promise based HTTP client

We will use Axios to fetch data in our application. Next, you can select the linting tool of your choice and a testing framework.

Rendering mode: Universal (SSR / SSG) Deployment target: Server (Node.js hosting)

You can use the following commands to enter the project directory and run the application.

cd okta-vue-nuxt-example npm run dev

If you open http://localhost:3000/ in your browser, you will see the default Nuxt page.

The Nuxt project layout is pretty straightforward. First you have a .nuxt folder where your compiled server code will end up. Next there is a components folder. You won’t use the folder in this tutorial, but breaking pages into components is common practice in larger projects. These components are then reusable in a number of pages. Next you will find a pages folder where your pages will go. Your routes will be inferred by Nuxt from these views. The static folder is where you can house css, images, or other static content to display. The store directory contains your Vuex store files.

You will also add a layouts folder later. As you might have guessed, this folder will contain layouts. There are several other directories that are configured out of the box for Nuxt, including middleware, modules, plugins, and dist. These are out of scope for this article but it is important to know they exist.

Finally, you will need two packages from npm. The first is @nuxtjs/dotenv which is a nuxt-friendly implementation of dotenv. You will use this to store sensitive information that you don’t want to end up in your source control.

Finally you will need @nuxt/auth-next to control your authentication.

npm i @nuxtjs/dotenv@1.4.1 npm i @nuxtjs/auth-next@5.0.0-1637333559.35dbd53

With your dependencies installed it’s time to start building your application. First, add a new file to your root directory and name it .env, then add the following code to it.

OKTA_DOMAIN=https://{yourOktaDomain} OKTA_CLIENT_ID={yourClientId}

Be sure to replace the placeholder variables with your actual Okta information.

Now open your nuxt.config.js file located in the root directory and replace its contents with the following code.

export default { // Global page headers: https://go.nuxtjs.dev/config-head head: { title: "todolist-article", htmlAttrs: { lang: "en", }, meta: [ { charset: "utf-8" }, { name: "viewport", content: "width=device-width, initial-scale=1" }, { hid: "description", name: "description", content: "" }, { name: "format-detection", content: "telephone=no" }, ], link: [{ rel: "icon", type: "image/x-icon", href: "/favicon.ico" }], }, // Global CSS: https://go.nuxtjs.dev/config-css css: [], // Plugins to run before rendering page: https://go.nuxtjs.dev/config-plugins plugins: [], // Auto import components: https://go.nuxtjs.dev/config-components components: true, // Modules for dev and build (recommended): https://go.nuxtjs.dev/config-modules buildModules: [], // Modules: https://go.nuxtjs.dev/config-modules modules: [ // https://go.nuxtjs.dev/bootstrap "bootstrap-vue/nuxt", // Doc: https://axios.nuxtjs.org/usage "@nuxtjs/axios", "@nuxtjs/dotenv", "@nuxtjs/auth-next", ], /* ** Axios module configuration ** See https://axios.nuxtjs.org/options */ axios: {}, auth: { strategies: { okta: { scheme: "openIDConnect", endpoints: { configuration: `${process.env.OKTA_DOMAIN}/oauth2/default/.well-known/oauth-authorization-server`, logout: undefined, }, clientId: process.env.OKTA_CLIENT_ID, grantType: "authorization_code", responseType: "code", }, }, }, // Build Configuration: https://go.nuxtjs.dev/config-build build: {}, };

The first thing this file does is make use of dotenv. It registers the modules you added earlier through the task runner and via npm. Finally, it sets up the options for your OAuth configuration. Here is where you use the environment variables that you set up in your .env file.

You should note the line of code that reads logout: undefined. It is critical for logging the user out correctly, because Okta requires the idToken to be passed as a query parameter. However, nuxt-auth won’t include that parameter under the hood. The solution is to override the logout URL obtained from the configuration endpoint with undefined and manually log the user out in your page file. You will implement this shortly.

To use the OAuth configuration properly, Nuxt requires you to add a file to the store folder called index.vue. You can leave this file empty, but it must exist for Nuxt to use it. If you do not have a store folder, create one now and add a blank index.vue file to it.

Add your Nuxt pages

Now, you can start adding pages to your application. Nuxt builds your routes by looking in the pages folder. You can read more about Nuxt’s custom routing in their documentation.

Before working on your pages you should set up your layout. Add a new folder called layouts to the project directory and add a file called default.vue.

This file, and all of the views in this project, will use the Vue template syntax. Vue templating is very similar to most other template syntaxes. It relies heavily on the v-bind HTML attribute to handle events or bind attribute values. You can handle events with the v-on syntax, which can be short handed to @, such as @click="doSomething". The b-* components are from the Bootstrap Vue library that should have been installed via the npx task runner.

The layout page will display the headers and footers and incorporate some branching logic to determine if a user should see a login or logout button. It also contains some common CSS and JavaScript that will be needed on each page that uses the layout. The <nuxt /> element on this page will act as a placeholder for the code on your page. Nuxt will render your page code in this section.

<template> <div> <b-container> <b-navbar toggleable="lg" type="dark" variant="info"> <b-navbar-brand href="#">Posts</b-navbar-brand> <b-navbar-nav> <b-nav-item href="/dashboard"> Dashboard </b-nav-item> </b-navbar-nav> <b-navbar-nav v-if="loggedIn" class="ml-auto"> <b-button @click="logout" size="sm" class="my-2 my-sm-0" type="submit" >Logout </b-button> </b-navbar-nav> <b-navbar-nav v-else class="ml-auto"> <b-button @click="login" size="sm" class="my-2 my-sm-0" type="submit" >Login </b-button> </b-navbar-nav> </b-navbar> <nuxt /> <footer id="sticky-footer" class="py-4 bg-dark text-white-50"> <div class="fluid-container footer"> <small>Copyright &copy;{{ year }} </small> <br /> A small app built with <a href="https://nuxtjs.org/" target="blank">Nuxt</a>, Protected by <a href="https://www.okta.com/" target="blank">Okta</a>, Written by <a href="https://profile.fishbowlllc.com" target="blank">Nik Fisher</a >. </div> </footer> </b-container> </div> </template> <script> export default { data() { return { loggedIn: this.$auth.$state.loggedIn, year: new Date().getFullYear(), } }, methods: { logout() { this.$auth.logout(); }, login() { this.$auth.loginWith('okta').then(result => window.location = "/Dashboard"); } }, } </script> <style scoped> .fluid-container.footer > *:last-child { margin-bottom: 0px; color: #fff; } </style>

You will also need a basic landing page that isn’t under authentication. Your landing page will give some information about the application. It will also contain a redirect for authenticated users to route them to the Dashboard page.

Open the index.vue page in the pages directory and replace the code with the following.

<template> <div id="page-content"> <b-jumbotron style="margin-top: 5vh" header="Lets Get Some Posts" lead="A Simple App with Nuxt and Okta" > <p>For more information visit their websites</p> <b-button variant="primary" href="https://nuxtjs.org/" target="blank" >Nuxt</b-button > <b-button variant="outline-primary" href="https://www.okta.com/" target="blank" >Okta</b-button > </b-jumbotron> </div> </template> <script> export default { beforeMount() { if (this.$auth.$state.loggedIn) window.location = '/Dashboard' }, } </script>

Next, you can add the Dashboard page. This page makes use of the data() webhook to get some data from a server using Axios, which it then displays in a table. The Dashboard page also makes use of the auth middleware to enforce authentication on this page. When a user attempts to hit this page, they will be rerouted to your login page if they are not authenticated.

Add a new file to the pages folder and name it dashboard.vue. Copy the following code into your dashboard.vue file.

<template> <div> <b-table :items="posts" :fields="fields"> </b-table> </div> </template> <script> import Vue from 'vue' export default Vue.extend({ middleware: ['auth'], data() { return { fields: ['userId', 'title', 'body'], posts: [], } }, async beforeMount() { this.$axios .$get('https://jsonplaceholder.typicode.com/posts') .then((res) => { this.posts = res }) .catch((err) => { console.log(err) }) }, }) </script>

Finally you will need to add the login page where Nuxt will route unauthenticated users. Add a new page to the pages folder named login.vue. Add the following code to it.

<template> <div id="page-content" class="p-4"> <a class="btn btn-primary" @click="$auth.loginWith('okta')">Login with Okta </a> </div> </template> <script> export default { middleware: ['auth'], data() { return {} }, beforeMount() { if (this.$auth.$state.loggedIn) window.location = '/Dashboard' } } </script> Testing your Nuxt application

Now that your application is complete you can run npm run dev from your terminal and the application will build. You should see the home page first.

Click on Dashboard or Login and this should route you to the Okta login form. You can enter your Okta credentials here and you will be routed to the Dashboard page where you can view the posts.

Do more with Nuxt and Vue

With a little bit of code, you can combine Nuxt and Okta to make secure Vue SPAs or universal applications. You can find the source code for the example created in this tutorial on GitHub.

In this post, you learned how to use the @nuxt/auth-next package to secure your Nuxt application using Okta. You also learned how to create the application in Okta and configure the Vue app to use it. Finally, you learned how to use the @nuxtjs/axios package to pull data from a sample API.

If you’d like to learn more about building web apps with Vue, you might want to check out these other great posts:

Use Schematics with Vue and Add Authentication in 5 Minutes Learn How to Build a Single-Page App with Vue and Spring Boot Vue Login and Access Control the Easy Way

Make sure you follow us on Twitter and subscribe to our YouTube channel. If you have any questions, or you want to share what tutorial you’d like to see next, please comment below.

Sunday, 23. January 2022

KuppingerCole

Analyst Chat #109: From IT GRC to Integrated Risk Management Platforms

The three biggest threats to business resilience are IT Risk, Compliance Risk, and Vendor Risk. Integrated Risk Management Platforms address these risks. KuppingerCole's Lead Analyst Paul Fisher has analyzed this market segment recently and he joins Matthias to talk about recent developments and the market in general.

The three biggest threats to business resilience are IT Risk, Compliance Risk, and Vendor Risk. Integrated Risk Management Platforms address these risks. KuppingerCole's Lead Analyst Paul Fisher has analyzed this market segment recently and he joins Matthias to talk about recent developments and the market in general.




Northern Block

The Future of Work will happen through DAOs (with Stepan Gershuni)

 Listen to this Episode On Spotify Listen to this Episode On Apple Podcasts About Episode Stepan Gershuni is working towards making it easier for DAOs to onboard talent through Professional ID credentials. There is an oversupply of talent looking to join DAOs and Stepan is trying to help them manage their excess talent pools […] The post <strong>The Future of Work will happen through DA

Listen to this Episode On Spotify

Listen to this Episode On Apple Podcasts

About Episode

Stepan Gershuni is working towards making it easier for DAOs to onboard talent through Professional ID credentials. There is an oversupply of talent looking to join DAOs and Stepan is trying to help them manage their excess talent pools and onboard people faster.

 

During this conversation, we discuss: Where are the opportunities to apply SSI principles and frameworks towards DAOs? How do you decentralize hiring, education, and the coordination of work. Are .eth good or bad identifiers? Are .eth names and decentralized identifiers (DIDs) mutually exclusive? How can the decentralized discovery of professional skills happen? And more!

 

Related Content

DAOs will offer more interesting opportunities to workers to contribute towards meaningful work. And SSI will allow workers to build up a verifiable portfolio of knowledge and skillsets, which can be used to participate in more and more opportunities through DAOs.

Read more here: The Future of Work will happen through DAOs


About Guest

Stepan Gershuni is the Lead Product Manager at Affinidi, Founder of Credentia and Deep Skills.

 

Twitter: https://twitter.com/sgershuni LinkedIn: https://www.linkedin.com/in/sgershuni/

The post <strong>The Future of Work will happen through DAOs</strong> (with Stepan Gershuni) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

Friday, 21. January 2022

Holochain

Small Holochain Release

Holochain Dev Pulse 111

This week is a quiet one for news. But if you’re a developer; the following few changes are important ones!

holochain-conductor-api → holochain-client-js

After a discussion both inside the team and with key developer community members, the name of the JavaScript conductor client has been changed. Everyone feels this better reflects its purpose as a client library for the conductor admin and app APIs.

The GitHub repo can now be found at https://github.com/holochain/holochain-client-js, and the NPM package is now called @holochain/client.

Of course, this is a breaking change, and you’ll need to update your package.json file in any project that uses this library.

If you’re a Rust front-end developer, there’s a package for you as well on GitHub. If you’re working on a different front-end language and are building a conductor client library for it, we’d love to tell other developers about it!

Finally, the hApp client call tutorial has been updated to use the new NPM package name (and has been updated to work with Holochain 0.0.119 and newer too).

Holochain 0.0.123: unified zome calling

The only breaking change with this release is that call_remote has been merged into call in the host API, and the signature has changed (#1180). If your hApp accesses the host API directly, you’ll need to refactor your code. If you use the HDK instead, the call function’s signature has changed too. The call_remote function is still available and the signature hasn’t changed, but you will need to recompile your zome against HDK 0.0.119 if you use it.

A network-level bug has also been fixed (#1181). Attempts to send validation receipts back to authors created an infinite loop; this was hanging the conductor on app startup.

Save the date: FOSDEM 2022 on 5–6 Feb

As mentioned two Dev Pulses ago, we’re once again co-hosting the Web3 / DWeb / peer-to-peer computing room at FOSDEM, Europe’s largest open-source conference. In addition to three sessions related to Holochain and one related to Fluence, our co-host, there are a lot of other exciting-looking projects on the schedule. I’ve had the pleasure of speaking to people involved with many of these projects over the past few years, and they’re all breaking ground in unique ways and contributing to the evolution of the decentralised web.

Cover photo by ThisisEngineering RAEng on Unsplash


Safle Wallet

Safle partners with Celer Network to Enable Multi-Chain Open Canonical bridging for $SAFLE

♾ Safle is delighted to announce yet another new partnership! We have partnered with Celer Network and have completed the integration of our ecosystem’s native SAFLE token on cBridge. Through this integration, users will now be able to bridge their SAFLE directly between Polygon, Ethereum and BSC — allowing them to gain access to cBridge’s fast finality speeds and extremely low costs! “For dem

Safle is delighted to announce yet another new partnership! We have partnered with Celer Network and have completed the integration of our ecosystem’s native SAFLE token on cBridge. Through this integration, users will now be able to bridge their SAFLE directly between Polygon, Ethereum and BSC — allowing them to gain access to cBridge’s fast finality speeds and extremely low costs!

“For demonstration purposes only”

Cross-chain liquidity on the SAFLE token is in high demand due to our project’s network-agnostic approach to liquidity provision. After considering several existing bridging solutions, we have chosen to adopt the Open Canonical Token Standard to avoid binding ourselves to a singular bridging solution, allowing different bridges to concurrently serve the project. SAFLE holders can find a quick and easy tutorial for how to use cBridge for cross-chain transfers here.

Enabling bridging of the SAFLE token is the first step in the aligned vision of Safle and Celer. With Celer’s generalized message passing SDK coming close to its release, the two teams will discuss more opportunities in offering SAFLE users access to a wide spectrum of DeFi scenarios on different chains with a unified user experience in the near future.

A setback only paves way for a comeback! WGMI SafleNauts.

🥞 🍣

About Celer Network

Celer Network is a layer-2 scaling platform that brings fast, secure and low-cost blockchain applications on Ethereum, Polkadot and other blockchains to mass adoption. Celer launched the world’s first Generalized State Channel Network and continues to push the frontier of Layer-2 scaling with advanced Rollup technology. Core applications and middleware like cBridge, layer2.finance, and more ecosystem applications built on Celer have attracted large audiences in the DeFi, blockchain interoperability, and gaming spaces.

Learn more about Celer: https://www.celer.network

Telegram | Twitter | Youtube | Facebook | Meetup | Medium | GithubDiscord

About Safle

A next-generation non custodial wallet, self sovereign identity protocol and Web 3.0 infra. provider for the decentralised ecosystem, governed by the community. Safle is a decentralised blockchain identity wallet that enables secure private key management and seamless experience for dApps, DeFi and NFTs. In order to maintain a balance between developers and retail users, Safle intends to develop a wallet infrastructure in a completely non-custodial fashion using Open Governance Mechanisms via the SafleDAO coordinated and maintained by the Safle token economy. The native $SAFLE token will not only enable token holders to propose and vote on changes (governance privileges) to functionalities and feature sets of the wallet and node services, but will also create a self-sustaining token economic model where value will be generated by providing access to finance and identity in the decentralised digital world.

Website | GitHub | Discord | Twitter | Instagram | Telegram Ann | Telegram Chat


Dark Matter Labs

New Value Flows: Impact modelling and financing neighbourhood retrofit, v0.1

New Value Flows: Impact modelling and Financing neighbourhood retrofit, v0.1 This is part of a series of Weeknotes as part of our TransCap x CivicValue x CommCap exploration. We have republished it here in our Provocations to spark a conversation on how we can rethink value flows in neighbourhood-level climate transitions. Neighbourhoods are facing an imminent need to transition for climate chan
New Value Flows: Impact modelling and Financing neighbourhood retrofit, v0.1

This is part of a series of Weeknotes as part of our TransCap x CivicValue x CommCap exploration. We have republished it here in our Provocations to spark a conversation on how we can rethink value flows in neighbourhood-level climate transitions.

Neighbourhoods are facing an imminent need to transition for climate change and deal with deep-seated issues of fuel poverty and other injustices. One of the most intractable challenges is the financial cost-benefit balance in undertaking a neighbourhood retrofit programme.

Under current conventional approaches, the comprehensive range of benefits and co-benefits projects generated are often unaccounted for, with the result being some projects being deemed (reductively) unviable investments, or worse, a liability. For example, as we have seen in our TreesAI project, urban trees are often seen as costs to municipalities, rather than value-creating natural assets, a logic that justifies trees being removed rather than nurtured.

Dark Matter Labs’ approach to the TransCap x CommCap project is based our working hypothesis, ‘Transitioning Together’, which aims to lay the framework of a system that puts streets and neighbourhoods in the driving seat of their climate transition, empowering residents to be able to sense-make, deliberate, implement, and access financing for their neighbourhood’s climate transition projects.

A fundamental part of our hypothesis is about rewiring value flows in order to make these community-driven climate transitions possible. This includes an alternative approach that attempts to account for a project’s benefits and co-benefits in a comprehensive, efficient and holistic way. This also includes wiring a project’s revenue and financing to its impact as a way of incentivising the maximisation of benefits.

While the community will be the primary beneficiary of climate transition, there will be other stakeholders/actors who too will co-benefit: such as public or philanthropic bodies who may have their liabilities reduced, nearby property-owners who may benefit from land value uplift, or local businesses that may benefit from revenue uplift. By putting in place mechanisms (e.g. Smart Covenants or outcome-buying contracts / social impact bonds) to capture these co-benefits, we would provide a long-term revenue source that can shift the cost-benefit balance in favour of doing these necessary projects, and also provide a framework for stakeholders to invest in outcomes-based preventative measures to reduce their liabilities (such as investing in housing retrofit, rather than spending on healthcare system costs for dealing with the health consequences of substandard, unhealthy homes).

The Transitioning Together model so far: revenues from captured co-benefits replenish a city-wide evergreen fund that funds community-driven climate transition projects. Initial experiments in value modelling

As part of our work developing this model, Dark Matter Labs and Civic Square are engaging with the community of Link Road (in Edgbaston, Birmingham) as a pioneer community for this approach. Based on data from the tangible, real-world context of this neighbourhood in Birmingham, we have taken a first attempt at creating this value model that represents a climate transition project’s costs, comprehensive estimates of its various benefits and co-benefits, and also the potential for these co-benefits to be captured.

Some questions that are beginning to emerge from this work include:

Is this model based on rewiring value flows and capturing co-benefits viable for funding community-driven climate transition projects?
In particular, we want to find out the approximate scale of the potential revenues that could derive from capturing co-benefits, and how they compare to the costs of a project. Is it possible to quantify and model the comprehensive set of co-benefits being created by climate transition projects?
For each type of climate transition project, it is necessary to develop an understanding of the types and quantities of (co-)benefits being generated, from the perspectives of the resident, a co-beneficiary, and an investor — is there enough research to allow this to be estimated, and how can we avoid double-counting? How can qualitative co-benefits be accounted for in the model?
While capturing co-benefits and outcomes requires some degree of quantification, how they are metricised and captured opens up a space for contestation. More critically, we are reflecting on how qualitative benefits can be factored into the model. How do we model the value that is created beyond the projects themselves?
Value created by such an approach can exist beyond the projects themselves: for example, the capacity a community develops and experience they gain after successfully implementing their first project can unlock further, more ambitious projects — this has ‘sequential’ value. The coexistence of multiple projects in a neighbourhood can also create a ‘systemic’ value that is greater than the sum of its parts. How can we account for these in the model? Finally, are there any ethical risks of financialisation civic or public value, including any perverse incentives created by doing so? This work also represents a transition from a model of centralised state taxation and spending, where the creation of civic value, captured or not, is considered part of the state’s remit. While this model has both its opportunities and risks, we must also consider the new set of opportunities and risks by taking a more decentralised approach to civic value creation. Value modelling approach Diagram showing the components of the interactive value model.

The structure of our initial value model is set out in the diagram above: the model is designed to take several global inputs (e.g. assumptions on inflation, the investment horizon, base interest rate, etc.). The model also allows the selection of a portfolio of different types of civic assets and climate transition projects (e.g. home energy efficiency retrofit, urban agriculture, community-owned PV energy generation, etc.), and the scale of the project (e.g. number of homes retrofitted, m2 of gardens used for growing, m2 of PV panels, etc.). Based on these inputs, the model calculates for each civic asset/project:

Costs: This calculates the upfront and ongoing costs of building and operating the project, based on fixed costs and costs that scale per unit of the asset. Impacts/benefits generated: This calculates the impacts generated over a comprehensive set of factors quantified in their natural units (tonnes of carbon emissions prevented and/or sequestered, health outcomes improved through no. of avoided hospitalisations/deaths, no. of employment opportunities created, litres of stormwater retained, no. of households brought out of fuel poverty, etc.) The calculations use assumptions based on local statistics and review of academic literature, and is part of our ongoing work to refine and test these calculation methods through further research. Revenues: This calculates the revenues generated over a comprehensive set of factors quantified in the local currency (GBP) — this includes direct revenues such as one-off or annual grants, but also potential revenues from spillover value capture. The spillover capture calculation takes each benefit generated and identifies third party liability holders who may see potential liability reductions and therefore cost savings. A percentage of these estimated cost savings can be captured as a revenue source for the project, to be negotiated with these liability holders as potential ‘outcome buyers’ (for example, the local NHS Clinical Commissioning Group may pay for the outcome of reduced cold weather hospitalisations from home retrofit, based on a percentage of savings from costs of x fewer patients not going to A&E per year).

These cost, impact, and revenues calculations will then be aggregated across the entire portfolio, giving an idea of financing requirements and potential investment returns. We are still reflecting on how secondary value, such as the ‘sequential value’ and ‘systemic value’ described above can be incorporated into the model’s current working form. Double-counting remains challenging to disentangle, especially in areas such as the mutually linked nature of physical and mental health outcomes.

A version 0.1 of our model

We have created a first attempt of the model based on the logic above in a spreadsheet. The estimates for costs, impacts, and revenues in this version are ‘back of the envelope’ calculations based on common sense assumptions and high-level research we have done so far. This model will be further developed as part of our next stage of work, as the types of impacts modelled are expanded, and deeper research is done to quantify and predict these impacts with more rigour. We have tried to make this modelling as tangible as possible by basing the modelling on the real-world location of Link Road, Birmingham, where Dark Matter Labs and Civic Square are working with a group of residents to test this approach.

Main dashboard of the interactive model, with current work being done to replace placeholder figures with more research.

The main interface of the model is the dashboard, which gives an overview of the neighbourhood climate transition portfolio, and the whole portfolio’s estimated costs, impacts and revenues. The dashboard allows global inputs, such as investment horizon and the assumed annual inflation rate. The dashboard also allows different civic assets/climate transition projects to be included as part of the neighbourhood portfolio, and the scale of those assets (e.g. no. of homes retrofitted, m2 of urban agriculture, etc.). By changing these inputs to model different scenarios, the user of this modelling tool, whether it is the residents themselves or their advisors, can experiment and find an ideal portfolio of projects that work for their vision of their neighbourhood, and test its viability both in financial terms and the benefits/impacts the community is seeking.

Beyond the dashboard, each civic asset will have its own page with a more detailed breakdown of the capital and operating costs, its impacts and co-benefits, and also potential both one-off and recurring direct revenues (e.g. government grants/subsidies) and revenues from captured spillover values, with the calculated values updated based on the scale of the asset, and also global inputs. In this version 0.1 of our model, we have begun to build out these costs, impact and revenue models for an average home retrofit (up to EPC C rating), with the ambition to refine the model with more research and sophisticated modelling methods.

Financial modelling assumptions and findings:

The key assumptions on the cost side are:

Cost per home of retrofitting the home: This is the most significant variable and will depend on characteristics and size of existing home, lifestyle of occupants, target improvement levels and the local supply chain. The literature appears to include a wide range of costs. Most helpful for Link Road are probably the actual average costs incurred through the BEIS supply chain pilots which were mainly undertaken on owner-occupied ‘early adopters’ seeking to do whole-house retrofits. This ended up costing in the region of £20-£25k per home. We acknowledge that some homeowners would be comfortable with improving to EPC C and including a low/zero carbon heat source such as an Air-Source Heat Pump (ASHP). Therefore we’ve assumed a total cost of around £20k per home for this work, but acknowledge that it could be more expensive. We have factored in operating/maintenance costs as a overall percentage of the capital costs, acknowledging that at regular periods and the end of the horizon (e.g. 30 years), the buildings and their retrofitted work may need to undergo periodic maintenance and upgrades. We hope to make this more granular depending on the type of intervention in future versions of the model. Our approach to deep retrofit beyond the individual house means we should consider improving the streetscape and open spaces as part of retrofit. In our model, we have treated public realm improvements as a separate civic asset. These costs tend to be modest in comparison with the domestic improvements.

Key assumptions on the revenue side:

Energy savings are hard to quantify as again they depend on the home, occupants and selected level of comfort. We have used high-level government assumptions around improving to EPC level C saving around £220 per household per year. This could be an underestimate especially given rising energy costs. These can be capitalised at a suitable discount rate and a reasonable life-cycle of the elements installed to achieve these outcomes. Carbon emissions avoided are also based on the same government assumptions, approximately 1.69 tonnes of carbon per house per year. Although these emissions reductions are not all currently traded, there is a potential for this to be financialised beyond the existing ETS system — we have used the current (possibly underpriced) value of carbon in the UK ETS market as a conservative indication of a potential source of revenue. This is subject to change based on the carbon intensity of the grid, the price of carbon, and other regulatory changes in emissions trading. We have also started to estimate other outcomes. These include social outcomes (e.g. fuel poverty alleviation), health outcomes (e.g. excess winter mortality, respiratory hospitalisations), and environmental outcomes (e.g. reduction of NOx emissions), with more in the pipeline. These are high-level back-of-envelope estimations using data as specific to the context (Link Road / Birmingham / West Midlands / UK) as possible. We are also in the process of reviewing approaches of modelling precedents, such as the C40 Cities model, GMCA’s Unit Cost Database, or the UK Government’s HIDEEM model, as a way of informing our own modelling methodology. Emerging findings & next steps

The model is preliminary and requires substantial development as part of the second stage of work. The following points are findings that are beginning to emerge.

Energy savings make the biggest contribution in the average household. This range could be around 15–30% of up front costs depending on capitalisation assumptions, modelled savings, lifestyle and poverty of household etc. At the time of writing, energy costs are rising and there is a general expectation that OFGEM will increase the tariff cap. Without further intervention this will result in greater fuel poverty, but will further strengthen the case for using energy savings as part of the model. Carbon emissions savings (if traded) makes up around 10–20% of the up front capital if this can be realised. Alternative avenues beyond the ETS system should be considered as a way of realising the value of these emission savings. The potential revenue from other spillover values such as health appear to be small based on the first tranche of research. However, we are seeing in some implemented projects, such as Warm Homes Oldham which is co-funded by the local public health service, where estimated savings to public healthcare are on a comparable scale to energy savings. This requires further exploration and verification and does not rule out the viability of capturing these spillover values when they are aggregated across neighbourhoods and considered at a city scale (as proposed by the Transitioning Together model). This means that the contractual, technological and other mechanisms to allow this value capture would have to be tested at a small scale, albeit with negligible returns, in order for this to be scaled. This would also help us understand if the returns on these values can be efficiently measured and distributed.

We are now entering the Strategic Design phase of the project. This has a number of strands. For the impact and finance workstream we will:

Refine and test the assumptions and modelling approach. Improve our understanding of the scale at which different elements of the model would work best, both in terms of any internal cross-subsidy required, but also in terms of efficiency. Shortlist the liability holders and commence discussions with them, to test the political viability of outcome contracting. Build a legible, user-friendly model that can be used in engagement with residents and other stakeholders

We’ll be posting further updates of our progress as the work progresses. Stay tuned!

This blog was written by Calvin Po and Dan Hill. Tom Beresford and Jack Minchella are also a part of the TransCap x CivicValue x CommCap exploration. Raj Kalia has also developed the work on the ‘Transitioning Together’ hypothesis.

New Value Flows: Impact modelling and financing neighbourhood retrofit, v0.1 was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


Europechain

The Relationship Between NFTs And Physical Assets

The relationship between NFTs and the physical world can be very vague. In this article we demystify the relationship between the two and discuss the possibilities!

Pablo Picasso was, by all accounts, a rather prolific artist. Throughout his lifetime, no fewer than 50,000 pieces of art are attributed to the Spaniard. Paintings, sculptures, prints, even tapestries, Picasso accrued an impressive body of work. Picasso passed away from natural causes at his home in France on April 8, 1973. He was 91, and died without a will. The artist’s passing marked the...

Source


Elliptic

Consumer Protection the Main Focus for Crypto Regulators in 2022

Last year saw significant regulatory change impact the cryptoasset space, and 2022 is gearing up to bring more substantial regulatory activity that will shape the industry long into the future. 

Last year saw significant regulatory change impact the cryptoasset space, and 2022 is gearing up to bring more substantial regulatory activity that will shape the industry long into the future. 


Affinidi

Custodial vs Non-custodial Wallets

A digital wallet is a software program designed to keep your digital assets safe and secure. It can be used to store cryptocurrencies, verifiable credentials, and more. These digital wallets can be broadly categorized into custodial and non-custodial wallets, depending on the level of control you have over your wallet. Specifically, this categorization is based on who stores the private keys

A digital wallet is a software program designed to keep your digital assets safe and secure. It can be used to store cryptocurrencies, verifiable credentials, and more.

These digital wallets can be broadly categorized into custodial and non-custodial wallets, depending on the level of control you have over your wallet. Specifically, this categorization is based on who stores the private keys of your wallet.

Before we jump into the differences, let’s recap how wallets work. Note that in this article, we will look only at Self-Sovereign Identity (SSI) wallets that store your verifiable credentials.

Working of an SSI Wallet

Any credential such as your name, age, address, educational qualification, work experience, etc. can be issued in a digital format known as a Verifiable Credential (VC). The obvious advantage of such a credential is that it is highly portable, easy to share, machine-verifiable, secure, and tamper-proof because it is cryptographically signed.

Here, “cryptographically-signed” means Public-Key Infrastructure (PKI) is used to secure the information contained in a VC, to prevent it from unauthorized viewing or tampering. The PKI has two parts, a private key, and a public key, and both these keys work only in pairs.

Every verifiable credential is signed by a public key and can be seen by anyone. But only the authorized people can access its contents as they will have the private key needed to unlock the public key.

The same principle applies to a wallet too. You need its private key to open the wallet and view or share the verifiable credentials stored in it. This is akin to a password.

From this standpoint, we can say that the entity that stores and maintains the private key of the wallet is its custodian.

Custodial Wallets

A custodial wallet is a type of digital wallet where the organization that offers this service holds the private key needed to open the wallet and access its contents. Here, the organization is the custodian of the wallet.

Advantages of Custodial Wallets

So, what are the advantages that come with custodial wallets?

Most custodial wallets are web-based and hence, device agnostic. You can access these wallets from any part of the world, and need just a browser and Internet. Highly convenient The holder doesn’t have the responsibility to remember the password or the private key. Extensible, as many organizations offer the same wallet for storing not just VCs, but also cryptos and other digital assets going forward. Disadvantages of Custodial Wallets

Custodial wallets are not perfect, as they come with some shortcomings as well.

The password is stored in the servers of a third party, and this can make it a lot less secure. Requires considerable research to identify a secure and reputable wallet. Custodial wallets come with a fee.

In all, a custodial wallet is a convenient and extensible option for storing your verifiable credentials. It takes the responsibility of storing and remembering passwords from the user, but at the same time, brings a third party into the transaction, thereby making it a lot less secure. Furthermore, the organization that controls the wallet’s keys can determine in the future how and where its contents can be used.

What are Non-Custodial Wallets?

As the name suggests, non-custodial wallets are those wallets that are solely controlled by the owner. In other words, you are the only one who knows the password to open the wallet, and hence, you’re the only one who can access it. No other entities are involved in managing your wallet.

Advantages of Non-custodial Wallets

The biggest advantages of non-custodial wallets are privacy and security. Since no one other than you can access the wallet, you can be assured that its contents will never be accessible to anyone and hence, offers the highest levels of privacy.

Furthermore, only you can determine how your VCs can be used, and with whom they can be shared. In this sense, non-custodial wallets adhere to the principles of Self-Sovereign Identity completely.

Disadvantages of Non-custodial Wallets

The biggest disadvantage of non-custodial wallets is their inaccessibility when you lose or forget the password. Since you are the sole custodian, there is a huge responsibility on you to always remember the password of your wallet.

The other minor disadvantage is that non-custodial wallets can be tied down to a device, depending on its implementation. This means you will always need the associated device to access the wallet application.

So, which of the two is better? Simply depends on your preference!

Affinidi’s Wallet

If you decide to go ahead with custodial wallets, note that Affinidi’s wallet is one of the best choices today in terms of security and flexibility. You can get a glimpse of it at wallet.affinidi.com.

Here are some screenshots of our wallet.

For more information and questions, please reach out to us on Discord, visit our Dev Portal, join our mailing list, and follow us on LinkedIn, Twitter, and Facebook.

Custodial vs Non-custodial Wallets was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Radiant Logic

Identity Deathmatch: Unification vs. Integration

A look at the process of Identity Unification, and its advantages over Integration. The post Identity Deathmatch: Unification vs. Integration appeared first on Radiant Logic.

Civic

An Update on the Verified by Civic Pass Program & the Big Daddy Ape Club NFT Project

We care deeply about our community and the victims involved in the Big Daddy Ape Club (BDAC) rug pull. We believe in a safer, healthier NFT ecosystem, and we are taking action in areas where we believe we can make an impact. The Verified by Civic Pass program is designed to verify the real-world identities […] The post An Update on the Verified by Civic Pass Program & the Big Daddy Ape Club

We care deeply about our community and the victims involved in the Big Daddy Ape Club (BDAC) rug pull. We believe in a safer, healthier NFT ecosystem, and we are taking action in areas where we believe we can make an impact.

The Verified by Civic Pass program is designed to verify the real-world identities of creators, so that they may build trust among their communities. The program is a free service that is just one piece of the trust and safety puzzle within the NFT space.

As a company that specializes in identity verification, we do not endorse projects in this program, nor do we perform due diligence on them beyond our identity verification services. Our program aims for the highest levels of accuracy. You can read more about the full details of our service here and here.

The identity of the individual who presented themselves as the founder of the BDAC project was verified through our program. We are working with law enforcement to assist in their investigation, but do not know how long their investigation will take. We’ve also been in touch with Solanart, TRM Labs and Solrarity.

Anyone around the world can file a tip to help further this case at https://www.ic3.gov/. We have been told by the authorities that evidence related to communication with the project creators (screenshots, etc.) is particularly useful. Likewise, the authorities have told us it’s also helpful if victims file complaints locally within their countries.

FAQ for the Verified by Civic Pass & Big Daddy Ape Club NFT Project

What does it mean to be Verified by Civic Pass?

Please note that the Verified by Civic Pass program is designed to verify the real-world identities of creators. We do not endorse or vouch for projects participating in our program, nor do we perform due diligence on them beyond our identity verification services. The program is a free service that is just one piece of the trust and safety puzzle within the NFT space. Please see service details here.

What is Civic doing for the victims of the rug pull?

The Verified by Civic Pass program is designed to verify the real-world identities of creators. The program is a free service intended to be one piece of the trust and safety puzzle for the NFT community. In this case, we’re working with law enforcement to assist in their investigation. Victims can file tips to help further this case at https://www.ic3.gov/.

How do I get verified for the Verified by Civic Pass program?

At this time, we’re pausing the Verified by Civic Pass program. We’ll be in touch with more information when we have an update to share.

Who were the individual(s) Civic verified in connection with BDAC?

We’re working with law enforcement to assist in their investigation but cannot share details at this time.

The post An Update on the Verified by Civic Pass Program & the Big Daddy Ape Club NFT Project appeared first on Civic Technologies, Inc..


Radiant Logic

Future-Proof Your Security Investments with an Identity Data Fabric

An Identity Data Fabric eliminates major security hurdles for sizable organizations. The post Future-Proof Your Security Investments with an Identity Data Fabric appeared first on Radiant Logic.

Wednesday, 22. December 2021

Radiant Logic

From Static Directories to Context Servers

An introduction to the Radiant Logic blog by founder and former CEO Michel Prompt. The post From Static Directories to Context Servers appeared first on Radiant Logic.

Welcome to Radiant 2.0!

CEO Joe Sander welcomes readers to Radiant Logic’s Blog 2.0—we’re happy you’re here! The post Welcome to Radiant 2.0! appeared first on Radiant Logic.

The post Welcome to Radiant 2.0! appeared first on Radiant Logic.


Cloud Directories, IDaaS, and Federation IdPs vs. the Rest of Your IAM Infrastructure

An overview of the challenges for integrating identity across on-premises and cloud systems. The post Cloud Directories, IDaaS, and Federation IdPs vs. the Rest of Your IAM Infrastructure appeared first on Radiant Logic.

Keep the Silos and Elevate Your Identity with an Identity Data Fabric

Unify identity across data silos into a single source of truth about all users. The post Keep the Silos and Elevate Your Identity with an Identity Data Fabric appeared first on Radiant Logic.

Are Data Silos Your Key to Better Cybersecurity?

Our Identity Data Fabric platform can help you beef up your cybersecurity posture. The post Are Data Silos Your Key to Better Cybersecurity? appeared first on Radiant Logic.

Deliver Customer Data Privacy and Compliance by Unifying Identity

This rise in large-scale data breaches has led to much stricter data privacy laws. The post Deliver Customer Data Privacy and Compliance by Unifying Identity appeared first on Radiant Logic.

KuppingerCole

New Methods to Accelerate Endpoint Vulnerability Remediation

IT endpoints are no longer just workstations and servers confined to corporate headquarters, branch offices, customer sites, and data centers, they can now be just about anything located anywhere, from employee homes to airports, hotels and in the cloud. But every endpoint represents a potential entry point for cyber attackers, and needs to be managed.

IT endpoints are no longer just workstations and servers confined to corporate headquarters, branch offices, customer sites, and data centers, they can now be just about anything located anywhere, from employee homes to airports, hotels and in the cloud. But every endpoint represents a potential entry point for cyber attackers, and needs to be managed.




Radiant Logic

The Rise of the Identity Data Fabric

Why a radically simple approach to identity data is the right one. The post The Rise of the Identity Data Fabric appeared first on Radiant Logic.

Coinfirm

Meet the Team: Adam Rylski

Hello, I’m Adam Rylski and I have been at Coinfirm since late 2019. When I joined the company, I was already an experienced Software Developer, yet I had almost zero knowledge about cryptocurrencies and knew Bitcoin only from memes. But with all the help from my awesome colleagues, I was able to gain very detailed...
Hello, I’m Adam Rylski and I have been at Coinfirm since late 2019. When I joined the company, I was already an experienced Software Developer, yet I had almost zero knowledge about cryptocurrencies and knew Bitcoin only from memes. But with all the help from my awesome colleagues, I was able to gain very detailed...

Indicio

Liquid Avatar Technologies and Ontario Convenience Stores Association (OSCA) Deliver Successful Pilot of Digital Age-Verification Solutions to Reach over 8,000 Retail Locations

Liquid Avatar The post Liquid Avatar Technologies and Ontario Convenience Stores Association (OSCA) Deliver Successful Pilot of Digital Age-Verification Solutions to Reach over 8,000 Retail Locations appeared first on Indicio Tech.

Coinfirm

Crypto Crime Investigations Series: Part 1

Crypto related crime reached an all time high in 2021. Will 2022 be any different? From personal wallets to leading exchanges, many have been victims to serious crypto crime to the tune of billions.  Join us on our first Crypto Crime Investigations Series of 2022 to find out about: Money Laundering Terrorism Financing  Crypto Asset...
Crypto related crime reached an all time high in 2021. Will 2022 be any different? From personal wallets to leading exchanges, many have been victims to serious crypto crime to the tune of billions.  Join us on our first Crypto Crime Investigations Series of 2022 to find out about: Money Laundering Terrorism Financing  Crypto Asset...

Elliptic

Q&A With Mark Aruliah: Elliptic’s New Senior Policy Advisor, EMEA

In this interview, Mark Aruliah discusses what led him to transition to the crypto space from a career working as a regulator, and why it’s so critical for regulators to get their heads around this emerging technology.

In this interview, Mark Aruliah discusses what led him to transition to the crypto space from a career working as a regulator, and why it’s so critical for regulators to get their heads around this emerging technology.


Spherity

Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S.

Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Market. Legisym, LLC is a trusted expert in the U.S. Life Sciences Market, providing services to pharmaceutical companies around the world since 2009 Legisym and Spherity have worked closely together to bring to maturity a joint offering that meets the security requirements of the U.S. Life
Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. Life Sciences Market. Legisym, LLC is a trusted expert in the U.S. Life Sciences Market, providing services to pharmaceutical companies around the world since 2009

Legisym and Spherity have worked closely together to bring to maturity a joint offering that meets the security requirements of the U.S. Life Sciences Market. As part of the joint development, both companies have collaborated with SAP and Novartis, which have already subjected the product to extensive quality testing and functional validation. Spherity and Legisym are pleased to officially announce their partnership as of today.

Spherity is partnering with Legisym offering a joint compliance product, Credentialing Service, for the U.S. Life Sciences Market.

In November 2013, the U.S. congress enacted the Drug Supply Chain Security Act (DSCSA) in order to protect patients’ health. To ensure that only legitimate actors are part of the supply chain, the regulation requires U.S. pharmaceutical trading partners to ensure that they only interact with other trading partners that are authorized. Authorized is every trading partner holding a valid state-issued license or a current registration with the Food and Drug Administration (FDA).

Today in 2022, U.S. pharmaceutical supply chain actors have no interoperable, electronic mechanism to validate each other´s authorized status. With more than 60,000 interacting trading partners involved in the U.S. Life Sciences Industry and a FDA recommendation to respond to data requests in under one minute, a solution that provides compliance with the regulations by 2023 is in high demand. Legisym and Spherity have decided to cooperate and offer an interoperable highly secure service to enable pharmaceutical supply chain actors to become an Authorized Trading Partner (ATP) according to U.S. DSCSA.

Legisym, as a trusted identity and license verification service provider, perfectly complements Spherity’s digital wallet technology for managing verifiable credentials. The verifiable credential technology is used to represent the authorized status of interacting trading partners in an highly efficient, secure and DSCSA-compliant way. To use credentialing for Authorized Trading Partner (ATP) requirements under DSCSA, trading partners need to go through a one-time due diligence onboarding process with Legisym. Once the verifiable credentials are issued, they are stored in a secure digital wallet which comes embedded with the Credentialing Service provided by Spherity. Using this technology enables U.S. pharmaceutical supply chain actors to interact with digital trust, as they now can digitally verify their ATP status in every interaction.

Georg Jürgens, Manager Industry Solutions at Spherity says, “together with our partner Legisym we focused on making the adoption of credentialing for trading partners as simple as possible. Manufacturers, wholesalers and dispensers can all acquire a digital wallet and ATP credentials within minutes without integration effort and use this innovative solution for DSCSA-regulated interactions.”

“Legisym is thrilled to be working alongside Spherity to bring the first production-level ATP Credentialing solution to the industry,” said Legisym President & Co-Owner David Kessler. “With the successful completion of the ATP Credentialing Pilot in 2020 and the joint founding of the Open Credentialing Initiative in early 2021, the Spherity-Legisym partnership is already proving successful in collaboration and forward thinking.”

Legisym and Spherity founded along with other adopters, the Open Credentialing Initiative (OCI). This newly formed organization incubates and standardizes the architecture using Digital Wallets and Verifiable Credentials for DSCSA compliance for Authorized Trading Partner requirements. Besides U.S pharmaceutical manufacturers, wholesalers, and dispensers, the OCI is open for solution providers integrating the ATP solution.

For press relations, contact communication@spherity.com. Stay sphered by joining Spherity’s Newsletter list and following us on LinkedIn.

About Legisym, LLC
For over a decade, Legisym, LLC has successfully provided the pharmaceutical industry with affordable and effective regulatory compliance technologies. In early 2020, driven by the 2023 authorized trading partner (ATP) requirements, Legisym began leveraging their existing Controlled Substance Ordering System (CSOS) and license verification technologies and experience, to engage as a credential issuer. By performing thorough credential issuer due diligence processes, first to establish a root of trust, Legisym promotes confidence in the trading partner’s digital identity prior to the issuance of all ATP credentials.

About Spherity

Spherity is a German software provider bringing secure and decentralized identity management solutions to enterprises, machines, products, data and even algorithms. Spherity provides the enabling technology to digitalize and automate compliance processes in highly regulated technical sectors. Spherity’s products empower cyber security, efficiency and data interoperability among digital value chains. Spherity is certified according to the information security standard ISO 27001.

Spherity is Partnering with Legisym Offering Joint Compliance Product for the U.S. was originally published in Spherity on Medium, where people are continuing the conversation by highlighting and responding to this story.


OWI - State of Identity

Behind the Biometric

We're still at the dawn of the applications of biometrics. What is the industry's role in leading consumer education? Join host Cameron D'Ambrosi and David Ray, COO & General Counsel at Rank One Computing as they discuss how biometrics have evolved from government applications to B2C use cases. You'll uncover how widespread mobile adoption has ushered biometrics across new industry verticals,

We're still at the dawn of the applications of biometrics. What is the industry's role in leading consumer education? Join host Cameron D'Ambrosi and David Ray, COO & General Counsel at Rank One Computing as they discuss how biometrics have evolved from government applications to B2C use cases. You'll uncover how widespread mobile adoption has ushered biometrics across new industry verticals, unlocking the power of understanding who's on the other side of that device.


Elliptic

Multichain DeFi Hacker Returns $1 Million to Victims

A hacker who exploited a bug in DeFi protocol Multichain has returned $1 million to their victims, keeping the remaining $200,000 as a "bounty".

A hacker who exploited a bug in DeFi protocol Multichain has returned $1 million to their victims, keeping the remaining $200,000 as a "bounty".


Dock

The World of Anonymous Credentials

The main idea in anonymous credentials is that rather than considering the credential data as arbitrary bytes which are then signed by the issuer, anonymous credentials adds "structure" to the credential.

A credential comprises one or more claims made by the issuer of a credential about the subject. For example; a driving license containing the licensee's name, SSN, date of birth, etc is a credential where such data and details are claims made by the license issuer about the licensee which is the subject. A credential is called a verifiable credential when its authenticity can be cryptographically checked by anyone because the credential contains a cryptographic signature by the issuer, and the issuer's public key is well known. To convince a service provider, the verifier, of the authenticity of a verifiable credential, the credential along with the signature is shared with the service provider who can then verify the signature.

However, the above approach may be harmful to the user’s privacy. The entity in possession of the credential being the holder, has to share the credential in its entirety with the verifier, which may reveal a lot more information than what was needed. As an example, the holder may have wanted to share his name and date of birth from his license, not the rest of the credential data like his SSN, all while still being able to convince the verifier that he does own a verifiable credential containing his name and date of birth. This desired property is called selective disclosure where the holder can choose to disclose only parts of the credential, rather than the entire credential.

Another problem to solve is linkability. Say the holder is able to achieve selective disclosure where it only reveals part of the credential data to the verifier (the part that is sufficient for the verifier to provide service to the holder), the holder would not want the verifier to link his previous interactions with him. This desired property is called unlinkability. There are cases however when the holder is willing to be linked across several of his interactions but that should be voluntary and we do support that.

In some cases, even the above two properties might not be sufficient, for example when the holder wants to reveal part of the data -

If a holder wants to prove he is over 18 or above 65 for availing a service, he doesn't need to show his date of birth; just being able to prove those conditions, called predicates is sufficient for the verifier. For proving that he/she is not a resident of a city/state, the holder should not need to share his exact address. For availing of social welfare, when the holder's salary is below a certain amount, the holder should not need to reveal his salary from his bank statement, but show an anonymous credential to convince the verifier of the same. Similarly, an investor should be able to convince a verifier that the total value of his assets is greater than some amount without revealing the actual values of his assets.

One of the solutions to solve the above problem is called anonymous credentials, sometimes called Attribute-Based Credentials (ABC). The main idea in anonymous credentials is that rather than considering the credential data as arbitrary bytes which are then signed by the issuer, anonymous credentials adds "structure" to the credential. Each claim in the credential is treated as an attribute and then these attributes are signed in a specific way by the issuer to create a signature. These attributes and the signature now form anonymous credentials. For example, if a credential has 3 claims made by the issuer about the holder including, name, city, and SSN, these 3 claims correspond to 3 attributes, one for each claim. Because of the "structure" in this credential, the holder can prove to a verifier, using zero-knowledge proof, that he has this credential from the issuer without revealing any of the attributes. To be precise, the holder can disclose these attributes exist and have been signed by an official issuer, without disclosing the attributes themselves or the signature.

"The holder can also reveal one or more attributes as part of the zero-knowledge proof from this anonymous credential, thus achieving selective disclosure."

“One or more zero-knowledge proofs created from the same anonymous credential cannot be linked together thus giving us unlinkability. "

To achieve both of these properties, we use a signature scheme called BBS+. A useful property of BBS+ (and similar) signatures is that when using several BBS+ signatures, attributes common to the signatures can be proven equal without revealing them. For example, if you have a driving license credential and passport, both with a BBS+ signature, you can prove that your SSN attribute is the same in both your driving license and passport without revealing your SSN. This property allows linking many credentials together and is helpful in cases when you need to prove that all credentials are about the same subject (or subject id).

Because credentials are essentially attestations made by an authority, i.e. the issuer, meaning they act as evidence or proof of something, it should be possible for the authority to permanently or temporarily renounce the attestations. These revoked credentials signal that the credential should not be accepted by any verifier. This kind of signaling requires the verifier to check the unique ID against a public list of all revoked credentials. However, revealing such a unique ID goes against our unlinkability requirement, as the unique ID of the credential ties all zero-knowledge proofs created from that credential. Therefore, we need a system in which the holder can prove that the membership of their credential ID is in a list in-zero knowledge:

The verifier is assured the holder’s credential ID is not in the public list of all revoked credentials The verifier will not have access to the credential ID This system will ensure the holder is unlinkable.

For this zero-knowledge membership, we use an accumulator which is a set-like database but with a very small size and allows such zero-knowledge membership checks. Accumulators also enable other use cases where such a membership or non-membership check is required.

Say that you have a bank details credential that contains your account number as one of the attributes. To convince a regulator that your account is not in a "suspicious-accounts" list, you can prove to the regulator that your account is not in the accumulator which contains all suspicious accounts. We use a pairing-based accumulator called VB accumulator.

Some other common requirements with anonymous credentials is being able to prove conditions or predicates about the attributes like proving total income from a bank statement credential is less than the limit. Here the total income might not be a single attribute but composed of several attributes with one attribute per income source. Therefore, the attributes need to be added together before comparing to the limit amount, some cases may need more complex operations on one or more attributes.

These capabilities are not possible with BBS+ alone, and we therefore use zk-SNARK which can be used to prove arbitrary conditions on the attributes. The zk-SNARK we use is the same as used in Zcash but adapted to be linked with the BBS+ signature.

Anonymous credentials offer great privacy but too much privacy can sometimes be crippling for regulators or law enforcement. Consider an exchange as a verifier that allows customers to make transactions after they prove possession of a government ID, but does not ask for any other information. At any point, a regulator or law enforcement may want to know details of who made certain transactions, but the only information the verifier has and is able to provide, is that someone made the transactions with a government ID.

This isn’t sufficient for the regulator or law enforcement, traceability is needed here. However, not everyone in this chain needs traceability, certain authorities are the ones in need of it. We solve this problem by a technique called verifiable encryption.

Using the example above, a customer who possesses a government ID should be required to encrypt their SSN for the regulator without revealing the SSN itself. Authorities can decrypt the SSN and track who the government ID belongs to, while the customer is assured the exchange is not able to track them, only the authorities who need to trace them. Note that verifiable encryption is not mandatory with Dock credentials, it’s opt-in.

The core cryptographic library that implements the signature schemes and proof protocols is implemented in Rust. The library contains different primitives in separate crates. The notable ones are BBS+ signatures, accumulators, and the composite proof system. The composite proof system allows us to combine these primitives like proving knowledge of 2 BBS+ signatures with proof of accumulator membership.

Since a large part of this code needs to run in a Javascript environment like a browser or a react native app, we built a WASM wrapper over the Rust code. The wrapper is quite thin and mostly does deserialization/serialization logic. It intentionally lacks abstractions like classes and is just some modules with free-floating functions. Finally, there is a Typescript library that uses the WASM wrapper and provides various abstractions like creating signatures, proofs, accumulators.  

As verification of proofs and accumulator membership/non-membership requires publicly available authenticated storage, we deployed the ability to post BBS+ and accumulator public keys on our chain and also to create, update and remove accumulators. The changes have been live on the mainnet for some time and corresponding modules and tests are added in SDK for BBS+, accumulators, and composite proof by fetching objects from the chain.

For a more detailed explanation of the primitives, see the explainer in the GitHub repo. The usage with examples is described here. It is broken down into the following sections:

BBS+ signature Accumulator Composite proof system

The composite proof subsection shows several examples like selective disclosure, creating proofs over multiple BBS+ signatures, BBS+ signature with accumulator membership, getting blind BBS+ signature, meaning even the issuer is not aware of certain attributes, and opt-in linkability where the holder chooses to be linkable across several proofs of his credential, but only at a certain verifier.

We are exploring options for predicates like range proofs, verifiable encryption, and support for arbitrary computation on credential data in general, but we have PoCs for showing that they work with BBS+ signatures.

This PoC shows the usage of BBS+ and LegoGroth16, which is an adaptation of Groth16, the zk-SNARK used in Zcash. The PoC shows how to:

Prove that some attribute of a BBS+ signature satisfies some bound that can be used to prove that age is less than or greater than some number Prove that the sum of certain attributes satisfy a certain bound which can be used to prove that the sum of liabilities is less than some number. Prove that sum of certain attributes of 1 credential is less than the sum of certain attributes of another credential, which can be used to prove that the sum of liabilities is less than the sum of assets.

This PoC shows verifiable encryption based on SAVER. We show in this test, that a user can verifiably encrypt any of his attributes for a third party, and convince the third party that the encryption has been correctly done.


Ocean Protocol

Data Governance Across Continents: What does it mean for internet users?

Udbhav Tiwari, Public Policy Advisor at Mozilla, compares the lack of data privacy protection laws in certain parts of the world to the others. We have often spoken about the relevance of Web3 and decentralization in our podcast. Still, it is essential to talk about pioneers in the Web2 space fighting for free internet and data privacy. They are the advocates working with governments for dat

Udbhav Tiwari, Public Policy Advisor at Mozilla, compares the lack of data privacy protection laws in certain parts of the world to the others.

We have often spoken about the relevance of Web3 and decentralization in our podcast. Still, it is essential to talk about pioneers in the Web2 space fighting for free internet and data privacy. They are the advocates working with governments for data regulations that protect internet users. One of such organizations is Mozilla. In this episode of Voices of the Data Economy, we talked with Udbhav Tiwari, Public Policy Advisor at Mozilla. During this discussion, he explains the role of Mozilla in the free internet ecosystem, data regulations discrepancies in the Asia Pacific and particularly India, and more. Here are edited excerpts of the podcast.

Mozilla’s role in the internet ecosystem: 20 years of Mozilla Firefox

Mozilla explicitly works towards treating the internet as a global public resource that should be accessible to everyone. It does this with the help of different products such as Firefox, Pocket, and Mozilla VPN.

“My role in the global public policy team at Mozilla is to also look out for laws and regulations that may impact these products. I oversee if the laws can be improved to ensure that they lead to that open Internet — I then advocate for those laws. Or if there are particularly worrying things in certain laws and regulations that would be bad, not just for Mozilla’s products, but also for the internet — I also tend to work on those issues. And so far my remit has largely been in the Asia Pacific region but I also work on some other issues globally.”

The main aspects of Udbhav’s role are

data protection and privacy content regulation and intermediary liability connectivity, encryption, and security

The Mozilla Foundation is actually a digital rights NGO. It has fellows that work on internet health and internet security around the world. At Voices of the Data Economy, we have interviewed two fellows in the past: Julia Reinhardt and Anouk Ruhaak. The foundation has initiatives like YouTube Regrets and privacy not included.

Data Regulations across Asia and India

Udbhav mentions that Europe has had a data protection law for much longer than other parts of the world (of course, now we have GDPR).

“Here in the Asia Pacific region, it’s quite far from being the norm even to have a data protection law in the first place. If one were to turn the clock back, I would say even ten years ago; one could easily count on two hands the number of countries in the Asia Pacific region that had a dedicated data protection law on the books. But thanks to the proliferation of technology and public demands from civil society that want better protections for users and consumers– things are changing.”

Australia has had a privacy law for a fair amount of time. Still, they are actually amid re-looking at their privacy law; it’s being updated to suit modern technologies better. If one were to move a little higher up, countries like the Philippines have now had a privacy law for some time. And in fact, business imperatives around the business processing and outsourcing industry in the Philippines played a considerable role in the Philippines, even having a privacy law in the first place. Sri Lanka is also looking at a data protection law. Even countries like Indonesia have been debating data protection laws for some time, and Indonesia also now has a data protection law. However, some provisions within the law still need to be notified, or specific, like regulatory entities under the law need to be set up.

“Even in India, the protection, the conversation around data protection has been going on for well over, I think, 12 years now, approximately, there have been different versions of different bills that different committees have proposed.”

Indian data regulation law and its relation with non-personal data

The drafted data protection law in India has the following issues:

The first fundamental provision is around data protection and how it applies to governments. So the law in India currently has a section, the draft law, section 35, which will allow the government to exempt any part of itself from any part of the law. Everybody understands the need for exceptions: national security for law enforcement access, and all laws worldwide contain it like the GDPR contains some exceptions. But in India, many people think it has significantly exceeded that narrow scope. Hypothetically, it can say things like These 15 sections will not apply to the Reserve Bank of India These five sections won’t apply to view IDI, India’s digital identity authority or The whole law will not apply to the Intelligence Bureau OR the whole law will not apply to the CBI, India’s centralized investigation agency. The second provision is in general, around the independence of the Data Protection Authority that this law is proposing. The Data Protection Authority is the regulator that enforces data protection laws — they are the ones you can complain to if something goes wrong with you and a Data Collector. But some individuals do proactive audits who enforce the provisions of the law. And suppose you don’t have a robust Data Protection Authority. In that case, the law becomes a toothless tiger when it comes to data protection because you can’t do anything unless there is, you know, a capable infrastructure of people who can enforce that law.

India’s data protection law contains both the drafts and the new one as well contains some pretty strong language that splits data into three categories.

Personal data Sensitive Personal data Critical Personal data
“Critical personal data isn’t defined; it’s something that the government will be able to notify, for example, they can say health data is critical, they can say that. And in these three classifications, critical data will have to stay in the country and simply cannot leave the country. Sensitive data can leave the country for a short amount of time, but a copy of that data must always be kept in the country, and personal data can currently go around when it wants to. So as you can imagine this, fundamentally like rejigs the way the internet and its infrastructure works in a very big way.”

When it comes to non-personal data, in general, numbers, data is actually like, it has a negative definition, which is anything that isn’t personal data is nonpersonal data. So all privacy laws and data protection laws around the world fundamentally apply to personal data, right, like data that identifies an individual or has identifying characteristics. So for example, if I write a poem, right, like on a piece of paper or on Google Docs, and I don’t write any my name on it, I don’t associate it with an account, then that is actually isn’t data. Yes. Is it personal data? No. Right? Like, because, for example, imagine Amazon has the list of the top 100 products that are sold in India every day, right without who bought them without any other information, just a pure List of 100 products, top 100 products that are sold in India every day. That is an example of nonpersonal data. It is data it has its uses as inferences associated with it. But it doesn’t meet the definition of personal data, which means it doesn’t have identifying characteristics that can’t be used to identify individuals, so on and so forth.

India’s status in Data Regulation

India, in many ways, is trying, and some people are now openly saying it is very much trying to carve what it calls is the fourth part of internet regulation; they think that the three main dominant models are

The United States: with the free market, very little regulation Europe: pretty strong prescriptive regulation, regulation in countries like China, which differs much more from the state’s power.
“I think India, in all of the issues that we have discussed, is trying to pick and choose different models from these different countries in order to be able to implement them in practice. I think it is very deliberately doing so to set an example for other similarly placed countries. So it’s not just what we do here, but it’s also how we do things in a manner that other countries can follow.”

Here is a list of the selected time stamps on the different topics discussed during the podcast:

2.20 -7.25: Mozilla’s role in the internet ecosystem

7.25–13.20: Data regulations in the Asia Pacific with a focus on India

13.20–28:35: What is non-personal data, and how are the regulations?

28.35–32:05: How should a data protection law in India look like?

32.05–43.06: How does big tech behave in India

Data Governance Across Continents: What does it mean for internet users? was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KYC Chain

Sanctions and PEP screening in the KYC process

Bribery, theft, lost opportunities: the threats posed by corrupt politicians to communities are immense. In parallel, inadvertently facilitating the laundering of illegally-acquired funds is also a major threat for financial or virtual asset service providers. By implementing ongoing sanctions lists and PEP data screenings in their KYC process, businesses can lower their risk of regulatory fines -

Infocert (IT)

Completo, Chiaro, Conversazionale e Contemporaneo: 4 C per infocert.it

È online il nuovo sito web di InfoCert-Tinexta Group per parlare alla platea sempre più vasta ed eterogenea di utenti dei Digital Trust Services. Il progetto rientra nella più ampia strategia aziendale che punta ad arricchire e rafforzare il rapporto con i clienti con esperienze d’uso personalizzate e coinvolgenti in ogni ambito. Roma, 20 Gennaio […] The post Completo, Chiaro, Conversazionale e
È online il nuovo sito web di InfoCert-Tinexta Group per parlare alla platea sempre più vasta ed eterogenea di utenti dei Digital Trust Services. Il progetto rientra nella più ampia strategia aziendale che punta ad arricchire e rafforzare il rapporto con i clienti con esperienze d’uso personalizzate e coinvolgenti in ogni ambito.

Roma, 20 Gennaio 2022 – InfoCert (Tinexta Group), la più grande Autorità di Certificazione europea, annuncia il rilascio della nuova versione di infocert.it, il sito web dedicato all’offerta di soluzioni tra cui PEC, Firma Digitale, Marche temporali, Conservazione Digitale, Fatturazione Elettronica e SPID.

Il radicale rinnovo di infocert.it ha grande valore strategico per l’azienda, anche per il suo ruolo di porta d’accesso della clientela, in particolare PMI e Professionisti e privati, al mondo dei trust services: sono oltre 1 milione i clienti dell’ecommerce InfoCert ed il sito ogni anno registra oltre 100 Milioni di pagine visitate.

Inoltre, questa prima release del sito è la chiave di comunicazione digitale di InfoCert in Europa. Infatti, le aziende del gruppo InfoCert verranno allineate a questo nuovo format, con l‘obiettivo ambizioso di rendere sempre più riconoscibile l’immagine e l’offerta del gruppo InfoCert. Questo elemento è stato fondamentale per determinare la scelta attuale di User Experience e Look&Feel, in quanto ha l’importante ruolo di comunicare non solo un marchio all’estero, ma soprattutto di intercettare le esigenze di geografie e culture diverse.

Il progetto è in continua evoluzione: già entro il primo trimestre del 2022, è prevista la trasformazione della sezione di e-commerce nonché la creazione di aree dedicate a specifici target di clientela.

Il nuovo sito si distingue per quattro caratteristiche principali:

Completo: gli utenti possono trovare sul sito non solo le informazioni su tutti i prodotti dell’offerta InfoCert per digitalizzare il proprio business, ma anche lo spazio per comprendere meglio il mondo dei servizi di digital trust. Chiaro: le informazioni sono sintetiche, precise e facilmente consultabili anche grazie a numerosi video tutorial. Conversazionale: lo stile punta su fruibilità e informalità, anche grazie a un uso di immagini coinvolgenti e rappresentative dei principali target. Contemporaneo: il design risponde non solo a specifiche esigenze funzionali, ma è anche in linea con i più recenti e innovativi canoni estetici e qualitativo-esperienziali.

In sintesi, il nuovo sito web è in 4C per un’esperienza inedita, personale e coinvolgente su qualsiasi dispositivo, innanzitutto mobile.

“Il nuovo portale vuole essere in linea con l’evoluzione del posizionamento di InfoCert, realtà tutta italiana che si è fortemente posizionata come leader in Italia dei servizi di Digital Trust e che negli ultimi anni sta adottando una strategia sempre più globale in Europa e non solo. Questo nuovo sito è solo la prima tappa di una strategia più ampia di cui si vedranno pienamente i risultati nei prossimi mesi. Vogliamo, infatti, garantire una brand experience continua, chiara e coerente ai clienti esistenti e potenziali a prescindere dal canale o dal dispositivo da loro utilizzato. Per continuare, e addirittura accelerare, il percorso di successo intrapreso da anni. Puntiamo al consolidamento e all’arricchimento del rapporto di conoscenza e fiducia con tutti i nostri sempre più numerosi interlocutori. Questo è dovuto essenzialmente ad una vera e propria esplosione, causata in larga parte anche dall’evento pandemico, della digitalizzazione sia in ambito professionale che non. A questo si aggiunge una platea internazionale crescente, vista la nostra dimensione sempre più globale”

Marco Lauro, Director Web & Inside Sales InfoCert – Tinexta Group

Per celebrare il nuovo sito web, InfoCert lancia una promozione speciale con sconti fino al 50% sull’acquisto di diverse soluzioni.

InfoCert SpA

InfoCert, Tinexta Group, è la più grande Certification Authority europea, attiva in oltre venti Paesi. La società eroga servizi di digitalizzazione, eDelivery, Firma Digitale e conservazione digitale dei documenti ed è gestore accreditato AgID dell’identità digitale nell’ambito di SPID (Sistema Pubblico per la gestione dell’Identità Digitale). InfoCert investe in modo significativo nella ricerca e sviluppo e nella qualità: detiene un significativo numero di brevetti mentre le certificazioni di qualità ISO 9001, 27001 e 20000 testimoniano l’impegno ai massimi livelli nell’erogazione dei servizi e nella gestione della sicurezza. Il Sistema di Gestione della Sicurezza delle Informazioni InfoCert è certificato ISO/IEC 27001:2013 per le attività EA:33-35. InfoCert è leader europeo nell’offerta di servizi di Digital Trust pienamente conformi ai requisiti del Regolamento eIDAS (regolamento UE 910/2014) e agli standard ETSI EN 319 401, e punta a crescere sempre di più a livello internazionale anche mediante acquisizioni: detiene il 60% di CertEurope, la più grande Certification Authority di Francia, il 51% di Camerfirma, una delle principali autorità di certificazione spagnole, il 16,7% di Authada, Identity Provider tedesco all’avanguardia. InfoCert, infine, è proprietaria dell’80% delle azioni di Sixtema SpA, il partner tecnologico del mondo CNA, che fornisce soluzioni tecnologiche e servizi di consulenza a PMI, associazioni di categoria, intermediari finanziari, studi professionali ed enti.

Per maggiori informazioni: InfoCertPress Relations Advisor BMP Comunicazione per InfoCert team.infocert@bmpcomunicazione.it Pietro Barrile +393207008732 – Michela Mantegazza +393281225838 – Francesco Petrella +393452731667 www.infocert.itTinexta S.p.A.Chief Investor Relations Officer Josef Mastragostino investor@tinexta.com Chief External Relations & Communication Officer Alessandra Ruzzu +39 331 622 4168 alessandra.ruzzu@tinexta.com Press Office Manager Carla Piro Mander Tel. +39 06 42 01 26 31 carla.piro@tinexta.comMedia Advisor Barabino & Partners S.p.A. Foro Buonaparte, 22 – 20121 Milano Tel.: +39 02 7202 3535 Stefania Bassi: +39 335 6282 667 s.bassi@barabino.itSpecialist Intermonte SIM S.p.A. Corso V. Emanuele II, 9 – 20122 Milano Tel.: +39 02 771151

The post Completo, Chiaro, Conversazionale e Contemporaneo: 4 C per infocert.it appeared first on InfoCert.


KuppingerCole

HCL BigFix

by Richard Hill Due to the sheer number of endpoint devices (billions) worldwide and their potential risk and impact to the organization through endpoint vulnerabilities, Endpoint Management has become imperative to IT Security. HCL BigFix offers a unified platform to discover, manage, and bring those endpoints under compliance in an automated way.

by Richard Hill

Due to the sheer number of endpoint devices (billions) worldwide and their potential risk and impact to the organization through endpoint vulnerabilities, Endpoint Management has become imperative to IT Security. HCL BigFix offers a unified platform to discover, manage, and bring those endpoints under compliance in an automated way.

PingTalk

What is Knowledge-based Authentication (KBA)? | Ping Identity

When you set up a new account, you are often asked to create a password and choose a security question and answer (e.g., What is your mother's maiden name?). Answering security questions based on personal information when you log in to an app or system is called knowledge-based authentication (KBA). While KBA is still widely used, people freely share the same information on social media sites, red

When you set up a new account, you are often asked to create a password and choose a security question and answer (e.g., What is your mother's maiden name?). Answering security questions based on personal information when you log in to an app or system is called knowledge-based authentication (KBA). While KBA is still widely used, people freely share the same information on social media sites, reducing its security value.

 


Okta

How to Create a React App with Storybook

UI designers and front-end developers are tasked with creating clean and consistent user interfaces. At the same time, testing is a cornerstone of software development. Each part of a software project is tested individually and isolated from the other elements in unit tests. This practice has been challenging to achieve in the context of user interfaces. Now Storybook provides an open-source fr

UI designers and front-end developers are tasked with creating clean and consistent user interfaces. At the same time, testing is a cornerstone of software development. Each part of a software project is tested individually and isolated from the other elements in unit tests. This practice has been challenging to achieve in the context of user interfaces.

Now Storybook provides an open-source framework that lets you test UI components in isolation from the rest of the website. Storybook presents you with a browser of all the components in your web application. You can test each component independently and in different configurations. The tool runs as a separate application outside your main application, which means that you can test your components without worrying about application-specific dependencies or requirements.

In this tutorial, I will show you how to use Storybook to create a simple React application. The application will be a unit conversion app, and I will use Storybook to showcase the individual components and the application page itself. I will not assume any prior knowledge of React or Storybook. I will assume that you are familiar with JavaScript and Node, and have an up-to-date version of the npm packet manager installed on your computer.

Prerequisites:

Node 14 Okta CLI

Table of Contents

Creating React components with Storybook Creating the unit converter application using Storybook component stories Adding authentication with Okta to the application Learn more about React, Storybook, and Single-Page Apps Creating React components with Storybook

In this section, I will show you how to create a React application and implement components displayed in Storybook. These components will serve as the basis for the unit conversion application. To start, open a terminal in a folder of your choice and run the following command to create a new React application.

npx create-react-app@5 react-storybook --use-npm

The create-react-app command creates a new folder, react-storybook, and initialises a basic application skeleton. Next, turn this basic React app into a Storybook application. Navigate into the newly created folder and run the following command.

npx sb@6 init

When prompted, answer yes to install the sb package. Initializing Storybook will create a new folder, stories inside the src folder, and populate it with some pre-made demo components and stories to be used by Storybook. Open the project folder in your favourite IDE.

You can test out storybook straight away. Open a terminal session in the project folder and run the following command.

npm run storybook

The command runs the Storybook app and opens a browser tab (http://localhost:6006). For now, you will only see the components that Storybook installs by default. You can keep Storybook running while you develop your app.

Using your IDE, create a new file named src/stories/Components.jsx. This will be the module that will contain some basic UI components. For the sake of this tutorial, I will place all these components into a single module. In practice, you might want to spread them out over several files. Open src/stories/Components.jsx and paste in the following code.

import React, { useState } from 'react'; import PropTypes from 'prop-types'; import './Components.css'; export function Input({ size, type, label, name, placeholder, onChange }) { return ( <label className={`input-component input-component--${size}`}> <span>{label}</span> <input type={type==='text' ? 'text' : 'number'} step={type==='floating-point' ? 'any' : undefined} name={name} placeholder={placeholder} onChange={onChange} /> </label> ); }; Input.propTypes = { size: PropTypes.oneOf(['medium', 'large']), type: PropTypes.oneOf(['text', 'number', 'floating-point']), label: PropTypes.string.isRequired, name: PropTypes.string.isRequired, placeholder: PropTypes.string.isRequired, onChange: PropTypes.func, }; Input.defaultProps = { size: 'medium', type: 'text', label: 'Enter a value', name: 'input', placeholder: 'Please enter a value', onChange: undefined }; export function Select({ size, label, options, onChange }) { return ( <label className={`select-component select-component--${size}`}> <span>{label}</span> <select className="select-component" onChange={onChange}> {options.map((option) => <option value={option.value}>{option.description}</option>)} </select> </label> ); }; Select.propTypes = { size: PropTypes.oneOf(['medium', 'large']), label: PropTypes.string.isRequired, options: PropTypes.arrayOf(PropTypes.shape({ value: PropTypes.string.isRequired, description: PropTypes.string.isRequired })).isRequired, onChange: PropTypes.func, }; Select.defaultProps = { size: 'medium', label: 'Options', options: [] }; export function Tabs({ children }) { const [active, setActive] = useState(0); const onTabClick = (newActive) => () => { setActive(() => newActive); }; return ( <div className="tabs-component"> <div className="tabs-row"> {children.map((child, index) => <div className={`tab ${index === active ? "active" : ""}`} onClick={onTabClick(index)}>{child.props.label}</div>)} </div> <div className="tabs-content"> {children[active]} </div> </div> ); }; Tabs.propTypes = { children: PropTypes.instanceOf(Array).isRequired, }; Tabs.defaultProps = { children: [] };

This module exports three components. Input is a configurable <input> element with a label for entering text or numbers, Select is a dropdown <select> element wrapped in a label, and Tabs is a component that shows its children in a separate tab. I am using the React feature propTypes to specify the properties that each React component expects as arguments, allowing Storybook to extract this meta-information and display it to the user. To provide a bit of styling for the components, create a file src/stories/Components.css, and fill it with the following contents.

.input-component { display: flex; flex-direction: column; margin-bottom: 1rem; } .input-component span { display: block; margin-bottom: 0.5rem; } .input-component.input-component--large input { font-size: 1.2rem; padding: 0.5rem 1rem; } .select-component { display: flex; flex-direction: column; margin-bottom: 1rem; } .select-component span { display: block; margin-bottom: 0.5rem; } .select-component.select-component--large select { font-size: 1.2rem; padding: 0.5rem 1rem; } .tabs-component .tabs-row { font-family: 'Nunito Sans', 'Helvetica Neue', Helvetica, Arial, sans-serif; display: flex; } .tabs-component .tabs-row .tab { border: 1px solid #EEEEEE; border-bottom: none; border-top-right-radius: 4px; border-top-left-radius: 4px; padding: 0.5rem 1rem; cursor: pointer; } .tabs-component .tabs-row .tab.active { background-color: #EEEEEE; cursor: auto; } .tabs-component .tabs-content { border: 1px solid #EEEEEE; padding: 0.5rem 1rem; }

With this, the components are usable as React components in your application. But you also want them to be browsable through Storybook. For this, you will need to create one file for each component. Start by creating a file src/stories/Input.stories.jsx and enter the following code in it.

import React from 'react'; import { Input } from './Components'; export default { title: 'Components/Input', component: Input, }; const Template = (args) => <Input {...args} />; export const Normal = Template.bind({}); Normal.args = { label: 'Normal Input', placeholder: 'Enter your value', size: 'normal' }; export const Large = Template.bind({}); Large.args = { label: 'Large Input', placeholder: 'Enter your value', size: 'large' }; export const Number = Template.bind({}); Number.args = { label: 'Integer Number', placeholder: 'Enter your value', size: 'large', type: 'number' }; export const FloatingPoint = Template.bind({}); FloatingPoint.args = { label: 'Floating Point Number', placeholder: 'Enter your value', size: 'large', type: 'floating-point' };

The export default at the top of the file tells Storybook what the component’s name is and which React component the stories in this file refer to. The subsequent exports Normal, Large, Number, and FloatingPoint represent individual stories or use cases for that component. Each story defines a member args that specifies the properties to pass to the component. Creating stories in this way is quick, so now create the next one for the Select component. Create a file src/stories/Select.stories.jsx and paste the following contents into it.

import React from 'react'; import { Select } from './Components'; export default { title: 'Components/Select', component: Select, }; const Template = (args) => <Select {...args} />; export const Default = Template.bind({}); Default.args = { size: 'medium', label: 'Select an Option', options: [ { value: 'a', description: 'Option A' }, { value: 'b', description: 'Option B' }, { value: 'c', description: 'Option C' }, ] }; export const Large = Template.bind({}); Large.args = { size: 'large', label: 'Select an Option', options: [ { value: 'a', description: 'Option A' }, { value: 'b', description: 'Option B' }, { value: 'c', description: 'Option C' }, ] };

This file defines two stories for the Select component. One story shows it in normal size, and the other shows it in a large size. Finally, do the same for the Tabs component. Create a file src/stories/Tabs.stories.jsx and fill it with the contents below.

import React from 'react'; import { Tabs } from './Components'; export default { title: 'Components/Tabs', component: Tabs, }; const Template = (args) => <Tabs {...args} />; export const Default = Template.bind({}); Default.args = { children: [ <div label="One">Content One</div>, <div label="Two">Content Two</div>, <div label="Three">Content Three</div>, ] };

Now, you are ready to test out your new components in Storybook. If you haven’t done so already, open the terminal in the project folder and run the following command.

npm run storybook

The command runs the Storybook app and opens a browser tab (http://localhost:6006). You can browse the components in the left sidebar. The stories you just created can be found under the Components header, and when you select, for example, the Input -> Number story, you should see something like shown in the image below.

The component shows up in the main view, and the icons above let you change the background, the screen size, and even allow you to check the dimensions of the component’s layout. Below the main view, you can manually adjust the options passed to the component. I invite you to play around with all the features Storybook provides.

Creating the unit converter application using Storybook component stories

I will use the convert-units library to implement the unit conversion app. Open a second terminal in your project folder and run the command below.

npm install -E convert-units@2.3.4

Now, in your IDE, create a new file, src/stories/Converter.jsx, and fill it with the contents below.

import React, { useState } from 'react'; import PropTypes from 'prop-types'; import * as convert from 'convert-units'; import { Input, Select } from './Components'; export const Converter = ({measure}) => { const possibilities = convert().possibilities(measure).map((unit) => { const descr = convert().describe(unit); return { value: descr.abbr, description: `${descr.singular} (${descr.abbr})` }; }); const [fromUnit, setFromUnit] = useState(possibilities[0].value); const [toUnit, setToUnit] = useState(possibilities[0].value); const [fromValue, setFromValue] = useState(1); const [toValue, setToValue] = useState(convert(1).from(fromUnit).to(toUnit)); const updateFromUnit = (event) => { setFromUnit(() => event.target.value); setToValue(() => convert(fromValue).from(event.target.value).to(toUnit)); }; const updateToUnit = (event) => { setToUnit(() => event.target.value); setToValue(() => convert(fromValue).from(fromUnit).to(event.target.value)); }; const updateValue = (event) => { setFromValue(() => event.target.value); setToValue(() => convert(event.target.value).from(fromUnit).to(toUnit)); }; return <div className="converter"> <Select label="From:" options={possibilities} onChange={updateFromUnit}></Select> <Select label="To:" options={possibilities} onChange={updateToUnit}></Select> <Input label="Value:" type="floating-point" onChange={updateValue}></Input> <p>{fromValue} {fromUnit} = {toValue} {toUnit}</p> </div> }; Converter.propTypes = { measure: PropTypes.string.isRequired }; Input.defaultProps = { measure: 'length' };

The component takes a single property called measure, which specifies the type of units to be converted and can be something like mass or length. The code for this component then consists of four parts. The first action is to query the convert-units library for all the possible unit conversion options. Units are mapped into an array of objects, ready to use with the Select component. In the next part, you’ll define four state properties, followed by three event handlers. These will react to a change in the user input and update the state accordingly. These event handlers contain the actual calls to the convert-units library where the unit conversion happens. Finally, the component is put together from all the parts and returned. You can also create a story for this more complex component with the individual components. Create a file src/stories/Converter.stories.jsx and paste in the following contents.

import React from 'react'; import { Converter } from './Converter'; export default { title: 'Components/Converter', component: Converter, }; const Template = (args) => <Converter {...args} />; export const Default = Template.bind({}); Default.args = { measure: 'length' }; export const Mass = Template.bind({}); Mass.args = { measure: 'mass' };

When you installed Storybook with the npx sb command, the initialization script added a few components as examples to demonstrate Storybook’s capabilities. You will be reusing two of these components for the unit-conversion app. Open src/stories/Header.jsx and replace its contents with the following code.

import React from 'react'; import PropTypes from 'prop-types'; import { Button } from './Button'; import './header.css'; export const Header = ({ user, onLogin, onLogout }) => ( <header> <div className="wrapper"> <div> <h1>Unit Converter</h1> </div> {user ? <div> Hello {user.given_name} </div> : ""} <div> {user ? ( <Button size="small" onClick={onLogout} label="Log out" /> ) : ( <> <Button size="small" onClick={onLogin} label="Log in" /> </> )} </div> </div> </header> ); Header.propTypes = { user: PropTypes.shape({}), onLogin: PropTypes.func.isRequired, onLogout: PropTypes.func.isRequired, onCreateAccount: PropTypes.func.isRequired, }; Header.defaultProps = { user: null, };

I have modified the header component to show the correct application name and allow some structured user data to be passed in. In the story for the header, in file src/stories/Header.stories.jsx, modify the arguments passed to the LoggedIn story to reflect this change.

LoggedIn.args = { user: { given_name: "Username" }, };

Now, open src/stories/Page.jsx and modify its contents to match the code below.

import React from 'react'; import PropTypes from 'prop-types'; import { Header } from './Header'; import './page.css'; import { Tabs } from './Components'; import { Converter } from './Converter'; export const Page = ({useAuth}) => { const [user, login, logout] = useAuth(); return <article> <Header user={user} onLogin={login} onLogout={logout} /> <section> <Tabs> <Converter measure="length" label="Length" key="length"></Converter> <Converter measure="mass" label="Mass" key="mass"></Converter> <Converter measure="volume" label="Volume" key="volume"></Converter> </Tabs> </section> </article>; } Page.propTypes = { useAuth: PropTypes.func.isRequired }; Page.defaultProps = { };

This component displays the application page, including the header and a tabbed container that allows switching between Converter components configured to convert different measures. The page needs a useAuth hook passed in that returns the user information and callbacks to log the user in or out. In the stories for the page, in src/stories/Page.stories.jsx, you need to create a mock function that supplies fake user data. Edit the contents of this file to look like the following code.

import React from 'react'; import { Page } from './Page'; export default { title: 'Pages/Page', component: Page, }; const mockUseAuth = (loggedIn) => () => [ loggedIn ? {given_name: "Username"} : undefined, () => {}, () => {} ]; const Template = (args) => <Page useAuth={mockUseAuth(true)} {...args}/>; export const LoggedIn = Template.bind({}); LoggedIn.args = { useAuth: mockUseAuth(true), }; LoggedIn.parameters = { controls: { hideNoControlsWarning: true }, }; export const LoggedOut = Template.bind({}); LoggedOut.args = { useAuth: mockUseAuth(false), }; LoggedOut.parameters = { controls: { hideNoControlsWarning: true }, };

Note how mockUseAuth uses currying to return a function that can be used as the useAuth hook in the Page component. You can now use Storybook again to test the Converter component and the full application page. If it’s not still running, run npm run storybook again. You can navigate to Pages -> Page in the left sidebar, and you should see something like the image below.

Adding authentication with Okta to the application

You have created a page that uses an useAuth hook to manage user authentication. For the Storybook stories, you made a mock implementation of this hook. This section will show you how to implement the hook using Okta’s authentication service. First, register the application with Okta.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Single-Page App and press Enter.

Use http://localhost:3000/callback for the Redirect URI and set the Logout Redirect URI to http://localhost:3000.

What does the Okta CLI do?

The Okta CLI will create an OIDC Single-Page App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. It will also add a trusted origin for http://localhost:3000. You will see output like the following when it’s finished:

Okta application configuration: Issuer: https://dev-133337.okta.com/oauth2/default Client ID: 0oab8eb55Kb9jdMIr5d6

NOTE: You can also use the Okta Admin Console to create your app. See Create a React App for more information.

Next, install the necessary libraries. Open the terminal and run the command below.

npm install -E @okta/okta-react@6.4.1 @okta/okta-auth-js@5.10.0 react-dom@17.0.2 react-router-dom@5.3.0

Open the file src/index.js and modify its contents to match the code below.

import React from 'react'; import ReactDOM from 'react-dom'; import './index.css'; import { App } from './App'; import { Page } from './stories/Page'; import reportWebVitals from './reportWebVitals'; import { BrowserRouter as Router, Route, useHistory } from 'react-router-dom'; import { LoginCallback, SecureRoute, Security } from '@okta/okta-react'; import { OktaAuth, toRelativeUrl } from '@okta/okta-auth-js'; import { useAuth } from './auth'; const oktaAuth = new OktaAuth({ issuer: '/oauth2/default', clientId: '{clientID}', redirectUri: `/callback`, }); function SecuredRoutes(props) { const history = useHistory(); const restoreOriginalUri = async (_oktaAuth, originalUri) => { history.replace(toRelativeUrl(originalUri || '/', window.location.origin)); }; return ( <Security oktaAuth={oktaAuth} restoreOriginalUri={restoreOriginalUri}> <Route path="/" exact render={(props) => <App {...props} useAuth={useAuth}/>} /> <SecureRoute path="/converter" exact render={(props) => <Page {...props} useAuth={useAuth}/>} /> <Route path="/callback" component={LoginCallback} /> </Security> ); } ReactDOM.render( <React.StrictMode> <Router> <SecuredRoutes /> </Router> </React.StrictMode>, document.getElementById('root') ); reportWebVitals();

Here {yourClientID} is the client ID that you obtained earlier and {yourOktaDomain} is your Okta domain. This change does several things. The oktaAuth instance provides a global authentication singleton. The main render function now contains a Router element that allows the application to navigate different routes. Finally, SecuredRoutes is a component that wraps the routes in a Security component. This component makes a useOktaAuth hook available for all components contained within it. Inside this component, you define the routes. Note how you pass a useAuth hook into the App and the Page components. Create a new file src/auth.js and add the following code to implement this hook.

import { useEffect, useState } from 'react'; import { useOktaAuth } from '@okta/okta-react'; export const useAuth = () => { const { oktaAuth, authState } = useOktaAuth(); const [user, setUser] = useState(null); useEffect(() => { if (authState?.isAuthenticated) { if (!user) { oktaAuth.getUser().then(setUser); } } else { setUser(null); } }, [authState, user, oktaAuth]); const login = async () => oktaAuth.signInWithRedirect('/'); const logout = async () => oktaAuth.signOut('/'); return [user, login, logout]; };

Finally, you need to modify the existing App component to use the authentication hook. Open src/App.js and adjust the content to look like this.

import './App.css'; import { Link } from 'react-router-dom'; import { Header } from './stories/Header'; export const App = ({useAuth}) => { const [user, login, logout] = useAuth(); return ( <div className="App"> <Header user={user} onLogin={login} onLogout={logout} /> <h1>Unit Converter</h1> <p> <Link to="/converter">Go to the app!</Link> </p> </div> ); }

Congratulations, you have completed your React application with Storybook. You can now open the console in the project folder and run the following command to start the app.

npm start

You should see the application’s front page in your browser. When you click the Go to the app! link, you’ll log in on the Okta page. After successfully signing in, you will redirect to the unit converter page, which looks like the image below.

Learn more about React, Storybook, and Single-Page Apps

In this tutorial, I have shown you how to create a React application and use Storybook to browse the application’s components. Storybook is a great tool that can enhance your development workflow.

It lets you view and test your components in isolation. You can specify the location of each component in a hierarchical menu and then browse through the components in your browser. You can have multiple stories showcasing different use cases for each component. You can also modify the component parameters and see the impact on the visual appearance in real time. Storybook can keep running during the development process, and it will reflect any changes you make to your code.

The application you wrote was a simple unit-conversion app. I guided you on using the convert-units library to convert length, mass, and volume. You assembled the individual components to create a larger component containing multiple input elements. I have shown you how Storybook lets you create stories, test these complex components, and complete application pages.

If you want to learn more about any of these topics, please follow the links below.

Build a React App with Styled Components The Best Testing Tools for Node.js Build a Simple React Application Using Hooks Develop Secure Apps with WebSockets and Node.js

You can find the code for this tutorial on GitHub at https://github.com/oktadev/okta-react-storybook-example.

If you liked this tutorial, chances are you like others we publish. Please follow @oktadev on Twitter and subscribe to our YouTube channel to get notified when we publish new developer tutorials.

Wednesday, 19. January 2022

Anonym

4 Major Data Privacy Trends to Watch in 2022

Protecting customer data will continue to be an urgent issue for businesses worldwide in 2022. The risks and requirements are not going away. In 2021 we saw users demanding and gaining more control over their personal data. Big tech/ad tech took more hits to their data-driven business models, even while introducing variously effective “privacy” measures (such as […] The post 4 Major Da
Protecting customer data will continue to be an urgent issue for businesses worldwide in 2022. The risks and requirements are not going away.

In 2021 we saw users demanding and gaining more control over their personal data. Big tech/ad tech took more hits to their data-driven business models, even while introducing variously effective “privacy” measures (such as Google phasing out third party cookies and scrapping app developers access to ad IDs and Apple introducing the ATT feature and privacy labels in the App Store). When WhatsApp caused global outrage by demanding its 2 billion-plus users consent to sharing their personal data with Facebook, and Elon Musk switched millions of those users to private messaging app Signal with a two-word tweet, the world was never surer that consumers are tired of trading their personal information for access to services. This sentiment is only intensifying.

Throughout 2021, the regulatory landscape expanded, placing even greater compliance burdens on business and making cross-border data transfers more difficult. COVID-19 and the mass move to remote working environments continued to demand businesses find ways to protect employee devices and secure corporate data and resources as cybersecurity threats ramped up. Contact tracing also amplified the risks of sharing and storing personal information. Cisco confirmed the pandemic had seriously elevated the importance of data privacy as a business priority and, as we enter the third year of the pandemic, this imperative hasn’t changed. Privacy budgets and resourcing skyrocketed among businesses in response, and governments and health authorities are still struggling to balance public health with protecting personal privacy.

Obviously, data privacy will remain firmly on the agenda in 2022, so the big trends to watch for are:

Privacy regulations and compliance obligations will continue to expand and tighten in the US and globally. 

The International Association of Privacy Professionals (IAPP) quotes Goodwin Partner and IAPP Senior Fellow, Omer Tene as saying, “In 2022, expect an avalanche of new laws and regulations, attempting to govern and impose order on a dizzying array of tech developments. New regulatory efforts will range from data protection laws in India and China to AI regulation in the EU to automated decision making rules in US states. Add to that a flurry of enforcement activities, and you get a perfect storm of tech regulation.”

But Tene says it’s unlikely the US will gets its long awaited national privacy law in 2022, but at least six states will move to pass their own (Maryland, Oklahoma, Ohio, New Jersey, Florida and Alaska), and it will be interesting to see whether any of those states includes a private right of action (PRA) —because, if they don’t, the game could change. “A couple of years ago, conventional wisdom was that the more states pass privacy laws, the greater the pressure will build on the business community, and consequently Congress, to pass preemptive federal legislation. Absent a PRA, however, an interesting dynamic may develop, where the more state privacy laws, the less appetite businesses – who are growing accustomed to complying with the emerging (PRA-less) state framework – have for federal pre-emption,” Tene says.

More tech legislation will flow out of Europe too. In fact, Tene calls it “a veritable alphabet soup of tech regulation affecting digital platforms, digital services, online marketing, data intermediaries and more …”. The AI Act, affecting algorithmic decision making across the economy, and the Data Act, which broadens legal obligations, including cross-border transfer restrictions, to non-personal data will also be things to watch.

We’ll see more regulatory enforcement.

Governments in the US and Europe are expected to ramp up regulatory enforcement around breach reporting and risk reporting. GDPR enforcement increased in 2021 and this is set to continue. Tene says, “Importantly, regulators are expanding the lens from an early focus on data breaches to challenging companies’ legal bases for processing data and, notably, cross-border data flows. In 2022, expect an additional step up the enforcement ladder. We expect regulators to focus on issues such as protecting children’s data, restricting the use of sensitive health and financial information, and curbing the excess of digital marketing.” Of course, if greater enforcement is to happen, the regulators will need to be adequately resourced. Enforcement times are not getting any shorter on the complex cases. It will also be long overdue. For example, the Schrems II decision was in the middle of 2020 and there has been very minimal (if any?) enforcement action resulting from that change. 

More limits on big tech/ad tech.

Regulators, the media and the general public have big tech/ad tech firmly in their sights and will continue to demand more privacy-first data tracking and sharing practices and business models. As we reported last year, the Federal Trade Commission could go harder on consumer privacy protection and cybersecurity with President Biden’s recent nomination of digital “privacy hawk” and law professor Alvaro Bedoya and House Democrats’ proposal to allocate $1 billion for a new privacy and data security bureau. 

Tene notes: “Under the leadership of new Chair Lina Khan, the FTC has issued strong statements and strategic plans for broad rulemaking efforts, including rules to curb “abuses stemming from surveillance-based business models” and “lax security practices” and to “ensure that algorithmic decision-making does not result in unlawful discrimination.” But he notes this in part depends on Bedoya’s appointment proceeding in the Senate, which may not be quick. 

Consumers will continue to favor brands that are genuinely good, honest stewards of their personal data.

Cisco released research findings in October 2021 which it says, ”demonstrates the growing importance of privacy to the individual and its implications on the businesses and governments that serve them.” Eighty-six per cent of respondents said they care about data privacy and want more control, and 79 percent said they’d be willing to vote with their wallet and not support businesses that don’t protect their data and would indeed pay for better data protection. This sounds like the new growing market of “privacy actives” we discussed last year and which no brand can afford to ignore. Cisco reminds business that data abuses have eroded trust in brands, which is one reason it has released its New Trust Standard, which we’ll explore in a separate article soon.

Clearly the opportunity for businesses to build customer trust and loyalty by responding authentically to what their customers want and need are enormous. As PwC says, brands that get customer privacy and safety right will disrupt the market. Our Sudo Platform privacy and cybersafety services platform can help you to rapidly develop and deploy branded customer solutions.

Want a longer term view? Read our 10-year vision for privacy which we released in 2020.

Explore Sudo Platform and how this persona- or digital identity-based services platform is helping to solve the data privacy problem for enterprises.

Photo By Miha Creative

The post 4 Major Data Privacy Trends to Watch in 2022 appeared first on Anonyome Labs.


Safle Wallet

SAFLE x QuickSwap LP Mining

Safle and QuickSwap are bringing everyone the opportunity to earn more with their token holdings. We have just launched the SAFLE-USDT pair on Quickswap where you can provide liquidity with your SAFLE and USDT to earn transaction revenue paid in these two assets. You will receive an LP token that you can be staked to earn dQUICK tokens, which earns you a share of the overall trading activity on th

Safle and QuickSwap are bringing everyone the opportunity to earn more with their token holdings. We have just launched the SAFLE-USDT pair on Quickswap where you can provide liquidity with your SAFLE and USDT to earn transaction revenue paid in these two assets. You will receive an LP token that you can be staked to earn dQUICK tokens, which earns you a share of the overall trading activity on the Quickswap platform (paid in QUICK tokens).

Deeper Dive: Everything you need to know about dQUICK & the Dragon’s Lair

So if you’re SAFLE hodler ready to earn a syrupy yield, take note of the three steps below! 🤤 Go to the SAFLE/USDT pool and add liquidity with your SAFLE and USDT. By providing liquidity, you will get a SAFLE/USDT Liquidity Pool Token (LP token), which represents your share in that pool as a liquidity provider.

2. After completing step 1, proceed to deposit your SAFLE/USDT LP token by clicking on “Farms” and then on “LP Mining”, then search for the SAFLE/USDT pool and click on “Deposit”. You will start earning dQUICK tokens, which earns you a share of the overall trading activity (fees) on the Quickswap platform, paid in QUICK tokens!

3. Stake your dQuick (or QUICK) in Dragon’s Syrup Pools to earn more — COMING SOON!

About QuickSwap

QuickSwap is a permissionless decentralised exchange (DEX) based on Ethereum, powered by Polygon. By accessing Polygon Web Wallet with Metamask, QuickSwap users can exchange tokens, mine liquidity, and buy cryptocurrencies with a credit card.

Website | Quickswap on Twitter | Quickswap Telegram

About Safle

A next-generation non custodial wallet, self sovereign identity protocol and Web 3.0 infra. provider for the decentralised ecosystem, governed by the community. Safle is a decentralised blockchain identity wallet that enables secure private key management and seamless experience for dApps, DeFi and NFTs. In order to maintain a balance between developers and retail users, Safle intends to develop a wallet infrastructure in a completely non-custodial fashion using Open Governance Mechanisms via the SafleDAO coordinated and maintained by the Safle token economy. The native $SAFLE token will not only enable token holders to propose and vote on changes (governance privileges) to functionalities and feature sets of the wallet and node services, but will also create a self-sustaining token economic model where value will be generated by providing access to finance and identity in the decentralised digital world.

Website | GitHub | Discord | Twitter | Instagram | Telegram Ann | Telegram Chat


KuppingerCole

Intelligent SIEM Platforms

by Alexei Balaganski This report is an overview of the market for modern, intelligent Security Information and Event Management (SIEM) platforms and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing SIEM solutions.

by Alexei Balaganski

This report is an overview of the market for modern, intelligent Security Information and Event Management (SIEM) platforms and provides you with a compass to help you to find the solution that best meets your needs. We examine the market segment, vendor service functionality, relative market share, and innovative approaches to providing SIEM solutions.

Global ID

GiD Report#195 — Can you trust your million dollar NFT?

GiD Report#195 — Can you trust your million dollar NFT? Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. Check out last week’s report here. ICYMI — Trusted Airdrops with GlobaliD This week: Can you trust your million dollar NFT? What people are saying The question of atoms v. 
GiD Report#195 — Can you trust your million dollar NFT?

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. Check out last week’s report here.

ICYMI — Trusted Airdrops with GlobaliD

This week:

Can you trust your million dollar NFT? What people are saying The question of atoms v. bits Tweet of the week — guy who created Ethereum edition The investment everyone is talking about Stuff happens 1. Can you trust your million dollar NFT? Bored Ape Yacht Club

One of the fundamental novelties of NFTs is that the framework has brought a sense of scarcity and permanence to a place — the internet — that has historically been of abundance and transience.

That NFTs exist on decentralized blockchains further gives users a sense of ownership over their items — perfect for digital collectibles and other use cases such as gaming.

But how permanent are NFTs really? And how decentralized are they actually?

That’s one of Moxie Marlinspike’s points from a blog post last week on his “first impressions of web3 (via /rcb) that’s making the rounds in the cryptosphere.

Here’s Moxie on the problem with NFTs:

Most people think of images and digital art when they think of NFTs, but NFTs generally do not store that data on-chain. For most NFTs of most images, that would be much too expensive.
Instead of storing the data on-chain, NFTs instead contain a URL that points to the data. What surprised me about the standards was that there’s no hash commitment for the data located at the URL. Looking at many of the NFTs on popular marketplaces being sold for tens, hundreds, or millions of dollars, that URL often just points to some VPS running Apache somewhere. Anyone with access to that machine, anyone who buys that domain name in the future, or anyone who compromises that machine can change the image, title, description, etc for the NFT to whatever they’d like at any time (regardless of whether or not they “own” the token). There’s nothing in the NFT spec that tells you what the image “should” be, or even allows you to confirm whether something is the “correct” image.
So as an experiment, I made an NFT that changes based on who is looking at it, since the web server that serves the image can choose to serve different images based on the IP or User Agent of the requester. For example, it looked one way on OpenSea, another way on Rarible, but when you buy it and view it from your crypto wallet, it will always display as a large [poop] emoji. What you bid on isn’t what you get. There’s nothing unusual about this NFT, it’s how the NFT specifications are built. Many of the highest priced NFTs could turn into [poop] emoji at any time; I just made it explicit.

That’s the current NFT reality. But technology has never been just about code. It’s also an inherently social endeavor — as Matt Levine points out in his Money Stuff newsletter:

The NFT does not by itself convey ownership of the underlying thing in either a legal or practical sense. It conveys ownership in some more metaphysical sense: If you buy a Bored Ape Yacht Club NFT, then the people who are part of the BAYC NFT community will treat you as the owner of your ape. This is essentially a social fact and can be true even if the immutable code of the blockchain says that you don’t own the ape, because you were hacked or whatever. The technology is a scaffolding on which to hang a social system, but the social system is what does or does not convey “ownership” in a meaningful sense.

It’s also likely that as the space develops — web3 and NFTs really only entered the mainstream conversation last year — that new solutions will emerge to address some of those technological gaps. (For instance, your NFT might be stored on Filecoin’s Interplanetary File System.)

But there will also be cases where some gaps can’t be addressed or more practically, it doesn’t make sense to address them with decentralized solutions.

As Moxie points out, that’s something the industry and community as a whole will continue to grapple over going forward. There are philosophical merits to decentralization, but in order to deliver a compelling and convenient product to end users, some sort of balance will be struck between the vision of a decentralized world and the efficiency of centralized solutions. (To Moxie’s point, that’s already the case with NFTs and platforms like OpenSea.)

For Moxie, it’s about expectations:

This isn’t a complaint about OpenSea or an indictment of what they’ve built. Just the opposite, they’re trying to build something that works. I think we should expect this kind of platform consolidation to happen, and given the inevitability, design systems that give us what we want when that’s how things are organized. My sense and concern, though, is that the web3 community expects some other outcome than what we’re already seeing.
However, even if this is just the beginning (and it very well might be!), I’m not sure we should consider that any consolation. I think the opposite might be true; it seems like we should take notice that from the very beginning, these technologies immediately tended towards centralization through platforms in order for them to be realized, that this has ~zero negatively felt effect on the velocity of the ecosystem, and that most participants don’t even know or care it’s happening. This might suggest that decentralization itself is not actually of immediate practical or pressing importance to the majority of people downstream, that the only amount of decentralization people want is the minimum amount required for something to exist, and that if not very consciously accounted for, these forces will push us further from rather than closer to the ideal outcome as the days become less early.

All of which is hardly a radical take. Decentralization in and of itself is no panacea — even if it is a cornerstone for where we’re evolving toward.

And as web3 continues to grow in prominence and end product, these conversations will only continue and new gaps will be identified.

One gap that will command increasing focus in 2022 is the question of identity. If technology is a fundamentally social endeavor, we need to know who we’re dealing with, whether from a personal, community, or regulatory perspective.

Relevant:

Via /rcb — Moxie Marlinspike: My first impressions of web3 Matt Levin e— Web3 Takes Trust Too The Web3 Debate Web3 Just Had Its Emperor’s-New-Clothes Moment Via /rcb — Moxie Marlinspike has stepped down as CEO of Signal Moxie Marlinspike is leaving Signal — TechCrunch Via /easwee — Casey Newton on the Danger of Signal Adding Anonymous Payments 2. What people are saying

Here’s FTX’s Sam Bankman-Fried’s tweetstorm on the subject, referencing this blog post response from Dan Finlay:

I think Dan’s ‘right to exit’ is a key piece. Sure, there are centralized apps in crypto — FTX is one! But you have the right to exit: if you don’t like FTX, you can withdraw assets to another exchange, or to your private wallet.
And, in fact, everything in the crypto universe — or at least almost everything — is plugged into decentralized rails which connect all of the (sometimes centralized) pieces. It’s not perfect, from a decentralization angle. But it still helps a ton.
And those rails are not just decentralized, they’re also open to competition. Don’t like using blockchain X to transfer assets? Try blockchain Y! Banks have had the same ~2 ways of sending funds for decades, and can’t change that. Crypto exchanges generally have ~20.
And, similarly, if some infrastructure provider sucks — RPC nodes, for instance — people can create a new one. That doesn’t, of course, mean that they *will* — and I think Moxie makes some really compelling points that we probably don’t do enough of that as an industry.
There is concentration of usage in various places, and some of those places are not really making progress.
In general, I really encourage people to find pieces of crypto infrastructure that seem shitty, and build competitors to them. That’s how we move forward.
And Moxie’s other point — that crypto is too profit-motivated right now — seems totally correct to me. Hopefully we can take a longer-term view of the ecosystem.

Sam’s first point reminds me of something that an old friend told me (who properly introduced me to Bitcoin told me back in 2013): You can build centralized services on a decentralized framework, but that doesn’t hold true the other way around.

And having that decentralization layer, having that option — whether or not people decide to take it — is powerful in and of itself.

To Sam’s second point, it’s probably no surprise that a lot of the real work gets done during crypto winters. (After all, that’s when Solana was developed.)

Relevant:

Dan Finlay: What Moxie Missed on Web3 Wallets 3. The question of atoms v. bits

Elsewhere in web3, here’s a compelling take from the FT’s Rana Foroohar (via /rcb) on why we shouldn’t forget about the real world despite the froth around the metaverse:

As Tim O’Reilly, a technology big thinker and one of the popularisers of the term Web 2.0, wrote recently: “The easy money to be made speculating on cryptoassets seems to have distracted developers and investors from the hard work of building useful real-world services.”
But when we step back from the dust that will eventually settle around Web3, it will be the ubiquitous, industrial changes driven by companies like Tesla that will probably be most impactful. They are transforming old industries and building real world assets.

It reminds me of something Peter Thiel likes to say, paraphrasing: We’ve done a lot of work in the world of bits, but we need to do more work in the world of atoms.

(By the way, it’s not necessarily clear that those two things are mutually exclusive. And as Rana points out in her piece — referencing the laying down of fiber to set up the internet revolution — they often go hand in hand.)

Relevant:

Via /rcb — When the Web3 bubble pops, real world assets will survive Turkish Lira Is Now More Volatile Than Bitcoin Bitcoin Could Hit $100,000 if Investors Treat It Like Gold, Goldman Sachs Says JPMorgan: 2022 Could Be ‘Year of the Blockchain Bridge’ — Blockworks Via /markonovak — The future of cities is in social DAOs Virtual Currencies: Additional Information Could Improve Federal Agency Efforts to Counter Human and Drug Trafficking Powell Says Private Coins Could Compete With Fed Digital Dollar The Mnuchin Files: New Documents Shed Light on Trump-Era Crypto Policy When the Brands Came for Crypto: Can a Subculture Survive the Social Media Managers? Investors Buy Up Metaverse Real Estate in Virtual Land Boom | WSJ Fidelity Report: Crypto Regulation Will Be Required in 2022 — Blockworks When the Crypto Founder Is Real but the Name Is Fake Analysis: Crypto companies bet new mayor will make New York digital asset hub Via /TravisXRP — China to create own NFT industry based on state-backed blockchain network Arkansas Tries a New Strategy to Lure Tech Workers: Free Bitcoin Iran’s Central Bank, Ministry of Industry to Allow Crypto Payments for Foreign Trade Settlements — Blockworks Credit Rating Agency Moody’s Sounds Alarm on El Salvador’s Bitcoin Policy — Decrypt CFTC Fines Crypto Betting Service Polymarket $1.4 Million, Orders Shutdown of Three Markets — Blockworks 4. Tweet of the week — guy who created Ethereum edition

Nicogs.eth:

5. The investment everyone is talking about

Paradigm and Sequoia are investing $1.15B in Ken Griffin’s Citadel Securities at a $22 billion valuation.

(Noteworthy: Ken Griffin is the guy who outbid ConstitutionDAO for that old piece of paper because, apparently, his son told him to.)

It’s an intriguing marriage between the old and the new — as the crypto kids like to say, tradfi (traditional finance) versus DeFi.

Anyway, here’s The Information’s Hannah Miller on why this deal matters:

The investment seems to be more strategic in nature, however, rather than strictly venture-oriented. Citadel Securities is interested in getting into crypto market making, but is waiting for greater regulatory clarity surrounding digital currencies, especially in the U.S., a company spokesperson told The Information.
Paradigm and Sequoia can also connect Citadel Securities to exchanges and companies that would enable the market maker to build a crypto business. Both VC firms have invested in crypto exchange FTX, for example, as well as in Fireblocks, a startup that offers a platform for market makers to manage crypto trading.
In addition to potentially securing a powerful new partner for their portfolio companies, Paradigm and Sequoia could also benefit from investing in a company that’s well-positioned to be a major player in crypto trading, in the event that the Securities and Exchange Commission recognizes digital currencies as securities. That move seems likely given SEC Chair Gary Gensler’s recent attitude.
Even though many crypto advocates don’t consider digital currencies to be securities, Paradigm and Sequoia seem to be betting on that to change with their investment. Citadel Securities already has two decades of experience in trading securities and established relationships with regulators, although it has recently tangled with the SEC. Its deep connections could help it quickly become a crypto market maker in compliance with emerging regulations. Citadel Securities could also let Paradigm, which helped create a digital currency lobbying group, gain better access to regulators.

Relevant:

Paradigm, Sequoia to Invest $1.15B in Citadel Securities Ken Griffin’s Trading Powerhouse Gains New Partners WSJ News Exclusive | Crypto Exchange FTX Sets Up $2 Billion Venture Fund Visa Partners With ConsenSys to Develop CBDC On-ramp Tool — Blockworks US Banks Form Group to Offer USDF Stablecoin PayPal Explores Launch of Own Stablecoin in Crypto Push Venmo, PayPal and Zelle must report $600+ in transactions to IRS 6. Stuff happens Via /easwee — Dev corrupts NPM libs ‘colors’ and ‘faker’ breaking thousands of apps SEC’s Gensler won’t say whether ether is a security, amid crypto market slide SEC Pushes for More Transparency From Private Companies Why FTC Chair Lina Khan’s Attempt to Stop Tech ‘Surveillance’ Faces Long Odds Traders Will Be Watching if December CPI Confirms Inflation at 4-Decade High What FTC’s Meta Victory Means A Facebook antitrust suit can move forward, a judge says, in a win for the F.T.C. Meta’s real antitrust problems are only beginning Inflation in 2021 hit 7%, highest since 1982 Via /coddsquad — Merit grabs $50M Series B to expand digital credentials platform — TechCrunch

GiD Report#195 — Can you trust your million dollar NFT? was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KILT

Announcing SocialKYC, Built on KILT

We are thrilled to announce the launch of SocialKYC, a service for regaining control over your digital identity using KILT Protocol. SocialKYC is a building block for making online services in Web3 better and safer, providing a way to return your credentials to where they belong — with you! KYC or “Know Your Customer” is standard practice while opening an account with a bank or exchange, whe

We are thrilled to announce the launch of SocialKYC, a service for regaining control over your digital identity using KILT Protocol. SocialKYC is a building block for making online services in Web3 better and safer, providing a way to return your credentials to where they belong — with you!

KYC or “Know Your Customer” is standard practice while opening an account with a bank or exchange, where customers must prove they possess government-issued credentials like a passport. SocialKYC gives the user power to extend trust in a similar way, by building a digital identity using their social accounts.

SocialKYC and Sporran, the KILT wallet, were built by B.T.E. BOTLabs Trusted Entity GmbH (B.T.E.) a subsidiary of BOTLabs GmbH, the initial developer of KILT. Used together, SocialKYC and Sporran allow users to manage and store their personal credentials, and decide which elements of their private information they share with online services.

SocialKYC issues credentials to users confirming the ownership of their email address or social media accounts after the user proves that they control the account. Unlike sign-in processes on the internet to date, SocialKYC then forgets about the user and the credential as soon as the credential is issued. The credential, the personal information, and the control remain with the user.

How it works:

SocialKYC first sends the user a simple task to verify their control of a specific account (confirm your email address; Tweet this message to prove this is your account). After a successful check, SocialKYC issues a credential to the user. The user can save the credential in their Sporran wallet, which is installed locally on their computer. The credential states the ownership of the specific account. This personal data is not stored or shared by SocialKYC. It remains in the user’s Sporran wallet under their full control. Using Sporran on their local computer, users can later send their credential to any online service that accepts the credential. Users can also choose to publish one or more of their credentials, making them accessible to anyone. Published credentials can be unpublished by the user at any time.

Create your identity and get your first KILT credentials now! Here’s how to get started:

Get Your Sporran

The first step is setting up your Sporran, a browser extension for Firefox and Chrome that serves as a wallet for your KILT credentials and KILT Coins. Download this at sporran.org, then follow our step-by-step guide to set it up.

Create Your KILT Identity

In KILT, identity starts with a unique decentralized identifier (DID), represented by a string of numbers and letters:

did:kilt:light:014sxSYXakw1ZXBymzT9t3Yw91mUaqKST5bFUEjGEpvkTuckar

You will get your first DID as part of the Sporran setup. Your digital identity is built by adding different credentials like email, Twitter account, passport, certificates, etc. to this identifier. You may even want to create several digital identities; one personal, one for work, one for gaming.

Get Your KILT Credentials

Once your Sporran is set up, follow our guide to creating your email credential or Twitter credential via SocialKYC here.

By default, these free credentials are anchored on the KILT blockchain and are linked to a “light” DID that lives on the computer it was created on.

Credentials for platforms such as Discord, GitHub, and others are coming soon, which will support use cases like gaming, e-sports leagues, health care, academia, media, and decentralized social networks. We’ll announce SocialKYC partner integrations as they happen, so keep an eye on our Twitter account and blog.

Upgrading to a Full DID

The Sporran also provides the option to upgrade to a “full” on-chain DID. This allows you to create a service endpoint such as a url, providing a way to make your credentials public and making it easier for online services to access them.

Upgrading to a full DID requires a transaction fee payable in KILT (currently less than 0.01 KILT) and a deposit of 2 KILT. The deposit is designed to incentivize users to clear the data from the blockchain once it is not needed any longer, reducing wasted storage space. If you choose to delete a DID at a later date, this deposit is returned.

This could also be of use, for example, for companies who want to share their product credentials, etc. The full DID also allows the owner to set additional keys for different purposes beyond authentication, including attestation and delegation. This would allow the owner to issue their own credentials and create delegation hierarchies — useful in a company where several employees would be involved in the process.

On-chain DIDs will also be useful for individuals who want to publish their credentials, use key rotation for extra security, and anchor their DID anchored on the blockchain.

See how to upgrade here.

Start Building!

We’ve created the infrastructure. The rest is up to you, the Web3 community! Got an idea? Build it! We can’t wait to see you create services that use SocialKYC to make Web3 a reality.

Go to our dev site, read our documentation, and check out the open-source SDK. If you’re not already in the community, join our Discord channel. We’re excited to see the apps and concepts you come up with. Don’t forget, KILT Treasury funds are available for creating infrastructure or initiatives that benefit the network.

We’ve started the ball rolling with a fun use case — a SocialKYC launch raffle on ClanKILT. Set up your credentials with SociaKYC, then enter to win limited edition KILT and Sporran swag.

Keep an eye out for more exciting uses for the full DID coming soon, like DIDsign!

About B.T.E. BOTLabs Trusted Entity GmbH

B.T.E. BOTLabs Trusted Entity GmbH (B.T.E.) is a subsidiary of BOTLabs GmbH, a blockchain R&D company founded in 2018 and based in Berlin. BOTLabs GmbH is the initial developer of KILT Protocol, now a fully decentralized blockchain identity network for issuing self-sovereign, anonymous and verifiable credentials. BOTLabs GmbH is also a founding member of the International Association for Trusted Blockchain Applications (INATBA) and the Decentralized Identity Foundation (DIF), and a member of Blockchain for Europe.

B.T.E. collaborates with developers, enterprise and government entities to build applications and services that restore user control and protect digital identity. Use cases include gaming, health care, IoT, academia, sustainability and energy. The Sporran wallet, Stakeboard platform and SocialKYC identity verification are the first services developed by B.T.E.

About KILT Protocol

KILT is a decentralized blockchain identity protocol for issuing self-sovereign, anonymous and verifiable credentials. KILT’s mission is to return control over personal data to its owner, restoring privacy to the individual. Developers can use KILT’s open-source Javascript SDK to quickly build applications for issuing, holding and verifying credentials and create businesses around identity and privacy.

Announcing SocialKYC, Built on KILT was originally published in kilt-protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Managing Trial Periods with Auth0 Actions

How to use Auth0 Actions to let users access your application or website for a trial period.
How to use Auth0 Actions to let users access your application or website for a trial period.

Dock

DOCK Staking Now Available on Binance

One of the worlds largest cryptocurrency exchanges confirms staking DOCK tokens is now available. Binance launches DOCK staking with up to 37.27% APY.

One of the worlds largest cryptocurrency exchanges confirms staking DOCK tokens is now available. Binance launches DOCK staking with up to 37.27% APY.

Launching a new high-yield DOCK staking activities, community members can stake DOCK tokens from 19th January 2022, at 12 pm UTC.

Token holders staking on Binance will play an important role in securing the network and Dock blockchain by helping select the active set of validators and share in the emission rewards that are paid out. So if you’ve already got your account set up, you can begin staking DOCK tokens on Binance’s simple and easy to use platform.

As well as being able to stake on crypto exchance, Binance, together we’re running 3 activities for token holders to increase their rewards, activity 2 and 3 are for a limited time only.

Activity 1:

Deposit and Stake DOCK for 10, 30, 60, or 90 days and enjoy up to 37.27% APY.

By locking a minimum of 10 DOCK tokens for 10 days, you could earn up to 10,000 DOCK, and by locking the tokens for 30 days, you could earn up to 200,000 DOCK. Find out more from Binance here.

Activity 2 ending 26th January 2022, 11:59pm UTC:

The first 1,000 users who spot-trade DOCK tokens and participate in 30/60/90 days of DOCK locked staking are eligible to each receive 400 DOCK.

Activity 3 ending 26th January 2022, 11:59pm UTC:

The first 500 users who participate in Locked Staking for the first time and subscribe to DOCK Staking during the promotion period are eligible to share in various rewards from 100 - 500 DOCK.

Terms and Conditions apply, see Binance for details.

AMA on Binance

Want to know more about Dock and it’s projects? We’re holding 3 AMAs on Binance’s Telegram Channels. Join the English, Turkish, or Chinese Telegram chat and ask us anything!


Ontology

Ontology Weekly Report (January 11–17, 2022)

Highlights Ontology has announced the opening of a new European office in Berlin, Germany. We believe that both the industry heritage and the policy environment have given Germany a key strategic position in the field of blockchain and Web3. By opening its new office in Europe, Ontology aims to play a role in increasing privacy across the continent and highlight the benefits that its decentralize
Highlights

Ontology has announced the opening of a new European office in Berlin, Germany. We believe that both the industry heritage and the policy environment have given Germany a key strategic position in the field of blockchain and Web3. By opening its new office in Europe, Ontology aims to play a role in increasing privacy across the continent and highlight the benefits that its decentralized solutions can bring to users and regulators alike. We are excited to contribute to the continent’s ongoing efforts to create a more secure web. We look forward to growing our presence in Europe and contributing to the ecosystem through a host of new partnerships, community members, and employees.

Latest Developments Development Progress We have completed the launch of Ontology’s EVM TestNet. Developers are welcome to conduct various application tests. Related documents can be obtained at the Ontology Developer Center. We are 100% done with the survey on the improvement of the ONT and ONG decimals and we are 85% done with testing. We are 79% done with the survey on the RISCV’s simplified instruction set architecture. We are 61% done with the truebit challenge design survey. We are 33% done with the Rollup VM design. Product Development ONTO is hosting an NFT lottery campaign with NFTmall, and has released the first co-branded NFT in 2022. Participants who buy one Cyberpunk ONTO 2022 NFT have the chance to win a great bonus. The event is in progress, you still have the opportunity to get rewards. ONTO hosted an NFT giveaway campaign with X WORLD GAMES. Participants who completed related tasks had the opportunity to earn NFT equipment. The event has successfully concluded, with more than 2,700 participants. ONTO hosted an NFT giveaway campaign with SIL Finance. Participants who completed related tasks had the opportunity to earn SIL Augmented Functional NFTs (SAFN). With the help of SAFN, stakers can enhance their mining experiences in several ways. The event has successfully concluded, with more than 1,000 participants. ONTO hosted an NFT giveaway campaign with Dragon Kart. Participants who completed all the tasks had the opportunity to earn a Dragon Kart NFT. The event has successfully concluded, with more than 1,100 participants. On-Chain Activity 124 total dApps on MainNet as of January 17th, 2022. 6,862,340 total dApp-related transactions on MainNet, an increase of 5,613 from last week. 16,949,618 total transactions on MainNet, an increase of 20,769 from last week. Community Growth This week, DuMont, an outstanding Harbinger from our English community, published《Staking vs Candidate Node Ownership》, which introduced the Ontology staking strategy for the community members in detail. The article answered questions relating to ”How to participate in Ontology staking”, “How to launch a candidate node?”, and more. Community members all said that they have benefited a lot. This week, Humpty Calderon, our Head of Community, was invited to join the AMA held by BingX. He shared Ontology’s long-term efforts and future goals, with a focus on DID and data. He explained how users will be on-boarded into the Ontology experience for secure, self-sovereign control over their digital existence. Humpty also interpreted Ontology 2022 Roadmap: From the very inception of the project, the aim has been to build the infrastructure required for users to interact with this new generation of decentralized applications, and that aim has not changed. We held our weekly Discord Community Call and Telegram Community Call, with the theme “Staking”. For those who are new to Staking, they rely more on APY when choosing a node, while Ontology Harbinger Polaris and SG Node believe that it is necessary to pay more attention to the revenue distribution ratio of each node. It is necessary to check the comparison before the start of each Consensus Round, which can reduce problems such as staking reward losses. We held our Telegram weekly Community Discussion, led by Benny, an Ontology Harbinger from our Asian community. He talked with community members about DID, including its function, meaning, and the changes it brings to Web3. The Ontology community believes that DID will be an indispensable foundation in Web3; by comprehensively deploying DID, Web3 will realize the protection of identity and privacy, and truly return data ownership to users. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates. Global News Li Jun, Founder of Ontology, was invited to participate in an exclusive interview hosted by Crypto Coin Show, and shared the latest progress of Ontology EVM. He said that Virtual Machines (VMs) play a crucial role in both the aforementioned need for interoperability and in the expansion of applications built on Ontology. The Ontology EVM will be launched in Q1 2022, and aims to establish seamless interoperability between Ontology and the Ethereum platform and to offer an inclusive experience to developers and users. Moonstake published a series of articles to introduce their strategic partners. In the article 《What You Need to Know about Ontology in 2022》, Moonstake elaborated on Ontology’s plans and goals for 2022, stating that they are proud to be strategic partners of Ontology. They will continue to maintain strong support for the ever-growing and developing Ontology ecosystem in 2022, building Web3 infrastructure with Ontology together. Ontology in the Media

Finextra — 《Five Digital Identity Trends to Watch in 2022

“In 2020, the world went online and fraudsters were quick to pounce on the opportunities as businesses rapidly digitised their operations. The wave of opportunistic fraudsters drove the rate of identity document (ID) fraud up by 41% in 2020. Given the rise in sophisticated identity fraud over the past 12 months and the staggering losses of $712 billion as a result, it is critical that organisations understand the most common trends and take the necessary steps to protect themselves. ”

With the continuous development of decentralized technology, decentralized identity is getting more and more attention. Whether it is traditional Internet giants or blockchain start-ups, they are building their own decentralized identity solutions based on blockchain technology, and are committed to building a new generation of Internet with more security and privacy. Ontology has worked on bringing trust, privacy, and security to Web3 through providing decentralized identity (DID) and data solutions. From the very inception of the project, the aim has been to build the infrastructure required for users to interact with this new generation of decentralized applications.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Ontology Weekly Report (January 11–17, 2022) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

SPC NXT and Coinfirm: a Strategic Partnership

LONDON, 19 January 2022 – SPC Next Consulting (SPC NXT), a premier corporate advisory and technology consulting company based in India has announced a strategic partnership with Coinfirm, a leading crypto AML compliance and analytics firm, in a move to introduce highly advanced blockchain technologies in the regulatory space to key industries. SPC NXT is...
LONDON, 19 January 2022 – SPC Next Consulting (SPC NXT), a premier corporate advisory and technology consulting company based in India has announced a strategic partnership with Coinfirm, a leading crypto AML compliance and analytics firm, in a move to introduce highly advanced blockchain technologies in the regulatory space to key industries. SPC NXT is...

Elliptic

Crypto Regulatory Affairs: Hong Kong Regulator Releases Blueprint for New Crypto Framework

Hong Kong Regulator Releases Blueprint for New Crypto Framework  On January 12 2022, the Hong Kong Monetary Authority (HKMA) released a discussion paper on cryptoassets. Later this year, a bill will be introduced there to regulate businesses operating in the space. At first, the regime will only apply to cryptoasset exchanges, while custodians and wallet providers will be inc
Hong Kong Regulator Releases Blueprint for New Crypto Framework 

On January 12 2022, the Hong Kong Monetary Authority (HKMA) released a discussion paper on cryptoassets. Later this year, a bill will be introduced there to regulate businesses operating in the space. At first, the regime will only apply to cryptoasset exchanges, while custodians and wallet providers will be included at a later stage. The HKMA’s paper sets out considerations for expanding the scope of Hong Kong’s regulatory framework, and in particular, how it should regulate stablecoins. 


IDnow

ID Verification with AutoIdent – meeting UAE Regulations 

Many industries are seeing a significant increase in the use of digital services. Of course, this has been ongoing for some time, but the COVID-19 pandemic has expedited it.     With more digital services comes an increased risk of fraudulent activity As customer onboarding and verification become automated, there are new challenges with which we must deal. Regulation has been devised to addre

Many industries are seeing a significant increase in the use of digital services. Of course, this has been ongoing for some time, but the COVID-19 pandemic has expedited it.    

With more digital services comes an increased risk of fraudulent activity

As customer onboarding and verification become automated, there are new challenges with which we must deal. Regulation has been devised to address such matters through the control and use of technical solutions and the checks and processes which currently exist.  

IDnow has developed solutions that offer state-of-the-art video-based and biometric onboarding. These have been designed to meet regulations in Europe and the Middle East. They continue to do so as regulations expand and tighten, particularly as our industry evolves and becomes more engrained as a standard part of the AML / KYC process.  

The UAE is one of the furthest ahead countries in the Middle East. It has technology, AML, and electronic signature regulation in place. The country has also adopted full use of national identities. IDnow’s AutoIdent product is ready to meet UAE regulations and help financial institutions realize the security and user experience benefits of automated verification.  (See also our article: UAE accelerates use for automated identity verification)

Growth of identity verification and regulation  

Video and automated onboarding has increased in recent years, along with controlling regulations. Of course, there is a need to ensure secure processes, both in identity matching and in the wider area of safeguarding against fraud and money-laundering activity.  

Europe and the UK have been leading developers of identity regulation. The first AML laws (the EU Anti-Money Laundering Directives) were released in 1991, following Financial Action Task Force  (FATF) guidance. Germany – traditionally a highly regulated market – later moved to allow video-based identity verification. Many other countries have since done the same. Being more deregulated in this area, the UK is one of the most supportive.   

This use and approval of video-based verification have quickly expanded to further automated and biometric verification. Many EU countries now permit some form of electronic identification / verification, and the trend is well set for expansion.  

IDnow’s verification solutions have been developed alongside this growth. They aim to fully support new and evolving technology while also meeting the latest regulations. Many of such regulations throughout the Eurozone, have been promulgated with the input of IDnow and our relationships with national and international contacts. 

VideoIdent was launched in 2014, largely to meet German regulators’ requirements for video-based identity verification. As regulations and technology evolved, AutoIdent was launched as a single solution offering greater user experience, faster conversion rates, and added security against fraud. 

AutoIdent takes VideoIdent further and offers fully-automated biometric identity verification with video or agent fallback. As a fully automated or hybrid solution, Autoident encompasses current and the most recent developments in digital identity verification.  

Read on: ID Verification with AutoIdent – meeting European Regulations

The regulatory trend to allow biometric solutions that rely on automation and Artificial Intelligence (AI) to onboard customers is evolving across Europe, Asia, and now the Middle East.  

UAE regulations  

For technical regulation, the UAE follows joint guidelines issued by The UAE Central Bank, Securities and Commodities Authority, the Dubai Financial Services Authority (DFSA) of the Dubai International Financial Centre (DIFC), and the Financial Services Regulatory Authority (FSRA) of Abu Dhabi Global Market (ADGM). They have jointly issued “Guidelines for Financial Institutions Adopting Enabling Technologies.“    

These regulations apply to all financial institutions that are licensed and supervised under UAE law. In terms of identity verification, these are advanced and far-reaching regulations – going further than many European countries as of early 2022. They permit video onboarding as well as fully automated biometric solutions. The use of NFC technology for reading national identity cards is also permitted.  

In addition to these technical regulations, anti-money laundering laws cover monitoring and screening requirements. The primary law in the UAE is ”Federal Decree-Law No. (20) of 2018 On Anti-Money Laundering and Combating the Financing of Terrorism and Financing of Illegal Organisations.” These AML regulations have been designed to follow international FATF guidance, much like the European AMLD regulations.   

The use of electronic signatures is regulated under two separate laws. Federal Law No. 1 of 2006 (eCommerce Law) applies countrywide in the UAE, while Dubai International Financial Centre (DIFC) has introduced separate legislation under the DIFC Electronic Transactions Law of 2017. These are more open than EU laws, however, and they do not define specific Qualified Electronic Signature standards.  

Just as AutoIdent meets European regulations and is used with several financial institutions, it also complies with UAE regulations. The following are some of the main areas to consider.  

Use of AutoIdent to meet UAE regulations for automated identity verification  

UAE regulations permit the use of automated and biometric identity verification. Under the UAE joint regulations, an automated solution should carry out the following steps:  

ID Proofing Check (to check the identity document is valid and genuine)   Similarity Check (to match the person and the identity document)   Liveness Check (to ensure the person is physically present)  

IDnow’s AutoIdent solution is designed to support these steps. Full automated identity verification is also supported by manual verification as a backup. If the AI is unsure of a case (or country regulations require a hybrid approach), manual video-based verification is available to ensure the completion of the process.   

Supporting NFC technology  

The UAE is a leader in the region with its Emirates ID national identity card. This is compulsory for citizens and uses NFC technology.  

AutoIdent has the functionality to read biometric data from NFC chips, developed to meet ICAO 9303 standard. NFC is currently used to read German identity cards but can work with any chips using global standards – such as the Emirates ID.   

Meeting AML regulations  

AutoIdent has been developed for compliance alongside EU AMLD regulations, and it fully meets the requirements of the latest 4AMLD, 5AMLD, and 6AMLD updates.  These regulations have evolved to respond to criminal behaviour changes and increasing sophistication of methods used.  

UAE AML regulations, despite being first released later than EU regulations, incorporate many of the main advances seen in AMLD. This includes key areas such as Political Exposed Persons (PEP) and checking sanction lists.   

Under AML regulations (AML-CFT Decision 4.2(b), 7.2, 15, 22, 25), enhanced due diligence must be carried out for PEPs and customers associated with high-risk countries. AutoIdent offers this functionality through add-on modules to the single platform. It handles fully compliant AML Screening & Monitoring. This includes checking Politically Exposed Person (PEP) and Sanction Lists.  

Use of Qualified Electronic Signatures  

The use of QES is relatively new in the UAE but is permitted in many areas. Crucially, the UAE’s laws allow the use of electronic signatures and state that a contract may not be deemed invalid or unenforceable solely because it is in electronic format. However, acceptance and legal basis are more open than under European regulations.    

Electronic Signatures can only be used in some areas. Traditional paper signatures are currently still required for transactions involving immoveable property, negotiable instruments, civil documents, and other processes that require a notary public.   

AutoIdent fully supports eIDAS regulations for the use of Qualified Electronic Signatures (QES). Neither the UAE E-Commerce Law nor the DIFC law prescribes specific technical criteria for acceptable signatures, and as such, QES would not be automatically deemed acceptable. Working to the highest QES standards with a reputable provider is the best way forward, though. With limited use to date, the area is somewhat untested.  

Webinar: The evolution of ID Verification

Moving to more automation in the future  

Many countries are rapidly moving towards greater use of automated identity verification and electronic signatures. In Europe, Germany, Austria, Spain, and the UK all permit this process, and more are moving this way. The UAE is a leader in the Middle East, but other countries will likely follow.  

Recently, in August 2021, Abu Dhabi Islamic Bank reported that it had successfully implemented automated customer onboarding. This uses facial recognition techniques combined with reading NFC data from a customer’s Emirates ID card and passport.   

IDnow firmly believes that the industry is moving to document and biometric-centered identity verification – in the Middle East and elsewhere. There is growing evidence that automatic techniques are more secure than manual ones in many areas (especially facial recognition). The FATF now supports this, and many more regulators are likely to follow its guidance.   

For a more in-depth look at AutoIdent, how it has evolved, and how it offers a well-thought-out balance between security and user experience, take a look at IDnow’s recent webinar.  

By

Rayissa Armata
Head of Regulatory Affairs at IDnow
Connect with Rayissa on LinkedIn


IDnow announces Johannes Meerloo as new COO

Munich, January 19th 2022, IDnow, a leading provider of Identity Verification-as-a-Service solutions, is pleased to welcome Johannes Meerloo to the management team as Chief Operations Officer (COO). In this role, he will be responsible for the group-wide Operations division. Johannes Meerloo is a co-founder of identity Trust Management AG, which was acquired by IDnow in […]

Munich, January 19th 2022, IDnow, a leading provider of Identity Verification-as-a-Service solutions, is pleased to welcome Johannes Meerloo to the management team as Chief Operations Officer (COO). In this role, he will be responsible for the group-wide Operations division.

Johannes Meerloo is a co-founder of identity Trust Management AG, which was acquired by IDnow in 2021. He has spent the last eleven years as Managing Director, Board Member and Chief Operating Officer (COO) there, overseeing the company’s entry into international markets following a comprehensive rebranding. Prior to that, he was a member of the management board of ID 8 GmbH, the predecessor of identity Trust Management AG. He thus has more than 15 years of experience in the identity management sector.

IDnow acquired identity.TM as well as the French market leader ARIADNEXT last year. One of Johannes Meerloo’s first tasks will be to merge the operational areas as a result of these M&A activities. In combination, IDnow will be able to offer customers of the combined companies an optimized service from a single source and across a broad product portfolio, setting new standards in the European market.

“I look forward to continuing to work with Johannes Meerloo in his new role. We have known each other for a long time and I greatly appreciate his extensive and long-standing experience in the identity sector. The restructuring and optimization of our operations area will require exactly this experience and I am glad that we could win Johannes for this task,” says Andreas Bodczek, CEO of IDnow.


PingTalk

Predicting 2022?s Top Identity Trends

As our lives grow more digital each day, we're witnessing an exponential growth in data collection and data breaches. This is especially common among financial institutions and entities with large, complex and legacy infrastructures.  

As our lives grow more digital each day, we're witnessing an exponential growth in data collection and data breaches. This is especially common among financial institutions and entities with large, complex and legacy infrastructures.

 


Okta

How to Deploy a .NET Container with AWS ECS Fargate

In a previous article, we learned how to host a serverless .NET application using AWS Lambda. We talked about the history of serverless and how companies are using these types of technology to simplify delivering APIs and functionality faster than traditional methods. Some problems will arise in this type of application when you need more capability than standard HTTP requests like GET, POST, PUT,

In a previous article, we learned how to host a serverless .NET application using AWS Lambda. We talked about the history of serverless and how companies are using these types of technology to simplify delivering APIs and functionality faster than traditional methods. Some problems will arise in this type of application when you need more capability than standard HTTP requests like GET, POST, PUT, DELETE, etc. Web Sockets is a great example of this.

Table of Contents

Understanding containers vs. virtual machines Hosting containers with AWS ECS Fargate Build and dockerize a .NET chat application Authentication for Your .NET chat app Set up a .NET Core web application Add Docker support to your .NET chat application Authorize Your chat app Add chat functionality with Vue.js and SignalR Deploy your .NET chat application to AWS Create an ECR repository Set up the ECS Fargate cluster Recap Learn more about AWS, .NET, and authentication Understanding containers vs. virtual machines

In 2013, Docker was released, beginning a shift in how we think about hosting applications and managing infrastructure. Infrastructure teams and operations had been leveraging virtual machines with great success for decades, so you may be tempted to think, “isn’t this just a virtual machine?” In spirit, yes; in application, no.

To talk about containers, we have to talk about virtual machines. A virtual machine (VM) is a virtual representation of a computer, all the way from its boot process to loading an entire operating system (OS). This provides a lot of flexibility. You gain fine-grained control over an entire virtual computer, as well as all the pitfalls of a whole computer. The operating system has to be maintained, patched, and updated; disks have to be managed, and all manner of operational overhead goes into their care and feeding. Notably, VMs are huge because an entire OS has to go on them. So when you start a VM, you have to go through the entire boot sequence of a traditional computer.

There are three major problem areas that containers hope to solve: ease of management, size, and speed to start. There are others, but for simplicity, I will focus on those that are the most relevant to business processes. Containers differ from traditional virtual machines in one significant way. They do not have an actual OS. This may come as a surprise to you, even if you’ve dabbled in containers before. Instead, containers rely on an abstraction of an operating system that provides hooks into the host OS using standards and conventions supplied by the container system.

Docker is the de facto container standard for all practical purposes, but other container technologies do exist. The notion of a standardized container comes from the shipping industry. In shipping, there was no standardization, which made transport difficult because you had to figure out how to hold each item on a boat, train, or truck. Containers provided a standard format for moving goods: “You can put whatever you want inside the container so long as it conforms to a standard set of dimensions and can be opened and bolted down in a uniform way.” This simplified the logistical process of moving real-world items. In this same way, standardizing the way a software container interacts with its host OS to delegate the responsibility for executing tasks simplifies management of the most crucial part, the application.

Hosting containers with AWS ECS Fargate

Now that we have a better idea of what makes a container different from a virtual machine, how does this solve our three problems?

First, containers make management easier by letting us write build scripts to create a stable, repeatable representation of the host. This is needed to do the body of work required by the application. Second, since an OS ultimately hosts the container, that OS can enforce company and security policies. In the worst case, this would be the most a given container could do. For example, if the host OS only allows traffic inbound from a specific subnet and port, the container cannot override this restriction. The container is ultimately bound by the host OS’s networking rules.

This kind of management reliability continues into data management, memory management, and any other policies that need to be enforced. Ultimately, the result here is business agility. Operations can open the doors to developers knowing that established guide rails are in place. As for size–our second key problem area–since you no longer need a whole OS, container images are often as small as a few MB vs. many GB. On the topic of speed–our third key problem area–since there isn’t a complete boot cycle, the container can effectively start up as fast as the hosted application.

The difficulty with containers moves further up the abstraction chain (as is the style of such things) than virtual machines. The challenge comes down to networking and managing container images across many resources so that you can treat a set of computing as one homogeneous unit. Fortunately, products like Kubernetes, OpenShift, Mesos, Nomad, and others can help you solve these challenges. However, you are ultimately back to managing a fleet of computers along with the associated management overhead.

This is the sweet spot of AWS Elastic Container Service (ECS) Fargate. Fargate gives you networking abstractions across a virtual network known as a VPC (virtual private cloud). This network abstraction is built right into the heart of AWS and is well vetted for any type of workload, including high-security government workloads. Fargate takes this a step further by abstracting away the machine management. You can set up traditional clusters and manage your machines if you want, but leveraging Fargate simplifies one more part of your process. Ultimately the goal with using cloud vendors is to let them handle the infrastructure management so you can focus on managing your business.

Time to jump in and try out .NET containers with AWS Fargate!

What you will need to get started

What you’ll need to continue are the following:

Basic knowledge of .NET .NET SDK 3.1 An AWS account (you’ll be using a free tier product) Okta CLI AWS CLI V2 Docker Desktop for Windows/macOS (Not required if you are on Linux)

This tutorial assumes you already have Docker set up and running.

Build and dockerize a .NET chat application

So what do you want to build? To keep things simple and see the value in using something other than standard HTTP protocols, you’ll use SignalR to build a very basic chat application.

Secure: Only logged-in clients should be able to use the chat functionality Chat users’ names must come from their validated identity Real-time

To achieve this, I am going to use four technologies:

Okta for identity management .NET to host the application SignalR to provide the socket management abstraction Vue.js to provide the rendering for the front-end Authentication for Your .NET chat app

Authentication is vital in any application, but doubly so when you need to depend on who someone is. Okta makes it really easy to delegate access to those who need it and shut down access across the entire suite of applications if a bad actor gets a set of credentials.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Web and press Enter.

Select Other. Then, change the Redirect URI to http://localhost:5000/authorization-code/callback and accept the default Logout Redirect URI of http://localhost:5000.

What does the Okta CLI do?

The Okta CLI will create an OIDC Web App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. You will see output like the following when it’s finished:

Okta application configuration has been written to: /path/to/app/.okta.env

Run cat .okta.env (or type .okta.env on Windows) to see the issuer and credentials for your app.

export OKTA_OAUTH2_ISSUER="https://dev-133337.okta.com/oauth2/default" export OKTA_OAUTH2_CLIENT_ID="0oab8eb55Kb9jdMIr5d6" export OKTA_OAUTH2_CLIENT_SECRET="NEVER-SHOW-SECRETS"

Your Okta domain is the first part of your issuer, before /oauth2/default.

NOTE: You can also use the Okta Admin Console to create your app. See Create a Web App for more information.

Note the Issuer URL and Client ID. You will need this in the next step.

Note: To test out your chat application, don’t forget to manually add a second user to your Okta org. Log in to your Okta org and navigate to the Directory > People page. Click the Add person button and fill out the form.

Set up a .NET Core web application

Let us create a new ASP.NET Core web application using the dotnet CLI. Navigate to the location where you want to create the project and run the following command.

dotnet new webapp -n Okta.Blog.Chat

Let’s call the project Okta.Blog.Chat.

Now, set up your Okta application credentials by opening appsettings.json and add the following to the JSON object after AllowedHosts:

"OktaSettings": { "OktaDomain": "{yourOktaDomain}", "ClientId": "{yourOktaClientID}", "ClientSecret": "{yourOktaClientSecret}" }

Note: You can find the required values from the .okta.env file created at the folder where you executed the okta apps create command. The value of {yourOktaDomain} should be something like https://dev-123456.okta.com. Make sure you don’t include -admin in the value!

Make sure to exclude appsettings.json from Git so that you don’t accidentally commit your client secret. You can generate a .gitignore file by running the command dotnet new gitignore.

Now you’ll need to add the authentication library. Run the following command inside the Okta.Blog.Chat folder.

dotnet add package Okta.AspNetCore

Now modify Startup.cs to use the Okta authentication provider.

Modify the method ConfigureServices(IServiceCollection services) to look like the code below. A line is commented out on purpose. You will use it later for authorization.

public void ConfigureServices(IServiceCollection services) { var oktaMvcOptions = new OktaMvcOptions() { OktaDomain = Configuration["OktaSettings:OktaDomain"], ClientId = Configuration["OktaSettings:ClientId"], ClientSecret = Configuration["OktaSettings:ClientSecret"], Scope = new List<string> { "openid", "profile", "email" }, }; services.AddAuthentication(options => { options.DefaultAuthenticateScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultSignInScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultChallengeScheme = OktaDefaults.MvcAuthenticationScheme; }) .AddCookie() .AddOktaMvc(oktaMvcOptions); services.AddRazorPages() .AddRazorPagesOptions(options => { //options.Conventions.AuthorizePage("/Chat"); }); services.AddSignalR(); }

Make sure to add imports.

using Microsoft.AspNetCore.Authentication.Cookies; using Okta.AspNetCore; using Okta.Blog.Chat.Hubs;

This will add the authentication provider, set the page Chat as an authorized page, and add SignalR support.

Next, modify the method Configure(IApplicationBuilder app, IWebHostEnvironment env) to look like the code below. A line is commented out on purpose, you will use it later during our SignalR setup.

public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { if (env.IsDevelopment()) { app.UseDeveloperExceptionPage(); } else { app.UseExceptionHandler("/Error"); // The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts. app.UseHsts(); } app.UseHttpsRedirection(); app.UseStaticFiles(); app.UseRouting(); app.UseAuthentication(); app.UseAuthorization(); app.UseEndpoints(endpoints => { endpoints.MapRazorPages(); //endpoints.MapHub<ChatHub>("/chathub"); }); }

The key difference here is app.UseAuthentication();.

Next, create a new folder called Hubs and a new class file in that folder called ChatHub.cs. This will provide our chat backend.

using Microsoft.AspNetCore.SignalR; using System.Threading.Tasks; namespace Okta.Blog.Chat.Hubs { public class ChatHub : Hub { public async Task SendMessage(string message) { if (this.Context.User.Identity.IsAuthenticated) await Clients.All.SendAsync("ReceiveMessage", this.Context.User.Identity.Name, message); } } }

You’ll see in this class you’ve made use of the user’s authentication status. This way, even if someone knows the backend is SignalR, you’ve mitigated their ability to use the system unless explicitly authenticated. Any additional authorization logic could go here as well.

Next, you need to make sure you have the pages you need. In the Pages folder, go ahead and delete the page Privacy.cshtml and Privacy.cshtml.cs and add a new page named Chat.cshtml by running the following command from the Pages folder.

dotnet new page -n Chat

Edit Pages/Shared/_Layout.cshtml and modify the second nav item from this:

<a class="nav-link text-dark" asp-area="" asp-page="/Privacy">Privacy</a>

To this:

<a class="nav-link text-dark" asp-area="" asp-page="/Chat">Chat</a>

Now, if you run the dotnet run command or run the application from Visual Studio, you should be able to navigate to the chat page with the heading “Chat”. Make sure that it runs successfully before moving on to the next step.

Add Docker support to your .NET chat application

Before you add the chat functionality, add the Docker support. Add a file named Dockerfile to the root Okta.Blog.Chat folder with the following content:

# build and publish the app FROM mcr.microsoft.com/dotnet/sdk:3.1-bullseye AS build WORKDIR /src ## copy csproj and restore as distinct layers COPY Okta.Blog.Chat.csproj ./ RUN dotnet restore ## copy everything else and publish app COPY . ./ RUN dotnet publish -c release -o /app --no-restore # final stage/image FROM mcr.microsoft.com/dotnet/aspnet:3.1-bullseye-slim AS base WORKDIR /app COPY --from=build /app ./ ENTRYPOINT ["dotnet", "Okta.Blog.Chat.dll"]

We are building and publishing our application first. Next, you will copy the build files and call ENTRYPOINT, which translates to “run dotnet with our published DLL” - nothing extraordinary, right? The FROM keywords use other built docker files as base images to build our image from, so the work of installing .NET is already done.

Now add a .dockerignore file with the below content so that they are not copied to the final image.

**/.dockerignore **/.git **/.vscode **/bin **/obj

If you use Visual Studio, you’ll now have the ability to debug right into a running container in your debug toolbar. If you press F5 or click the play button, you’ll run your app in a Docker Container.

If you are not using Visual Studio, run the following command to build and start the container.

docker build . -t okta-chat docker run -it --rm -p 5000:80 okta-chat

The app should be accessible at http://localhost:5000/.

Authorize Your chat app

Modify Setup.cs and uncomment the following line.

options.Conventions.AuthorizePage("/Chat");

Note: At this point, if you try to authenticate the /Chat path using Google Chrome, you’ll get an error. This is because you are running the app without TLS and Chrome blocks set-cookie headers with SameSite=None when the Secure attribute is not present (which will be preset only for HTTPS requests).

Let’s make sure our docker container can run with HTTPS.

First, you need to create a certificate. You can use the .NET CLI for creating self-signed certificates for development. Please note that this is only for development purposes and should not be used in production.

Run the following command to create a self-signed certificate. Use the correct command based on your OS and use an appropriate password.

# clean existing certs if any dotnet dev-certs https --clean # create a new cert on macOS/Linux dotnet dev-certs https -ep ${HOME}/.aspnet/https/aspnetapp.pfx -p mypass123 # create a new cert on Windows dotnet dev-certs https -ep %USERPROFILE%\.aspnet\https\aspnetapp.pfx -p mypass123 # trust the cert dotnet dev-certs https --trust

Now you can run the container with HTTPS using the below command. You will use the certificate you created above and mount it as a volume.

# macOS/Linux docker run -it --rm -p 5000:80 -p 5001:443 -e ASPNETCORE_URLS="https://+;http://+" -e ASPNETCORE_HTTPS_PORT=5001 -e ASPNETCORE_Kestrel__Certificates__Default__Password="mypass123" -e ASPNETCORE_Kestrel__Certificates__Default__Path=/https/aspnetapp.pfx -v ${HOME}/.aspnet/https:/https/ okta-chat # Windows docker run -it --rm -p 5000:80 -p 5001:443 -e ASPNETCORE_URLS="https://+;http://+" -e ASPNETCORE_HTTPS_PORT=5001 -e ASPNETCORE_Kestrel__Certificates__Default__Password="mypass123" -e ASPNETCORE_Kestrel__Certificates__Default__Path=/https/aspnetapp.pfx -v %USERPROFILE%\.aspnet\https:/https/ okta-chat

The app should be accessible at https://localhost:5001/ now.

In your Okta Developer Portal, go to Applications > Applications and click the application name you created using the CLI. Edit the General Settings, and add URI for Sign-in redirect URIs and Sign-out redirect URIs as below:

Sign-in redirect URI = https://localhost:5001/authorization-code/callback Sign-out redirect URI = https://localhost:5001/

Try running the application. You should see that to open the chat page, you’ll be redirected to the Okta Single Sign-On portal and then redirected back. You are now successfully authenticated.

Add chat functionality with Vue.js and SignalR

You’ll use Vue for state management since you can take as little or as much as you want. With Vue, you. can start with a CDN script tag and use it for a single component, a single page, or a deep dive with a robust build system on Node. For this exercise, you’ll be using a CDN-hosted script.

But first, you need to finish one thing on the backend.

Since you have created your ChatHub.cs, open Startup.cs and uncomment the following line:

endpoints.MapHub<ChatHub>("/chathub");

Now modify the Chat.cshtml file to look like this:

@page <div id="chatApp"> <div class="container"> <div class="row"> <div class="col-2">Message</div> <div class="col-4"> <input type="text" v-model="message" id="message" /> <input type="button" v-on:click.stop.prevent="sendMessage" id="sendButton" value="Send Message" /> </div> </div> </div> <div class="row"> <div class="col-12"> <hr /> </div> </div> <div class="row"> <div class="col-6"> <ul id="messagesList"> <li v-for="(item, index) in chatLog" :key="index">{{ item.User }} - {{ item.Message }}</li> </ul> </div> </div> </div> <script src="https://cdnjs.cloudflare.com/ajax/libs/aspnet-signalr/1.0.27/signalr.min.js"></script> <script src="https://unpkg.com/vue@3.2.24/dist/vue.global.prod.js"></script> <script> const connection = new signalR.HubConnectionBuilder().withUrl("/chatHub").build(); const VueChatApp = { data() { return { isConnected: false, message: "", chatLog: [], }; }, mounted() { connection.on("ReceiveMessage", (user, message) => { this.receiveMessage(user, message); }); connection .start() .then(() => { this.isConnected = true; }) .catch((err) => { return console.error(err.toString()); }); }, methods: { receiveMessage(user, message) { this.chatLog.push({ User: user, Message: message, }); }, sendMessage() { connection .invoke("SendMessage", this.message) .then(() => { this.message = ""; }) .catch((err) => { return console.error(err.toString()); }); }, }, }; Vue.createApp(VueChatApp).mount("#chatApp"); </script>

We use CDN for the SignalR client library, and the connection to the ChatHub uses the endpoint you set earlier:

const connection = new signalR.HubConnectionBuilder().withUrl("/chatHub").build();

The state is stored in the data section of our Vue app. There are three properties in use: a flag to let us know if it is connected, the message currently being typed, and a chat log.

data() { return { isConnected: false, message: "", chatLog: [], } },

Your app has two methods: receiveMessage and sendMessage.

When receiveMessage is called, it appends an object to the chat log with the user and the message.

When sendMessage is called, you use the SignalR connection to invoke “SendMessage” and pass along our message properties. Once the message is sent, you blank it out to fill a new message.

receiveMessage(user, message) { this.chatLog.push({ User: user, Message: message, }); }, sendMessage() { connection .invoke("SendMessage", this.message) .then(() => { this.message = ""; }) .catch((err) => { return console.error(err.toString()); }); },

When the app is created, a hook is added for “ReceiveMessage” that calls the ViewModel receiveMessage method described previously.

Then the connection to the hub is started, and if successful, isConnected is set to true.

connection.on("ReceiveMessage", (user, message) => { this.receiveMessage(user, message); }); connection .start() .then(() => { this.isConnected = true; }) .catch((err) => { return console.error(err.toString()); });

If you run your application at this point, you’ll have a secured chat application running in a container!

Deploy your .NET chat application to AWS

Now that the chat application is complete, it’s time to deploy. First, you need to build the docker image and deploy it for use in AWS. AWS has private container repositories via its Elastic Container Registry (ECR) product.

Create an ECR repository

First, you need to make one final change to the Dockerfile to get TLS working on Fargate.

Add the following two lines to the Dockerfile right after the dotnet publish command:

# build and publish the app ... RUN dotnet publish -c release -o /app --no-restore ARG CERT_PASSWORD RUN dotnet dev-certs https -ep /app/aspnetapp.pfx -p ${CERT_PASSWORD} # final stage/image ...

This will create a self-signed development certificate right within the docker image, which you can use to run the application with TLS on Fargate for demo purposes.

Note: This type of development certificate is not recommended for production use. It is used here for simplicity in a demo. For a production setup, you should use a certificate signed by a certificate authority. Recommendation: to run a production .NET application on ECS, use an AWS Application Load Balancer (ALB) to route traffic to a reverse proxy (Nginx) via an HTTPS listener configured to use a certificate signed by a certificate authority. The reverse proxy would then route traffic to the application. The reverse proxy and the application should also be configured to use TLS using valid certificates.

Log in to your AWS console and navigate to ECR.

Click Create Repository.

For visibility settings, choose Private and name your repository. Then click Create Repository at the bottom of the wizard.

Navigate to okta-chat and click View push commands. This has all the steps you’ll need to build and push your image to ECR.

Follow the steps and push the images to ECR. Make sure to pass --build-arg CERT_PASSWORD=mypass123 to the docker build command like below.

docker build --build-arg CERT_PASSWORD=mypass123 -t okta-chat .

Select the copy button in the URI column next to your image and save that for later. This is the path to your image.

Set up the ECS Fargate cluster

Now set up your ECS cluster.

On the left menu, click Clusters under Amazon ECS, then click the Create Cluster button.

Click Networking Only > Next step from the options.

I named my cluster “okta-test”, name yours as you see fit, and click Create on the following screen. Then click View Cluster.

Now that the cluster is set up you’ll need to create a task for your image.

Click Task Definitions on the left menu and click Create new Task Definition.

For launch type, select FARGATE and click Next step.

For the Task Definition Name, I named mine okta-chat.

Set the Task memory (GB) to 0.5 GB. Set the Task CPU to 0.25 vCPU.

.NET Core applications are very efficient, as are containers. For many applications, you’ll find you can serve many requests with smaller boxes than you might expect.

Leave other options unchanged.

Now you need to define the image you want to use. Click Add Container.

Let us name the container the same thing – okta-chat.

For the Image, you’ll need the path you copied from the ECR repository earlier.

Set the soft limit to 512, and add ports 80 and 443.

Scroll down to the ENVIRONMENT section and add the following key-value pairs:

ASPNETCORE_URLS="https://+;http://+" ASPNETCORE_HTTPS_PORT=443 ASPNETCORE_Kestrel__Certificates__Default__Password="mypass123" ASPNETCORE_Kestrel__Certificates__Default__Path=./aspnetapp.pfx

Click Add at the bottom of the pop-up. Now click Create at the bottom of the page. Go back to your cluster, click on the Tasks tab, and click Run new Task.

Note: Here’s a recommendation: Create services instead of tasks for production to take full advantage of the elastic scaling capabilities of ECS. I’m using tasks directly for the simplicity of the demo. When using an ALB (application load balancer) to route traffic to the application, use of services is also required.

Select FARGATE as Launch type.

Select your default VPC and Subnets under VPC and security groups.

Click the Edit button next to the security groups. Click on Add rule to add a new inbound rule for HTTPS port 443. Click Save on the pop-up.

Now, click Run Task.

You’ll be taken back to your cluster. Select the task you created by clicking its ID. Make a note of its public IP; you’ll need that to adjust your Okta application settings.

In your Okta Developer Portal, go to Applications > Applications and click the application’s name you created using the CLI. Edit the General Settings and add URI for Sign-in redirect URIs and Sign-out redirect URIs using the IP address from your running task as shown below:

Now, if you navigate to https://YOUR_FARGET_TASK_PUBLIC_IP, you’ll see you are the proud owner of a fully functional chat application, secured with Okta, running in an ECS Fargate container.

Recap

Whew, that was a ride! Good job on making your new chat application. What can we take away from this?

Serverless is good for HTTP request/response, but other protocols need something different. Containers are more lightweight than virtual machines but come with their own challenges. A docker file consists of the instructions to build your application and launch it, and which ports to expose. Fargate makes the hosting of containers easier since you don’t have to manage the host machine infrastructure. SignalR makes real-time communication easier by abstracting most of the heavy lifting. Vue can be used for state management without taking on additional build and development pipeline. Okta makes it easier to secure any type of .NET web application. There is no reason to have an insecure site! Use AWS Application Load Balancer (ALB) + Nginx configured to use TLS with valid certificates in production.

Check the code out on GitHub here.

Learn more about AWS, .NET, and authentication

If you are interested in learning more about security and .NET check out these other great articles:

The Most Exciting Promise of .NET 5 ASP.NET Core 3.0 MVC Secure Authentication 5 Minute Serverless Functions Without an IDE Create Login and Registration in Your ASP.NET Core App Build Secure Microservices with AWS Lambda and ASP.NET Core Build a CRUD App with ASP.NET Core and Typescript Build a GraphQL API with ASP.NET Core Maintaining Transport Layer Security all the way to your container

Want to be notified when we publish more awesome developer content? Follow @oktadev on Twitter, subscribe to our YouTube channel, or follow us on LinkedIn. If you have a question, please leave a comment below!


KuppingerCole

3 Reasons Why Now is the Best Time to Book Your EIC Ticket

by Michel Liebscher Given the current state of the world, any decision about whether to attend a conference needs careful consideration. Here are three reasons why you should book your hybrid ticket for the European Identity and Cloud Conference sooner rather than later. Reason #1: Prime Discount Expires on January 31st  We all like winter sales, Black Friday deals, special offers, and e

by Michel Liebscher

Given the current state of the world, any decision about whether to attend a conference needs careful consideration. Here are three reasons why you should book your hybrid ticket for the European Identity and Cloud Conference sooner rather than later.

Reason #1: Prime Discount Expires on January 31st 

We all like winter sales, Black Friday deals, special offers, and early bird discounts because seizing an opportunity and long-term planning always pays off! Sign up for EIC 2022 by January 31st to take advantage of the Prime Discount and save your company a lot of money. The Prime Discount applies to all tickets, which means you get 66% off your hybrid ticket and nearly 80% off your virtual ticket. If you feel thatmaking the decision now is too early, read on.

Reason #2: 100% Money Back Guarantee Until April 9

If you have Zero Trust with regards to the pandemic, no need to worry, because we guarantee a 100% refund until April 9, 2022, so you can book your tickets risk free. We also guarantee that the EIC will be a safe place in 2022! We constantly keep our hygiene concept up to date with the latest developments in infection control.

Reason #3: Bring a Friend for a Higher Discount

Did you know that you get an extra discount on top of the Prime Discount if you book multiple tickets at the same time? Well, now you know! Book more than one hybrid or virtual ticket and get an additional group discount of up to 25%. The group discount will be applied at the end of the checkout process.

If you are still unsure about which type ticket to choose for the European Identity and Cloud Conference 2022, you can contact us at any time. We will be happy to help you make the right decision.

Take care and see you in Berlin – online or on-site!

Thursday, 16. December 2021

Radiant Logic

Good Digital Customer Care Requires Identity Unification

Digital interactions are now at the forefront of the customer service experience. The post Good Digital Customer Care Requires Identity Unification appeared first on Radiant Logic.

Indicio

Indicio Wins British Columbia Code With Us Challenge to Upgrade Hyperledger Indy

The post Indicio Wins British Columbia Code With Us Challenge to Upgrade Hyperledger Indy appeared first on Indicio Tech.
Indicio will upgrade Hyperledger Indy to support “did:indy” DID Method Specification, which will enable greater interoperability across Indy-based networks

By Trevor Butterworth

Indicio has won the  BC Gov “Code With Us” Challenge to develop the did:indy DID Method for the Hyperledger Indy project, the open source project housed at the Hyperledger Foundation that provides tools, libraries, and reusable components for providing digital identities rooted on blockchains or other distributed ledger.  

The challenge has two goals. The first is to expand the types of content that can be written to DIDDocs, documents that instruct users on which ledger they need to find and read so that they can use verifiable credentials.

Second, the new method will align Hyperledger Indy with the World Wide Web Consortium (W3C) Decentralized Identifier (DID) Specification. Most of Hyperledger Indy’s development occurred prior to the completion of the standard DID Specification by the W3C and, as a result, identifiers written to one network are currently not resolvable on other networks. A new did:indy DID Method will fix that and make it easier for decentralized identity products and services to interoperate across different Indy networks.

“Given the increasing use of Indy Networks, we are grateful to this </> Code With Us challenge and to BCGov for sponsoring this important contribution,” said Heather Dahl, CEO of Indicio. “As a company, we are committed to open-source development, and we applaud BCGov for its leadership in sponsoring critical development needs that will make interoperability easier and lead to increased adoption. We’re excited to work with BCGov on this—and it means a lot given that several of Indicio’s developers have been contributing to the Hyperledger Indy project since its inception.”

 Winning the challenge comes as Indico is about to kick off a series of  Hyperledger Community Training Workshops in using the Hyperledger Aries codebases to design and develop decentralized identity solutions. Indicio has made training on all aspects of decentralized identity a key part of its mission and product offerings. Places on the free Hyperledger workshops filled rapidly, so keep an eye out for future events.

To learn more about decentralized identity training, or how to start developing decentralized identity products on the Indicio TestNet, or how to use our products, services, and hosting to create trusted data ecosystems, please contact us.

The post Indicio Wins British Columbia Code With Us Challenge to Upgrade Hyperledger Indy appeared first on Indicio Tech.


Safle Wallet

Delay in SushiSwap Listing. Security Breach Averted!

We have had to delay the Listing Announcement on SushiSwap due to a breach in our Eth. token contracts. The details of the contract breach have been mentioned below. Bridging contract for Poly <> Eth. worked with a lock/unlock, burn/mint functionality. If the user wants to bridge their tokens from the Ethereum chain to the Polygon chain, they will have to call the [deposit()](<https://

We have had to delay the Listing Announcement on SushiSwap due to a breach in our Eth. token contracts. The details of the contract breach have been mentioned below.

Bridging contract for Poly <> Eth. worked with a lock/unlock, burn/mint functionality.

If the user wants to bridge their tokens from the Ethereum chain to the Polygon chain, they will have to call the [deposit()](<https://github.com/getsafle/bridging-contract/blob/main/contracts/FxBaseRootTunnel.sol#L77>) function in the FxBaseRootTunnel contract. The deposit() function will call the [burn()](<https://github.com/getsafle/bridging-contract/blob/main/contracts/eth%20token/Safle.sol#L34>) function from the Ethereum token contract. The burn(address, amount) should accept the address from where the tokens are to be burnt and the amount of tokens to be burnt. The burn(address, amount) function should have a condition check to allow only the FxBaseRootTunnel contract to call that function and that check that was not present.

On Monday January 17, 2022 we deployed test liquidity on SushiSwap Liquidity Pool. Within minutes the attacker burnt SAFLE tokens in the SushiSwap Liquidity Pool in multiple transactions, draining 480,853 SAFLE

https://etherscan.io/tx/0xf7ea4e662a664e7e0451fffcd61de94456f4958e858b12c3d4bfa568750e04e3 https://etherscan.io/tx/0xeadde0c3097f35aadca90b534affdc56ebba05b236a6b60c2e80e7235bc619e9 https://etherscan.io/tx/0xb138df86c55a82cd46d15e890924101c3a8a47793c52fa5282ef022542a46011 https://etherscan.io/tx/0xfe7a1b4408df1256dcba685970aa42c806fce462eed211b58ed260c9d0013194

This inflated the price of SAFLE and the attacker swapped SAFLE/WETH in a transaction. Since the tokens had been burned, the attacker was able to convert 56.88 SAFLE to 16.04 WETH.

Here are more details of the incident as captured by the blockchain explorer: https://etherscan.io/tx/0xd457aeb845985c415decb5e1bec2c90a8ce8e3191a54f9e85168a608c84d1ef4

https://etherscan.io/tx/0xd457aeb845985c415decb5e1bec2c90a8ce8e3191a54f9e85168a608c84d1ef4 — Transaction of the Exchange

https://etherscan.io/address/0x7b1088a749c868017f8ba34ea10e761288c6a509 — Address of The Attacker

https://etherscan.io/tx/0xa015c1af7ad9a297b1e0b93cc28c0bc25037e10958f415cdb1ff1151c00ead3f — Seeding Money on the Attacker Account

https://etherscan.io/tx/0xf7ea4e662a664e7e0451fffcd61de94456f4958e858b12c3d4bfa568750e04e3 — Burn Call

Figure- Bridging-contract/contracts/eth token/Safle.sol.

The burn method used for an attack. Due to lack of the ‘caller checking’ it can be called by anyone.

There was a lapse in the audit by Smart State as they were unable to flag the vulnerability in the token contract. They didn’t point to that part so this is the communication level issue.

The best part is that the vulnerability has now been resolved. More details are given here — https://github.com/getsafle/bridging-contract/pull/8.

Figure- The vulnerability resolved.

Even though the SushiSwap Liquidity Pool deployment was never officially announced or promoted by our channels. We apologise for any inconveniences caused in the process. This can be avoided with more eyes on, as well as a re-think in developer procedures and peer-review. We assure our community that we will further investigate and compensate for any personal losses.

Figure- Hacker’s Track

We’ll announce a new date for SushiSwap Listing soon. It won’t take too long! Thanks all for your patience and support once again, there’s only one way forward.

Team Safle


KuppingerCole

Claroty – Visibility into Vulnerability

by Graham Williamson Organizations are facing a brave new world in which governments are taking a proactive role in constraining cybersecurity risks. Companies with operational infrastructure that is deemed ‘critical’ to social stability can expect legislation to ensure they are adequately protecting their OT infrastructure, and monitoring and responding to cyber-threats and compromise events. Ger

by Graham Williamson

Organizations are facing a brave new world in which governments are taking a proactive role in constraining cybersecurity risks. Companies with operational infrastructure that is deemed ‘critical’ to social stability can expect legislation to ensure they are adequately protecting their OT infrastructure, and monitoring and responding to cyber-threats and compromise events. Germany has recently made significant changes to critical infrastructure legislation which will become a benchmark for other jurisdictions to follow.

Imageware

Arizona DOC Chooses Imageware to Supply Streamlined Inmate Intake Solution

Imageware was chosen by the Arizona Department of Corrections (AZ DOC) to provide the Imageware Law Enforcement Platform to improve and streamline the current processes for capturing biometric and biographic information upon inmate intake at correctional facilities The post Arizona DOC Chooses Imageware to Supply Streamlined Inmate Intake Solution appeared first on Imageware.

auth0

Add OpenID Connect to Angular Apps Quickly

This tutorial shows you how to add authentication to your Angular App with Auth0
This tutorial shows you how to add authentication to your Angular App with Auth0

Send Slack Message After New User Sign Up with Auth0 Actions

Use Auth0 Actions with webhooks to send a Slack message after a new user signs up.
Use Auth0 Actions with webhooks to send a Slack message after a new user signs up.

Wider Team

Why do you care about identity?

I have always given a damn about digital identity. Hi. I’m Phil Wolff. And this is a personal note to start my year. I love that digital identity tries to answer deep philosophical questions. What does it mean to be human? How do we trust ourselves and each other? What is the reality we agree … Continue reading Why do you care about identity?

I have always given a damn about digital identity.

Hi. I’m Phil Wolff. And this is a personal note to start my year.

I love that digital identity tries to answer deep philosophical questions. What does it mean to be human? How do we trust ourselves and each other? What is the reality we agree on? What is personal integrity? What helps civilization and a civil society endure? What is fairness and justice?

I love that it’s more than a technical discussion. It’s laws. Sociology. Politics and civics. Commerce. Psychology. Ethics. History. Economy. All the humanities.

Identity touches everything. It always will. Computation and communication continue their pervasion, and identity spreads with them.  

Identity ships. I like that a few standards folks have the power to turn grand ideas into covenants into engineering protocols. To settle things for everyone for a while. To versioning something pragmatic that others can build from and evolve.

Shaping identity is a human endeavor. I like the geeky and artistic communities that emerge for a while around problems and opportunities. We show the best of ourselves. Comradery while tackling authenticity and the unidentified. Courage taking on concentrations of power. Conviction while building safety and security for all. Grit for persevering through long slogs to make stuff work.

So I’m passionate for digital identity’s formative process, its purpose, and its impact on our world.

That’s why I organized online for data portability as a human right (DataPortability Project). Why I literally marched on Oakland City Hall for privacy and joined police in working out a surveillance policy (Oakland Privacy). Why I promoted open data and open source in civic projects at more than 150 civic hack nights (OpenOakland). Why I fostered identity-centric startups (PDEC). Why I consult on identity as transformational strategy here with Wider Team.

Since 2021, I’ve worked with standards bodies on digital identity in clinical IoT, trust governance, decentralization and fiduciaries, and biometrics. I hope to continue these efforts along with the growing cadre who focus on the intersection of technology, society, and enterprise.

Have a healthy and impactful 2022, y’all.


KuppingerCole

Software Supply Chain Security: Don’t Get Your Code Tampered

by Martin Kuppinger Recent events such as the SolarWinds and Kaseya incidents have demonstrated the need to focus significantly more on software supply chain security. Thus, avoiding code tampering by external attackers and internal parties is essential. This whitepaper looks at how to increase security throughout the Software Lifecycle and implement a multi-layered, defense-in-depth code tamperin

by Martin Kuppinger

Recent events such as the SolarWinds and Kaseya incidents have demonstrated the need to focus significantly more on software supply chain security. Thus, avoiding code tampering by external attackers and internal parties is essential. This whitepaper looks at how to increase security throughout the Software Lifecycle and implement a multi-layered, defense-in-depth code tampering prevention and detection strategy.

BeyondTrust Cloud Privilege Broker

by Paul Fisher BeyondTrust Cloud Privilege Broker is designed for multi-cloud environments common in today’s organizations to broker entitlements, privileges, and permissions. BeyondTrust has taken its longstanding expertise in PAM to create a single tool that provides continuous discovery of entitlements, centralized management and reporting capabilities.

by Paul Fisher

BeyondTrust Cloud Privilege Broker is designed for multi-cloud environments common in today’s organizations to broker entitlements, privileges, and permissions. BeyondTrust has taken its longstanding expertise in PAM to create a single tool that provides continuous discovery of entitlements, centralized management and reporting capabilities.

WSO2 Asgardeo

by John Tolbert WSO2 has been a leader in CIAM innovation for years. WSO2’s primary design and deployment models have been to leverage open source for on-premises installations. With Asgardeo, WSO2 launches a complete multi-tenant SaaS solution for CIAM, aimed at developers. Asgardeo features an all-new architecture to speed CI/CD for customers. WSO2 embraces technical standards to promote interop

by John Tolbert

WSO2 has been a leader in CIAM innovation for years. WSO2’s primary design and deployment models have been to leverage open source for on-premises installations. With Asgardeo, WSO2 launches a complete multi-tenant SaaS solution for CIAM, aimed at developers. Asgardeo features an all-new architecture to speed CI/CD for customers. WSO2 embraces technical standards to promote interoperability, and many out-of-the-box API integrations are available in their marketplace to facilitate integration with customer applications.

Jolocom

A brief history of SSI: Where does it come from? A timeline. 

Picture above: SSI workshop at Decentralized Web Summit 2018, San Francisco Mint, (left to right) Kim Duffy, Christopher Allen, Jonathan Holt, Daniel Buchner, Christian Lundquist, Markus Sabadello, Eugeniu Rusu, Rouven Heck. At this point, we know what self-sovereign identity (SSI) is, but where does it come from? A short timeline highlights just how quickly SSI ... The post A brief history of S

Picture above: SSI workshop at Decentralized Web Summit 2018, San Francisco Mint, (left to right) Kim Duffy, Christopher Allen, Jonathan Holt, Daniel Buchner, Christian Lundquist, Markus Sabadello, Eugeniu Rusu, Rouven Heck.

At this point, we know what self-sovereign identity (SSI) is, but where does it come from? A short timeline highlights just how quickly SSI has developed. It underlines the path and development of the evolution of Internet Identity. From a centralized to a federated to a user-centric identity, the self-sovereign identity is seen as the most recent and urgently needed step in this evolution. It is independent from any individual silo, and provides three essential elements: individual control, security, and full portability.  

The source goes back to the 1990s 

Christopher Allan estimates the birth of SSI in the early 90s (cf. Allan 2016). In 1991, PGP mentions a first hint towards what could become a self-sovereign identity when introducing a ‘Web of Trust’. This was an example of a decentralized trust management, but focusing only on email addresses. Therefore, PGP can be seen as one of the first to highlight a different way, in which identity can be supported by peers instead of centralized authorities. As another early thought, Allan recognizes a paper by Carl Ellison that examined how digital identity was created, published in 1996 (‘Establishing Identity without Certification Authority’). Ellison’s main argument highlighted that there was a need for a method of establishing identity without using certificates from trusted certification authorities (Ellison 1996). 

These early approaches could mark the beginning, but it is safe to say that SSI really took off in the 21st century, along with the development and further spread of the internet. 

Since then, things have changed of course. Essentially, Sovereign is really about the individual being able to control what happens to stuff others (trusted parties) say about them. The community as a whole does not rely on web of trust infrastructure anymore for SSI, but rather connects the SSI technology with the existing trusted certification authorities (ie. the state). 

Taking off: SSI in the 21st century 

In 2000, the Augmented Social Network provided the groundwork for a new digital identity. Their most significant advance was ‘the assumption that every individual ought to have the right to control his or her own online identity’ (Jordan 2003). 

One major contribution, paving the way for the SSI development, is the Internet Identity Workshop, which started back in 2001 and introduced on a new term: user-centric identity. It is interesting to notice that the term of self-sovereign identity itself only came into increased use in the late 2010s. At Jolocom, we used the term ‘autonomous identity’ before SSI picked up.  One of the first references is Devon Loffreto’s blog post back in 2012, where he wrote about ‘Sovereign Source Authority’ (Loffreto 2012). 

Five years later, Phil Windley described self-sovereign identity as an ‘Internet for identity’ (Windley 2012). He highlights three major virtues of such: no one owns it, everyone can use it, and anyone can improve it. Ever since, SSI has grown into an ecosystem aiming to put users in control over their own personal information and data. Identity management can be expected to become increasingly significant due to the rise in digital interactions. Already, self-sovereign identity is becoming a large industry. 

The late 2010s 

A year to remember is 2018. Back then, the emerging SSI community agreed on a high level definition of the term SSI in the Identity Position Paper by Bundesblock for the first time. INATBA’s position paper ‘What’s at stake’ in 2020 built on exactly this foundation and furthermore highlighted possible development scenarios for SSI. These efforts helped to make SSI a priority in the digitalization process within Germany and the EU. In Germany specifically, the SDI (secure digital identities) projects are a significant effort to start an SSI ecosystem. 

On a European level, developments towards an SSI friendly legal framework were proposed by the European Commission in June 2020. The initial examples of successful SSI pilots and projects across EU member states and the international momentum on the topic have found their way into the proposal for an updated eIDAS (electronic Identification, Authentiction and trust Services) regulation. This proposal It is currently being debated by the European Council and the European Parliament  and foresees a introduction of European Digital Identity Wallets for all European citizens. The proposed regulation is taking inspiration from self-sovereing identity principles and is a great chance for Europe. It’s main potetnial lies in the provision of a legally binding Trust Framwork, as well as associated Technical Specifications for a coherent technical infrastructure. As a result, if the European Commission has its way, all public services should be available online by 2030 and 80 percent of EU citizens should use an eID solution.  

We have been contributing to this process on a German, European and Global level in different capacities of the past years as Jolocom and will do our best to ensure that our vision for an open- and decentralized infrastructure will be achieved.   

 
To find out where we stand, read about Jolocom here: https://jolocom.io/  

Sources: 

http://www.lifewithalacrity.com/2016/04/the-path-to-self-soverereign-identity.html  

http://asn.planetwork.net/asn-archive/AugmentedSocialNetwork.pdf 

https://www.moxytongue.com/2012/02/what-is-sovereign-source-authority.html https://www.windley.com/archives/2012/05/moving_toward_a_relationship_network.shtml https://www.usenix.org/conference/6th-usenix-security-symposium/establishing-identity-without-certification-authorities https://inatba.org/wp-content/uploads/2020/11/2020-11-INATBA-Decentralised-Identity-001.pdf https://jolocom.io/wp-content/uploads/2018/10/Self-sovereign-Identity-_-Blockchain-Bundesverband-2018.pdf https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/europes-digital-decade-digital-targets-2030_de https://ec.europa.eu/commission/presscorner/detail/de/IP_21_2663

The post A brief history of SSI: Where does it come from? A timeline.  appeared first on Jolocom.


Tokeny Solutions

What does our new strategic partnership with Inveniam mean for issuers?

The post What does our new strategic partnership with Inveniam mean for issuers? appeared first on Tokeny Solutions.

This content is taken from January 2022 Tokeny Insights newsletter

What does our new strategic partnership with Inveniam mean for issuers?

We are off to a great start this year as we teamed up with Inveniam to unlock asset liquidity via tokenization. The partnership comes alongside a €5m investment from Inveniam, Apex Group, and K20 Fund.

Assets tokenized and priced 

The most exciting part about this partnership is that Inveniam’s data solution complements the T-REX ecosystem by providing tokenized assets with a valuation reference, which will fully unlock private market asset liquidity.

Covering the entire value chain of tokenization

As a result of this collaboration, securities tokenized through the ERC-3643 standard and the T-REX suite will be enriched with timestamped valuation data linked to their underlying assets. The result? High functioning securities that will unlock liquidity:

Issuance: We provide the enterprise-grade white-label tokenization platform – T-REX Platform – to help asset managers compliantly bring assets to blockchain. Asset management: Tokenized assets and investors are managed via our T-REX platform. In addition, tokens are enriched by real-time data regarding asset valuation delivered by Inveniam. Secondary trading: Peer-to-peer transfers are available through our compliance framework. With trusted pricing references provided by Inveniam, investors can then make fair deals with confidence. Accelerating the tokenization adoption 

By addressing the two biggest issues in private markets, which are pricing discovery and compliance, we are now making private market assets more accessible and liquid together with our partner Inveniam.

Additional value to the market will accrue as the partnership will ultimately enable Apex Group, Inveniam’s fund administration partner, to deliver end-to-end cutting-edge services to a broad ecosystem of private asset owners. For more info, please read the full announcement: click here.

Tokeny Spotlight

PRESS RELEASES

Tokeny Receives A Strategic Fundraise From Inveniam, Apex Group, and K20 Fund

Read More

PRESS RELEASES

BitMEX Partners With Tokeny To Launch BMEX

Read More

TOKENY’S TALENT

Accounting Administrator Barbel’s Story

Read More Market Insights

Aave launches its permissioned DeFi platform Aave Arc

Aave has launched its permissioned DeFi platform for institutions. Fireblocks is the first whitelister of the platform.

The Block

Read More

JPMorgan: 2022 Could Be ‘Year of the Blockchain Bridge’ 

“If 2021 was the year of the NFT, we see 2022 as possibly the year of the blockchain bridge…or the year of financial tokenization,”

Blockworks

Read More

PayPal confirms that it is ‘exploring’ a stablecoin

The Block reported last May that PayPal was holding exploratory conversations about a stablecoin.

The Block

Read More

Compliance In Focus

European Markets Regulator Seeks Feedback on Regulation of Tokenized Securities

ESMA wants to explore whether existing regulatory standards need to be amended.

Coindesk

Read More

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Jan18 What does our new strategic partnership with Inveniam mean for issuers? This content is taken from January 2022 Tokeny Insights newsletter What does our new strategic partnership with Inveniam mean for issuers? We are off to… Dec20 Season’s Greetings|2022 will be the year of… December 2021 Towards the end of the year, we’d like to thank you on behalf of Tokeny for your continued support throughout 2021. Without you,… Nov22 Are you ready to digitize yourself in the metaverse? November 2021 Are you ready to digitize yourself in the metaverse? The metaverse has been a hot topic of late. As Facebook rebranded to Meta,… Oct25 Guarantee Asset Ownership with Blockchain Passports October 2021 Guarantee Asset Ownership with Blockchain Passports In the DeFi world, digital assets are stored in blockchain wallets. Obviously, wallets are not identities, they…

The post What does our new strategic partnership with Inveniam mean for issuers? appeared first on Tokeny Solutions.


SWN Global

SWN Global and MetaMUI became a Pioneer Member of the Blockchain Research Institute

SWN Global & MetaMUI x Blockchain Research Institute We are excited to announce that SWN Global and MetaMUI became a Pioneer member of the Blockchain Research Institute(BRI). BRI is a globally well known and most active blockchain think tank to be the catalysts of the blockchain transformation. BRI has more than 100 research projects underway to explore, understand, document and inform leaders
SWN Global & MetaMUI x Blockchain Research Institute

We are excited to announce that SWN Global and MetaMUI became a Pioneer member of the Blockchain Research Institute(BRI).

BRI is a globally well known and most active blockchain think tank to be the catalysts of the blockchain transformation. BRI has more than 100 research projects underway to explore, understand, document and inform leaders about blockchain strategies, market opportunities and implementation challenges. All the research and programs done by BRI are funded by BRI members, including enterprise organizations, governments, universities, colleges and blockchain startups from around the world.

Here is a greeting from Don Tapscott, Executive Chairman of the Blockchain Research Institute.
“Digital identity and central bank digital currencies are two emerging technologies that will bring about profound changes to businesses, governments, and societies around the world. At the Blockchain Research Institute, we are pleased to collaborate with companies like Sovereign Wallet Network and MetaMUI, who are paving the way for digital transformation using these technologies,”

We are proud to be one of the members of the BRI and we are excited to do a joint research with the BRI and keep strong connection each other to build a new era of digital currency.

BRI announcement link:
https://www.blockchainresearchinstitute.org/blog/2022/01/17/swn-metamui/

About MetaMUI

MetaMUI is the world’s 1st Identity-based blockchain to truly enable regulated Peer_to_Peer (P2P) transactions. It can be used as the cornerstone for most financial offerings, with its high versatility and support for custom-developed use cases that enable Central Bank Digital Currencies (CBDC) implementation and issuance easily on the MetaMUI CBDC platform.
MetaMUI is built around Self-Sovereign Identity (SSID) technology. By combining the SSID and blockchain token mechanism, MetaMUI was able to create a new identity-based CBDC system that can protect user’s privacy while providing an identity-based transfer system that can satisfy travel rules and other regulatory requirements such as Know your customer (KYC), Anti-money laundering (AML) and Counter Financing of Terrorism(CFT).

Website/Linkedin/Twitter/Telegram/Youtube/Medium


PingTalk

What is Best-of-Breed-Technology? | Ping Identity

"Best-of-breed" technology is a term in the tech community that refers to individual solutions that are really good in their specific area. Actually, best-of-breed refers to the best solutions out there?based on customer reviews and tech ratings?compared to a single platform that can do everything, but may not do each function as well as a single solution dedicated to that specific capability.

"Best-of-breed" technology is a term in the tech community that refers to individual solutions that are really good in their specific area. Actually, best-of-breed refers to the best solutions out there?based on customer reviews and tech ratings?compared to a single platform that can do everything, but may not do each function as well as a single solution dedicated to that specific capability.

 


Indicio

Senior React Native Mobile Software Engineer

The ideal candidate will be responsible for the technical design and implementation of new products and enhancements in Python. They will work in all phases of the development cycle: concept to implementation. As you'll work remotely... The post Senior React Native Mobile Software Engineer appeared first on Indicio Tech.
Senior React Native Mobile Software Engineer

The ideal candidate is a self-motivated, multi-tasker, and demonstrated team-player. You will be a lead developer responsible for the development of new React Native mobile (iOS and Android) apps in the Decentralized Identity industry. You should excel in working with large-scale applications and frameworks autonomously. As you’ll work remotely you must have outstanding communication and leadership skills.

 

We will ask for a code sample if you are short listed.

 

Responsibilities

Writing clean, high-quality, high-performance, maintainable code in React Native Develop and support mobile software including applications, database integration, interfaces, and new functionality enhancements Coordinate cross-functionally to ensure project meets business objectives and compliance standards Support test and deployment of new products and features Participate in code reviews

Requirements

Bachelor’s degree in Computer Science (or related field) Expert in React Native Expert in JavaScript Expertise in coding for iOS and Android apps 3+ years of relevant work experience Ability to multi-task, organize, and prioritize work

Bonus qualifications

TypeScript Experience with Agile or Scrum software development methodologies Slack super user

The post Senior React Native Mobile Software Engineer appeared first on Indicio Tech.


Identosphere Identity Highlights

Identosphere 65 • Building DID with MS • EU DGA meets TOIP (& MyData) • Devon Lofretto still not Moxy Marlinspike

Your weekly digest of the latest news, events, and development related to creating a system for ID online giving users ownership of their personal info, and eliminates the use of access control lists
Thanks to Patrons like you. Consider supporting this publication with a monthly payment via Patreon ← click here

…or reach out to Kaliya, and she will invoice by your preferred means!

Read previous issues and Subscribe : newsletter.identosphere.net

Contact \ Content Submissions: newsletter [at] identosphere [dot] net

Correction: Moxy Tongue (Devon Lofretto, still not Marlinspike)

Last week we referenced Moxy Marlinspike’s first impressions of web3 (he really gets it!), recently posted on his blog, Moxy.

We also shared an image about Human Authority found on Devon Lofretto’s website Moxy Tongue, misattributed (again) to Moxy Marlinspike; a longstanding misattribution, only recently, “once and for all” publicly corrected.

TLDR; No Relation (but the, now famous, mis-attribution)
- Devon Lofretto came up with the original idea of Sovereign Source Authority, which became Self Sovereign Identity.
- Moxy Marlinspike, created Signal App. Nice privacy focused messaging app.

Upcoming

Build Your Identity Solution Using Hyperledger Aries 1/20

How To: Own Your Identity 1/25 UFOstart (IAMX)

Speaking: Markus Sabadello, Michael Shae, Tim Brückmann, Tim Heidfeld

Hyperledger Indy Technical Deep Dive 2/3

Data Space Launchpad MyData • Until 2/22

Build using Decentralized Identities with Microsoft

Introduction to DID 01/13

Introduction to DID (continued) 01/20

Introduction to Decentralized Identity 01/24

Build an app that uses Verifiable Credentials 01/25

Setting up your DID infrastructure 01/25

Hackathon: Let’s build something cool with DID 01/27

High Level #SSI101: An Introductory Course on Self-Sovereign Identity Spherity

Outside of a few philosophers, social scientists, and a tiny minority of specialized technologists, however, most people feel uncomfortable making any definitive or authoritative statements about identity. 

Digital Identity Trends and Predictions for 2022 Signicat

What will be the buzzwords of 2022?

Identity wallet

Decentralized identity

Web 3.0

Passwordless

Kim Cameron  Remembering Kim Cameron Vittorio Bertocci

Kim might no longer update his blog, nudge identity products toward his vision or give inspiring, generous talks to audiences large and small, but his influence looms large in the identity industry – an industry Kim changed forever. 

Memories of Kim Cameron Ian Glazer

Reification. I learned that word from Kim. In the immediate next breath he said from the stage that he was told not everyone knew what reify meant and that he would use a more approachable word: “thingify.” And therein I learned another lesson from Kim about how to present to an audience.

SSI Standards Community Resources - DID Primer Credentials Community Group

decentralized public key infrastructure (DPKI) could have as much impact on global cybersecurity and cyberprivacy as the development of the SSL/TLS protocol for encrypted Web traffic (now the largest PKI in the world).

Why we need DIDComm  IdentityWoman

This is the text of an email I got today from a company that I had a contract with last year [...] I was reminded quite strongly why we need DIDComm as a protocol to enable the secure transport of all sorts of things not just signed VCs but intermediate uses

NGI - ESSIF LAB reports Adding SSI to internet communications using Sylk Suite by Bloqzone

The project SSIComms adds SSI to internet communications by adding SSI wallets to the renowned SYLK Suite, an award winning ensemble of communications solutions with the SIP protocol at its core.

SSI Mandate Service by Visma Connect

The SSI mandate service is a generic and holistic approach to provide and request mandates. Mandates are SSI credentials signed by the dependent that can be requested by either the dependent or authorized representative. These credentials can be used to prove to a verifier that the authorized representative is authorized to act for specific actions on behalf of the dependent.

OnboardSSI by Quadible

The concept of SSI was designed with the citizen and privacy in mind. However, existing implementations lack user-friendliness (e.g. showing hash codes to users), creating potential barriers in users’ adoption. OnboardSSI focuses on providing a secure and user-friendly wallet solution creating an easier way for citizens to manage their identity. 

Data Privacy Kids PRIVCY ACT Me2BA

Strengthen the legal protections afforded to children under COPPA, and extend COPPA protections to adolescents ages 13 to 17, who have long gone without privacy protections online, while creating new rights for families. 

Prohibit surveillance advertising and other harmful uses of data on all digital services likely to be accessed by children, rather than limiting protection to ineffectual notice and consent on “child-directed sites.”

2022: LOOKING AT THE YEAR AHEAD MyData

As MyData Global saw in our reflection on 2021, the transformation towards a human-centric personal data economy is underway. This transformation is driven by two forces: first, the dominant unethical approaches to personal data are starting to show how unsustainable they really are.

EU DATA GOVERNANCE ACT MEETS TOIP FRAMEWORK

The DGA defines an “intermediary” that facilitates processing and sharing of data for individuals and organizations to “…increase trust in data intermediation services and foster data altruism across the EU”. In the MyData framework for user-controlled data sharing, intermediaries are called MyData Operators and there is a certification program in place. 

Organizational Update How LEI datasets can enhance global sustainability initiatives and climate-aligned finance GLEIF

During COP26, GLEIF announced a partnership with Amazon and OS-Climate to add LEI datasets to Amazon’s Sustainability Data Initiative [...] We have since caught up with Ana Pinheiro Privette, Global Lead for ASDI, to discuss how the partnership is working to improve global sustainability data modelling, mapping and calculations, and the expected impact on climate finance risk and opportunity evaluations.

Guidance on the Acceptable Use of Biometrics DIACC SIG

• Biometrics SHOULD only be used where its use is demonstrably necessary and […] is proportionate to basic human rights, privacy laws, and justifiable to the benefits gained.
• The biometric technology MUST require suitable accuracy, minimize data collection
• The evaluation of proportionaliry SHOULD include evaluating […] sensitivity, necessity, proportionality, effectiveness, and minimal intrusiveness.

Use Case A MAJOR - AND OVERDUE - POWER SHIFT IS COMING TO TRAVEL Phocuswire

it will be a transformative change, shifting power from travel suppliers to travelers themselves and giving travelers more choice, better personalization, lower friction and more security.

Desire to store less digital identity data stokes travel’s SSI brushfire BiometricUpdate

The potential use cases for self-sovereign identity to transform the travel industry are almost limitless, particularly with the impending arrival of decentralized identifier communications, also known as DIDComm

Can Verifiable Credentials Make Life Better for Refugees? Affinidi

Let’s say Mr.X is forced out of his country due to war and he reaches the neighboring country but doesn’t have any physical document to prove his identity such as name, address, educational qualifications, work experience, etc.

Web3 What is Web3 and Why It Matters Dion Hinchcliffe OpenSea, Web3, and Aggregation Theory Stratechery

what gives Aggregators their power is not their control of supply: they are not the only way to find websites, or to post your opinions online; rather, it is their control of demand. People are used to Google, or it is the default, so sites and advertisers don’t want to spend their time and money on alternatives; people want other people to see what they have to say

Web3 and Digital Embodiment Phil Windley

Web3 will make a difference for all of us if it enables people to become digitally embodied, able to recognize, remember, and react to other people and organizations online—without the need to be in someone else's database.

Web 3.0 - How to get started! Tech3.0

I will go through how to create your own DID (ID on the blockchain), and how you can use it today with Profile. @tryProfile @ElastosInfo

What is Web3Auth??? Web3Auth partners with Polygon Studios to bring seamless logins to the Polygon ecosystem

Web3Auth aggregates OAuth (Google, Twitter, Discord) logins, different wallets, and existing key management solutions, and provides dApps/wallets a familiar experience that fits every user. Mobile, web, and blockchain agnostic

Metaverse A Digital Identity Fit For The Metaverse Forbes

The SSI model has an individual’s value — be it crypto, in-game items, or other NFTs — directly tied to their identity. It will be accessible with a simple click for physical services, like an Uber, as well as digital ones

Avatars May Use SSI In Metaverse To Prove Identity Hypersign ID

The tech got a surge especially after Facebook decided to change its name to Meta. In this blog […] focus will be on explaining why decentralized digital identities are an important tool for Metaverse to replicate the real world.

Self Sovereign Identity and Web3: From the metaverse to real life Talao

SSI makes it easier to rely on traditional economic actuators (Brands) online and off-line to develop traffic and business on decentralized platforms and the Metaverse

Public Sector MetaMUI and Sovereign Yidindji Government launched 1st self-sovereign identity-based National ID system Cointelegraph

We are delighted to announce that our first E-Government pilot program with the Sovereign Yidindji Government has been successfully completed on Jan 7, 2022.

Decentralized Identity & Government Evernym

a few notable government-led projects, such as Aadhaar (India), Verify (UK), eIDAS (EU), and the Ontario Digital Identity Program (Canada) - What decentralization means for portability, scalability, flexibility, and privacy - How governments and commercial organizations can enhance existing federated identity systems with verifiable credentials

the potential of Self-Sovereign Identity with representative use cases    

Our Self-Sovereign Identity initiative with SwissSign, the canton of Aargau and cardossier clearly shows: Even if there are still uncertainties regarding technical maturity and governance – SSI is happening and brings major advantages in data protection and cross-organizational digitization.

Companies SURF: Technical exploration Ledger-based Self Sovereign Identity Identity Economy DE  

the privacy-friendly nature of SSI, end-user control over disclosure of personal information, and the SSI trust model aligned well with the public values ​​typically found in R&D. The platform we used (based Hyperledger Indy) allowed us to successfully run all use cases.

SSI initiative open to new players adnovum Procivis launches SSI+

Composed of the desk, wallet and gateway, SSI+ offers a complete solution for issuers, holders and verifiers of verifiable credentials (VCs) to get started with self-sovereign identity projects today. 

Hello, User: Episode 13 with Katryna Dow

Katryna discusses Meeco’s mission to enable everyone on the planet access to equity and value in exchange for the data and information they share.

International Thank You Day Jolocom

We at Jolocom reflect on amazing projects that became possible by joining forces with partners such as T-Labs, Bundesdruckerei, Stacks, and TIB – the Technical Information Library Hanover. 

Research Non-human Personas: Including Nature in the Participatory Design of Smart Cities Martin Tomitsch, Joel Fredericks, Dan Vo, Jessica Frawley, Marcus Foth

we introduce a framework for developing and employing non-human personas. As a key element of the framework, we describe a middle-out approach for forming a coalition that can speak on behalf of the non-human species that are impacted by design decisions.

Identity not SSI  Nat Sakimura delves into Financial-Grade API ubisecure

“The data economy needs a secure and interoperable data network. And we are finally getting there with FAPI and eKYC standards. So, you guys need to get ready for the ride. It’s the time. You need to start acting, start preparing for that.”

The Rise of the Identity Data Fabric Radiant Logic

Every new identity project takes much longer than anticipated, demands huge costs in customization, presents a huge burden on staff across the enterprise, and reveals security gaps due to the complexity and inflexibility of the legacy infrastructure. 

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net
Support this publication: patreon.com/identosphere
Contact \ Submission: newsletter [at] identosphere [dot] net

Monday, 17. January 2022

KuppingerCole

Cloud Backup and Disaster Recovery

by Mike Small The KuppingerCole Market Compass provides an overview of the product or service offerings in a certain market segment. This Market Compass covers solutions that provide backup, restore and disaster recovery of IT service data into the cloud in the context of the hybrid IT service delivery environment that is now commonly found in medium to large organizations.

by Mike Small

The KuppingerCole Market Compass provides an overview of the product or service offerings in a certain market segment. This Market Compass covers solutions that provide backup, restore and disaster recovery of IT service data into the cloud in the context of the hybrid IT service delivery environment that is now commonly found in medium to large organizations.

IDnow

Austria’s Know Your Customer Regulations Allow Fully Automated, Biometric Identification Procedures

The world is moving towards digitalisation and its having both a positive and negative impact on businesses. On one hand, with the rise of cybercrime, and cybercrime damages costing the world roughly $6 trillion in 2021, regulators and authorities are looking into solutions that can ensure highest security standards. On the other hand, completely automated […]

The world is moving towards digitalisation and its having both a positive and negative impact on businesses. On one hand, with the rise of cybercrime, and cybercrime damages costing the world roughly $6 trillion in 2021, regulators and authorities are looking into solutions that can ensure highest security standards. On the other hand, completely automated processes, that can be used remotely are a new requirement amongst users.

The Austrian Financial Market Authority (FMA) made an important step and changed its online identification ordinance. The Austrian Anti-Money Laundering Act (Austrian AMLA) has been amended to allow for the use of a fully automated biometric process for remote identity verification. The new process allows banks and other AML-required companies to comply with Know Your Customer (KYC) requirements using a variety of remote identity verification methods. This is a positive development not only for users, but also for companies, who can now offer quicker processes and achieve better conversion rates.

According to the proposed amendment, biometric verification criteria must meet current state-of-the-art technology and offer the same level of security as an “in person” identity. This procedure involves technical requirements ranging from ID document security feature checks to biometric capture or Liveness Detection, which can be verified via video recording. From January 2023 onwards, a photo ID must be verified by reading the electronic security chip (NFC chip) under the current contingency under the new amendment.

IDnow is constantly working with the authorities, regulators and European standardisation working groups like IDO Alliance working groups from identity verification to European digital identity. IDnow is also a member of Hyperledger, W3C and, Linux Foundation, to work towards a safe, modern identity ecosystem and to ensure a superior platform for all compliance needs.


KuppingerCole

Adding Bread to the Sandwich: Beyond MITRE D3FEND

by Martin Kuppinger Commissioned by HCL Software Over the past years, various frameworks and models for defending against cyber-attacks have been published. A popular one is the NIST CSF (Cybersecurity Framework), another one is MITRE D3FENDTM. Both have overlaps and differ in other areas. But, when looking at these approaches, there also are missing elements that are required for a comprehe

by Martin Kuppinger

Commissioned by HCL Software

Over the past years, various frameworks and models for defending against cyber-attacks have been published. A popular one is the NIST CSF (Cybersecurity Framework), another one is MITRE D3FENDTM. Both have overlaps and differ in other areas. But, when looking at these approaches, there also are missing elements that are required for a comprehensive approach.

Comparing NIST CSF and MITRE D3FENDTM

While NIST consists of the five stages Identify – Protect – Detect – Respond – Recover, the MITRE approach has Harden – Detect – Isolate – Deceive – Evict as its five main stages. When comparing these in a matrix, it looks like this:

NIST CSF

MITRE D3FENDTM

Identify

 

 

Harden

Protect

 

Detect

Detect

Respond

Isolate

Deceive

Evict

Recover

 

Table 1: Mapping NIST CSF to MITRE D3FENDTM shows that the overlaps and differences between both approaches.

Also, while the MITRE approach is more a list of technical steps and technologies, NIST also focuses on continuous improvements. It details the actions by mapping the functions (such as “identify”) into categories, which are split into subcategories that then reference to established standards such as ISO/IEC 27001:2013 or COBIT 5.

When analyzing the MITRE framework, one also could argue that some essential steps such as hardening the operating system. patching applications or hardening by enforcing consistent access controls at the system and application level are lacking. In addition, Access Governance for enforcing the least privilege principle, or Privileged Access Management (PAM) should be considered along with endpoint management related capabilities, which are absent. Notably, all this could be easily added to the framework.

The even more interesting challenge becomes apparent in table 1: The bread around the sandwich is lacking, with the phases of identify and recover not (yet) being part of the MITRE framework. NIST CSF starts with Asset Management as the first category within the first stage of “Identify”. Without understanding which systems are in place, which software is running on these systems, or how these systems are configured, the knowledge of which systems to harden in which way, or analyzing the state of security becomes difficult, as well as response becomes a challenge.

You can’t protect what you don’t know

Experience from practice tells that a lot of organizations struggle in case of incidents with the simple fact that they don’t know exactly which systems, be it clients, servers, or services, they have in place that could be affected. Identifying these systems takes valuable time and the right endpoint management systems can assist in discovery and identification actions.

Asset management and software inventory are essential capabilities for every defensive approach, capturing the information about the state of IT. Such repository is the foundation for automating activities for hardening/protecting as well as for response.

Response is not enough – keep your business up and running

While protecting or hardening, detection, and responsive actions such as isolation, deception, and eviction are essentials, there also is a need for recovery. These activities also must be well-prepared, for rapid reaction in case of an incident. This, e.g., might include recovering both endpoints and data in case of a ransomware attack, to return to normal work as quickly as possible. Also, in case of many other types of attacks, the ability to recover or restore to a known state is an essential capability.

Thus, a comprehensive approach must look beyond the technical response and forward to getting back to work based on a known, safe state. A key capability therein is the ability to recover systems, including endpoints, rapidly, which is a common capability with Endpoint Management Systems solutions available today.

Operate and automate

Last not least, it also is about automation and efficient IT operations. Neither steps such as identification and hardening, nor response or recovery, can be only based on manual activities. Efficient protection across all systems and services requires a high degree of automation, where, again, Endpoint Management comes into play. Rapid reaction equally requires automation, to be fast in, e.g., restoring systems and data.

IT operations is the glue that ensures that all these activities can be executed fast, efficiently, and with a high degree of automation.

Taking the best of both worlds: Bread, mustard, cheese, and more

Both frameworks, the one from NIST as well as the one from MITRE, deliver very valuable input for organizations when implementing their cyber defense, as other references such as ISO/IEC 27001:2013 or NIST SP 800-53 Rev. 5 are doing. The analysis of these frameworks also shows that there is not a single, simple answer to all challenges. The depth MITRE delivers when it comes to technical activities is impressive, while NIST CSF takes a broader perspective.

For being well-prepared against cyber-attacks, a broad perspective is essential, from the identification of both risks and the assets to be protected, to the ability for quickly recovering from attacks. Technologies such as UEM, Asset Management, and solutions that support in recovery from both a data perspective (Backup and Restore) as a system perspective (UEM again) thus must become part of a comprehensive approach on defense against cyber-attacks.

The KuppingerCole Recommended Risk Framework:

The Sandwich

When To Use

The New Recommended Framework

Bread

 

Before the Attack

Identify

Mustard

 

Harden

Cheese

 

Protect

Ham

During the Attack

Detect

 

Respond: Isolate, Evict, Patch

 

After the Attack

Recover

Bread

Improve

At the end, it is like with a good sandwich. It is not only about ham or cheese, about mustard or other sauces, it also is about the bread and putting all this together the right way. MITRE D3FENDTM is a very valuable approach, providing depth at the technical level other models don’t deliver. Other approaches such as the established NIST CSF add another perspective and a broader coverage across the whole cycle of cyber-attack resilience.


Ocean Protocol

SmartPlaces partners with Ocean Protocol to unlock data monetization for Web3 social interaction…

SmartPlaces partners with Ocean Protocol to unlock data monetization for Web3 social interaction app SmartPlaces Protocol is partnering with Ocean Protocol to enable users who contribute to social interactions through the SmartPlaces app to securely monetize their data while preserving their privacy. The mission behind SmartPlaces Protocol is to provide users with a new way of social intera
SmartPlaces partners with Ocean Protocol to unlock data monetization for Web3 social interaction app

SmartPlaces Protocol is partnering with Ocean Protocol to enable users who contribute to social interactions through the SmartPlaces app to securely monetize their data while preserving their privacy.

The mission behind SmartPlaces Protocol is to provide users with a new way of social interaction based on the benefits of blockchain technology. SmartPlaces Protocol makes it possible for people to interact and connect in real-world situations. This is done through a geo-location based mobile application that allows them to make their thoughts and interests visible for others in a direct real-life environment.

With the “1-click-wingman”, people can easily seize everyday’s life opportunities and immediately connect with others. The hyper-localized application will be launched at a later stage in 2022 and, based on strategic partnerships, it will be made available in approx. 40 countries. In addition, SmartPlaces will provide use cases for the Metaverse, as it allows users to earn an identity and trust label based on the volume and the quality of real-life interactions with the others.

The SmartPlaces app data will be shared and monetized on Ocean Market. Ocean’s Compute-to-Data technology aligns perfectly with SmartPlaces Protocol’s need, as it provides a means to exchange data while preserving privacy — the data stays on-premise with the data provider, and data consumers can run compute jobs on the data — to train AI models.

Bruce Pon, Co-Founder of Ocean Protocol, commented:

The current data economy is not inclusive, transparent, nor does it allow for easy entry and access. The collaboration between Ocean and SmartPlaces Protocol is adding to the opportunities for people to take control over their data and monetize–this is the social media of tomorrow, fair, responsible, open, beneficial for everyone who is a part of it.

Hung Luu, CEO of SmartPlaces Protocol added that:

The digitalised world lacks real-life interactions between people. Smart Places Protocol’s vision is to create a new era of social interaction by engaging people to connect with others. I’m sure that our values and vision have the potential to align decentralized technologies with everyone on a real social level. We believe that Ocean Protocol could play a major role in enabling this ambitious but realistic plan in a distributed way. We envision that our cooperation will build a redistributed social data ecosystem, by being a solid proposition to our society.

We’re aware that the data produced by the user’s activities, like geolocation, consuming behaviors, and other metadata is private and must be protected. With Ocean Protocol we’re sure to have the right partner to combine privacy requirements with opportunities for monetizing these data in favor of the users, says Björn Heinze (COO & Legal) expert and specialist in data protection / GDPR.

About Ocean Protocol

Ocean Protocol is a decentralized data exchange platform spearheading the movement to unlock a new Data Economy, break down data silos, and open access to quality data. Ocean’s intuitive marketplace technology allows data to be published, discovered, and consumed in a secure, privacy-preserving manner. By giving power back to data owners, Ocean resolves the tradeoff between using private data and its public exposure.

About SmartPlaces Protocol

SmartPlaces’ mission is to provide users with a new way of social interaction based on the benefits of blockchain technology and to build a decentralized ecosystem. As a result, SmartPlaces Protocol helps to (re)connect people in real life and creates unique user experiences.

SmartPlaces Protocol introduces the “connect2earn” approach and rewards all users who contribute to interact with other users. SmartPlaces Protocol will also allow users to earn NFT-based identity and trust label that covers various use cases within the Metaverse.

SmartPlaces partners with Ocean Protocol to unlock data monetization for Web3 social interaction… was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 16. January 2022

KuppingerCole

Analyst Chat #108: Privacy and Consent Management

"Privacy and Consent Management" is an exciting topic in a continuously changing market. Annie Bailey has just completed her latest Leadership Compass, which researches this market segment. To mark the release of this document, she joined Matthias for an Analyst Chat episode where she talks about the innovations and current developments. In A Nutshell In the episode 108 “Privacy & Consent

"Privacy and Consent Management" is an exciting topic in a continuously changing market. Annie Bailey has just completed her latest Leadership Compass, which researches this market segment. To mark the release of this document, she joined Matthias for an Analyst Chat episode where she talks about the innovations and current developments.

In A Nutshell

In the episode 108 “Privacy & Consent Management” Matthias hosts Anne Bailey.

Q: “From a definition point of view, what do we need to think of when we talk about privacy and consent management?”

Anne: “Yeah. So this is one of those terms where you could spin it in a lot of different ways, you know, privacy is so much in the public discourse that it doesn't really have a concrete definition anymore. So I thought it might be useful to get us all on the same page before we talk any more about it. So the way at least I have defined privacy and consent management in this most recent report. It's, of course, considering organizations and it's their administrative and governance capabilities over data privacy within their organization and of course, the tools and the solutions that are there to make that happen. So you could think of it then in a simplified manner about the capabilities that such a tool or a solution would have to the first group of capabilities, would then to be able to manage any incoming signals about privacy and consent. So these are things like being able to manage cookies and trackers that are on websites, being able to accept and then implement those consent or preference choices that an end user would make. And that would be over the range of different channels. So on a smart TV, on a mobile device, on a website, over the phone, via email in person interactions as well, should be considered. So that's all about managing the incoming signals. But what's also very important as well is the organization's ability to take care of their own internal management of privacy. So being able to govern sensitive data, which is in the organization and private data, being able to document their steps towards compliance and something which is a buzzword in this most recent report is being able to operationalize privacy.”

Q: “Recently, you published an updated version of your Leadership Compass report, which compares providers and services. What are the changes in the market that you can observe that you want to share with us?”

Anne: “Yeah. So this is an especially dynamic market area. Things are always changing. And so we can see some pretty big market changes between the report which published 18 months ago or so and the one which just came out this week. And that's in the types of vendors that were interested in participating. So what we saw in the last report were a lot of vendors that really focused on being able to manage those incoming signals, so being very focused on cookie management, on being able to collect consents and preferences and make sure that those are all able to be implemented in the many different connected systems within an organization and all the downstream vendors that may impact. Very focused on this incoming flow of information from end users. And what we saw, which was different in this report, is that there were more vendors that are really focused on data governance and using that as a foundation for privacy. So being able to operationalize and take action within the organization to further their privacy goals. And so we could think of that as an example. So being able to identify a privacy weakness of some sort in a process and then from that same administrative screen, then be able to do something to address that weakness. I guess we could go into more concrete details on what that could be. So, you know, if there was a scan done on a database and that scan returns the notification that there is private information in this database, there would then be the chance to leverage automation to go and anonymize those sensitive fields. So you're then connecting information about the status of privacy in the organization with an action to then improve it. So that was something that we noticed among several of the vendors that they're moving more in this direction. And that also does connect back to the relationship between the end user and the organization. So there was a big focus on being able to provide support for data subject requests and being able to process those. So in the same way of operationalizing privacy, if a consumer then submits a data subject request, the administrator would then be able to scan and automatically compile a report containing their personal information rather than needing to do that manually.”

Q: “Vendors offer products and services globally. Do you think they can catch up with changing privacy and consent requirements?”

Anne: “Mm-Hmm. Yeah. And frankly, this is really hard to stay up to date with because given our very globalized presence on the internet and connection with consumers all around the world, many organizations do have to stay up to date with the regulations that are not just for their own jurisdiction and in the region where they reside, but they have to pay attention to where their customers are, where any of their downstream suppliers or, you know, MarTech partners may reside and where this data is moving. So they have to be aware of a much wider legal domain than they've been used to before. And as I mentioned before, this is a really dynamic space. And part of that is because there are many privacy regulations which are being released all around the world. So this is something that we've identified as a really key capability in privacy and consent management tools, is that having some basis, some support from legal experts in-house to be able to keep up with all of these changing regulations and be able to pass that knowledge down to their customers is a really valuable thing.”

 



Friday, 14. January 2022

Indicio

Sounding Off: A Major – And Overdue – Powershift Is Coming To Travel

PhocusWire The post Sounding Off: A Major – And Overdue – Powershift Is Coming To Travel appeared first on Indicio Tech.

Safle Wallet

Safle Announces its Farming Partnership Program with Dfyn

We are very excited to announce that Safle has partnered with Dfyn, to bring Safle’s community members an amazing farming program. Safle is premiering this program with a very lucrative launch pool with high rewards for people providing liquidity for the SAFLE-USDC pair. The current estimated APY for the launch pool is juicyy to say the least — Grab it while you can! Built with an intent of

We are very excited to announce that Safle has partnered with Dfyn, to bring Safle’s community members an amazing farming program. Safle is premiering this program with a very lucrative launch pool with high rewards for people providing liquidity for the SAFLE-USDC pair. The current estimated APY for the launch pool is juicyy to say the least — Grab it while you can!

Built with an intent of facilitating safe & simple access to the world of Web3 Safle does all that it can to ensure a seamless experience for both the users and developers. In the retail segment, Safle is a decentralised blockchain identity wallet that enables secure private key management and seamless experience for dApps, DeFi and NFTs. For the developers, Safle provides multi-chain wallets as a service as well as node infra. support to communicate with blockchains. To make its mission a success, having good liquidity comes as a cherry on the pie and this is exactly where Dfyn comes into the picture.

The initial launch farm will be followed by exciting trading competitions and additional farms will also be opened to reward community members providing liquidity.

Steps to start farming:

Create a SAFLE & USDC pool with 50% SAFLE / 50% USDC here: https://info.dfyn.network/pair/0x62f31f24c0e7987f4a92f3c20d872a577a5b0d2c Once created you will receive your LP tokens. Now go to: https://exchange.dfyn.network/#/launch-farms/0x04b33078Ea1aEf29bf3fB29c6aB7B200C58ea126/0x2791Bca1f2de4661ED88A30C99A7a9449Aa84174/v1 Click on “Deposit LP Tokens”, connect, and confirm. Reap the rewards and claim them as per the release schedule: Rewards can be claimed over a period 8 months in five tranches: Day 0 (last day of staking period), Day 60, Day 120, Day 180 and Day 240.

Both the communities aim to create a conjugal environment for DeFi users to thrive upon, the farming program is another step towards our unified mission. 👨🏽‍🚀 ✨

About Safle

A next-generation non custodial wallet, self sovereign identity protocol and Web 3.0 infra. provider for the decentralised ecosystem, governed by community. Safle is a decentralised blockchain identity wallet that enables secure private key management and seamless experience for dApps, DeFi and NFTs. In order to maintain a balance between developers and retail users, Safle intends to develop a wallet infrastructure in a completely non-custodial fashion using Open Governance Mechanisms via the SafleDAO coordinated and maintained by the Safle token economy. The native $SAFLE token will not only enable token holders to propose and vote on changes (governance privileges) to functionalities and feature sets of the wallet and node services, but will also create a self-sustaining token economic model where value will be generated by providing access to finance and identity in the decentralised digital world.

Website | GitHub | Discord | Twitter | Instagram | Telegram Ann | Telegram Chat

About Dfyn

Dfyn is a multi-chain AMM DEX currently functional on the Polygon network and Fantom. Dfyn nodes on various chains act as liquidity entry and exit points into the cross-chain liquidity super mesh that is being enabled by Router Protocol.

Website | Discord | Twitter | Instagram | Telegram Ann


Ontology

Staking vs Candidate Node Ownership

To stake or not to stake? Or even better, should I choose to stake or should I create my own candidate node instead? Is it worth it? What are the benefits? This and more you can find answered in this article, brought to you by your Telegram admin, Ontology Harbinger and proud owner of candidate node “CZ/SK Ontology community node”, DuMonT. Which one is better for me? In order to make a
To stake or not to stake? Or even better, should I choose to stake or should I create my own candidate node instead? Is it worth it? What are the benefits? This and more you can find answered in this article, brought to you by your Telegram admin, Ontology Harbinger and proud owner of candidate node “CZ/SK Ontology community node”, DuMonT. Which one is better for me?

In order to make a correct decision, you should know what staking is, what it means to run a candidate node, and what are the main differences. To put it simply, staking means that you assign your ONTs to a candidate or consensus node. For doing this, you earn ONG rewards based on the distribution ratio set by the node owner. On the other hand, as a node owner, the main advantage is that you manage the distribution ratio. Let’s see the main pros and cons of both approaches.

Staking

(+) Low limit - you can stake as little as 1 ONT, while the upper limit depends on how much the specific node can accept from stakers.

(+) Higher flexibility - you can decide to cancel the full, or part of the stake at any time. Your tokens are unlocked when consensus round ends (in the case of unstaking from candidate nodes).

(+) No entry barrier - no extra fees applied, only a transaction fee of 0.05 ONG.

(-) Trust in the node owner - your rewards depend mainly on the node’s distribution ratio, the risk is that node owner can change it or even cut it to 0, which may force you to stake to a different node. To minimize the risk, it is recommended to either stake with nodes that have stable fee sharing ratio checkmark at node.ont.io, or regularly check whether the node you stake with changed the ratio to unfavorable values.

(-) Switching nodes - if you decide to change a node, cancelling the stake and staking to a different node will result in losing 1 consensus round of rewards.

Candidate node owner

(+) You manage the distribution ratio - you decide on how much % of generated rewards you keep, and how much is distributed to your stakers.

(+) Potentially higher rewards - explained in next parts of article.

(+) A way to promote yourself/your company.

(-) Higher entry limit - minimum amount to start a candidate node is 10,000 ONT.

(-) Lower flexibility — as a node operator you can not reduce your stake below 10,000 ONT, or unstake such an amount that would result in user stake being more than 10x higher than your remaining node stake (example: your node stake is 15K ONT, user stake is 100K ONT, at max you can unstake 5K ONT). In case you would need to unstake more, you must cancel your node entirely.

(-) Entry fee - to start a node you need to pay a fixed 500 ONG fee, + 0.05 ONG transaction fee.

Summary

Staking is a great way to get passive rewards for holding your ONTs. Moreover, it is suitable for everyone as you can stake from as little as 1 ONT, and it also provides more flexibility than when you run a node. You can learn more about staking in the Ultimate Ontology Staking Guide. On the other hand, if you have more than 10,000 ONTs, and you are a long term holder that wants to maximize your rewards, it is a good time to consider creating your own candidate node.

I want to run a candidate node!

So you decided to run a candidate node? In the next part, I will cover how to create a candidate node, the requirements, and possible strategies.

How it works?

Ontology candidate nodes are nodes which do not validate transactions. They could become active validators if they get to the top 15 by size, which would turn them to consensus nodes. A basic candidate node is something like staking pool, which creates room for stakers. Therefore, once you register a candidate node, and you are not in the top 15, you are not required to have your device online 24/7. This is all you need to start generating ONG rewards. In addition, you could run an online node, which synchronizes the blocks generated by consensus nodes. This would require your device to be online but does not bring you any extra rewards as you do not actively participate in consensus. If you still want to go in that direction, then you can find documentation at https://docs.ont.io/ontology-node/abstract, but it also requires some technical skill.

How to launch a candidate node?

The main tool you need is OWallet on your PC. It is a desktop wallet application developed by Ontology. Download it only from the official GitHub source: https://github.com/ontio/OWallet/releases.

Once you have OWallet installed, you can create a new wallet, import your existing wallet, or use a Ledger hardware wallet. Last but not least, you need at least 10,000 ONT in your wallet address, + at least 500.05 ONG. Ontology published a step-by-step guide on how to proceed within OWallet, you can find the guide here: https://ontology-1.gitbook.io/ontology-node-staking-docs/guides/sign-up-to-run-an-ontology-node

Once you launch the node within OWallet, and set important settings, you just need to wait until a new consensus round begins. At that point, your node generates rewards which will be available for you to claim after each consensus round ends.

Explanation of important settings in OWallet

There are few important settings, which you definitely should not miss because they define rewards distribution as well as how stakers see you.

Allowed stake unit — located at the “User stake Authorization” page, determines how many ONT your node can accept from stakers. By default it is set to 0, which means stakers can not stake at your node. The maximum value you can set is 10x of your initial node stake, and you can also see this value in the table below. So, if your node stake is 10K ONT, you can set a maximum of 100K ONT. Fee sharing ratio — located at at “User Stake Authorization” page, determines the distribution ratio of rewards between the node operator and its stakers. You set the ratio for the node’s initial stake and user stake separately. Remember, in OWallet, you set reward values which you, as node operator, decide to keep. The remaining rewards are automatically distributed to stakers. The first value represents % of rewards generated by the node initial stake, while the second value represents % of rewards generated by the user stake. If you set it to 100/100, this would mean you keep all rewards and stakers get 0 (so nobody will stake with you). On the other hand, if you set it to 0/0 then all rewards will go to the stakers and you get 0. The trick is to find the ideal setting which fits your strategy. Some popular ratios are 90/10, or even more generous to stakers, 90/5. This would be displayed to stakers at node.ont.io as 10/95. Keep in mind that any change of ratio takes effect in the consensus round after the next one. (Example: you change the ratio in OWallet during round 150. When round 151 begins, it will be displayed to stakers at node.ont.io as the next round ratio and becomes effective when round 152 begins). Node info — it is a separate page where you can fill in all the information about your node. For instance, you can change the node name, set a logo, etc. If you want to obtain a “Verified” checkmark next to your node at node.ont.io, you need to also fill in the “Email to contact with Ontology”. You can learn more about checkmarks at https://ontology-1.gitbook.io/ontology-node-staking-docs/node-checkmark When will I get rewards?

Reward distribution frequency is the same as when you are a staker. You do not get rewards for the round during which you launch a node, but you get your first rewards for the next consensus round. Rewards are added to your profit section each time when a consensus round ends. You can claim them after each round, or keep them accumulating and claim anytime you like. (Example: you launch the node during consensus round 150, you will get your first rewards when round 151 ends and the next rewards when round 152 ends, etc.)

Candidate node possible strategies Operator only — some candidate nodes decide not to allow user staking to their node so they participate only with their node initial stake. In this case, they keep all rewards generated by node initial stake, but lose the potential of getting additional rewards from user stake. The difference from normal staking is that they do not need to worry about changing the distribution ratio, because they are node owners. Self promotion — some nodes may decide to use their node for promotional purposes, such as Unifi Protocol node, which distributes 100% of all rewards to stakers. Node operator gets 0 rewards but in return is well visible in the community, which can help achieve long term goals of the operator in the future. Win/win — this strategy means that you set a distribution ratio, which gives you, as node operator, higher rewards than you would get with the “operator only” strategy, and at the same time gives good rewards to stakers, which motivates them to stake with you. Success of this strategy depends on the node operator’s ability to attract stakers, because the goal of such a node is to achieve 100% filling of the staking capacity. An example may be Harbinger nodes, which are represented by outstanding members of the Ontology community. Stakers can decide to stake with those Harbingers to support them and their work, instead of some company or corporation. Anything else — you can develop your own strategy which fits you best :)

In the end, I hope this article helped you with your decision making process. Now it is your turn, whether you decide to stake or run a candidate node, that fully depends on your own preferences. Should you pick any of them, I welcome you onboard!

If you have any questions, or to report missing or wrong content, or if you want to tip my wallet to buy me a cup of coffee, please reach me on Telegram or Twitter. See you!

Want more Ontology?

You can find more details on our website for all of our decentralized solutions across identity and data, or keep up with us on Twitter. Our Telegram is for discussion, whereas the Telegram Announcement is designed for news and updates if you missed Twitter!

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Staking vs Candidate Node Ownership was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

OceanDAO Round 14 is Live

200,000 OCEAN available for Web3 data economy projects! Hello, Ocean Community! OceanDAO is a grants DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month. The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, a

200,000 OCEAN available for Web3 data economy projects!

Hello, Ocean Community!

OceanDAO is a grants DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month. The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, and resources that the community finds valuable and fosters value creation for the Ecosystem.

The OceanDAO website has up-to-date information on getting started with OceanDAO.

OceanDAO’s next funding round — Round 14 — has 200,000 OCEAN available. Submissions are due February 1st 2022; we encourage you to submit early, reach out to the Project-Guiding Work Group for feedback and guidance. The rest of this post has details about Round 14.

Thank you to all of the OceanDAO participants, voters, and proposers.

OceanDAO Round 14 Outline Funding Amount

There is 200,000 OCEAN in grant funding available in Round 14.

Grant Funding Categories Building / improving applications or integrations to Ocean Community or developer outreach (grants don’t need to be technical in nature) Unleashing data Building or improving core Ocean software Improvements to OceanDAO itself Submit your Proposal

As a builder, you can check out the OceanDAO home page to get onboarded and create your proposal. Once completed, you can submit your proposal to our forum: Port.

As a voter, Port is where you can see all proposal details and cast an informed vote.

Proposals that meet the basic criteria are then submitted to Snapshot.

Proposals that receive support from the DAO receive funding in OCEAN.

Project Standing

If you have previously received a grant, you must update your Grant Deliverables inside each proposal to remain in Good Standing. This enables eligibility in upcoming Funding Rounds. Follow the instructions on the Project Standing Dashboard to learn more.

Funding Tiers

To incentivize completion and outcomes we are continuing Funding Tiers, which can be achieved by Teams delivering on their grant promises.

The funding tiers are:

New Project Funding Ceiling: $3,000 USD Requires: No one in your project has ever received a grant from OceanDAO. Open to all. Benefits: Earmarked. Receive feedback during the application process. Introduced to related projects. Receive support during voting.

2. Existing Project

Funding Ceiling: $10,000 USD Requires: You have completed 1 or more grants. Benefits: Same as above. Receive promotion via Newsletter, Twitter, and other channels. Receive support during voting.

3. Experienced Project

Funding Ceiling: $20,000 USD Requires: You have completed 2 or more grants.

4. Veteran Project

Funding Ceiling: $35,000 USD Requires: You have completed 5 or more grants.

Above are the Funding Tiers accessible by projects for OceanDAO Grants Round 13. The amount requested is in USD; the amount paid is in OCEAN token. The conversion rate is the market price on the Proposal Submission Deadline.

Earmarks

“Earmarks” means that there are funds available exclusively to the first three groups listed below, without having to compete. For example, New Teams (Outreach) have 12,000 OCEAN available without having to compete against other projects. Beyond that amount, they have to compete.

24,000 OCEAN for New Teams (non-outreach category) 12,000 OCEAN for New Teams (outreach category) 30,000 OCEAN for Core Tech Initiatives (listed below) 134,000 OCEAN for remaining General Grants

A total of 200k OCEAN is available.

Core Tech Earmark & Initiatives

If you are interested in contributing to Ocean Protocol there are earmarked Core Tech Initiatives that are being continuously expanded upon. You can view them inside our wiki.

Please find us in the Core-Tech Working Group discord channel and review your proposal ahead of time to qualify for the Core-Tech Earmark.

Unleash Data Category

In Round 13 we have revised the Unleash Data Category to have the same rules as the rest of the Funding Categories. They can qualify again for funding up to 35,000 OCEAN conditional that Unleash Data projects:

“Unleash data” has actually always had a broad intent, towards driving Data Consume Volume. It is intended for projects like: becoming a data broker to get data owners’ data into Ocean Market, networking / “hitting the phones” to increase consumption on Ocean Market, integrating Filecoin & publishing data on Ocean Market, etc. Here are further examples. It was never meant to be solely for constructing / maintaining a small number of datasets. Whatever is proposed should be high value-add to the Ocean ecosystem. Example of poor value-add: requesting a lot of OCEAN to construct / maintain a small number of datasets to sell with expectation of profit. We expect and encourage OceanDAO voters to downvote such proposals.

Burning becomes Recycling

Starting Round 13, any funds remaining inside an Earmark will be recycled into General Grants rather than burned. If there are any funds available at the end, they will be moved back into the Treasury.

For intuition on how burning relates to voting in OceanDAO.

You can find all the steps below:

[Earmarked Grants] winners are counted first. Projects that do not win an [Earmarked Grant] are then considered for funding under the [General Grant] category. Any remaining funds inside an Earmark, are then moved towards [General Grants] [New & Core Tech] teams are thus eligible to be funded from both [Grant Categories]. Returning teams are eligible to be funded from [General Grants]. 50% or more “Yes” Votes receive a grant. Any remaining funds inside the [General Grants] Category are moved back to the treasury.

Quadratic Voting

We are planning to ship Quadratic Voting for Round 14.

This will include a portal where participants will be able to register, receive a voting boost, and increase their total voting power.

Please keep an eye on Twitter and Discord for announcements.

Working Groups

Head to the Ocean Discord and dive into each Working Group. There you will find each Working Group’s respective documents, stewards, members list, meeting times, and more:

Project Guiding Working Group Parameters & Roadmap Working Group Core Tech Working Group OceanDAO Engineering Working Group Data Consume Volume Working Group Ambassadors Working Group

Working Group Outline

Each Working Group has its own public discord sub-channel. Then, each Working Group can leverage real-time discord channels (voice/video). Each Working Group has its own membership + working process. Each Working Group has a weekly or bi-weekly call. Each Working Group has Stewards (who’s accountable to run the group), and Members (people who participate and contribute).

Town Hall

Town Hall has been running weekly since the beginning of OceanDAO. Currently the DAO is expanding and with this growth so will the Town Hall.

Town Hall will be a structured presentation and expand its scope to include the entire Ocean Ecosystem where each Working Group presents slides on their current progress and updates

Deadlines

Proposals Submission Deadline is February 1st at midnight GMT Add OCEAN to Voting Wallet by the Proposal Submission Deadline. A 2 day proposal Due Diligence Period ends February 3rd, 2022 at 23:59 GMT Voting Starts on February 3rd, 2022 at 23:59 GMT Voting Ends on February 7th, 2022 at 23:59 GMT Funding Deadline on February 21st, 2022 at 23:59 GMT

Starting with Round 13, grant requests via request.finance have been deprecated. Instead, you can now claim the grant directly from the OceanDAO website. Please find instructions on claiming your grant here.

OceanDAO Ecosystem

Visit Ocean Pearl to learn more about each project, track updates, get an overview of the grant process, and follow the Voting Leaderboard.

You can also find all Funding History & Round Data here!

Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 14 is Live was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 13. January 2022

KuppingerCole

Are You Ready for Security Automation?

Security Orchestration, Automation & Response (SOAR) tools are the latest in the evolution of automated cyber defenses and are set to become the foundation of modern Security Operations Centers (SOCs). But SOAR is not only for large enterprises. The benefits for smaller organizations should not be overlooked.

Security Orchestration, Automation & Response (SOAR) tools are the latest in the evolution of automated cyber defenses and are set to become the foundation of modern Security Operations Centers (SOCs). But SOAR is not only for large enterprises. The benefits for smaller organizations should not be overlooked.




Coinfirm

Why DeFi Startups Need to Implement AML

What is DeFi? DeFi is an acronym for Decentralised Finance, a concept representing the shift away from intermediaries within financial product offerings to direct peer-to-peer interactions. To get rid of intermediaries, DeFi protocol developers involve the use of smart contracts – programmable financial instruments launched on accommodating blockchains and made available to everyone on the...
What is DeFi? DeFi is an acronym for Decentralised Finance, a concept representing the shift away from intermediaries within financial product offerings to direct peer-to-peer interactions. To get rid of intermediaries, DeFi protocol developers involve the use of smart contracts – programmable financial instruments launched on accommodating blockchains and made available to everyone on the...

Torus (Web3 DKMS)

Web3Auth partners with Polygon Studios to bring seamless logins to the Polygon ecosystem

The partnership between Web3Auth and Polygon Studios arises as a natural extension of both companies’ vision to bridge the gap between Web2 and Web3 for both developers and users. With this partnership, we’re excited to announce Web3Auth’s full support for all Polygon projects. Developers building on Polygon will now have full access to Web3auth’s plug and play auth suite to provide seamless

The partnership between Web3Auth and Polygon Studios arises as a natural extension of both companies’ vision to bridge the gap between Web2 and Web3 for both developers and users.

With this partnership, we’re excited to announce Web3Auth’s full support for all Polygon projects. Developers building on Polygon will now have full access to Web3auth’s plug and play auth suite to provide seamless user logins for both mainstream and Web3.0 users.

If you are a game developer or an NFT creator on Polygon, what does this mean for you?

Web3Auth aggregates OAuth (Google, Twitter, Discord) logins, different wallets, and existing key management solutions, and provides dApps/wallets a familiar experience that fits every user. Mobile, web, and blockchain agnostic, Web3Auth fits right into your application or wallet.

Developers can integrate Web3Auth into their application with a couple of lines of code and allow users to connect to any blockchain with their method of choice. With web and mobile SDKs, dApps can fully Whitelabel the onboarding process and take charge of the UI/UX.

We have been powering dApps and wallets such as Skyweaver by Horizon Games, Mycryptoheroes, Kash, Ubisoft’s launch on Kukai, and other 300+ applications and in the process securing over eight million user keys.

Web3Auth committed to supporting every Polygon Studio developer

As more and more developers look to build their dream dApps on Polygon, Web3Auth commits full support to make their dreams into reality by facilitating a ridiculously simple and seamless onboarding experience. We are certain that with the partnership, Web3Auth will become a de facto part of every Polygon Studio developers’ starter kit.

How to start building?

Integration is simple. You can plug and play Web3Auth here, and reach us on telegram for any dev support or book a demo with our team here.

About Web3Auth

Web3Auth, is pluggable infrastructure that enables Web3 wallets and applications to provide seamless user logins for both mainstream and Web3.0 users. By aggregating OAuth (Google, Twitter, Discord) logins, different wallets, and existing key management solutions, Web3Auth provides dApps/wallets a familiar experience that fits every user.

Stay in touch with Web3Auth

Website | Docs | Twitter | Telegram | DiscordLinkedin

About Polygon Studios

Polygon Studios is the Gaming and NFT arm of Polygon focused on growing the global Blockchain Gaming and NFT Industry and bridging the gap between Web 2 and Web 3 gaming through investment, marketing, and developer support. The Polygon Studios ecosystem comprises highly loved games and NFT projects like OpenSea, Upshot, Aavegotchi, Zed Run, Skyweaver by Horizon Games, Decentraland, Megacryptopolis, Neon District, Cometh, and Decentral Games. If you’re a game developer, builder, or NFT creator looking to join the Polygon Studios ecosystem, get started here.

Stay in touch with Polygon Studios

Twitter | Facebook | Instagram | TelegramLinkedin

Web3Auth partners with Polygon Studios to bring seamless logins to the Polygon ecosystem was originally published in Web3Auth on Medium, where people are continuing the conversation by highlighting and responding to this story.


Finicity

Open Banking Drives Growth of Secure, Real-Time Payments

In the world of real-time ACH payments, open banking instant account verification is quickly making the old, manual methods look like the rotary phone. Why use an unreliable relic, when there’s a better, more secure, faster innovation? Throughout the years, financial institutions have relied on voided checks, manual routing and account number input by the […] The post Open Banking Drives Growth

In the world of real-time ACH payments, open banking instant account verification is quickly making the old, manual methods look like the rotary phone. Why use an unreliable relic, when there’s a better, more secure, faster innovation? Throughout the years, financial institutions have relied on voided checks, manual routing and account number input by the consumer and microdeposits. These require days or weeks to complete, and are prone to errors or fraud.

More recently, data consortiums were used to confirm customer-submitted account details, checking them against existing databases. This assisted with clearing Anti-Money Laundering (AML) and Know Your Customer (KYC) requirements. Mixed-solution authenticators provided a fraud risk rating for the account being verified, but their information was sometimes based on older data. At best, these methods yielded only a sensible prediction of risk. With FinicityPay, failed payment risk is mitigated by obtaining account owner and balance insights prior to processing.

Previous methods of verification were rife with processing delays and lost revenue due to failed payments and fraud. Open banking APIs and platforms are the next evolution of payments. Faster, more secure and more transparent.

Open Banking Platforms Create a Real-Time Network of Connections to Financial Institutions Trusted financial data aggregation platforms facilitate secure access to consumer-permissioned data via traditional connections (enriched with bank-level security) and APIs. Tokenized access to account credentials is granted to third-party financial services innovators, utility providers and investment platforms, opening up consumer options. Open banking is making third-party ACH payments faster and more secure, maximizing savings, rewards and investment opportunities. A simple account balance check smooths the pre-funding path and protects consumers and merchants alike from failed payments and potential fees.

Using secure, consumer-permissioned access, financial institutions and vendors now get instant account balances and data from a consumer’s bank. Owner details, addresses, account and routing numbers and real-time balances are all immediately available. They can be scrutinized before authorizing payments on an open banking platform. Account access credentials are packaged into a “token” that can be passed to a third party. It yields no meaningful data if intercepted and hacked. This new level of security is spawning a much larger, more niche-oriented market of financial services providers and app developers. Open banking applications allow users to move money at the speed of the market. 

Open Banking Opens Up Payment Enablement and Authentication Innovation across all data-sharing stakeholders in the ecosystem expands what you can do to verify data and customize it to fit your payment use cases. New technologies like AI are now available to integrate into your user experience. Move customers through your account setup process quickly and intuitively, while gaining permission to curated data that fits your needs.  Once an account is authenticated, payments from it can be issued instantly (dependent upon market/account providers) throughout your suite of products. Financial management, investment, utilities and recurring payments are all open for immediate money movement.

All of this has been revolutionary for consumers. Their mobile phones are full of financial services options powered by open banking, and they’re adopting them at a rapid pace. With a few taps on the screen, a user can grant permission to third-party apps and platforms, without having to type in account or routing numbers. The app or service the user wants to access is up and running with a full package of information pre-loaded. Secure, real-time ACH payments can be made within moments of installing and setting up an app. With the richer level of insight that’s available through consumer-permissioned data, developers are giving consumers a wealth of new options for managing, spending, and borrowing money. This is industry-changing innovation. It’s giving consumers more flexibility and ownership of their financial position than ever before. 

Open Banking Simplifies the Payments Experience for Consumers Customers aren’t asked to supply sensitive information, just permission to their accounts. No more waiting periods for further verifications, like microdeposits or database checks. The customer’s data is pulled in milliseconds and rolled into the app they’re using, making real-time payments a breeze. Financial management apps with AI integration pull together a holistic picture of the consumer’s moment-to-moment financial well-being. 

While authenticating account details, ownership and current balance can streamline your payment process, it also reduces the hassle of setting up accounts and making payments for your customers. Enhancing their experience, especially when setting up payment is one of the first interactions they have with you.

Learn more about Finicity’s data solutions for verifying account details, account ownership, and check balances here.

The post Open Banking Drives Growth of Secure, Real-Time Payments appeared first on Finicity.


UNISOT

This is great news!

On January 24, 2022, major transaction processors in the Bitcoin SV network are upgrading to accept and handle Bitcoin block sizes up to 4 Gigabyte! The post This is great news! appeared first on UNISOT.

Major transaction processors in the Bitcoin SV network are upgrading to accept and handle Bitcoin block sizes up to 4 Gigabyte.

On January 24, 2022, major transaction processors in the Bitcoin SV network are upgrading to accept and handle Bitcoin block sizes up to 4 Gigabyte!

“The upgrade to 4GB is a breakthrough that will help everyone in the BSV ecosystem. TAAL is here to support the community and to help every BSV transaction processor to mine bigger and bigger blocks. That’s why we’re encouraging all miners to immediately increase their excessive block size limit to 4GB. They’ll be able to take full advantage of the node upgrade to expand their revenues and client base,” said Lars Jorgensen, TAAL Chief Operating Officer.

The original Satoshi Nakamoto Bitcoin protocol does not have a limit on the block size (!), it is up to the decentralized transaction processors (aka “Miners”) to decide how big a block they accept based on their own capacity.

The market decides. When there is a market demand for more transactions and thereby larger blocks, the transaction processors will invest in hardware, network and energy contracts to be able to capture and handle more transactions efficiently and earn the transaction fees on the transactions they process and store.

The old term “Crypto Mining” becomes irrelevant as the initial “block reward” diminishes and becomes irrelevant compared to the massive amount of (very low) transaction fees. Transaction processors are not “mining crypto” they are processing transactions and storing data in the distributed blockchain.

Now when the Bitcoin blocks become larger and store many million transactions, the energy cost per transaction also becomes irrelevant, especially as the Energy Return on Investment (EROI) becomes very high due to the practical use of Bitcoin blockchain technology in for example enabling Global Supply Chains to become more efficient.

Global Supply Chains waste at least 80%! i.e., global supply chains end-to-end aggregated efficiency is less than 20%. If we can increase global supply chains efficiency by just some percentages by using Bitcoin Blockchain technology the Energy Return on Investment (EROI) will be enormous! Not to mention the enormous positive environmental and sustainability effects that will bring us.

Stephan Nilsson – CEO UNISOT

4GB On Its Way! TAAL Upgrades Its Operations and Doubles Capacity

The post This is great news! appeared first on UNISOT.


auth0

Skyscanner Wins KuppingerCole Customer Authentication Award with Auth0

Global travel marketplace uses Auth0 to eliminate passwords and block suspicious login activity for millions of customers
Global travel marketplace uses Auth0 to eliminate passwords and block suspicious login activity for millions of customers

Tokeny Solutions

Tokeny Receives A Strategic Fundraise From Inveniam, Apex Group, and K20 Fund

The post Tokeny Receives A Strategic Fundraise From Inveniam, Apex Group, and K20 Fund appeared first on Tokeny Solutions.

Tokeny, the Luxembourg-based tokenization platform, announced today a partnership with Inveniam Capital Partners, Inc. (“Inveniam”), a SaaS company delivering trust, transparency and completeness of data to private market assets. The partnership includes a €5m investment by Inveniam, Apex, and K20 Fund.

Inveniam works with private market asset owners and managers to deliver trusted valuation and pricing data using distributed ledger technology. This data underpins private market digital assets and gives these assets integrity upon which market participants can establish price discovery. The benefit to asset owners and managers is a more liquid asset, which can unlock value.

Tokeny offers a comprehensive and institutional-grade white-label solution for digital assets that allows asset owners and managers to efficiently and compliantly issue, transfer, and manage digital assets. All processes, from the client onboarding process including KYC/AML checks to the administration required to manage investor subscriptions, capital tables, distributions, capital calls, etc., are streamlined on the Tokeny platform.

The partnership aims to fully unlock asset liquidity via tokenization, which will be facilitated by providing all technical solutions private asset owners need, covering the entire value chain of tokenized assets underpinned with trusted data regarding asset valuation and pricing. These synergies can return significant value to each company’s clients.

We’ve been watching the Tokeny team’s progress and product evolution for more than two years and know they're building next-gen tokenization systems the efficient and compliant way. Totally altering the global trading of private market assets will only work on the institutional level if the experience is seamless, the technology is top-notch, and the right regulatory structures and business networks are in place. This partnership addresses all those requirements for success. Patrick O'MearaChairman and CEO of Inveniam

While Tokeny’s solutions provide a compliance infrastructure enabling asset managers to easily bring nearly every kind of real-world asset to blockchain, Inveniam allows their investors to access trusted data and asset valuation. Liquidity will be realized as investors can conduct peer-to-peer transfers using Tokeny’s compliance framework with a fair and transparent price reference provided by Inveniam.

Tokeny’s capabilities pick up right where Inveniam’s end and vice versa, addressing two biggest obstacles in private markets — pricing data and compliance — on a hyper-efficient infrastructure. In tandem with this very synergistic partnership, the investment by Apex, K20, and Inveniam will allow us to further improve our solutions and accelerate the adoption of tokenization with the best-in-class technology. Luc FalempinCEO Tokeny Solutions

Additional value to the market will accrue as the partnership will ultimately enable Apex Group, Inveniam’s fund administration partner, to deliver end-to-end cutting-edge services to a broad ecosystem of private asset owners.

For more information, visit https://inveniam.io.

Read additional information About Inveniam

Inveniam is a blockchain-based fintech company, headquartered in Miami, Florida, with offices in New York City and Novi, MI. Founded in 2017, Inveniam has built Inveniam.io, a powerful technology platform that utilizes big data, AI, and blockchain technology to provide not only surety of data, but high-functioning use of that data in a distributed data ecosystem. Through Inveniam’s platform, users can obtain real-time pricing of private, infrequently traded assets, accelerate diligence, accurately price assets, and identify buyers for those assets. Inveniam’s platform credentials data to commute trust throughout the global financial system. Inveniam holds numerous patents pertaining to the ingestion of data into smart contracts. As of January 2022, there are $5.7 billion in assets on the platform.

About Tokeny  

Tokeny allows financial actors operating in private markets to compliantly and seamlessly issue, transfer, and manage securities using distributed ledger technology, enabling them to improve asset liquidity. Due to disconnected and siloed services that are currently used to enforce trust, private markets experience poor asset transferability with little to no liquidity. By applying trust, compliance, and control on a hyper-efficient infrastructure, Tokeny enables market participants to unlock significant advancements in the transferability and liquidity of financial instruments. Tokeny is the leader in its field and in 2020 was named one of the top 50 companies in the blockchain space by CB Insights. The company is backed by Euronext.

The post Tokeny Receives A Strategic Fundraise From Inveniam, Apex Group, and K20 Fund appeared first on Tokeny Solutions.


KuppingerCole

SentinelOne Singularity Platform

by Alexei Balaganski SentinelOne Singularity Platform is a security analytics platform for unified protection, detection, response, and remediation across heterogeneous IT environments powered by an autonomous AI technology.

by Alexei Balaganski

SentinelOne Singularity Platform is a security analytics platform for unified protection, detection, response, and remediation across heterogeneous IT environments powered by an autonomous AI technology.

OWI - State of Identity

The Compliance Required

Are there effective means to catch fraudsters, money launderers, and terrorists? On this week's State of Identity podcast, host Cameron D'Ambrosi is joined with Tony Petrov, Chief Legal Officer at Sumsub to discuss image-based automated KYC, countries liveness detection, anti-fraud instruments, watchlists, and screenings while building your AML procedures compliant with various data protection law

Are there effective means to catch fraudsters, money launderers, and terrorists? On this week's State of Identity podcast, host Cameron D'Ambrosi is joined with Tony Petrov, Chief Legal Officer at Sumsub to discuss image-based automated KYC, countries liveness detection, anti-fraud instruments, watchlists, and screenings while building your AML procedures compliant with various data protection laws. Hear what biometric data provides to online service providers and the difference in regulation to start a fintech in Singapore, or in the UK, or in Austria.


KuppingerCole

Mar 01, 2022: Enabling Full Cybersecurity Situational Awareness With NDR

Effective cyber defense depends on detecting, preventing, and mitigating threats not only on desktops, laptops and servers, but also on the network, in the cloud, and in OT, ICS and IoT, which is where Network Detection & Response (NDR) solutions come into play. Support for a security operations (SecOps) approach is essential as remote working becomes commonplace.
Effective cyber defense depends on detecting, preventing, and mitigating threats not only on desktops, laptops and servers, but also on the network, in the cloud, and in OT, ICS and IoT, which is where Network Detection & Response (NDR) solutions come into play. Support for a security operations (SecOps) approach is essential as remote working becomes commonplace.

SWN Global

MetaMUI and Sovereign Yidindji Government launched 1st Self-Sovereign Identity-based National ID…

MetaMUI and Sovereign Yidindji Government launched 1st Self-Sovereign Identity-based National ID system 1st Self-Sovereign Identity-based App “Yidindji SSID” for Yidindji’s National ID system We are delighted to announce that our first E-Government pilot program with the Sovereign Yidindji Government has been successfully completed on Jan 7, 2022. The Sovereign Yidindji government
MetaMUI and Sovereign Yidindji Government launched 1st Self-Sovereign Identity-based National ID system 1st Self-Sovereign Identity-based App “Yidindji SSID” for Yidindji’s National ID system

We are delighted to announce that our first E-Government pilot program with the Sovereign Yidindji Government has been successfully completed on Jan 7, 2022.

The Sovereign Yidindji government is a representative of Sovereign Yidindji which is a rainforest-based micronation with several clan groups in the area described as far north Queensland on the Australian continent and it has a combined population of more than 150,000 Australian citizens and Yidindji citizens. Academics and scientists claim the Yidindji people have been around for more than 60 thousand years, so it is one of the oldest living cultures on the planet. In 2014, history was made on September 20 as the official government of the Yidindji nation came into life with the launch of its coat of arms. Since then, the SYG, as it is known, has created several departments and its parliament continues to pass laws for the peace, order and good governance of the Yidindji territory. Currently, SYG has 17 ministers and 22 ministries and plans to be the first government to become fully digitized in its venture towards E-Governance and national ID system.

The purpose of this pilot program was to establish the world first fully blockchain-based E-Government. During the six-month pilot program, Sovereign Wallet Network(SWN) team has implemented the first self-sovereign identity-based national identity system to digitize all government services such as digital identity authentication, vaccination certificate, and digital administrative works.

We just finished our first step to open the gate for making governments become automated and fully compatible with each other country on top of MetaMUI Blockchain. MetaMUI Blockchain is the world’s first blockchain that every government and institution can issue their own digital currency while complying with all the financial legal conditions. MetaMUI Blockchain allows users to have their own Decentralized Identity(DID) and all the personal information of the users only stays inside the users’ device. Every ownership of the assets is bound to the identity of the user, not to the private key, therefore it can be recovered should the user lose the private key. This will make every bank adopt MetaMUI Blockchain to build a practical application for their customers and it will allow the customers borrow, lend and store their digital currency and digital assets through the app.

The Yidindji SSID and MetaMUI SSID will soon provide more functions such as storing various digital assets including NFTs, Carbon Credits, Stocks and even gold certificates. These 2 apps will also present a special feature called “Pairwise Trust”. Pairwise Trust is a two-way cryptographic binding system that enables users to login any websites without registering a new ID and password but only with DID. Users can also avoid phishing websites through its Pairwise Trust function because the two-cryptographic binding makes possible not only the users prove themselves to the website but also the website must prove itself to the users through the system.

More innovations will come through the year of 2022 and our journey has just begun. We are glad that we could deliver this wonderful news to our audience at the very first month of this year and we will keep building and developing our technology to make a better world.

Cointelegraph article:
https://cointelegraph.com/press-releases/metamui-and-sovereign-yidindji-government-launched-1st-self-sovereign-identity-based-national-id-system

Sovereign Yidinji SSID Launch Video:
https://www.youtube.com/watch?v=qjX1OysGbk4&t=102s

About MetaMUI

MetaMUI is the world’s 1st Identity-based blockchain to truly enable regulated Peer_to_Peer (P2P) transactions. It can be used as the cornerstone for most financial offerings, with its high versatility and support for custom-developed use cases that enable Central Bank Digital Currencies (CBDC) implementation and issuance easily on the MetaMUI CBDC platform.
MetaMUI is built around Self-Sovereign Identity (SSID) technology. By combining the SSID and blockchain token mechanism, MetaMUI was able to create a new identity-based CBDC system that can protect user’s privacy while providing an identity-based transfer system that can satisfy travel rules and other regulatory requirements such as Know your customer (KYC), Anti-money laundering (AML) and Counter Financing of Terrorism(CFT).

Website/Linkedin/Twitter/Telegram/Youtube/Medium


Ontology

Ontology Opens New German Office To Expand Its Digital Identity Solutions and Web3 Infrastructure…

Ontology Opens New German Office To Expand Its Digital Identity Solutions and Web3 Infrastructure Across Europe Ontology aims to contribute to the burgeoning European privacy ecosystem through new partnerships, community members, and employees Ontology, the project bringing trust, privacy, and security to Web3 through decentralized identity and data solutions, has today announced the openin
Ontology Opens New German Office To Expand Its Digital Identity Solutions and Web3 Infrastructure Across Europe Ontology aims to contribute to the burgeoning European privacy ecosystem through new partnerships, community members, and employees

Ontology, the project bringing trust, privacy, and security to Web3 through decentralized identity and data solutions, has today announced the opening of a new European office in Berlin, Germany.

Ontology’s move reflects its alignment with Europe’s efforts to increase data privacy and protection through legislation such as GDPR. In addition, the European Commission’s endorsement of Digital Identity as a more secure and convenient means for data storage and exchange for citizens makes Germany an ideal environment for Ontology’s expansion. By opening its new office in Europe, Ontology aims to play a role in increasing privacy across the continent and highlight the benefits that its decentralized solutions can bring to users and regulators alike.

Increasing privacy, transparency, and trust, Ontology’s high speed, low cost blockchain is designed to give users and enterprises the flexibility to build blockchain-based solutions that suit their needs, while also ensuring regulatory compliance. ONT ID, Ontology’s decentralized digital identity application, which enables users to fully control their digital identity, surpassed 1.5 million users in September 2021. Other products include the ONTO wallet, which allows users to securely manage their identities, data, and digital assets.

Europe is one of the largest hubs for blockchain related services and Web3 innovation worldwide, with much of the action happening in Berlin, which has seen over $13 billion of investment since 2016. As such, Ontology has chosen Berlin as a key strategic location where it will aim to expand its reach and capitalize on the incredible talent and resources available.

Gloria WU, Chief of Ecosystem Partnerships at Ontology, said: “Europe sits at the forefront of Web3 and technical innovation and its continued focus on increasing user privacy and security aligns clearly with our mission at Ontology. By opening our new office in Berlin, we are excited to contribute to the continent’s ongoing efforts to create a more secure web. We look forward to growing our presence in Europe and contributing to the ecosystem through a host of new partnerships, community members, and employees. We are currently hiring for a number of roles available.”

The opening of the office supplements Ontology’s long standing roots in Europe, which have been established through a host of different partnerships. Following on from its previous partnership with Mercedes parent Daimler Mobility to develop Welcome Home, an in-car system designed to transform driving experiences, Ontology recently partnered with bloXmove, a European mobility blockchain platform designed to simplify travel across multiple forms of transportation. The partnership will see bloXmove integrate Ontology’s decentralized digital identity protocol into its platform, providing users with an identifier that will allow them to share their verifiable credentials, such as driving licenses and passports just once, in a way that is totally private, secure, and encrypted.

Specifically in Berlin, Ontology has partnered with the Hochschule für Technik und Wirtschaft (HTW) University to explore joint research and teaching initiatives, with a view to developing a number of bespoke blockchain applications.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Ontology Opens New German Office To Expand Its Digital Identity Solutions and Web3 Infrastructure… was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

How to Create a Seamless Mobile SSO (Single Sign-On) Experience in iOS

On an iPhone, when we log in to an app, we click a login button, and a website pops up to verify our credentials. Once verified, the website then redirects back to the app, and you are logged in. This familiar Single Sign-On (SSO) pattern is frequently referred to as the redirect flow for authentication. The use of a web browser for auth in this example is considered a “Best Current Practice” for

On an iPhone, when we log in to an app, we click a login button, and a website pops up to verify our credentials. Once verified, the website then redirects back to the app, and you are logged in. This familiar Single Sign-On (SSO) pattern is frequently referred to as the redirect flow for authentication. The use of a web browser for auth in this example is considered a “Best Current Practice” for security and usability reasons.

However, in 2017, a new prompt appeared in the login flow, before you were taken to the website. The following screenshot came from an iOS app preparing to log you in with Facebook. The interface prompt informs you that a specific app, the Yelp app in this example, wants to use a specific website to sign you in, and it warns you about information sharing.

There are a couple of problems with this prompt:

Problem One (Where did this prompt come from? ) : Many people consider this a bad user experience because the prompt looks out of place, and the wording is confusing and might alarm end users.

Problem Two (Ambiguous UI message): If you implement logout functionality through the same flow, the prompt still says Sign In, even though the user may have already clicked a Logout button. This too is confusing to the end user, as shown in the following screenshot. This problem is reported as a bug on the AppAuth library, although it was designed as a security and privacy feature.

In this article, I’ll go into more detail on aspects of the iOS platform limitations. I’ll explain why this prompt is shown and how to get around it to build a more seamless user experience.

Specifically, this article will:

explain the evolution of iOS over time, and how and why this prompt was introduced. describe various ways you can invoke a standalone or embedded browser on iOS, and explain the cookie-sharing behaviors. discuss several ways to eliminate the confusing user experience issues we’ve identified above. A brief history of iOS evolution

The browser options available for authentication on an iOS device have changed significantly over the years, as Apple continues to add capabilities to enhance user privacy. To understand the behavior of the browser options provided on iOS, some historical context is helpful.

Authentication prior to iOS 9

Before the release of iOS 9 in 2015, there were only two ways to authenticate through a web browser, and neither one was optimal.

UIWebView. This approach shows web content in a UIView. The mobile app can intercept interactions with the UIView, hence it is not secure.

Redirect to an external browser. The mobile app can open a webpage in a separate browser app, and the browser app can redirect back to the mobile app after authentication. However, app switching is not a good experience for mobile users. In addition, the redirect back can fail if another app has registered the same URL scheme first.

Authentication in iOS 9 (2015)

In 2015, Apple introduced SFSafariViewController, which displays an embedded Safari browser in the app. As an in-app browser tab, this interface lets the user stay in the app. Once the user interaction with the website is no longer needed, the browser tab can be closed and dismissed.

This solution was groundbreaking, because it is secure (the native app can neither peek into the embedded browser, nor alter its state) and it provides a better user experience by avoiding app switching.

SFSafariViewController shares cookies with the standalone Safari browser. This gives developers a way to implementSSO (Single Sign-On) with shared cookies..

iOS 11 changes (2017)

Apple changed SFSafariViewController behavior to address privacy concerns with the release of iOS 11 in 2017. SFSafariViewController no longer shares cookies with the standalone Safari browser. Thus, a website can no longer determine that the user on SFSafariViewController is the same user on Safari, even though the view controller and the browser are open on the same device.

Apple understood that the SFSafariViewController behavior change would break SSO, since single sign-on relies on the ability to share cookies. Instead, Apple had to introduce SFAuthenticationSession as a workaround, where SFAuthenticationSession would share a persistent cookie with Safari.

Iterative changes in iOS 12 (2018)

In 2018, Apple deprecated SFAuthenticationSession, but introduced ASWebAuthenticationSession as a revised solution.

Both SFAuthenticationSession and ASWebAuthenticationSession are designed specifically for OAuth 2.0 authentication, not for showing general web content. To avoid abuse, when SFAuthenticationSession or ASWebAuthenticationSession are used, Apple always displays the prompt we saw earlier. The prompt is designed explicitly for user sign-in, and indicates that cookies are shared.

Apple does not know whether an ASWebAuthenticationSession is invoked for sign in, sign out, or general web browsing. This is why the prompt text is generic. It only states that your app is trying to Sign In, regardless of the actual use case, which results in the ambiguity described earlier in Problem 2.

iOS 13 changes (2019)

In 2019, Apple introduced prefersEphemeralWebBrowserSession as an option for ASWebAuthenticationSession. If this option is set to true, ASWebAuthenticationSession does not show the prompt above, and as a consequence, it does not share cookies with Safari. This gives developers a choice. Either they gain a better user experience with no confusing prompt and no SSO, or they get SSO, along with the annoying prompt.

SFAuthenticationSession or ASWebAuthenticationSession behavior

The various browser options offer differing levels of cookie sharing in order to limit websites’ ability to track a user.

The cookie-sharing behavior is not well documented in Apple’s documentation. The SFSafariViewController doc mentions that, “In iOS 9 and 10, it shares cookies and other website data with Safari … If you would like to share data between your app and Safari in iOS 11 and later, … use ASWebAuthenticationSession instead.”

A few third-party websites extend Apple’s documentation to highlight that persistent cookies are shared between an embedded browser and Safari. Unfortunately, it is rarely known that session cookies can also be shared between different embedded browsers. If your application just needs SSO between your mobile apps, you do not have to use a persistent cookie; a session cookie is sufficient.

The following table summarizes the complete sharing behavior in iOS 11 or later. I’ve omitted SFAuthenticationSession for brevity because it is deprecated, but it behaves the same as ASWebAuthenticationSession. I’ve also omitted the prefersEphemeralWebBrowserSession option for ASWebAuthenticationSession because when it is set, cookies are not shared at all. I also use SVC as shorthand for SFSafariViewController and WAS as shorthand for ASWebAuthenticationSession.

  APP1 + SVC App1 + WAS App2 + SVC App2 + WAS Safari Other browsers (ex.: Chrome) Session cookie ❌ ✅ ❌ ✅ ❌ ❌ Persistent cookie ❌ ✅ ❌ ✅ ✅ ❌

In the following iOS cookie behavior demo video, you can see how the session cookie and the persistent cookie are shared. The session cookie sharing behavior is not intuitive. Even when App1 is closed (which should clear all session cookies), opening App2 would make it possible to see App1’s session cookie.

It is worth noting that in iOS 14 and later you can specify another browser, such as Chrome, as the default. This does not affect the sharing behavior. SFSafariViewController and ASWebAuthenticationSession always use Safari under the hood, so they will never share cookies with browsers other than Safari, even if they are set as the default. The term system browser, often used in online articles, is synonymous with Safari.

Solution: how to remove the extra prompt

Now that we’ve reviewed the evolution of iOS browser behavior, and explored the rationale for the changes, let’s look at solutions to improve the user experience.

Solution to problem two

Solving problem two only – eliminating the ambiguity of the prompt message – is straightforward: Use the browser for sign-in only; do not use the browser to sign out. You can sign out of your application directly by revoking the access token and the refresh token. Okta provides a revoke API that you can call directly.

Revoking a token works as a solution if you are okay with signing the user out of the native app only. The user may still have a login session in the browser, and if the native app wants to log in again, the browser will not ask the user for credentials before granting an access token.

This is the recommended approach if your web session may be supporting many different native apps. For example, FB logout specifically recommends not to log out of the web session.

However, if you require a stronger privacy and security posture, for instance for a banking app, keeping the web session alive may not be an option.

Solutions to problem one

There are two potential approaches to solve problem one. Both solutions use browser components that do not show a prompt. Both have the drawback that no cookie can be shared, so SSO will not work. If your app requires SSO, you’ll have to find a new way to share login sessions. Fortunately, Okta recently introduced Native SSO, which allows native apps to implement single sign-on without cookies. See our blog post on SSO between mobile and desktop apps for a full example. The following code shows how to remove the prompt, and it assumes your app either does not require SSO or uses Native SSO.

First, you can use the prefersEphemeralWebBrowserSession option for ASWebAuthenticationSession. If you are using the Okta OIDC iOS library, you can configure the noSSO option as follows:

let configuration = OktaOidcConfig(with: {YourOidcConfiguration}) if #available(iOS 13.0, *) { configuration?.noSSO = true }

Under the hood, the noSSO option sets the prefersEphemeralWebBrowserSession flag. Note that this flag is only available in iOS 13 and above.

Second, if you desire to support older iOS versions, you could use SFSafariViewController as the browser to present the login session. The following demonstrates how to launch SFSafariViewController if you are using the AppAuth iOS library.

AppAuth iOS supports a concept of external user agent, where you can use any browser to present the sign-in webpage. Normally, you invoke the following method to bring up a browser:

OIDAuthState.authStateByPresentingAuthorizationRequest:presentingViewController:callback:

This method will invoke a default user agent. Alternatively, you can use the following method, which gives you an option to plug in any external user agent, as long as it conforms to the OIDExternalUserAgent protocol.

OIDAuthState.authStateByPresentingAuthorizationRequest:presentingViewController:callback:

There is already an implementation of using SFSafariViewController as an external agent. You can download it and plug it into your project. The following code snippet shows how to create and pass in an external user agent OIDExternalUserAgentIOSSafariViewController.

let appDelegate = UIApplication.shared.delegate as! AppDelegate let externalAgent = OIDExternalUserAgentIOSSafariViewController(presentingViewController: self) appDelegate.currentAuthorizationFlow = OIDAuthState.authState(byPresenting: request, externalUserAgent: externalAgent) { authState, error in if let authState = authState { self.authState = authState print("Got authorization tokens. Access token: " + "\(authState.lastTokenResponse?.accessToken ?? "nil")") } else { print("Authorization error: \(error?.localizedDescription ?? "Unknown error")") self.authState = nil } } Summary

Understanding iOS and Android platform-level constraints is a prerequisite for developing great mobile experiences. I hope this article will help you understand the browser options available on iOS and their cookie sharing behaviors.

At Okta, we are adamant about optimizing end-user experience. Every extra click needs to be eliminated if possible, and every extraneous prompt should be avoided. We develop solutions such as Native SSO to help you overcome mobile constraints. I hope our solution helps you improve the experiences you are building for your end users. We’d love to hear more about your needs and how we can build solutions together. Feel free to add your questions and suggestions about this solution or future topics in the comments below.

If you enjoyed reading this article, you can keep up with our content for developers by following us on Twitter and subscribing to our YouTube channel.


Holochain

How Did DevCamp 8 Go?

Holochain Dev Pulse 110

A few keen-eyed readers pointed out that I never wrote a follow-up article on last year’s DevCamp. They’re absolutely right; all I wrote since the start was this small update halfway through the course. It’s time to write something!

As you may have heard, over 1800 people signed up for the DevCamp, which amazed us all. This was a huge increase from the previous DevCamp’s roughly 200 registrants. More and more people are hearing about Holochain, among them developers who may go on to create some really wonderful apps.

I’m so grateful to the DevCamp organisers. I think they did a fantastic job presenting a pretty demanding subject to a huge group of people, all online and spread out across time zones. They facilitated with enthusiasm, structure, patience, and compassion. This was a big project, and they faced it courageously and skillfully. (Did you know that they were all volunteers, doing it simply because they love the project?)

The organisers did have some selection criteria to make sure it would be a rewarding experience for participants — this was, after all, meant to train people to build Holochain apps, so it probably would only be useful to people who intended to build Holochain apps. Roughly 1000 of the registrants were selected for participation, and roughly 400 enrolled once they were accepted.

Over the course of the DevCamp’s two-month schedule, we did see a drop in participation. Live session attendance dropped by 67% between the beginning and the end of the course. This might sound disappointing, but it’s actually quite good — various studies say online courses see from 85% to 97% dropout rate. This shows that an above-average number of people committed themselves to learning a pretty demanding subject.

Not knowing why individual participants decided to leave, I have some guesses: some may have found the course load or meeting times didn’t work for their schedules, others may have decided that Rust or Holochain development wasn’t for them, and still others may simply have joined because they wanted to learn a little bit about the project but not develop an app.

Some Holo team members helped with the DevCamp too, which gave us the opportunity to do some research. We learned a few important things:

People don’t get interested in Holochain simply because it solves a technical problem for them. Instead, they get involved because they believe in the values it represents — agency, accountability, privacy, cooperation, voluntary participation. They want to join a movement, not just use a piece of tech. Holochain’s community is unique among open-source communities — it seems stronger, more closely knit and supportive, with passionate people who are willing to give their time to help newcomers. (The DevCamp organisers are a case in point!) Reputation revolves around commitment to helping others grow, rather than crowing about one’s accomplishments. It’s not just one ‘community’ either — there are many communities who value slightly different things or are focused on different aspects of the project. There are a lot of different perspectives about what matters. In a way, the tech and the social structures mirror each other. DevCamps are a special way to experience the health of the community — it might be that the community is the most valuable part, because they create a supportive space for learners and help them become part of the Holochain community at large. The Discord server, in particular, was a great addition to DevCamp 8, and resulted in a lot more active conversations than the forum had had in previous DevCamps. There’s a steep learning curve. Not only is Rust a new language for most, but Holochain is quite different from either cloud or blockchain. Many DevCamp participants had already participated in a DevCamp but wanted to reinforce or refresh their knowledge. The strong, supportive community also helps people gather the courage to meet this challenge. Many participants feel like there was still so much to learn by the end of the course. In fact, we learned that some had already participated in a DevCamp and wanted a refresher. This reinforces what we learned about the learning curve, and suggests that a community, ongoing events, and self-serve resources for alumni would be useful. Not everybody can participate in a DevCamp or online learning community. Time constraints — busy life or time zone differences — are the biggest barriers. Most people participated asynchronously, replaying the recorded session videos and chatting on Discord. We would do well to develop more self-serve resources for those people. (Fortunately, the organisers have published all of DevCamp 8’s session videos on YouTube, which is a great start!) Session 1 of the DevCamp Holochain 0.0.122: big gossip performance improvements

Most of the recent Holochain releases have been maintenance releases.

But today’s release brings big New Year news. I’ve been looking forward to sharing this one!

Recently one of the core developers has been troubleshooting a performance bottleneck we saw in the Elemental Chat tests. Gossiped DHT ops were being integrated at a rate of 40 per second, which seemed unusually slow because it didn’t involve a lot of processing.

If you’re not familiar with the mechanics of the DHT, here’s how it works: when an agent commits an element to their source chain, it results in the creation of a number of DHT operations. These operations are messages, sent to peers on the DHT, that update their local DHT shard. Here’s an example of the ops produced by a ‘create app entry’ element:

A StoreElement operation stores the header in the neighbourhood of the new header’s hash. A StoreEntry operation stores the entry, and a copy of the header, in the neighbourhood of the new entry’s hash. Finally, a RegisterAgentActivity operation stores a copy of the header in the neighbourhood of the author’s public key, allowing those authorities to detect source chain forks or serve up source chain segments to other validators.

Each of these operations has to be ‘integrated’ by the authorities in the neighbourhood — that is, validated and stored in their DHT shard. This happens initially, at publish time, when the author first sends out these operations. But it also happens on an ongoing basis, as new peers join the DHT and start building their shard, or as peers disappear and cause remaining peers to increase their coverage. These peers share this information through gossip, which happens at regular intervals — more frequently for new data, less frequently for historical data.

In the Elemental Chat tests, the historical data gossip loop was slowing down HoloPorts. Part of this was because sharding wasn’t yet turned on, which meant that everybody gossiped everything with everybody else. But that ended up being a great way to catch this performance problem.

The solution was to optimise the table that stores the operations, along with the queries that manipulate it (#1167). Now, operations are integrated at a rate of 1000 per second — a 25× increase!

There are other changes you should know about as a developer:

The conductor API lets you manually insert elements into a source chain. This will be used to build source chain redundancy for Holo hosting, as well as source chain backup and restore for natively installed Holochain instances. (#1166) Breaking: The Path abstraction for working with links has changed. (#1167) Breaking: You can now query a source chain by ranges — either between two sequence numbers, between two hashes, or the most recent element plus a certain number of preceding elements. All range queries are inclusive. (#1142) The above breaking changes mean that Holochain 0.0.122 is compatible with HDK 0.0.118.

Read the full changelog.

That’s the news from the start of 2022. I’m looking forward to sharing more this year as our projects move towards beta status and see wider adoption among developers!

Cover photo by Tegan Mierle on Unsplash

Wednesday, 12. January 2022

Shyft Network

Announcing the Shyft Network Design Competition Winners!

From GIFs on reputation to stickers on $SHFT, here are the winning entries from our latest community campaign. It’s always nice to kick off a new year with your community. Following the close of last year’s Doodle NFT giveaway campaign, which received over 10,000 entries, we’re now sharing the winners of our next community campaign: a design competition on Discord. As part of the design com
From GIFs on reputation to stickers on $SHFT, here are the winning entries from our latest community campaign.

It’s always nice to kick off a new year with your community. Following the close of last year’s Doodle NFT giveaway campaign, which received over 10,000 entries, we’re now sharing the winners of our next community campaign: a design competition on Discord.

As part of the design competition, we put out a call to the Shyftoshi community to create memes, GIFs, stickers, and unique images focused on Shyft Network and our vision to bring trust, identity, and validation in blockchain data. We were thrilled by the response and creative ways our Discord community members flexed their knowledge of our platform in their work.

After 70+ entries from our Discord community, we’re proud to share the winning designers (and designs!). Entries were voted on by the Shyft Network Discord community and awarded 5000 $SHFT tokens following the competition’s close. 1st place came down to a tie, and rewards were split evenly between those two entries; in lieu of third-place, we awarded the second-place winner.

Big thanks to everyone who submitted their work and shared the energy of this campaign. We’ll be back soon with some new exciting community initiatives in the coming weeks and months!

— Shyft Network

🥇 1ST PLACE
— Discord User Neha #8358 / @SCreator123 Click here to play the video on Twitter!
“Thank you @shyftnetwork for making a design competition. #ShyftNetwork #Shyft $SHFT. It’s a fair and funny contest. I have enjoyed this contest very much. I will hold this winning amount of yours for 5 years so that I too can become the richest person.”
🥇 1ST PLACE
 — Discord User yogiehiqqie#5287 / @yogiehiqqie
“The real victory is when it is useful for others, thank you @shyftnetwork for making this competition” #ShyftNetwork #Shyft $SHFT
🥈 2ND PLACE
— Discord User leroyjenkins | YMH#8606 @capj84​​
“The contest was a lot of fun. So many great designs and very artistic people in the community! Everybody did a great job!”
About Shyft Network

The Shyft Network aggregates and embeds trust into data stored on public and private ecosystems, allowing an opt-in compliance layer across all systems. The key pillar for Shyft is user consent, allowing users to track the usage of their data. Therefore, no one can use personal data without consent from the owner. Shift Network allows and gives incentives to individuals and enterprises to work together to add context to data, unlocking the ability to build authentic digital reputation, identity, and credibility frameworks.

Website / Telegram / Twitter / MediumDiscord


Evernym

January 2022 Release Notes

This month's update covers Evernym's acquisition by Avast, the launch of cheqd, progress on interoperability, and a roundup of improvements to our core products. The post January 2022 Release Notes appeared first on Evernym.

This month's update covers Evernym's acquisition by Avast, the launch of cheqd, progress on interoperability, and a roundup of improvements to our core products.

The post January 2022 Release Notes appeared first on Evernym.


Elliptic

UniCC – the Largest Dark Web Vendor of Stolen Credit Cards – Retires After Raking in $358 Million in Crypto

UniCC – the leading dark web marketplace of stolen credit cards – has announced its retirement. Elliptic analysis shows that $358 million in purchases were made through the market since 2013 using cryptocurrencies.

UniCC – the leading dark web marketplace of stolen credit cards – has announced its retirement. Elliptic analysis shows that $358 million in purchases were made through the market since 2013 using cryptocurrencies.


IBM Blockchain

Harnessing the power of data and AI to operationalize sustainability

Companies are under mounting pressure from regulators, investors, and consumers to progress toward more sustainable and socially responsible business operations — and to demonstrate these measures in a robust and verifiable way. In fact, corporate responsibility and environmental sustainability risks tied as the third highest concerns for organizations, as ranked by large corporations in a […] T

Companies are under mounting pressure from regulators, investors, and consumers to progress toward more sustainable and socially responsible business operations — and to demonstrate these measures in a robust and verifiable way. In fact, corporate responsibility and environmental sustainability risks tied as the third highest concerns for organizations, as ranked by large corporations in a […]

The post Harnessing the power of data and AI to operationalize sustainability appeared first on IBM Supply Chain and Blockchain Blog.


Global ID

Trusted Airdrops with GlobaliD

We’re thrilled to announce the launch of Trusted Airdrops on GlobaliD. Every crypto project faces the same challenge: How to strategically or fairly distribute tokens to their users. A common practice is called an airdrop — giving away tokens to users who qualify for certain terms or accomplish certain tasks. For example, Uniswap, a decentralized exchange, created one billion governance tok

We’re thrilled to announce the launch of Trusted Airdrops on GlobaliD.

Every crypto project faces the same challenge: How to strategically or fairly distribute tokens to their users.

A common practice is called an airdrop — giving away tokens to users who qualify for certain terms or accomplish certain tasks. For example, Uniswap, a decentralized exchange, created one billion governance tokens and airdropped 15% of them to anyone who had used the protocol up to that point — 400 UNI tokens in total, at the time, worth about $2000.

Airdrops are a great way to distribute tokens to your most active and engaged users while also helping to promote and market the project.

But as airdrops have gained in popularity and prominence — and with significant money at stake — these distributions are increasingly vulnerable to people gaming the system, undermining their equity and effectiveness. When bots sabotage your airdrop, it compromises the strategic intent of the token distribution.

For any project, the goal is to reach engaged users that are real people. Naturally, that extends to NFT drops, DeFi platforms, and DAOs (decentralized autonomous organizations) as well.

Trusted Airdrops

By introducing self-sovereign identity to airdrops, you can now guarantee that real people receive tokens rather than bots in a privacy preserving way. It’s the natural symbiosis between decentralized finance and decentralized identity.

With GlobaliD’s Trusted Airdrops, developers can quickly and easily verify that wallets belong to real people.

Users that receive airdrops can conveniently verify themselves and participate in exciting new projects without compromising ownership or control over their identity or their data. GlobaliD never sees users’ private data, which is encrypted and decentralized.

Learn more about why self-sovereign identity matters

The process is simple and straightforward:

Developers create a signup link for their airdrop through GlobaliD Recipients follow the link to verify their identity Recipients add a wallet and get their drop

You can learn more about the process here:

Trusted Airdrops with GlobaliD

For projects building on XRPL, they can leverage GlobaliD’s native integration with XUMM Wallet, the most popular XRP wallet in the ecosystem that boasts over 200,000 active users.

How to verify your XUMM Wallet with GlobaliD

It’s also incredibly cool to see the many projects come up with hacky solutions for custom needs. After all, GlobaliD is an open platform that we’ve designed to empower developers of all kinds.

If you want incorporate decentralized identity to your project, check out our developer docs:

Developer documentation API documentation

We encourage you to reach out to us directly — if you can’t find what you’re looking for or if you’re interested in collaborating.

The GlobaliD vision

At GlobaliD, we’re committed to figuring out how self-sovereign identity can enable safe, secure and efficient inclusive finance.

The origin of this project was actually fueled by organic market demand — when the Strategy Engine project leveraged GlobaliD for their own airdrop.

With the success of that third party project, we wanted to quickly figure out a way to make Trusted Airdrops more convenient and accessible for developers. In just the last few weeks, we’ve seen dozens of projects choose GlobaliD through word of mouth.

In order to meet and address that demand, we’re actively working toward improving our documentation and developer support. That starts with the Trusted Airdrops landing page, but it’s just the beginning.

It’s also just the tip of the iceberg for what’s possible with GlobaliD and the entire suite of tools available for developers. That includes GlobaliD Groups for building communities as well as the GlobaliD Wallet for payments.

We’ve also been working with a handful of DeFi organizations to address some of their biggest challenges — all while maintaining the decentralized spirit of their platform.

That includes:

ID verification for compliance (Green DeFi or XeFi) Elimination of bots, scams, and bad actors on the network Protecting their master nodes and protocols

So stay tuned for more exciting releases in the very near future (including a noncustodial wallet) as we look to make self-sovereign identity more useful for more people.

Relevant:

How GlobaliD enhances trust for XRPL Labs’ XUMM Wallet

Trusted Airdrops with GlobaliD was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Infocert

Now available for consultation the Paper: “An integrated approach for electronic identification and central bank digital currencies”

The post Now available for consultation the Paper: “An integrated approach for electronic identification and central bank digital currencies” appeared first on InfoCert.

Recently has been published on Journal of Payments Strategy & Systems, vol. 15, n. 3, by Henry Stewart Publications, the paper written by four international experts. Luca Boldrin, Innovation Manager at InfoCert, is one of them.

Download the Paper for free

On Journal of Payments Strategy & Systems has been published the paper “An integrated approach for electronic identification and central bank digital currencies”.

This paper outlines a proposal for how to implement Central Bank Digital Currencies (CBDC) based on open banking standards and supports both account-based and token-based CBDC models, transacting online and offline with immediate finality, while recognising the European PSD2 requirements, including (multi-factor) strong customer authentication (SCA).

Authors: Michael Adams – Founder, Quali-Sign, UK Luca Boldrin – Innovation Manager, InfoCert, Italy Ralf Ohlhausen – Founder, PayPractice, Germany Eric Wagner – Group Product Owner Compliance Advanced Analytics, Erste Group Bank, Austria Download the Paper for free

The post Now available for consultation the Paper: “An integrated approach for electronic identification and central bank digital currencies” appeared first on InfoCert.


Indicio

Businesses’ desire to store less digital identity data stokes travel’s SSI brushfire

Biometric Update The post Businesses’ desire to store less digital identity data stokes travel’s SSI brushfire appeared first on Indicio Tech.

KuppingerCole

Integrated Risk Management Platforms

by Paul Fisher The KuppingerCole Market Compass Integrated Risk Management (IRM) focuses on platforms designed to manage risk within an organization including IT assets, business, vendor, and compliance-based risk.

by Paul Fisher

The KuppingerCole Market Compass Integrated Risk Management (IRM) focuses on platforms designed to manage risk within an organization including IT assets, business, vendor, and compliance-based risk.

IDnow

Becoming Compliant in Ontario’s iGaming Market

The online sports betting market in Ontario, Canada is seeing rapid expansion after lawmakers passed a bill (C-218) in June 2021, removing the prohibition of single-game betting in Canada.   This bill allows players to make single bets, rather than the previously allowed combination-only betting.   The result of this bill is hugely increased flexibility for both […]

The online sports betting market in Ontario, Canada is seeing rapid expansion after lawmakers passed a bill (C-218) in June 2021, removing the prohibition of single-game betting in Canada.  

This bill allows players to make single bets, rather than the previously allowed combination-only betting.  

The result of this bill is hugely increased flexibility for both operators and bettors, as single bets make up a sizable portion of gambling activity and are some of the simplest to both place and offer.  

Sports betting regulation in Canada comes down to local governments, with each province having authority over the regulations in their jurisdiction and Ontario is no different.  

Key iGaming Regulations in Ontario 

In July 2021, shortly after the passing of C-218, Ontario created iGaming Ontario (iGO) – a body to conduct and manage iGaming in Ontario, serving directly as a subsidiary of the pre-existing Alcohol and Gaming Commission of Ontario (AGCO).  

The aim of iGaming Ontario is to enter into agreements with operators, allowing them to offer online gaming experiences to residents of Ontario, so long as they are compliant with the new legal code and provincial law. One of the four iGO and AGCO’s market objectives is Consumer protection quoted on IGO website are 3 bullet points: 

Ensure safe and responsible play, and game integrity  Prevent underage access Ensure compliance by private operators with applicable laws, including compliance with anti-money laundering rules and regulations and compliance with relevant privacy and information security laws 

A full list of Registrar’s Standards for Gaming details what is expected of operators and what kind of rules and regulations are being enforced for iGaming in Ontario. These are largely based on existing standards of Ontario’s regulatory body, AGCO, and there are some key things to note for operators.  

Canada’s iGaming sector is also subject to the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (PCMLTFA), which is regulated by the Financial Transactions and Reports Analysis Centre (FINTRAC). Although, iGaming Ontario has established a strict set of regulations, which go beyond the basic FINTRAC requirements.  

Most importantly, there are a strict set of anti-money laundering (AML) regulations and know your customer (KYC) requirements. Operators are required to maintain internal AML operating procedures that comply with Canadian federal regulatory requirements, which include a wide range of rules and regulations, such as reporting suspicious transactions, prohibited players, and much more. 

Additionally, there are detailed KYC requirements in place, such as in-depth identity verification standards. This means players will have to provide official government documents, recent photos (or selfies), and liveness (such as a video) to verify their identity during the registration process.  

Becoming Compliant With Ontario’s iGaming Regulations 

Considering the depth and complexity of iGaming requirements in Ontario, entering the region as an operator may prove to be difficult. Operators will have to meet strict PCMLTFA and FINTRAC standards, as well as go above and beyond those to meet iGaming Ontario-specific regulations.  

Key barriers will also include meeting iGO’s AML and KYC program requirements, which could require operators to complete overhauls of data storage, tracking, and verification systems. This is something that has already been a hot topic around the world, with more and more regulators calling for stricter iGaming systems and regulations. A good example of this were the changes for operators in Germany which was challenging to say the least. Having experienced partners in a situation like this will be key to success. 

One solution to overcoming this is working with an identity verification partner like IDnow. As an identity verification specialist, IDnow has the experience and tools to aid operators in entering new markets. This is done with the help of Autoident – an AI-powered verification tool that quickly and securely verifies users identities, while being completely AML compliant.

Autoident is very well suited to the FINTRAC standard set by Ontario, already supporting verification processes that include Document verification, selfie and liveness detection, while also being backed by an experienced team with years of success and experience, globally in the iGaming industry.  

Of course, there are also other ways to become compliant with Ontario’s regulations, such as adjusting things like the onboarding process, looking at AML reporting requirements, and much more that is required by iGO. Depending on an operator’s current systems, this could be an expensive and time-consuming process.  

Whichever way operators choose to tackle Ontario’s regulatory climate, it will be a complicated and tough battle to become compliant in a reasonable timeframe.

However, utilizing an identity verification partner to help smoothen the bumps in the road and provide a guiding hand will prove to be an incredible advantage. KYC and AML should not be a burden but an enabler for your success. 

By

Roger Redfearn-Tyrzyk
Director Global Gambling & Sales UK at IDnow
Connect with Roger on LinkedIn


Coinfirm

Coinfirm Welcomes Senior Leadership Additions and Changes

London, 12 January 2021 – 2021 was an astounding year of growth for Coinfirm and we kick off 2022 with a bang! The firm is excited to have new faces join the organisation’s senior leadership team in the effort to create a safer blockchain economy.  Coinfirm welcomes Vincent van Maasdijk and Ovi Berindea to the...
London, 12 January 2021 – 2021 was an astounding year of growth for Coinfirm and we kick off 2022 with a bang! The firm is excited to have new faces join the organisation’s senior leadership team in the effort to create a safer blockchain economy.  Coinfirm welcomes Vincent van Maasdijk and Ovi Berindea to the...

Indicio

Indicio CMO and VP of Governance Join Leadership of Linux Foundation Projects

The post Indicio CMO and VP of Governance Join Leadership of Linux Foundation Projects appeared first on Indicio Tech.
Helen Garneau joins the Hyperledger Foundation as Vice Chair of the Marketing Committee and Trevor Butterworth joins the Steering Committee of the Covid Credentials Initiative.

By Tim Spring

The Hyperledger Foundation, an open source collaborative effort created to advance cross-industry blockchain technologies hosted at the Linux Foundation, has welcomed Helen Garneau, Chief Marketing Officer of Indicio as Vice Chair of its Marketing Committee.

This is the second leadership role for Indicio in recent months at a Linux Foundation Project. In December, Vice President of Governance, Trevor Butterworth, was elected to the Steering Committee for the COVID Credentials Initiative (CCI), a part of Linux Foundation Public Health (LFPH), the leading organization for hosting and nurturing open-source technology for the benefit of public health initiatives.

Garneau has an extensive track record in implementing successful marketing strategies for a range of Hyperledger Foundation projects, including Hyperledger Indy, Hyperledger Aries, and Hyperledger Ursa. She also played a key role in project launches and the growth of Hyperledger communities.

As Vice Chair of the Marketing Committee, Garneau is committed to broadening Hyperledger’s membership base during her one-year term.

“It is an honor to be named vice-chair of the Hyperledger Foundation Marketing Committee alongside Chair, Duncan Johnston-Watt, CEO & Co-founder at Blockchain Technology Partners (BTP),” said Garneau. “One of the things I enjoy most  in the Hyperledger community has been seeing the technology, use cases, and contributions from other members and companies. The dollars that companies put into their Hyperledger Foundation membership are valuable and I’ve seen the return in business growth first hand. The Hyperledger marketing team does an incredible job and I look forward to helping articulate that value proposition in my role as Vice-Chair.”

Butterworth was already involved in Linux Foundation Public Health as Chair of the Steering Group for the Cardea Project, an open-source ecosystem built on Hyperledger Indy  and Hyperledger Aries for digital health credentials developed by Indicio and SITA and donated to LFPH.

“At Indicio, we are at the front end of building and implementing digital health credentials with commercial partners—and our success in this area, in open-sourcing the results, and in creating a community of developers to extend functionality is a win for both the Hyperledger Foundation and Linux Foundation Public Health,” said Butterworth. “Joining the Covid Credentials Initiative will help us learn from its experts, build better products, and drive adoption of technology that is sustainable and solves fundamental problems in privacy and security.”

Indicio is both a member and active contributor to the Hyperledger Foundation and Linux Foundation Public Health and has taken a significant leadership role in the open sourcing of a number of technologies. Among others, Indicio has helped lead the development and contributions to several projects at Linux Foundation including:

Hyperledger Indy Node Monitor (August 2020): A set of tools for monitoring the status of Hyperledger Indy networks. Hyperledger Aries Mediator (Dec 2020): An essential cloud-based connection point to send and receive requests between mobile holders and issuers and verifiers Hyperledger Aries Bifold (Feb 2021): An open source digital wallet for the exchange of verifiable digital credentials LFPH, Cardea (May 2021): A complete open-source ecosystem for the exchange of privacy-preserving digital health credentials

Indicio is committed to supporting the Hyperledger Foundation and Linux Foundation Public Health, and the open-source technology movement as we advance decentralized identity solutions around the globe.

The post Indicio CMO and VP of Governance Join Leadership of Linux Foundation Projects appeared first on Indicio Tech.


Shyft Network

Malcolm Wright Joins Shyft Network as Head of Strategy for its Global Regulatory & Compliance…

Malcolm Wright Joins Shyft Network as Head of Strategy for its Global Regulatory & Compliance Solutions BitMEX’s former Chief Compliance Officer Malcolm Wright will oversee the expansion of Veriscope, Shyft’s FATF Travel Rule solution & Travel Rule interoperability tooling, along with strategy and growth for upcoming core public network products including institutional DeFi, NFT’s & T
Malcolm Wright Joins Shyft Network as Head of Strategy for its Global Regulatory & Compliance Solutions BitMEX’s former Chief Compliance Officer Malcolm Wright will oversee the expansion of Veriscope, Shyft’s FATF Travel Rule solution & Travel Rule interoperability tooling, along with strategy and growth for upcoming core public network products including institutional DeFi, NFT’s & The Shyft Network’s DAO.

Malcolm Wright, a veteran of the blockchain industry is departing BitMEX as the Chief Compliance Officer to join Shyft Core, the core development team of the Shyft Network ($SHFT). Malcolm, whose experience has spanned over three decades across strategy, technology, operations, and risk and compliance will join Shyft Network to lead Veriscope while also acting as the Head of Strategy for Global Regulatory & Compliance Solutions.

He will be responsible for the expansion of Veriscope, the open source Travel Rule protocol built on the Shyft network, for leading strategy and growth for upcoming ecosystem infrastructure that includes Shyft’s Cross-chain Institutional DeFi System & the Shyft DAO’s regulatory and compliance solutions, and more.

BitMEX to Shyft

Malcolm is joining the Shyft Network Core team from BitMEX, one of the world’s leading crypto derivatives exchanges. Malcolm has been instrumental in global policy work and advocacy on behalf of the digital asset ecosystem, and is the founder of InnoFi Advisory, a consulting firm supporting innovation and growth in DeFi, DAOs, and NFTs.

Malcolm is currently the co-lead of the anti-money laundering (AML) working group at Global Digital Finance (GDF), a virtual asset industry body that supports policymakers to develop supportive legislation to assist the industry in its goal to understand and comply with regulation, and was instrumental in the development of open data sharing standard requirements such as IVMS101.

Malcolm said “I have long supported Shyft’s approach to resolving the FATF Travel Rule as one that provides the least friction for crypto firms’ customers whilst maintaining the decentralized community-driven ethos that underpins the Blockchain and crypto industries.

“With DeFI, DAOs, and NFTs reaching critical mass over the past 12 months, Shyft is trailblazing the way in addressing multiple challenges for financial services firms and governments that go far beyond Travel Rule. This is very much aligned to my own desire to provide the governance foundations upon which responsible innovators can design impactful, transformational, successful solutions.”

BitMEX Compliance Chief Malcolm Wright Jumps to Shyft Network (coindesk.com)

Shared beliefs

Joseph Weinberg, Co-founder of Shyft Network said: “Over the last 3+ years a group of incredible individuals have been working to guide policy and educate regulators across this space. Malcolm and I were among several of that group who share a belief that decentralization and maintaining the ecosystem is critical to its foundational success.

“As Malcolm and the policy side of Shyft had already been working together, both in his capacity at BitMEX, and through policy development across regulators, bringing on Malcolm to drive this next phase of institutional development & growth was a 1+1 = 6 equation.
“We began working on Shyft Network & its public infrastructure out of a belief that we can apply the open innovations built from within the ecosystem to bridge the gap between the traditional world’s regulatory & compliance requirements, identity & decentralized economies without eroding privacy, decentralization & openness.
“As these two worlds collide and we see rapid acceleration in both regulatory requirements, as well as adoption, we also need the best and brightest people who continue to bring balance for the ecosystem to be driving critical infrastructure like Veriscope & other public utilities for the new market participants who need identity-based compliance capabilities to operate across all verticals. Malcolm’s deep expertise across the space, from policy makers, to Open Standards, as well as his vast experience prior to entering crypto, makes him an excellent person to lead all compliance related aspects of the public utility smart contracts that the Shyft Core team developed for the Shyft network.”
Leading role

In this role, Malcolm will be tasked with leading the Veriscope teams both inside of Shyft Core, and across all VASPs & Travel rule solutions that are globally utilizing or contributing to the public network, while also managing the open governance task force responsible for proposing policy changes across the counterparty processes that are universal to Veriscope through the Shyft Network DAO.

Along with this, Malcolm will lead all regulatory compliance & institutional strategy for upcoming extension systems of Veriscope that includes universal cross chain layers for Institutional Defi, NFT’s, DAO’s and more.

As Shyft Network is an open source project, and it’s infrastructure is being deployed to open source, Malcolm’s work will focus across regulators, institutions currently joining Veriscope, the Shyft DAO & continued open standards development across the cryptocurrency ecosystem.

Malcolm will be tasked with designing and documenting the frameworks and critical policy requirements across Shyft Network’s products and maintain his already impressive status as an industry thought leader recognized by virtual asset service providers (VASPs) and financial institutions.

On behalf of the Core team, the Shyft Community, Veriscope’s open Network of VASPs& institutions, and all developers building atop of Shyft Network, We are excited to welcome Malcolm to the Community

About Shyft Network

The Shyft Network aggregates and embeds trust into data stored on public and private ecosystems, allowing an opt-in compliance layer across all systems. The key pillar for Shyft is user consent, allowing users to track the usage of their data. Therefore, no one can use personal data without consent from the owner. Shift Network allows and gives incentives to individuals and enterprises to work together to add context to data, unlocking the ability to build authentic digital reputation, identity, and credibility frameworks.

Website / Telegram / Twitter / MediumDiscord


Coinfirm

Meet the Team: Jagna Nieuważny

皆さんこんにちは! Hi everyone! I’m Jagna Nieuważny and I have been at Coinfirm for almost 4 months! Those months have proved a steep learning curve as before joining the team I possessed only a very general knowledge of how blockchains work and an even fuzzier knowledge of crypto markets and regulations. But thanks to the help...
皆さんこんにちは! Hi everyone! I’m Jagna Nieuważny and I have been at Coinfirm for almost 4 months! Those months have proved a steep learning curve as before joining the team I possessed only a very general knowledge of how blockchains work and an even fuzzier knowledge of crypto markets and regulations. But thanks to the help...

auth0

Auth0 Public Sector Index Shows Governments Struggle to Provide Trustworthy Online Citizen Services

New global report highlights how identity is foundational in helping governments improve existing services, and launch new ones faster and more securely
New global report highlights how identity is foundational in helping governments improve existing services, and launch new ones faster and more securely

Torus (Web3 DKMS)

Web3Auth raises $13M Series A to drive mass adoption on Web3 applications and wallets via simple…

Web3Auth raises $13M Series A to drive mass adoption on Web3 applications and wallets via simple, non-custodial authentication infrastructure Led by Sequoia Capital India with participants such as Union Square Ventures and Multicoin Capital, the Series A funding round advances our mission to eliminate seed phrases and make blockchain authentication decentralized and accessible to all. We ar
Web3Auth raises $13M Series A to drive mass adoption on Web3 applications and wallets via simple, non-custodial authentication infrastructure Led by Sequoia Capital India with participants such as Union Square Ventures and Multicoin Capital, the Series A funding round advances our mission to eliminate seed phrases and make blockchain authentication decentralized and accessible to all.

We are excited to announce that we have raised $13M led by Sequoia Capital India, Union Square Ventures, Multicoin and others, to drive mass adoption in Web3 through simple, secure, and non-custodial auth infrastructure for apps and wallets. The round follows demonstrable success, with our infrastructure securing over 1M monthly users and 8M keys, on wallets and applications such as Binance Extension Wallet, Ubisoft, Kukai, Skyweaver, and more. We are also proud to reveal Web3Auth, which kickstarts the decentralization of the Torus Network.

The Series A was raised with additional participation from FTX, Bitcoin.com, DARMA Capital, Chainstry, Hash, KOSMOS Capital, Kyros Ventures, LD Capital, Minted Labs, P2P Capital, Phoenix VC, Staking Facilities, YBB Capital, Moonwhale Ventures, and Decentralab.

For those who don’t know us, the Torus team (now specifically more known as Web3Auth) have been working on improving onboarding and non-custodial key management in the crypto space. A critical problem that quantifiably has contributed to the loss of nearly 20% of all Bitcoin in circulation. Fear of loss and unfamiliarity of keys/seed phrases presents a significant barrier to entry that deters non-technical users from exploring Web3 and participating in the ecosystem.

Introducing Web3Auth

Web3Auth is simple, non-custodial auth infrastructure for Web3 wallets and applications. For your crypto native users Web3Auth makes them feel right at home, allowing them to connect or use their key management of choice, be it Metamask, Phantom, Ledger and so on. For new users developers can use Web3Auth to build intuitive login flows, such as single sign-on with Google and Twitter, eliminating the need for users to directly interface with vulnerable public-private key pairs.

Under the hood, Web3Auth’s mainstream UX is powered by the Torus Network (which Torus now refers exclusively to), it enables users to use multiple factors to access their wallet, leveraging on existing things a user has to secure these factors. These include their social login, existing devices, backups/passwords and in the future even family. This immediately improves on the existing key management standard of seed phrases, which is essentially a single point of failure.

For those who have known Torus before Web3Auth, Web3Auth consolidates our existing products, and draws a clear distinction between the products we build and the underlying infrastructure, allowing us to take further steps towards further decentralizing this infrastructure. Moving forward, the Torus Network will refer to the underlying infrastructure that supports our products, and Web3Auth will refer to the one stop pluggable product that Web3 apps/wallets know & love.

Developers can integrate Web3Auth into their application just with a couple lines of code and allow users to connect to any blockchain. And can start here.

Apart from the initial connection, Web3Auth also provides:

Customizable UI / even fully whitelabel Plugging in your own existing auth and user base Fiat-to-crypto aggregator, so users can also choose the best fiat-to-crypto provider for your application and the user’s region. Incremental security and better redundancy, which drastically reduces support tickets for mainstream users. Broad platform support, including SDKs for iOS, Android, React Native and Unity. What will happen to existing users/developers and the Torus Network?

Existing users and developers on our current products (i.e. CustomAuth, OpenLogin, Torus Wallet) do not have to do anything, and they can be rest assured that these products will continue to be developed and supported as they are used as components of Web3Auth. No action is required from developers or end-users.

We will continue working on the underlying infrastructure to improve efficiency and security, but the priority for us is to further decentralize the Torus Network. This means major improvements to the capability for new nodes to join and leave the Torus Network, as well as additional features that formalize the interactions between dApps and nodes. Stay tuned for more updates on this over the next few months!

As usual, we’re always hiring and if you’re interested in solving some of the hardest problems in this industry, we welcome you to apply. If you’re an application or wallet looking to use Web3Auth, reach out to us at hello@web3auth.io, and let’s discuss!

Web3Auth raises $13M Series A to drive mass adoption on Web3 applications and wallets via simple… was originally published in Web3Auth on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Weekly Report (January 1–10, 2022)

Highlights The Ontology 2022 Roadmap has been officially released. Our goal for 2022 is to make the Ontology high-speed, low-cost, public blockchain, the blockchain of choice for Web3 applications. We are working hard to develop a more diverse and prosperous Ontology ecosystem. Ontology’s efforts in creating interoperability between the different public chains, including Ethereum with the upcomin
Highlights

The Ontology 2022 Roadmap has been officially released. Our goal for 2022 is to make the Ontology high-speed, low-cost, public blockchain, the blockchain of choice for Web3 applications. We are working hard to develop a more diverse and prosperous Ontology ecosystem. Ontology’s efforts in creating interoperability between the different public chains, including Ethereum with the upcoming Ontology Ethereum Virtual Machine (EVM) launch, will provide the opportunity to incubate more high-quality Web3 applications in the new year, across a multitude of industries and services. Ontology will continue to vigorously develop ONT ID, ONTO, and other products, making Web3 available to everyone. Ontology is proud to be part of this rapidly changing landscape and is excited to share more updates in the future. Here’s to another year exploring the future together.

Latest Developments Development Progress We have completed the launch of Ontology’s EVM TestNet. Developers are welcome to conduct various application tests. Related documents can be obtained at the Ontology Developer Center. We are 100% done with the survey on the improvement of the ONT and ONG decimals and we are 75% done with testing. We are 72% done with the survey on the RISCV’s simplified instruction set architecture. We are 53% done with the truebit challenge design survey. We are 22% done with the Rollup VM design. Product Development ONTO App v4.1.5 was released, bringing support for Avalanche and NFTs on OEC, as well as FIO address registration and binding. ONTO is hosting its very own NFT ONTOvaganza. This event rewards NFT holders from previous ONTO events with more NFTs. Each NFT comes with a different ONG airdrop quota and rights. Top winners can get up to 1,000 ONG from this event. The event is in progress, and the claiming of NFTs will end at 21:00 (Singapore time) on January 15th. Don’t forget to open ONTO and claim your NFT! ONTO is hosting a campaign with Parking Infinity. Participants who complete related tasks have the opportunity to earn rewards. The event is in progress, with more than 1,100 participants. ONTO hosted an NFT Giveaway campaign with MetaYoka. Participants who completed related tasks had the opportunity to earn NFT rewards. The event has successfully concluded, with more than 2,000 participants. ONTO hosted a campaign with HyperJump. Participants who completed all the tasks had the opportunity to earn rewards. The event has successfully concluded, with more than 1,000 participants. On-Chain Activity 123 total dApps on MainNet as of January 10th, 2022. 6,856,727 total dApp-related transactions on MainNet, an increase of 18,420 from last week. 16,928,849 total transactions on MainNet, an increase of 75,643 from last week. Community Growth 5,539 new members joined our global community this week. Over the past four years, Ontology Harbingers have always been our most dedicated supporters, driving the growth of the global community. We welcome more new members to join us to experience and promote a more decentralized future together with Ontology. We held our weekly Discord Community Call, with the theme “EVMs”. The Ontology EVM will be launched in Q1 2022. The EVM aims to establish seamless interoperability between Ontology and the Ethereum platform, and offer an inclusive experience to developers and users. The new VM will join a suite of existing VMs, which include the Native (Ontology) VM, NeoVM, and WasmVM. Community members believe that this is a crucial step in the realization of the Ontology Multi-VM. It adds to the opportunities provided by the WasmVM, which allows more traditional developers to deploy apps on the blockchain. We held our Telegram weekly Community Discussion, led by Benny, an Ontology Harbinger from our Asian community. He talked with community members about ENS, including its application scenarios, how to participate in the ecology, and more. Community members discussed the differences between Ontology DID and ENS. For example, ENS currently focuses more on associated addresses, while DID focuses on a wider range of information and is more inclined to personal data. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates. Ontology in the Media

Gizmodo — 《What Is Web3 and Why Should You Care?

“In recent months, you may have come across a phrase growing in popularity: Web3. You might be wondering what it is, what it will mean for the future, and how exactly the third-generation internet differs from the first two. And now the dawn of Web3 is upon us. People define it in a few different ways, but at its core is the idea of decentralization, which we’ve seen with cryptocurrencies (key drivers of Web3). Key to this decentralization is blockchain technology, which creates publicly visible and verifiable ledgers of record that can be accessed by anyone, anywhere. The blockchain already underpins Bitcoin and other cryptocurrencies, as well as a number of fledging technologies, and it’s tightly interwoven into the future vision of everything that Web3 promises. The idea is that everything you do, from shopping to social media, is handled through the same secure processes, with both more privacy and more transparency baked in.”

Ontology has worked on bringing trust, privacy, and security to Web3 through providing decentralized identity (DID) and data solutions. From the very inception of the project, the aim has been to build the infrastructure required for users to interact with this new generation of decentralized applications, and that aim has not changed. 2021 draws to a close and 2022 promises to bring us closer to Web3 than ever before. Four years of work has positioned Ontology as a leader in the space, through the development of products focused on DID and data, and by taking a Multi-VM approach.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Ontology Weekly Report (January 1–10, 2022) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

OceanDAO Round 13 Results

Below you can find the conclusion of our latest round of funding. OceanDAO Grants Hello, Ocean Community! The OceanDAO is pleased to share the results of the 13th round of our community grants initiative: A total of 200,000 OCEAN was available. Conversion rate of 0.72 OCEAN/USD. A final amount of 144,000 USD was available for grants. All funds that were not granted will be recyc

Below you can find the conclusion of our latest round of funding.

OceanDAO Grants

Hello, Ocean Community!

The OceanDAO is pleased to share the results of the 13th round of our community grants initiative:

A total of 200,000 OCEAN was available. Conversion rate of 0.72 OCEAN/USD. A final amount of 144,000 USD was available for grants.

All funds that were not granted will be recycled back into the treasury as part of our initiatives to continue leveraging treasury funds into greater outcomes.

Round 13 included 17 first-time projects and 14 returning projects requesting follow-up funding.

200,000 $OCEAN has been granted 0 $OCEAN will be recycled back into the treasury

Congratulations to all grant recipients! These projects have received an OceanDAO grant in the form of $OCEAN tokens.

For the full vote results please see the Voting Page.

You can view the expanded proposal details on the Round 13 Ocean Port Forum!

OceanDAO Round 14 and announcements will be live shortly. Ocean Protocol is dedicated to ever-growing resources for continued growth, transparency, and decentralization. Keep an eye out on Twitter @oceanprotocol and our blog for the full announcement and new highlights.

For up-to-date information on getting started with OceanDAO, we invite you to get involved and learn more about Ocean’s community-curated funding on the OceanDAO website.

We encourage proposers who did not win to re-apply and for every participant to vote for projects who applied for a grant.

Thanks to all proposers, participants, and voters who engaged in Round 13!.

OceanDAO Round 13 Results

You can find the full overview on our Round 13 — Votes page.

Round 13 Rules

Proposals with 50% or more “Yes” Votes received a grant, until the “Total Round Funding Available” is depleted in descending number of votes received order.

Claiming your Grant

If your Proposal was voted to receive a grant, please note that starting with this round we implemented a new way for you to claim your grant.

Grant requests via request.finance have been deprecated.

You can find instructions on claiming your grant here.

There will still be a funding deadline of 2 weeks to claim your grant.

Funding Tiers — All other Categories (max per team):

New Project Funding Ceiling: $3,000 USD Requires: No one in your project has ever received a grant from OceanDAO. Open to all. Benefits: Earmarked. Receive feedback during the application process. Introduced to related projects.

2. Existing Project

Funding Ceiling: $10,000 USD Requires: You have completed 1 or more grants. Benefits: Same as above. Receive promotion via Newsletter, Twitter, and other channels.

3. Experienced Project

Funding Ceiling: $20,000 USD Requires: You have completed 2 or more grants.

4. Veteran Project

Funding Ceiling: $35,000 USD Requires: You have completed 5 or more grants

Earmarks

“Earmarks” means that there are funds available exclusively to the first three groups listed below, without having to compete as incentive to apply.

24,000 OCEAN for New Teams (non-outreach category) 12,000 OCEAN for New Teams (outreach category) 30,000 OCEAN for Core Tech Initiatives 134,000 OCEAN for remaining General Grants

The grant proposals from the snapshot ballot that met these criteria were selected to receive their $OCEAN Amount Requested to foster positive value creation for the overall Ocean ecosystem.

Voting opened on January 6th at 23:59 GMT Voting closed on January 10th at 23:59 GMT

Proposal Vote Results:

31 proposals submitted 18 funded or partially funded 126 Unique Wallets Voted 3,902,721.359 $OCEAN voted Yes on proposals 110,620.8438 $OCEAN voted No on proposals 4,013,342.2028 $OCEAN Tokens voted across all proposals (each OCEAN token was only able to be used once in voting) 200,000 $OCEAN has been granted 0 $OCEAN will be returned to the treasury Proposal Details

General Grants

Data Whale: Build and Launch a Data Token mobile application integrated with Ocean Protocol Market. Develop a greater awareness about the Ocean Data Market through a new website.

Evotegra: Extend the round 8 promotional 100k traffic dataset with annotations for licence plates and faces to enable the creation and evaluation of automated anonymization algorithms and conduct a data economy poll among members of the German AI association.

Algovera: is making it easier for data scientists to build data science apps on Ocean by integrating with existing frameworks, running weekly hacking sessions for app building and inviting grant proposals for data science projects on Ocean.

Lynx: is building a privacy-preserving backend data management system for real-time biometric user data collected through interfaces and wearables, with access to a marketplace of algorithms for analysing this data, combined with safe two-way interactions with third parties (from metaverse experiences to “digital medicine”) and the potential for tokenized rewards.

RugPullIndex is helping Ocean Protocol users to invest in data safely.

mPowered: Improving Search Engine Discoverability for the current Ocean Data Marketplace in order to easily find datasets and algorithms that have been published in main net or in bespoke Data Marketplaces that other teams have developed.

Ocean Missions is helping to onboard data scientists and developers to Ocean Protocol. We connect via outreach activities and guide them through the on-boarding process, creating data assets from on-chain data and publishing them to Ocean Market.

Coral Market is integrating automated decentralized file storage workflows for IPFS and Filecoin with Ocean Market data publishing features:

Athena Data Brokerage Project is building a data brokerage who’s primary objective is to drive inflows of data and datasets to the ocean platform through the development of partnerships with data producers and data scientists, while spreading awareness through professional networking and social media.

HealthClaims is building a real-time digital verification health record on OCEAN. An Intelligent Healthcare Platform that would eliminate medical prescription and insurance claim fraud with real-time digital verifications on chains.

VideoWiki is developing a video data union for stock media creatives that will be uploaded by creators to be sold to other educators to include in their videos. This project brings a decentralized approach to content ownership and sharing.

New Outreach

[The Data Economist] “The Ocean Minute” Video Series: an ongoing series of brief videos that expand awareness of Ocean Protocol by summarizing recent happenings in the Ocean Ecosystem, and engages the community to curate and consume content for these videos.

The Phoenix Guild: Increasing Diversity in the Ocean Ecosystem. The project aims at increasing women developer participation in the Ocean Ecosystem.

New Entrants

FitCoral is developing a framework for a data-staking mechanism for fitness data logged on a mobile application. This is a one-stop solution for fitness, wellbeing and community involvement on a platform that is devoid of the inconvenience offered by traditional web2 solutions.

DIAM is building a web-platform dedicated to building a decentralized marketplace for invisual art by leveraging Blockchain technology and by tokenizing data, thus allowing an open, collaborative, non-discriminating economic environment for all participants in the invisual art market.

FELToken is creating a decentralized and more secure solution for federated learning while anonymizing data providers. The platform will connect owners of the data with scientists to train their machine learning models while preserving the privacy of the data.

ITRMachines is developing a wrapper library that will allow a direct integration between the Ocean marketplace and the tensorflow.js library for artificial intelligence. The library will allow AI developers to quickly access datasets from the Ocean marketplace and integrate them using Tensorflow to implement their custom AI models in a seamless way.

DataBounty is building a platform that allows data buyers to quickly and efficiently find the data they need. This allows data buyers to get the data they need quickly and efficiently. DataBounty will be built on top of Ocean Protocol, so all data transactions are secure and transparent.

OceanDAO Ecosystem

Continue to support and track progress on all of the Grant Recipients inside of Ocean Pearl or view our Funding History page to learn more about the history of the grants program.

Much more to come — join our Town Halls to stay up to date and see you in Round 14. Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 13 Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Imageware

Biometric Security in Airports: How Will It Change the Airport Space?

With millions of people boarding flights every day, customer data and experiences are the most important elements to the travel industry. As airports increasingly ramp up to meet the digital age, they are more prone to cyberattacks and data breaches. One such incident occurred when Heathrow airport was fined £120,000 when they lost sensitive data […] The post Biometric Security in Airports: How

With millions of people boarding flights every day, customer data and experiences are the most important elements to the travel industry. As airports increasingly ramp up to meet the digital age, they are more prone to cyberattacks and data breaches. One such incident occurred when Heathrow airport was fined £120,000 when they lost sensitive data stored on a memory stick, which included the Queen’s travel history and passport numbers.

To prevent such data leaks in the future and avoid heavy fines, airports are constantly evolving and adapting biometric security in airports.

Why Are Biometrics Used in Airports?

The technology associated with air travel has advanced considerably in recent years. Biometrics for airports have provided passengers with a faster, safer end-to-end experience. It may go unnoticed at first but biometric technology is used widely in airports. Be it filling your visa application from the comfort of your home or automated barrier control at the airports. 

Biometrics for airports can provide a contactless experience using facial recognition for the ease of passengers and to meet growing concerns around health concerns with contact methods. Biometric technology can identify individuals, measuring dozens of physical attributes, such as fingerprint, voice recognition, and facial recognition, among others. 

Biometrics for airports can be used for security checks, luggage, boarding, and check-in facilities. Implementing biometrics at airports helps automate the verification process, and sharing a digital identity can expedite the immigration process.

Benefits of Using Biometric for Airports

Biometric authentication and its use cases in the travel industry and airports provide several advantages including:

High Security

Biometrics provide increased levels of security to the airlines that the person is truly who they claim to be. The identification is done by tangible and real-world traits that relate to the passenger. Nearly any username, passwords, and PINs can be compromised, and a data breach can leak sensitive information to the hackers. 

Improved User Experience

The internal implementation of biometric technology may be difficult, but it provides a smooth user experience to passengers at the airport. Placing a finger on a scanner and getting border access is much easier and fast than going through the long processes of physical verification. 

Non-Transferable and Unique

Every individual has a unique set of biometrics which are impossible to duplicate with current technology. Also, biometrics are not transferable. The only way to access biometric data at airports is through the physical application. 

4 Biometric Technologies Used at Airports

There are currently four biometric technologies being used at airports. These include:

Facial Recognition

Facial recognition identifies facial features such as distance between eyes, eye shape, nose shape, and more, through stored facial patterns and data to find a match. Facial recognition enhances security, is contactless, and is easy to integrate. 

Fingerprint Recognition

Fingerprint recognition automatically identifies someone based on a comparison between a stored and live fingerprint. Fingerprint recognition is one of the most popular biometrics and is being implemented at airports all over the world. 

Iris Recognition

Iris scanning or iris recognition is the process of using infrared light to take a high contrast image of your iris. Similar to fingerprints, iris patterns are unique to each individual and provide exceptional accuracy. Iris recognition is another contactless biometric method which is considered to be one of the most accurate.

Palm Recognition

Palm recognition evaluates the unique patterns and characteristics in the palms of an individual’s hands. The systems use a scanning device that processes data from a digital photo of an individual’s hand and compares it to the stored biometric template in the database. Many airports have started deploying palm recognition as a part of their authentication processes. 

Biometric Solutions for Airports with Imageware

Imageware develops identity solutions that can provide comprehensive and easy-to-use applications to create inviolable photo identification and access credentials. With Imageware’s solutions, you can strengthen your biometric security for airports by:

Tracking employee data Identify and verify credential holders through biometrics Access control systems for seamless flow

The post Biometric Security in Airports: How Will It Change the Airport Space? appeared first on Imageware.


Torus (Web3 DKMS)

How Binance Chain Extension Wallet Removes Seedphrases with Web3Auth

Web3Auth has allowed Binance Smart Chain users to seamlessly create and manage their extension wallets with their Social Accounts About Binance Chain Extension Wallet (BEW) Binance Chain Extension Wallet is the official Binance cryptocurrency wallet for accessing Binance Smart Chain, Binance Chain, and Ethereum. The extension allows users to securely store crypto and connect to thousan
Web3Auth has allowed Binance Smart Chain users to seamlessly create and manage their extension wallets with their Social Accounts About Binance Chain Extension Wallet (BEW)

Binance Chain Extension Wallet is the official Binance cryptocurrency wallet for accessing Binance Smart Chain, Binance Chain, and Ethereum. The extension allows users to securely store crypto and connect to thousands of projects across different blockchains. It currently supports Chrome, Firefox, and Brave.

The Story Behind the Integration

Key management is a long-standing problem in crypto. Users have lost crypto worth over $150bn (Worth more than even GDPs of several nations) due to inefficient management of their private keys.

The Binance Smart Chain (BSC) ecosystem knew how important it was to make key management and onboarding easy for mainstream crypto adoption. Especially with the increase in the adoption of decentralized apps, mismanagement of private keys was rampant leading to significant financial loss.

The search for a perfect non-custodial key management solution

To fix this — the BSC community started considering multiple private key management solutions. Few solutions had made significant progress in streamlining the user onboarding experience but had long been plagued by the trilemma of sacrificing security or redundancy for convenience. Many of these systems also had points of centralization that reduced censorship resistance.

They were then introduced to Web3Auth.

Onboarding thousands of users through Web3Auth

Binance chain Extension Wallet (BEW) introduced a new seamless and secure user onboarding process through Web3Auth in early 2021 and has since onboarded hundreds of thousands of users into the BEW wallet.

Designed with a simple and straightforward user experience for mainstream users, the onboarding includes OAuth logins, key management that reduces account loss while retaining the non-custodial guarantees that the decentralized ecosystem requires.

How Web3Auth handles users’ private keys

Web3Auth is a model of threshold key management that solves the trilemma without sacrificing user experience while retaining end-user autonomy and security. The model implements the Shamir Secret Shares (SSS) scheme using the user’s device, private input, and wallet service provider.

The user’s private key is split into multiple factors, and these split factors are used to reconstruct the original encrypted secret key. Like 2FA systems, as long as the user has access to 2 out of 3 of their factors, they will be able to retrieve their private key and gain access to the wallet.

BEW — Web3Auth pave the way for crypto’s future

BEW’s partnership with Web3Auth ensures that no user in the future will ever lose access to their valuable Cryptos just because they forgot or misplaced their private keys. This revolutionary approach paves the way for the mainstream adoption of cryptos across the industry. Reach out to Web3Auth to learn more about how you can integrate this seamless user onboarding experience into your application!

How Binance Chain Extension Wallet Removes Seedphrases with Web3Auth was originally published in Web3Auth on Medium, where people are continuing the conversation by highlighting and responding to this story.


How Kukai Reached a Million User Logins with Web3Auth SDK

Kukai: A go-to wallet for leading NFT projects on Tezos About Kukai Kukai is the longest-running web-based Tezos wallet. It allows you to manage your tokens, NFTs and digital assets in a single easy-to-use and secure platform. As a Tezos wallet, Kukai includes the ability to store, transfer and delegate your Tezos (XTZ) tokens, so you can easily receive XTZ rewards through staking. Kukai su
Kukai: A go-to wallet for leading NFT projects on Tezos About Kukai

Kukai is the longest-running web-based Tezos wallet. It allows you to manage your tokens, NFTs and digital assets in a single easy-to-use and secure platform. As a Tezos wallet, Kukai includes the ability to store, transfer and delegate your Tezos (XTZ) tokens, so you can easily receive XTZ rewards through staking. Kukai supports the fast-growing Tezos ecosystem with features like an NFT gallery, DeFi token support and more. Kukai has been pivotal to the success of Ubisoft, HicEtNunc, OneOf, and other notable NFT launches.

The Origin

The origin of Kukai goes back to 2018 when it was started as an open-source community project. The team spent the initial days figuring out the technical aspects and focused on security. Once the tech was sorted and the wallet started gaining traction — the team realized how the onboarding experience was integral to achieving their goals of acquiring millions of mainstream users.

That’s when Web3Auth came into the picture.

Deciding on Web3Auth

Kukai discovered Web3Auth at the perfect time when they decided to redesign the wallet user experience to drive user adoption. The ease of onboarding provided by Web3Auth’s social logins and key management solution made a lot of sense. The primary reasons they chose to integrate Web3Auth SDK were:

The Decentralised Non-custodial nature of the private key handling The Social logins to onboard blockchain non-native users The Web3Auth’s Lookup feature to send crypto/NFTs to socials directly The single-click setup, the documentation, and the dev support throughout

The Kukai team was evaluating other solutions as well but did not proceed with them because of their inflexibility, low maturity, and centralized method of managing keys.

“We have been so happy with Web3Auth, and whenever we do our regular reviews of other solutions we continue to be convinced we made the right choice for Kukai users”
— Klas Harrysson, Co-Founder, Kukai

Winning the NFT space with Web3Auth

Kukai took less than two weeks to integrate Web3Auth SDK. The detailed documentation and the code examples made the integration process straightforward and easy to complete. With Web3Auth developer support readily available, our engineers worked closely with the Kukai team to build the final product within a short developmental time frame.

It’s been one year since Kukai launched with Web3Auth handling their authentication and key management, and the SDK has processed more than a million social logins thus far.

“We never faced any roadblocks during integration because the Web3Auth team was always available at every step”
— Klas Harrysson, Co-Founder, Kukai

Kukai has onboarded hundreds of thousands of non-native crypto users with ease and they devote the success to the Web3Auth SDK which has made their app’s onboarding look like any other web2 app’s. The Tezos community loves Kukai and Web3Auth is excited to support the Tezos ecosystem in their Kukai 2.0 and the upcoming mobile wallet where they expect to multiply their user-base threefold.

How Kukai Reached a Million User Logins with Web3Auth SDK was originally published in Web3Auth on Medium, where people are continuing the conversation by highlighting and responding to this story.


Let's Talk about Digital Identity

Nat Sakimura delves into Financial-Grade API (FAPI) – Podcast Episode 59

Let's talk about digital identity with Nat Sakimura, Chairman at the OpenID Foundation. In episode 59, Nat returns to the podcast to discuss Financial-Grade API (FAPI), the base security protocol for UK Open Banking, Australian Consumer Data Standard, Brazil's Open Banking. He discusses why and how FAPI was formed; what exactly FAPI is – including technical characteristics; how FAPI is used to
Let’s talk about digital identity with Nat Sakimura, Chairman at the OpenID Foundation.

In episode 59, Nat returns to the podcast to discuss Financial-Grade API (FAPI), the base security protocol for UK Open Banking, Australian Consumer Data Standard, Brazil’s Open Banking. He discusses why and how FAPI was formed; what exactly FAPI is – including technical characteristics; how FAPI is used today; and future plans for the specification – as well as how it connects to GAIN.

[Transcript below]

“The data economy needs a secure and interoperable data network. And we are finally getting there with FAPI and eKYC standards. So, you guys need to get ready for the ride. It’s the time. You need to start acting, start preparing for that.”

Nat Sakimura is a well-known identity and privacy standardisation architect and the representative partner of NAT Consulting. Besides being an author/editor of such widely used standards as OpenID Connect, FAPI, JWT (RFC7519), JWS (RFC7515), OAuth PKCE (RFC7636) ISO/IEC 29184, ISO/IEC 29100 Amd.1, he helps communities to organise themselves to realise the ideas around identity and privacy.

As the chairman of the board of the OpenID Foundation, he streamlined the process, bolstered the IPR management, and greatly expanded the breadth of the Foundation spanning over 10 working groups whose members include large internet services, mobile operators, financial institutions, governments, etc. He is also active in the public policy space. He has been serving in various committees in the Japanese government, including the Study Group on the Platform Services of the Ministry of Internal Affairs and Communications and the Study Group on the competition in Digital Market of the Fair Trade Commission of Japan.

Find Nat on Twitter @_nat_en and LinkedIn.

Nat also appeared in episode 54 of Let’s Talk About Digital Identity, discussing how OpenID Connect took over the world.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

 

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello and welcome to the first episode of Let’s Talk About Digital Identity for this New Year 2022. And we have a very special guest who has been in very short clips in an episode we had in October, very recently in October. A very special episode, a storytelling episode called How OpenID Connect took over the World, and today’s guest was there. We are talking about our super special guest called Nat Sakimura, one of the creators of the OpenID Connect standard.

Nat Sakimura is a well-known identity and privacy standardisation architect and a representative partner of NAT Consulting. Besides being an author and editor of such widely standards such as the OpenID Connect, FAPI, JWT, OAuth PKCE among others, he helps communities to organise themselves to realise the ideas around identity and privacy. As the chairman of the board of the OpenID Foundation, he streamlined the process, bolstered the IPR management, and greatly expanded the breadth of the Foundation spanning over 10 working groups whose members included large internet services, mobile operators, financial institutions, government, etc. He has been serving in various committees in the Japanese government, including a Study Group on the Platform Services of the Ministry of Internal Affairs and Communications and a Study Group on the competition in Digital Market of the Fair-Trade Commission of Japan.

Hello, Nat.

Nat Sakimura: Hi, Oscar. Thanks for inviting me.

Oscar: Welcome. It’s a great pleasure talking with you, Nat. And well, Happy New Year. And let’s talk about digital identity.

Nat: Likewise, yeah.

Oscar: Fantastic. And I think we know a lot of your involvement, of course, you are leading one of the most important standardisation organisations in the digital space. We’d like to hear a bit about yourself, something personal, of course, how life led you to this world of digital identity.

Nat: OK. So you wanted to know, how was my journey to the world of identity?

Oscar: Yes.

Nat: Yeah. There are three reasons. They all converge to identity. In the middle 1990s, I was working on VPN as an alternative way to secure communications than having dedicated leased lines. And at that time, we were using dedicated leased lines in the financial institutions. And that actually amounts to the identification and the authentication of machines and people as well as integrity protection and encryption of the communication channel. So there you have identity.

Then also, the second reason was that I found in the hard way that access to our personal data is not granted. My daughter had a failed surgery, she was three years old then, and for re-operation I needed to access the medical record quickly but it was not granted. So it was not useful for us. That was a surprise for me. We didn’t have the right nor technology in place to access our own data in a meaningful timeframe. So I started on the combination, the identity and privacy then.

The third one was the research commissioned by the Japanese Postal Agency. It’s been subsequently privatised, but it was still a government agency at the time, on the future of mails. And as part of the research, I got acquainted with Drummond Reed, who dragged me into the standardisation part of the identity arena.

All of these combined paved the way for me to get into the digital identity standardisation work.

Oscar: I see, it’s very interesting you mentioned these three points. One very technical, solving the security problem so that those time- the VPNs in ‘90s, 1990s. Then you needed to access personal data of your daughter, right? And even though I’m sure it was stored somewhere, but you were not able to access it in a timely manner, as you said so.

Nat: Right.

Oscar: Yeah. And then I think the opportunity came, as you mentioned in the Japanese Postal Agency to start working on the standard. So, yeah, super interesting. And we have talked recently and you say you wanted to talk about FAPI, one of the… yeah, one of the main – well children of OpenID Foundation. We have interviewed also Don Thibeau a bit more than one year ago. He a little bit talked about that standard. He has been very enthusiastic about talking that standard and I’m sure you have much more to tell us. But please tell us what is the… what – where is the beginning of that? So what were the challenges that you were working on OpenID Foundation found, led that – led to the, yeah, to start working on the creation of FAPI?

Nat: Yeah. So, right after we have finished OpenID Connect in 2014, I think I have started working on FAPI- preparation for FAPI in 2015. OAuth and OpenID Connect are designed to scale from the point of view of security. If you use only the very basic, you can achieve entry-level security quite simply. Since you’re not using options, it was not that hard to achieve interoperability.

However, when you wanted to achieve more security, you had many options to choose from. And the different combinations not only led to different security properties, but also a lack of interoperability. We needed to bolt the options down so that we have known security properties as well as interoperability.

Oscar: Yeah, indeed, I think interoperability is a word that always come to any standard because there are so many different… yeah, so many possibilities can be done and so many different needs in different, yeah, different industries, different places in the world. Yeah. FAPI is Financial-grade API, correct? And I understand also that from the beginning, it has been, as the name says it’s for financial transactions. Please tell us a bit, what is FAPI in practice?

Nat: Yeah, the FAPI is a general purpose security profile of OAuth and OpenID Connect to protect APIs. It started off from financial use case, from which the letter F comes from. But later, it also made clear that it has general purpose. So we changed the name from Financial API to Financial-grade API. It can be used by for example, healthcare, transportation, and name any.

So there are two notable characteristics of FAPI. Number one is that all the communication is integrity protected, so that it’s tamper-evident. All messages are authenticated. And the second characteristic is that no bearer token is used.

Oscar: OK.

Nat: Yeah. So even if the token was stolen, it cannot be used. You know, the bearer token is like metro ticket or something like that. So if you drop it, and if somebody picks it up, it can be used, right? And instead, in the case of FAPI, we decided to use something called sender constrained tokens. Sender constrained token is much like international airline ticket, I mean the boarding pass. So when you use it, it’s got your name, and you need to show your passport, for example, or travel card so that the person at the entrance can match that you are the rightful user of that boarding pass. So that’s the sender constrained token. And in FAPI, we have stopped using bearer token completely and went all the way to the sender constrained tokens.

So another way to look at it is that it fulfils four authentication properties. In many cases, when we talk about these things, we kind of focus on the user authentication. But there are many other kinds of authentication, which is really important. The number one is strong server authentication. Number two, strong client authentication. Number three, strong message authentication. And these combined with strong user authentication, you will have a pretty complete security picture. So that’s the security part.

And also, we wanted to address the interoperability. So to improve our interoperability, we also needed to provide a testing framework. That is now provided by the OpenID Foundation as FAPI Certification. There are now over 150 implementations that are certified and number is increasing quite rapidly now.

Oscar: And there is some sort of self-certification, correct?

Nat: Yes, it’s a self-certification. So, the test suite is also available as open source. So you can, for example, use it for the continuous development – some banks are actually using it like that. But when you are done with the development, and you have passed all the tests, you can submit the result to OpenID Foundation, and then it will be published from the OpenID Foundation site, and they can be checked by your peers as well so that you know, you have a conformant implementation.

Oscar: Excellent. Yeah. One thing I was not aware now that I said just before asking you is FAPI’s Financial-grade API. And when I read and when I heard for the first time that term, to me sound like yeah, it’s financial, but now you clarified that initially it was Financial API but then now it’s financial-grade, so as secure as the financial transaction, I guess that’s the idea, right? So now it’s more general purpose in a very secure profile from OpenID Connect and OAuth 2, so that’s interesting.

Nat: Yeah, that’s correct. Yeah. I mean, so I think it was by 2017 or ‘18, we had a request from both the healthcare sector as well as the international travel sector, you know airlines, that they wanted to use it. But their request was to change the name because having labelled as financial, it’s going to be harder for them to actually make industry-wide adoption of it. So we kind of searched for the name. By then the acronym FAPI was quite popular so we didn’t want to change the acronym and we wanted to generalise the F part. And we couldn’t come up with anything better than financial-grade. But the intent is that it is for any industry, and it’s secure. That’s what we wanted to communicate.

Oscar: Yeah. Thank you. Thank you for that clarification. I think it’s very important also, for the ones who are not so familiar with the term yet, so yeah. And also, the fact you mentioned in the description that in FAPI, as a profile of OpenID Connect, you don’t – you get rid of the bearer token so that already tells a lot about how much secure it is per se, the bearer token is just – it’s not a possibility in this profile.

Nat: Yeah. So we have removed the bearer token, removed completely to the sender constrained token, as well as other communications even through the browser, is integrity protected. That’s not the case in the usual OAuth or OpenID Connect. So…

Oscar: Yeah, that is correct.

Nat: Yeah.

Oscar: Sounds very interesting. I have to play myself in the browser, what can I see, what I cannot see, so it’s super interesting.

Nat: So unless the sender and the receiver is using the encryption model, you can still see it, but you cannot tamper it.

Oscar: Oh, yes.

Nat: Yeah. Yeah.

Oscar: Exactly.

Nat:  It’s signed. Yeah.

Oscar: Exactly. Exactly. The integrity, you cannot, yeah. You cannot modify it. OK, excellent. Please tell us, yeah, you mentioned that initially it was for financial, then at least healthcare and aviation or transport were interested, what are the – nowadays, let’s say, what are the main use cases? Those are still the main use cases or it’s even broader?

Nat: Yeah, as I’ve said, it’s general-purpose, so it can be used for any use cases that require high level security. Having said that, since we have started off from the financial world, they are leading in the implementation. They had like two years of head start. So the most popular implementation scenario currently is banking. So like accessing banking data, or making payments, and things like that.

Oscar: OK. And I know, I read from some examples from FAPI, the documentation is available. They mentioned some has been already – FAPI has been made part of some, I don’t know standards or regulations in few different countries, including UK, open banking is the most known. Could you tell us about those? How has FAPI being implemented in different countries?

Nat: Yeah, so as you mentioned, the first mainstream usage was the UK Open Banking. Nine biggest banks were mandated by Competitive Market Authority, CMA, to open up the API and FAPI was chosen as the protocol to secure them. And I understand the first reason for that is to make banking data portable. But subsequently, PSD2 kicked in, and now it was expanded into the payment.

And then after that, Australian Consumer Data Standard adopted FAPI. So as the name suggests, it’s much wider than just banking. It’s for any consumer data costs. Again, they have started from banking, and if I remember correctly, they are now expanding into electricity and things like that.

And then Brazil followed the step. I actually had the opportunity to talk with the Vice Governor, I think of the Brazilian Central Bank in London, just before the COVID outbreak. And we agreed that adopting a standardised approach for the testing or certification coming together is going to be really, really vital for widespread adoption and quick adoption. So I think that was taken by Brazilian Central Bank and they are now mandating that to the banks and TPPs, they’re like parties in FinTechs.

And then more recently FDX in the United States announced its adoption, so did the Russian FinTech Association, they have been public. While FDX has made a written statement in the blog post on their website, in the case of the Russian FinTech Association, it’s still verbal, but they’ve been speaking about that in the conferences. So I think it’s, you know, public knowledge now.

Of course, there are a little bit of tweaks that are required for the Russian use cases. In the case of current FAPI, for the purpose of bolting down the options, we have specified a few encryption algorithms and signature algorithms to be used. But in the case of Russia, they need to use their own cost standard for those algorithms. So we need to expand it, modify it and expand it a bit. But that can be done as extension. We are talking about that right now.

And then most recently, we have started joint workshop with the Berlin Group, which is very prominent in continental Europe to seek a way to converge. So, it’s getting good traction now.

Oscar: Berlin Group, I’m not so familiar. It’s also into financial or what it covers, Berlin Group?

Nat: So Berlin Group is mainly… how can I put it? It’s an association of mainly banks. And then there are advisory groups made up which includes FinTechs and other industries as well. But although the name says Berlin, it’s not only Germany. It’s from many other countries as well. So it’s quite widespread in the continent of Europe. And also, I heard some of the Berlin Group banks are now implementing FAPI, or they have already implemented FAPI as well. So that’s an interesting development as well.

Oscar: And it’s also possible that some – in other countries, in different institutions they can implement independently, right, without having a regulation or a main association that – say it force them to use them.

Nat: Sure. Sure. It’s a standard and standards, unless required by regulations like in some other cases, it’s totally optional. It’s totally voluntary. But at the same time, you know, anybody can actually use it. So you don’t have to wait for regulators to come in, and mandating you to use it.

Oscar: So it’s very likely that we will hear in the next, well, in the new year and in the coming months, coming years more associations like this, or banking or even in other industries, more regulations in different other geographies that are embracing FAPI.

Nat: Yeah, so one of the reasons for FDX I understand to adopt FAPI is to move ahead. Well, I don’t know if that’s correct in other countries, but in many countries, if industries or industrial body actually self-regulate themselves, the government regulations do not kick in and you’ll get more freedom. So I think it’s actually better for the industry to adopt something like this without being mandated by the government.

Oscar: And what about GAIN, the project you are heavily involved also, a project that was released a few months ago that also requires GAIN, or – I’m sorry, FAPI. Is FAPI part of GAIN, or could you tell us?

Nat: So GAIN itself is technology neutral, so…

Oscar: OK.

Nat: It’s not mandating any particular technology. It’s open, but FAPI can also be used. And certainly, that is envisaged by some of the participants in GAIN as the first step. So certainly, you can use FAPI and there are people who are using FAPI in the GAIN POC that they’re self-organising right now.

Oscar: Excellent. And could you tell us – leaving a bit outside of the main topic of FAPI, but could you tell us a bit what was also coming from in GAIN, what is happening lately and what is coming in the next months?

Nat: So, there are two distinct paths, at least two, or maybe depending on how you count, it can be three. One is a technical working group, which is hosted by OpenID Foundation and the others are the legal and the business working groups, which are hosted by IIF and Open Identity Exchange. IIF stands for International Institute of Finance, right?

And as to the technical part, we are currently working on the legal agreement, a legal participation agreement so that we can create a safe space for all the participants whose taking part in GAIN POC. We’re in the final phase of drafting such a legal agreement, so that we can participate in POC feeling safe.

Oscar: Excellent. So yes, more… the POCs are coming for GAIN. Excellent. And tell us, core FAPI is the project that has ended, I know you are you’re working on that. So tell us what is coming next for the FAPI Working Group?

Nat: So FAPI Working Group is very far from being done, right? And we have ever more work to do. In the beginning, we didn’t create a profile for dynamic client registration, or consent or intent grant, and every jurisdiction says it slightly differently, but grant management will be core. We had enough work to work then as well. So we kind of postpone the work.

However, after UK, EU, Australia and Brazil went online, it became quite clear, painfully clear, that there are compatibility and security headaches because of no standardised way of doing it. So we decided to start work to address them. So the new technical specification called Grant Management for OAuth is being drafted. And it’s gone through the first implementers draft for OAuth successfully so that it can be implemented by implementers without fear of being sued or something like that. And people are just ironing out the kinks about the draft. We’ll probably need multiple rounds of these things to get to the finalisation but it’s a good start, right.

And then now we’re just starting, just in this, like, a month or so, we’ve been starting dynamic client registration part. It’s been done slightly differently in the UK, Australia and Brazil, as well. And, you know, we are gaining a lot of knowledge from that as well. So we are trying to codify the best practices about those. And hopefully, that’s going to be a very good guidance for other jurisdictions to follow.

Oscar: So that relates to the enrolment of customers or identity for customers.

Nat: Well, actually, no, that’s KYC, right. The dynamic client registration is so the end users uses software to interact with the, for example, banks, in the case of open banking, right? And this software actually needs to be registered to those banks. And we have to exchange the keys and things like that. And that can, of course, be done manually but that doesn’t scale. So we needed to have some kind of programmatical way of doing it. That’s called dynamic client registration. And again, we have a lot of options there. And we need to fold it down for the interoperability purpose and to have predictable security. So that’s what we are doing.

When it comes to the KYC part, KYC is a big field. And it’s a headache for all the financial institutions. It hasn’t been too effective to date, but we’re trying and one of the ways to help the pain there is to create a protocol which would make it easier and more secure to exchange those KYC data among the participants, like GAIN. And that’s not being worked on by FAPI Working Group, but in the adjacent working group called eKYC and IDA Working Group.

Actually, if you join the FAPI Working Group call without dropping off and calling another number or accessing new URL, you will be seamlessly taken to eKYC and IDA Working Group. It’s just happening at the same time, I mean at the same location, just they are connected to each other. And that working group is working on the data and metadata side. That is how to express verified attributes and how they were verified. And these attributes, in the first phase we are sorting out the natural persons, but in the coming days we will also be doing legal entities as well, just passing that this person has been verified as, for example, Nat Sakimura isn’t good enough, you really can’t trust it, right? You need more information and, for example, how it was verified, why it was verified, which evidence you have used, under which legal framework, and so on, so forth, and when.

So, these we call metadata needed to be sent with the data itself. And the specification called OIDC for identity assurance, or OpenID Connect for identity assurance, deals with it. And using that standard, you can get those data and metadata together in a signed format. So the combination of the standard, together with FAPI will be very, very powerful.

Oscar: Yeah, I can see it’s needed definitely. It’s powerful as you said. So yeah, fantastic job that the FAPI Working Group and this other adjacent group is doing together. Nat, just one final question, for all business leaders who are listening to this conversation we had, what is the one actionable idea that you would like them to write on their agendas today?

Nat: OK. So while it’s been talked a lot about – the data economy, that it will be over 5% of GDP, it’s yet to be seen. Why? The data economy needs a secure and interoperable data network. And we are finally getting there with FAPI and eKYC standards. So, you guys need to get ready for the ride. It’s the time. You need to start acting, start preparing for that.

Oscar: Yes. And that will affect all industries.

Nat: Yeah, it’s not only financial institutions. Yeah, it’s for everybody.

Oscar: Yeah, I couldn’t agree more. Thanks a lot Nat for this very interesting conversation. We had knowing a lot more about FAPI and all the great work you do. And please let us know for people who like to continue the conversation with you or know more about the work you are doing, what are the best ways to find you on the net, get in touch?

Nat: OK. So, I’m in LinkedIn. If you search Nat Sakimura, you can get there. I’m on YouTube as well, so you can get to me there as well, so is Twitter. And of course, if you’re technical, I highly recommend you to join FAPI Working Group so we can share the technical ideas in a safe IPR environment.

Oscar: Excellent. Again, thanks a lot Nat for this interview and all the best and Happy New Year!

Nat: Thanks, much obliged and likewise.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

[End of transcript]


Torus (Web3 DKMS)

Accelerating Blockchain Adoption Amongst Mainstream Users with Skyweaver

Sequence Wallet Implements Familiar Onboarding by Web3Auth Skyweaver — A Truly Immersive Game on the Blockchain Skyweaver’s launch brings an extremely well crafted and beautifully designed Trading NFT Game to the Ethereum blockchain. Built on Polygon’s L2, players can expect exhilarating gameplay and build up their decks as seamlessly as any other AAA Game titles currently out in the market
Sequence Wallet Implements Familiar Onboarding by Web3Auth Skyweaver — A Truly Immersive Game on the Blockchain

Skyweaver’s launch brings an extremely well crafted and beautifully designed Trading NFT Game to the Ethereum blockchain. Built on Polygon’s L2, players can expect exhilarating gameplay and build up their decks as seamlessly as any other AAA Game titles currently out in the market. The majority of blockchain jargon is abstracted away from mainstream users, and they are only introduced to NFT marketplaces when they are looking to trade their earned assets.

The game was years in the making, with over a year spent in meticulous beta testing, and for good reason. Horizon Blockchain Games, the team behind this masterpiece, wanted to ensure the best user experience for their players once they dived into Skyweaver’s enchanting universe. ​​To realise that vision, Horizon discovered Web3Auth and integrated the key management solution as one of the signing keys of the Sequence Multi-Key Wallet used by Skyweaver players. The combination of Web3Auth and Sequence’s Multi-Key Wallet gives players the best user experience, security and access control to their wallets.

Web3Auth Made User Onboarding Ridiculously Simple

“We onboard users via the Sequence Smart Wallet, a multi-key Web3 wallet that feels like Web2. Web3Auth’s Distributed Key Generation (DKG) powers the social login as one of those keys to make the login experience feel easy and familiar.”
— Peter Kieltyka, Chief Architect, Horizon Blockchain Games

Powered by Web3Auth, Sequence Wallet is able to provide familiar email passwordless or social logins via Google, Facebook, Discord, Twitch or AppleID to their user base. This greatly reduces user onboarding time, decreases support tickets needed to solve the confusion and lets users plunge straight into playing Skyweaver.

Don’t Lose Out on Users Because of Poor UX

The Sequence Wallet built on top of Web3Auth is one of our many clients that have incorporated user-friendly onboarding flows to accelerate blockchain adoption within mainstream audiences with reduced friction and gentler learning curves. Curious minds aligned with our vision can reach out to Web3Auth and explore the tools we have built to onboard the next billion users into decentralised ecosystems.

Accelerating Blockchain Adoption Amongst Mainstream Users with Skyweaver was originally published in Web3Auth on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

The Ultimate Guide to Risk-based Authentication | Ping Identity

Risk based authentication?otherwise known as RBA?is when an authentication system assesses the risk associated with each unique profile attempting to gain access to the network (or application). It analyzes the likelihood of an account compromise or other type of data breach with each login attempt, based not just who is trying to log in, but other information surrounding the circumstances of that

Risk based authentication?otherwise known as RBA?is when an authentication system assesses the risk associated with each unique profile attempting to gain access to the network (or application). It analyzes the likelihood of an account compromise or other type of data breach with each login attempt, based not just who is trying to log in, but other information surrounding the circumstances of that login attempt (more on that in a minute)..

 


Okta

Using Azure Cognitive Services in a .NET App

Azure Cognitive Services is a collection of cloud-based AI products from Microsoft Azure to add cognitive intelligence into your applications quickly. With Azure Cognitive Services, you can add AI capabilities using pre-trained models, so you don’t need machine learning or data science experience. Azure Cognitive Services has vision, speech, language, and decision-making services. In this artic

Azure Cognitive Services is a collection of cloud-based AI products from Microsoft Azure to add cognitive intelligence into your applications quickly. With Azure Cognitive Services, you can add AI capabilities using pre-trained models, so you don’t need machine learning or data science experience. Azure Cognitive Services has vision, speech, language, and decision-making services.

In this article, you will learn how to use the Vision Face API to perform facial analysis in a .NET MVC application and store user profile pictures in Azure Blob Container Storage. You’ll also authenticate with Okta and store user data as custom profile attributes.

At the end of this post, you’ll be able to upload a profile picture in your app and get information about image error conditions, such as when zero or more than one face is detected or when your facial features don’t match a new picture.

Table of Contents

Create a new ASP.NET project Secure your app with Okta View user details Edit your Okta profile Add Azure Storage Add Azure Cognitive Services Perform facial analysis Resetting profile pictures Learn More About Entity Framework Core, ASP.NET Core, and Okta

Prerequisites

Basic knowledge of C# .NET 5.0 runtime and SDK, which includes the .NET CLI Your favorite IDE that supports .NET projects, such as Visual Studio, Visual Studio for Mac, VS Code, or JetBrains Rider Okta CLI A Microsoft Azure Account (Azure free account) Create a new ASP.NET project

Let’s create a new ASP.NET project. We’ll use the .NET CLI to scaffold the project.

Run the following commands to create the solution, the project, and add the project to the solution.

dotnet new sln -n OktaProfilePicture dotnet new mvc --auth none --framework net5.0 -o OktaProfilePicture dotnet sln add ./OktaProfilePicture/OktaProfilePicture.csproj

Now you can open the project in your favorite IDE and run the app to ensure everything runs correctly. If you are using the command line to build and serve, use the following commands.

dotnet build dotnet run --project OktaProfilePicture Secure your app with Okta

You’ll use Okta to secure the application quickly with Okta SDKs, so you don’t have to spin up an identity provider and deal with the tricky details of authentication.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Web and press Enter.

Select Other. Then, change the Redirect URI to https://localhost:5001/authorization-code/callback and use https://localhost:5001/signout/callback for the Logout Redirect URI.

What does the Okta CLI do?

The Okta CLI will create an OIDC Web App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. You will see output like the following when it’s finished:

Okta application configuration has been written to: /path/to/app/.okta.env

Run cat .okta.env (or type .okta.env on Windows) to see the issuer and credentials for your app.

export OKTA_OAUTH2_ISSUER="https://dev-133337.okta.com/oauth2/default" export OKTA_OAUTH2_CLIENT_ID="0oab8eb55Kb9jdMIr5d6" export OKTA_OAUTH2_CLIENT_SECRET="NEVER-SHOW-SECRETS"

Your Okta domain is the first part of your issuer, before /oauth2/default.

NOTE: You can also use the Okta Admin Console to create your app. See Create a Web App for more information.

Okta CLI will configure a new OIDC application and save some secret properties in the solution folder in a file called .okta.env. We’ll use the values in the app.

Some of the values in the .okta.env are secret, and ideally, they should be in a trusted store or in Secrets Manager for local development. Since you’ll be making updates to configuration through this post, we’ll use App Settings so you can set up your configuration as you go along.

Open appsettings.Development.json to add a new section, Okta, with authentication details of your app as shown below. The Domain is the URL of the Issuer value without the path segments.

{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } }, "Okta": { "ClientId": "{clientId}", "ClientSecret": "{clientSecret}", "Domain": "" } }

Let’s add authentication to the app. Open a terminal window, then run the following command to add Okta’s authentication middleware package.

dotnet add OktaProfilePicture package Okta.AspNetCore --version 4.0.0

To set up authentication services and register Okta’s authentication middleware services, open Startup.cs and add the following code in the ConfigureServices method before the services.AddControllersWithViews();.

services.AddAuthentication(options => { options.DefaultAuthenticateScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultSignInScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultChallengeScheme = OktaDefaults.MvcAuthenticationScheme; }) .AddCookie() .AddOktaMvc(new OktaMvcOptions { OktaDomain = Configuration.GetValue<string>("Okta:Domain"), ClientId = Configuration.GetValue<string>("Okta:ClientId"), ClientSecret = Configuration.GetValue<string>("Okta:ClientSecret") });

Next, in the Configure() method, add a call to use authentication before adding authorization. Your code will look like this:

app.UseAuthentication(); app.UseAuthorization();

If your IDE is having trouble figuring out references, add the following using statements:

using Microsoft.AspNetCore.Authentication.Cookies; using Okta.AspNetCore;

Next, you’ll add a controller that handles login, logout, and protected data. Let’s add the methods for login and logout.

Create a controller named AccountController in the Controllers folder. Add the two methods to support login and logout. Your AccountController code will look like the following.

namespace OktaProfilePicture.Controllers { public class AccountController : Controller { public IActionResult LogIn() { if (!HttpContext.User.Identity.IsAuthenticated) { return Challenge(OktaDefaults.MvcAuthenticationScheme); } return RedirectToAction("Index", "Home"); } public IActionResult LogOut() { return new SignOutResult( new[] { OktaDefaults.MvcAuthenticationScheme, CookieAuthenticationDefaults.AuthenticationScheme, }, new AuthenticationProperties { RedirectUri = "/Home/" }); } } }

We’ll create an empty view for the Account controller as a placeholder. Create a new folder named Account in the Views directory. The Account folder is where the views to display and edit your profile will go. Add the file Profile.cshtml to the new Account folder, and paste in the following code.

@{ ViewData["Title"] = "User Profile"; } <h1>Your Profile</h1>

Next we can update the initial view to add login and logout. Open Views/Shared/_Layout.cshtml and add the following markup above the <ul class="navbar-nav flex-grow-1"> tag:

@if (User.Identity.IsAuthenticated) { <ul class="nav navbar-nav navbar-right"> <li><p class="navbar-text">Hello, @User.Identity.Name</p></li> @* <li><a class="nav-link" asp-controller="Account" asp-action="Profile" id="profile-button">Profile</a></li> *@ <li> <form class="form-inline" asp-controller="Account" asp-action="LogOut" method="post"> <button type="submit" class="nav-link btn btn-link text-dark" id="logout-button">Sign Out</button> </form> </li> </ul> } else { <a>@Html.ActionLink("Sign In", "LogIn", "Account")</a> }

The parent div for the element we added needs some updates to the styles. Replace the

<div class="navbar-collapse collapse d-sm-inline-flex justify-content-between">

with

<div class="navbar-collapse collapse d-sm-inline-flex flex-sm-row-reverse">

If you run the app now, you can log in and log out, but there’s nothing to see yet.

View user details

We’ll use the Okta .NET management SDK to get and update user data. Add the Nuget package by running the following command.

dotnet add OktaProfilePicture package Okta.Sdk --version 5.3.1

The Okta management SDK requires an API token that we manually generate. Log in to the Okta admin dashboard. Then navigate to Security > API > Tokens tab and press the Create Token button. Enter a name for the token (I will use “OktaProfilePicture”) and press the Create Token button to finish creating the token. Make sure you copy the token value because you won’t be able to view it again:

Now we can add the token to the settings file. Open appSettings.Development.json, and add a new property named ApiToken inside the Okta section. Add your token value. Your settings file should look like this.

{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } }, "Okta": { "ClientId": "{clientId}", "ClientSecret": "{clientSecret}", "Domain": "", "ApiToken": "{token}" } }

We want to add the Okta management SDK client to the Dependency Injection (DI) system to use it in the controller. Open Startup.cs and add the following code to ConfigureServices method before the method call services.AddControllersWithViews();.

services.AddSingleton((serviceProvider) => new OktaClient(new OktaClientConfiguration() { OktaDomain = Configuration.GetValue<string>("Okta:Domain"), Token = Configuration.GetValue<string>("Okta:ApiToken") }));

Add the following using statements if your IDE didn’t already add them for you.

using Okta.Sdk; using Okta.Sdk.Configuration;

Now we can use the service in the controller. Open Controllers/AccountController.cs and add a private field of type OktaClient and add a constructor to set the field to the AccountController class. The DI system injects an instance of OktaClient that you will use. Your code will look like this.

private readonly OktaClient _oktaClient; public AccountController(OktaClient oktaClient) { _oktaClient = oktaClient; }

In the AccountController class, create a private method to interact with Okta management SDK and retrieve user information.

private async Task<IUser> GetOktaUser() { var subject = HttpContext.User.Claims.First(claim => claim.Type == JwtRegisteredClaimNames.Sub).Value; return await _oktaClient.Users.GetUserAsync(subject); }

If your IDE is confused about which packages to use, add the following using statements.

using Microsoft.IdentityModel.JsonWebTokens; using Okta.Sdk;

Let’s get your profile info. Create a new public method in AccountController named Profile to handle interacting with the Profile view. You’ll use the user model from Okta. Your code will look like the following.

public async Task<IActionResult> Profile() { var user = await GetOktaUser(); return View(user); }

We want to protect the Profile method against non-authenticated calls. With the Okta middleware configured, you can add guards. Add the [Authorize] attribute directly above the Profile method.

Let’s update the profile view with user information. Open Views/Account/Profile.cshtml and replace the existing code displaying your user info and a profile pic. You won’t have a profile pic to show yet, but that’s coming up soon!

@model Okta.Sdk.IUser @{ ViewData["Title"] = "User Profile"; } <h1>Your Profile</h1> <div class="card" style="width: 36rem;"> <div class="card-header d-flex justify-content-center"> @if (ViewData["ProfileImageUrl"] != null) { <img class="rounded-circle border border-info" src="@ViewData["ProfileImageUrl"]" alt="Profile picture" width="300px"/> } </div> <div class="card-body"> <h2 class="card-title h3">@Model.Profile.FirstName @Model.Profile.LastName</h2> <p class="card-subtitle h6 mb-2 text-muted">@Model.Profile.Email</p> @if (!string.IsNullOrEmpty(@Model.Profile.City) || !string.IsNullOrEmpty(@Model.Profile.CountryCode)) { <p class="mt-3 card-text"> <span>📍</span><span class="ml-2">@Model.Profile.City, @Model.Profile.CountryCode</span> </p> } <div class="d-flex justify-content-end"> @* @if (ViewData["ProfileImageUrl"] != null) *@ @* { *@ @* <a class="btn btn-outline-danger mr-4" asp-action="DeleteProfilePic">Delete Profile Picture</a> *@ @* } *@ @* <a class="btn btn-primary" role="button" asp-action="EditProfile">Edit Profile</a> *@ </div> </div> </div>

Now that we have a Profile method, we can update the main navbar. Open Views/Shared/_Layout.cshtml and uncomment the line of code that links to the “Profile” view.

You can view your profile, but the edit button doesn’t work yet.

Edit your Okta profile

We’ll add the support to edit your profile info, except for your profile picture. We need a new model that supports the form fields for editing profiles. Add a new file named UserProfileViewModel.cs to the Models directory. Copy and paste the following code into the file.

#nullable enable using System.ComponentModel.DataAnnotations; using Microsoft.AspNetCore.Http; namespace OktaProfilePicture.Models { public class UserProfileViewModel { [Required] public string? FirstName { get; set; } [Required] public string? LastName { get; set; } [Required] [EmailAddress] public string? Email { get; set; } public string? City { get; set; } [Display(Name = "Country Code")] public string? CountryCode { get; set; } [Display(Name = "Profile Image")] public IFormFile? ProfileImage { get; set; } } }

Next open the AccountController class. We need to add the methods to support editing your profile. First, we need a method to populate the view for editing, and we need another method to handle the profile update. Because both methods relate to editing the profile, we’ll name them the same and create a method overload.

Add a public method EditProfile to AccountController to populate the view and guard it against unauthenticated users. The method looks like the code below.

[Authorize] public async Task<IActionResult> EditProfile() { var user = await GetOktaUser(); return View(new UserProfileViewModel() { City = user.Profile.City, Email = user.Profile.Email, CountryCode = user.Profile.CountryCode, FirstName = user.Profile.FirstName, LastName = user.Profile.LastName }); }

Next, we’ll overload the EditProfile method with the user profile model. This method calls Okta management SDK to apply updates. Add the following method to AccountController.

[Authorize] [HttpPost] [ValidateAntiForgeryToken] public async Task<IActionResult> EditProfile(UserProfileViewModel profile) { if (!ModelState.IsValid) { return View(profile); } var user = await GetOktaUser(); user.Profile.FirstName = profile.FirstName; user.Profile.LastName = profile.LastName; user.Profile.Email = profile.Email; user.Profile.City = profile.City; user.Profile.CountryCode = profile.CountryCode; await _oktaClient.Users.UpdateUserAsync(user, user.Id, null); return RedirectToAction("Profile"); }

Let’s add the view for editing. Create a new view named EditProfile.cshtml in Views/Account folder. Copy the code below and replace the contents of EditProfile.cshtml with it.

@model UserProfileViewModel @{ ViewData["Title"] = "Edit Profile"; } <h1>Edit Profile</h1> <div class="row"> <div class="col-md-4"> <form asp-action="EditProfile" enctype="multipart/form-data"> <div asp-validation-summary="ModelOnly" class="text-danger"></div> <div class="form-group"> <label asp-for="FirstName" class="control-label"></label> <input asp-for="FirstName" class="form-control" /> <span asp-validation-for="FirstName" class="text-danger"></span> </div> <div class="form-group"> <label asp-for="LastName" class="control-label"></label> <input asp-for="LastName" class="form-control" /> <span asp-validation-for="LastName" class="text-danger"></span> </div> <div class="form-group"> <label asp-for="Email" class="control-label"></label> <input asp-for="Email" class="form-control" /> <span asp-validation-for="Email" class="text-danger"></span> </div> <div class="form-group"> <label asp-for="City" class="control-label"></label> <input asp-for="City" class="form-control" /> <span asp-validation-for="City" class="text-danger"></span> </div> <div class="form-group"> <label asp-for="CountryCode" class="control-label"></label> <input asp-for="CountryCode" class="form-control" /> <span asp-validation-for="CountryCode" class="text-danger"></span> </div> <div class="form-group"> <label asp-for="ProfileImage" class="control-label"></label> <div class="custom-file"> <input asp-for="ProfileImage" class="custom-file-input" id="customFile"> <label class="custom-file-label" for="customFile">Choose file</label> </div> <span asp-validation-for="ProfileImage" class="text-danger"></span> </div> <div class="form-group"> <input type="submit" value="Save" class="btn btn-primary" /> </div> </form> </div> </div> <div> <a asp-action="Profile">Back to Profile</a> </div> @section Scripts { @{await Html.RenderPartialAsync("_ValidationScriptsPartial");} <script type="text/javascript"> const fileInput = document.querySelector('.custom-file-input'); fileInput.addEventListener('change', (event) => { const fileName = event.target.value.split("\\").pop(); document.querySelector('.custom-file-label').textContent = fileName; }); </script> }

Lastly, open Views/Account/Profile.cshtml to uncomment a line of code towards the bottom of the file. Uncomment the line that displays a button to edit profile - the line with <a class="btn btn-primary" role="button" asp-action="EditProfile">EditProfile</a>.

Now you should be able to edit your profile information except for your profile picture. Next we’ll make it so your app persists your profile picture.

Add Azure Storage

To make your profile picture viewable, we need to persist the image somewhere. We’ll use Azure Storage to do this and use a custom attribute in Okta to store a unique identifier associated with the file in Azure. You’ll be able to upload and display a profile picture in the app.

First, we’ll create the custom attributes in Okta using the Okta admin dashboard.

Log in to the Okta admin dashboard. Then navigate to Directory > Profile Editor. You should see an item named User (default) in the list, which is Okta’s default user profile template. Select User (default) to open the Profile Editor. Press the +Add Attribute button.

You’ll add two string attributes, profileImageKey and personId:

Use “Profile Image Key” as the display name and profileImageKey as the variable name. Use “Person Id” as the display name for the second attribute and personId as the variable name.

See the image below for the “Profile Image Key” attribute inputs.

Create another attribute named personId. If you look at the “User (default)” user type attribute list, you should see your two new attributes listed at the bottom.

Next, we’ll create Azure Storage. If you don’t already have an Azure subscription, make a free account before you begin. We’ll walk through the steps using the Azure Portal, but if you are an Azure pro, feel free to use the Azure CLI or Azure PowerShell.

Once you have an Azure subscription, open the Azure Portal. Open the menu by pressing the hamburger menu on the left, and select Storage accounts. Press +Create to create a storage account. In the Create a storage account view, create a Resource Group if you don’t already have one for test projects. I named my Resource Group “OktaDemo”. Enter oktaprofilepicture as the storage name, select a region (Azure’s default selection is fine here), then select Standard performance and Locally-redundant storage (LRS) for redundancy. Press the Review + create button at the bottom as the default options for the remaining selections are acceptable. Press Create to complete creating the storage account.

Next, we need to get the access keys for integrating the storage account SDK in the app. In the oktaprofilepicture storage account, open Access keys below the Security + networking section in the nav menu. Then press Show keys and copy the first Connection string field. Now we can move to the code!

We’re going to add a new section for Azure resources to the appsettings.Development.json file in the solution to connect to the Azure resource. Open appsettings.Development.json and add a new key named “Azure” with a property called “BlobStorageConnectionString” like the following code snippet.

{ ... Logging and Okta sections here, "Azure": { "BlobStorageConnectionString": "DefaultEndpointsProtocol=https;AccountName=oktaprofilepicture;AccountKey=eWlJUj.....A==;EndpointSuffix=core.windows.net" } }

Next, we’ll need to add the packages and set up the service. Add two Nuget packages — one for Azure Storage and the other for integrating Azure clients into the DI system. Add the packages by running the following commands in the terminal.

dotnet add OktaProfilePicture package Azure.Storage.Blobs --version 12.9.1 dotnet add OktaProfilePicture package Microsoft.Extensions.Azure --version 1.1.1

Open Startup.cs and add the following code to the ConfigureServices method before services.AddControllersWithViews();.

services.AddAzureClients(builder => { builder.AddBlobServiceClient(Configuration.GetValue<string>("Azure:BlobStorageConnectionString")); });

We can inject the blob service client into the AccountController and connect to the blob container. We’ll create the blob container that we’ll use in the constructor. Open Controllers/AccountController.cs and add the blob container client as shown below.

private readonly OktaClient _oktaClient; private readonly BlobContainerClient _blobContainerClient; public AccountController(OktaClient oktaClient, BlobServiceClient blobServiceClient) { _oktaClient = oktaClient; _blobContainerClient = blobServiceClient.GetBlobContainerClient("okta-profile-picture-container"); _blobContainerClient.CreateIfNotExists(); }

Update the Profile() and EditProfile(UserProfileViewModel profile) methods to upload and view the blob container image. First, let’s update the Profile() method.

The blob container’s access level is private, so we need to generate a shared access signature that creates a read-only, temporary URL that we can use in the view. We’ll also use the custom attribute profileImageKey to retrieve an existing image if one exists.

Replace the existing Profile() method with the following code.

[Authorize] public async Task<IActionResult> Profile() { var user = await GetOktaUser(); var profileImage = (string)user.Profile["profileImageKey"]; if (string.IsNullOrEmpty(profileImage)) { return View(user); } var sasBuilder = new BlobSasBuilder { StartsOn = DateTimeOffset.UtcNow, ExpiresOn = DateTimeOffset.UtcNow.AddMinutes(15) }; sasBuilder.SetPermissions(BlobSasPermissions.Read); var url = _blobContainerClient.GetBlobClient(profileImage).GenerateSasUri(sasBuilder); ViewData["ProfileImageUrl"] = url; return View(user); }

The EditProfile(UserProfileViewModel profile) changes to

[Authorize] [HttpPost] [ValidateAntiForgeryToken] public async Task<IActionResult> EditProfile(UserProfileViewModel profile) { if (!ModelState.IsValid) { return View(profile); } var user = await GetOktaUser(); user.Profile.FirstName = profile.FirstName; user.Profile.LastName = profile.LastName; user.Profile.Email = profile.Email; user.Profile.City = profile.City; user.Profile.CountryCode = profile.CountryCode; if (profile.ProfileImage != null) { await UpdateUserImage(); } await _oktaClient.Users.UpdateUserAsync(user, user.Id, null); return RedirectToAction("Profile"); async Task UpdateUserImage() { var blobName = Guid.NewGuid().ToString(); if (!string.IsNullOrEmpty((string)user.Profile["profileImageKey"])) { await _blobContainerClient.DeleteBlobAsync((string)user.Profile["profileImageKey"]); } await _blobContainerClient.UploadBlobAsync(blobName, profile.ProfileImage?.OpenReadStream()); user.Profile["profileImageKey"] = blobName; } }

If you run the app, you can select a picture from local files to add to your profile and see your profile picture on the “Profile” page.

Add Azure Cognitive Services

Finally, we get to check out Azure Cognitive Services for facial analysis. We’ll use facial analysis in two different ways — for face detection and face verification.

First, we need to create the Azure resource and get the access keys. Open the Cognitive Services Face resource page in the Azure portal. Press + Create to open the Create Face view. Select “OktaBlog” as the Resource group (or a Resource group of your choosing) and name the instance “OktaProfilePicture”. Since this is for demo purposes, I used the “Free F0” pricing tier. Press Review + create to create the resource. Open “OktaProfilePicture” Face service instance and open Keys and Endpoint. You will need both the key and the Endpoint.

Open the appsettings.Development.json file and add two new fields in the Azure section named SubscriptionKey and FaceClientEndpoint. Copy and paste the key and endpoint values from the Azure Keys and Endpoint view. The Azure section of your appsettings.Development.json will now look like the following.

{ ... Logging and Okta sections here, "Azure": { "BlobStorageConnectionString": "DefaultEndpointsProtocol=https;AccountName=oktaprofilepicture;AccountKey=eWlJUj.....A==;EndpointSuffix=core.windows.net", "SubscriptionKey": "{FaceResourceKey}", "FaceClientEndpoint": "https://{FaceResourceName}.cognitiveservices.azure.com/" } }

To add the packages, run the following command in the terminal.

dotnet add OktaProfilePicture package Microsoft.Azure.CognitiveServices.Vision.Face --version 2.8.0-preview.2

Let’s add the face service to the DI system to use it in the controller. Open Startup.cs and add the following code to the ConfigureServices method before the call to services.AddControllersWithViews();.

services.AddSingleton((serviceProvider) => new FaceClient( new ApiKeyServiceClientCredentials(Configuration.GetValue<string>("Azure:SubscriptionKey"))) { Endpoint = Configuration.GetValue<string>("Azure:FaceClientEndpoint") } );

Now we can incorporate the service into the controller. Open Controllers/AccountController.cs, inject the face service in the constructor, then connect it to a private instance that we’ll use throughout the controller class. Your code will look like this:

private readonly OktaClient _oktaClient; private readonly BlobContainerClient _blobContainerClient; private readonly FaceClient _faceClient; public AccountController(OktaClient oktaClient, BlobServiceClient blobServiceClient, FaceClient faceClient) { _oktaClient = oktaClient; _faceClient = faceClient; _blobContainerClient = blobServiceClient.GetBlobContainerClient("okta-profile-picture-container"); _blobContainerClient.CreateIfNotExists(); }

The first thing we want to do with the cognitive service is ensure there’s only one face in an image. To do so, we’ll use a built-in method to detect faces.

In EditProfile(UserProfileViewModel profile) method, we’ll add code to open the image file, detect the number of faces, and set an error if there isn’t exactly one face in the image. Replace the if(profile.PageImage != null){await UpdateUserImage();} statement with the following code.

if (profile.ProfileImage == null) { await _oktaClient.Users.UpdateUserAsync(user, user.Id, null); return RedirectToAction("Profile"); } var stream = profile.ProfileImage.OpenReadStream(); var detectedFaces = await _faceClient.Face.DetectWithStreamAsync(stream, recognitionModel: RecognitionModel.Recognition04, detectionModel: DetectionModel.Detection01); if (detectedFaces.Count != 1 || detectedFaces[0].FaceId == null) { ModelState.AddModelError("", $"Detected {detectedFaces.Count} faces instead of 1 face"); return View(profile); } await UpdateUserImage();

The RecognitionModel.Recognition04 is the most accurate model currently available, and DetectionModel.Detection01 is a model that avoids detecting small and blurry faces.

Now, if you try running the app and upload an image with you and your friends, you’ll see an error.

Perform facial analysis

Next, let’s add the facial analysis. We need to handle two scenarios.

You’re adding a profile picture for the first time — this sets the baseline for the facial features to use in future comparisons You’re updating your profile picture — facial analysis runs against the face in this image and compares it to the baseline

Let’s handle the first scenario.

The Face API only stores the extracted facial features for 24 hours by default, so we need to set a baseline that we can refer to beyond the 24-hour window. We can store facial features within the “OktaProfilePicture” Azure Cognitive Services resource, which we’ll do for the first upload scenario.

In the EditProfile(UserProfileViewModel profile) method, replace the line of code that calls await UpdateUserImage(); with the code snippet below.

var personGroupId = user.Id.ToLower(); if (string.IsNullOrEmpty((string)user.Profile["personId"])) { await _faceClient.PersonGroup.CreateAsync(personGroupId, user.Profile.Login, recognitionModel: RecognitionModel.Recognition04); stream = profile.ProfileImage.OpenReadStream(); var personId = (await _faceClient.PersonGroupPerson.CreateAsync(personGroupId, user.Profile.Login)).PersonId; await _faceClient.PersonGroupPerson.AddFaceFromStreamAsync(personGroupId, personId, stream); user.Profile["personId"] = personId; await UpdateUserImage(); } else { }

The PersonGroup is a container object for a group of people. We’re using your Okta user ID as the unique identifier for this group, so each group is truly individualized. Other options could be group or department IDs. The PersonGroupPerson object is an individual within the group, so there is usually a 1:n relationship between PersonGroup and PersonGroupPerson. We’re also associating one face to a PersonGroupPerson, although you can add multiple images of an individual so you can train the service to analyze an individual better. We’re using a simple case here to showcase how this service works by a single image of an individual.

Now we want to handle the second scenario — where we want to perform facial analysis compared to the baseline facial features. We want only to allow the image to update when we have some measure of confidence that the new face image matches the baseline. Add the following snippet inside the else block from the code above.

var faceId = detectedFaces[0].FaceId.Value; var personId = new Guid((string)user.Profile["personId"]); var verifyResult = await _faceClient.Face.VerifyFaceToPersonAsync(faceId, personId, personGroupId); if (verifyResult.IsIdentical && verifyResult.Confidence >= 0.8) { await UpdateUserImage(); } else { ModelState.AddModelError("", "The uploaded picture doesn't match your current picture"); return View(profile); }

We used a confidence level of 80% in positive identifications since we only have one facial feature for comparison. With better training, you can increase the confidence level.

Now you can upload a picture of yourself and then upload a photo of a friend to see what happens.

Resetting profile pictures

Sometimes you need to clear everything out and start over. I uploaded a picture of myself wearing oversized sunglasses and a hat, and the face service no longer recognized me. 😎

Let’s add in the delete functionality so you can reset everything.

In AccountController we’ll add a new method called DeleteProfilePic(). This method deletes the image from container storage, removes links to stored facial features, and clears out the custom attributes set in your Okta profile. Copy the code below and add it to the end of the AccountController class.

[Authorize] public async Task<IActionResult> DeleteProfilePic() { var user = await GetOktaUser(); await CleanAzureResources(); user.Profile["profileImageKey"] = null; user.Profile["personId"] = null; await _oktaClient.Users.UpdateUserAsync(user, user.Id, null); return RedirectToAction("Profile"); async Task CleanAzureResources() { // remove image from blob await _blobContainerClient.DeleteBlobAsync((string)user.Profile["profileImageKey"]); // remove face from Face services var personId = Guid.Parse((string)user.Profile["personId"]); await _faceClient.PersonGroupPerson.DeleteAsync(user.Id.ToLower(), personId); await _faceClient.PersonGroup.DeleteAsync(user.Id.ToLower()); } }

Next, open Views/Account/Profile.cshtml and uncomment the remaining commented out code. Doing so allows the “Delete Profile Picture” button to display.

Now you can reset your profile picture and upload that picture of you wearing a hat and sunglasses to test out the facial comparison ability.

Learn More About Entity Framework Core, ASP.NET Core, and Okta

I hope the tutorial was interesting and enjoyable for you. You can get the full source code of the project from GitHub. For more ASP.NET Core and Okta articles, check out these posts:

Rider for C# - The Best Visual Studio Alternative IDE How to Secure PII with Entity Framework Core Okta .NET management SDK

Be sure to follow us on Twitter and subscribe to our YouTube Channel so that you never miss any excellent content!

Tuesday, 11. January 2022

Finicity

The Rise of Open Banking: Five Open Banking Benefits Driving Consumers to Fintech

How people pay bills, shop online, open accounts and experience their finances is evolving quickly. Fintech innovations are altering the way consumers think about and relate to money. The COVID-19 pandemic has only accelerated the digital transition. The shifting reality of how finances are managed is underway. Bank branches and paper checks are an afterthought […] The post The Rise of Open Bank

How people pay bills, shop online, open accounts and experience their finances is evolving quickly. Fintech innovations are altering the way consumers think about and relate to money. The COVID-19 pandemic has only accelerated the digital transition. The shifting reality of how finances are managed is underway. Bank branches and paper checks are an afterthought to a growing number of consumers. Eight in 10 Americans are linking their bank accounts digitally, using these connections to automate everyday tasks like paying credit card and utility bills, according to Mastercard’s new study, The Rise of Open Banking

 

The sea change in consumer expectations is already in motion. With every leap in speed, security, and ease of use, open banking apps and services usher in the next generation of finance. The advent of the internet has trained the human mind to process far more information than in the past, and to do it quickly. When it comes to their personal finances, people want the real-time data whenever they want it. Save them time and money, and enhance their financial health, or they won’t adopt your platform.

1. Saving More Time, Creating Less Work

Just a handful of years ago, a group of friends out to dinner would have to all reach into their wallets and purses, chip in cash, write a check or stack a pile of credit cards on top of the bill, and wait for the server to swipe them all. With the massive adoption and growth of P2P payment apps, these antiquated processes are quickly becoming a thing of the past. 

In real time, payments can be split between friends with a few taps on the mobile screen. Digital wallets allow a busy shopper to tap their phone against a payment terminal and breeze through checkout at retail stores. Encrypted credit card info auto-populates, saving time and reducing user error when shopping online. The countless hours these new processes save is the biggest driver for adoption. 59% of study respondents cited this as their number one motivator. 

2. Saving Money

65% of Americans don’t know how much money they spent last month. Saving doesn’t come naturally to many people, and our emotionally-charged relationship with money certainly doesn’t help. So much emphasis is placed on new and better ways to spend that saving becomes just a lurking afterthought. Open banking innovators are addressing this huge segment of consumers with a rich slate of apps and services. AI and machine-learning engines do the heavy lifting of savings calculation, goal-setting and projection, raising the level of users’ financial literacy. 42% of Americans surveyed wanted help saving their money, and trust technology to give them the advantage they need.

Open banking technology powers some of the most effective fintech apps for saving money, using artificial intelligence and machine learning to take a deep dive into spending habits and cash flow. Some even gamify finances, injecting the process with fun and some healthy self-competition. 

3. Improving Financial Health

Open banking and AI make a powerful duo. Open banking connections to third-party financial service providers are flipping the data experience to favor the consumer. Anyone can download a financial management app, grant permission to access their bank accounts, and be guided easily through opening accounts, investment suggestions, and loan applications. This happens in moments, not hours or days. 

Fintech AI systems process massive amounts of data in milliseconds. App and service developers can leverage this power to analyze a consumer’s subscription payments, utility bills, direct deposits and loan obligations near-instantaneously. Machine learning and AI engines can use this rich, real-time data to suggest smart financial planning and investing options. It’s a growing expectation that has to be accounted for in app development. Exposure to the open banking ecosystem is raising the level of consumer financial literacy, and it’s happening quickly. As a result, financial decision-making can improve dramatically. Consumers are seeing the positive results of adoption in their bank balances: 35% of respondents to the survey say improved financial health is why they use fintech. 

4. Automate It!

No one leaps out of bed in the morning in wild anticipation of paying the electricity bill. Mundane, time-consuming financial tasks don’t actually have to absorb anyone’s time. Open banking allows easy setup and maintenance of connections to financial institutions, allowing consumers to set a variety of monthly payments and subscriptions to auto-pay. With the average consumer spending $273 per month on a mixed bag of small payments, automation saves a considerable amount of time, and it ensures payments aren’t missed. U.S. households average nine separate payments per month just for entertainment subscriptions. Add gym memberships, retail subscriptions and utilities to an ever-growing list of small, recurring transactions, and the benefit of utilizing technology to manage these payments becomes obvious. 26% of North Americans surveyed are attracted to technology to help them automate these little chores, freeing up their time for more important concerns. 

5. Getting a Better Holistic View of Finances

Gone are the days when consumers used file drawers stuffed with manila folders, neatly tabbed, in an attempt to keep checking the pulse of their financial health. Open banking data has spawned a wealth of apps and services that give consumers and small businesses an up-to-the-moment picture of their finances. Open banking connections across accounts and across financial institutions power the apps that consumers are using to take their financial temperature. The growing expectation is that individuals should be able to see what their money is doing and where it’s going, 24 hours a day. 18% of North American users said that tracking their money was a major driver for fintech adoption. 

 

To drill down even further into the use cases driving consumers to fintech, read the full study here.

 

The post The Rise of Open Banking: Five Open Banking Benefits Driving Consumers to Fintech appeared first on Finicity.


Monetha

The new Monetha app is now live: Shop online and earn rewards today!

We’re excited to announce that the new Monetha app has just been released on iOS and Android! What’s in there: Purchase from 100+ e-shops based on your interests and preferences. Accumulate high rewards. The more you complete your profile, the higher rewards you get! Fully encrypted and secure user profile. No one else, including Monetha, has access to it. The most signi

We’re excited to announce that the new Monetha app has just been released on iOS and Android!

What’s in there:

Purchase from 100+ e-shops based on your interests and preferences. Accumulate high rewards. The more you complete your profile, the higher rewards you get! Fully encrypted and secure user profile. No one else, including Monetha, has access to it.

The most significant update in the new version is a marketplace with 100+ merchants where you can shop and earn rewards. Every time you shop with Monetha, your purchases will be automatically registered and turned into reward points. As soon as the merchant approves your purchase, the reward points earned become spendable.

What could you do with the rewards? We are working on three main ways to spend them that will be available in upcoming updates:

shopping with Monetha MTH tokens charity donations.

We continue to expand the Monetha marketplace with new stores. Don’t see your favorite shops? Let us know what they are, and we will do our best to get them featured in the following Monetha’s app updates! Email us at affiliates@monetha.io or contact us on Facebook, Twitter, Reddit, LinkedIn, Telegram -whichever platform you use.

Stay tuned for more! Subscribe to our newsletter and follow us on social media to hear our latest news.

The Monetha team


Global ID

GiD Report#194 — The party is over

GiD Report#194 — The party is over Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. Check out last week’s report here. This week: The party is ending sooner than expected Chart of the week — 10-year note edition Regulatory wrangling continues Chart of the week — Crypto fraud editio
GiD Report#194 — The party is over

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. Check out last week’s report here.

This week:

The party is ending sooner than expected Chart of the week — 10-year note edition Regulatory wrangling continues Chart of the week — Crypto fraud edition The end of data retention Stuff happens 1. The party is ending sooner than expected Photo: Petr Vdovkin

Last week, we talked about the 6 stories that matter in 2022 — one of them being the end of the Fed’s easy money party.

That story is playing out sooner than expected. Here’s Axios:

Things change fast in a pandemic. And a rapidly changing economy has the Federal Reserve playing catch-up, Axios’ Neil Irwin writes.
Less than a month ago, the Fed made an abrupt pivot toward a more hawkish monetary policy stance. By the end of last week, the data was pointing toward an even faster withdrawal of stimulus.
Why it matters: Cheap money has become baked into the economy, so the Fed’s moves to take it away will bring risks of abrupt swings in markets that could spill back over into the economy.
By the numbers: Friday’s jobs data are Exhibit A. While initial headlines focused on soft growth in payroll numbers, the report points toward a labor market that has become exceptionally tight, contributing to already high inflation.
The unemployment rate is down to 3.9%. In the last economic cycle that level was not reached until May 2018 — at which point the Fed had already raised interest rates six times (now: zero). Wages are not only rising, but rising at an accelerating pace. Average hourly earnings rose 4.7% over the entire course of 2021, but at a 6.2% annual rate in the final three months of the year.
The bottom line: The labor market has gotten tighter, faster than most people, including at the Fed, thought possible just a few months ago. Now, policy is on track to follow suit.

Goldman Sachs now predicts the Fed will hike rates four times next year.

Once the Fed starts hiking rates, they’ll also start shedding their balance sheet, essentially sucking easy money out of the system.

When that money gets sucked out, it gets sucked out of riskier assets like certain equities and crypto.

We’re now entering that part of the cycle.

Relevant:

Goldman predicts the Fed will hike rates four times this year, more than previously expected Goldman Now Expects Four Fed Hikes, Sees Faster Runoff in 2022 Federal Reserve puts wheels in motion for balance sheet reduction 2-year Treasury yield hits an almost 2-year high as Fed’s Bullard says first rate hike could come as soon as March Fed officials discussed raising rates sooner and faster in 2022. 2. Chart of the week — 10-year note edition

Axios:

3. Regulatory wrangling continues

One of the other 6 stories that matter in 2022 was regulators.

The big story last week is that the CFTC fined Polymarket, concluding an investigation that was first reported back in October.

Here’s Coindesk:

Polymarket is a crypto betting service that allows users to pick one of at least two options on given trades, such as who might win the 2020 presidential election. According to the order published by the CFTC, Polymarket offered at least 900 such markets over the last 18 months.
These markets are swaps under federal law.
Polymarket cooperated with the investigation, according the CFTC’s press release, leading to a reduced fine. The company will stop offering markets by Jan. 14 and commit to making all funds available to users by Jan. 24, according to the order. Polymarket will also cease and desist any further violations of the CEA, though it doesn’t appear the company itself will be shut down.
Elsewhere, Berlin-based Neuhaus is preemptively shutting down a “viable security token business” due to lack of regulatory clarity (via /gregkidd):
Our concept-proving case — Greyp Bikes — made the full cycle, from issuing tokenized shares for retail investors, through corporate governance on blockchain, to the exit to Porsche and proceeds distribution via ERC20 tokens. Effectively, there were never any compliance issues, technical problems or security breaches. An European tech company fundraised through the issuance of securities using a decentralized technology. And more than 1,000 investors from dozens of countries participated. How cool.
Yet, we are closing the Neufund business.
Why? Because today, more than two years after Greyp fundraised, we still are unsure whether regulation allows us to repeat the Greyp fundraising model with other similar companies. Despite engaging with regulators for years, we didn’t manage to get out of the limbo of legal uncertainty.
And, I dare say, no DeFi (decentralized finance) company, aiming for regular investors on a bigger scale, has ever made it so far.

Not the most encouraging updates. As much as this is a technology game, it’s still going to be decided by regulations.

Relevant:

Via /gregkidd — Why We’re Shutting Our Successful Fundraising Platform CFTC Fines Crypto Betting Service Polymarket $1.4M for Unregistered Swaps Former SEC Chairman-Turned-Crypto Advisor Defends Move to Private Sector — Blockworks US Congress to Hold Oversight Hearing on Crypto Mining: Report 4. Chart of the week — Crypto fraud edition

Axios:

From their report:

Illicit activities like cybercrime, money laundering and terrorist financing made up only 0.15% of all crypto transactions conducted in 2021, according to a new report from Chainalysis, a blockchain data platform.
Why it matters: This is a sign of crypto’s growing mainstream popularity — and a rebuke to critics who say digital currency is mainly for criminals.
Yes, but: Crypto crime is at an all-time high in absolute terms, the report found. It’s just that the growth rate of legitimate activity far outpaced the growth rate for illicit activity, Kim Grauer, director of research at Chainalysis, tells Axios.

Relevant:

Report: Illicit activity actually a tiny part of cryptocurrency use Crypto Crime Hit All-Time High of $14B in 2021 as Prices Climbed: Chainalysis Crypto Security Is Biggest Concern for Institutional Investors 5. The end of data retention

Here’s the German Minister of Justice (via /m):

“I reject data retention without any reason and would like to remove it from the law once and for all. It violates fundamental rights. If everyone has to expect that a lot about their communications will be stored without cause, then no one will feel free anymore”, said the Federal Minister of Justice in an interview with the Funke Mediengruppe.
“That’s why [German] courts have repeatedly stopped the use of data retention without a specific reason.”

Relevant:

Via /m — Germany: Data retention to be abolished once and for all. German bulk data retention law isn’t legal — CJEU adviser — TechCrunch 6. Stuff happens Via /easwee — NFT Project Bored Ape Yacht Club Spawns ‘Left-Facing’ Copycats Bitcoin at the Bank: Mainstream Lenders Dabble in Crypto Outside the U.S. Via /easewee — Today on Sick Sad World: How The Cryptobros Have Fallen Aave Arc to Provide 30 Financial Institutions Access to Private Pools of DeFi Liquidity — Blockworks Australian Open Serves Up NFTs Linked to Live Matches — Blockworks Quentin Tarantino Pushes Forward with ‘Pulp Fiction’ NFT Plans Despite Lawsuit from Miramax — Blockworks There’s an internet debate raging over whether Web3 will be a dud or game-changer JPMorgan Says Ethereum’s DeFi Dominance at Risk Due to ‘Sharding’ Delays Tencent Adds Digital Yuan Support to WeChat Pay Wallet: Report Crypto Browser Brave Passes 50M Monthly Active Users Meme stock traders’ next chapter Scoop: NFT marketplace OpenSea in talks to buy Dharma Labs WSJ News Exclusive | GameStop Entering NFT and Cryptocurrency Markets as Part of Turnaround Plan The Investor’s Guide to DeFi 2.0 — Blockworks

GiD Report#194 — The party is over was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Quick Tips and Tricks for Auth0 Actions

Here’s a collection of Actions that you can write quickly to perform useful tasks.
Here’s a collection of Actions that you can write quickly to perform useful tasks.

Dock

NFTs and Verifiable Credentials: Using Dock to Support NFTs for Vintage Watches

Dock’s technology can be used alongside NFTs to reduce fraud and preserve the privacy of NFT owners.

Dock is a bespoke blockchain designed and built to be used for Decentralized Identifiers (DIDs) and Verifiable Credentials. One of the many important use cases where Dock’s technology can be used is alongside NFTs to reduce fraud and preserve the privacy of NFT owners. To help illustrate how this could work, we are sharing a specific use case with world-renowned vintage watch dealer, Fog City Vintage.

Before we dive into the use case, it’s important to understand the differences between NFTs and Verifiable Credentials, and how they can work together.

Verifiable Credentials contain cryptographically signed pieces of information that can be instantly verified and authenticated based on the identity and trustworthiness of the issuer of the Verifiable Credential. Information contained in the Verifiable Credential is specific to the holder of the credential, which can be an individual, entity, or even an object, and Verifiable Credentials aren’t considered transferable or able to change ownership. In addition, Verifiable Credentials are stored off-chain, typically in a holder’s wallet, and the only data on-chain is the DID and a hash pointing to the credential ensuring data privacy. Some common use cases for Verifiable Credentials include credentials used to prove identity, skills, and achievements.

Non-Fungible Tokens (NFTs) are blockchain-based tokens that contain digitally unique data and identifiers. They are designed to be transferable and can contain digital artwork, collectibles, or representation of real-world assets. Most NFTs today exist on the Ethereum blockchain and so their ownership is tied to a public Ethereum address. Dock has integrated with Ethereum to enable NFTs to be created directly on the Dock blockchain.

How can NFTs & Verifiable Credentials work together?

Although they are fundamentally different, there are some very interesting use cases for Verifiable Credentials to be used alongside NFTs. Verifiable Credentials can provide a privacy preserving yet cryptographically provable identity linked to an NFT, and therefore provide tamper-proof information about the identity of an NFT owner, creator, or buyer.

Fraud is rampant both within and outside of the blockchain space, and NFTs are no exception with scammers setting up fake accounts and selling fake NFTs. A Verifiable Credential can be used to verify the identity of all parties involved in creating, buying, and selling NFTs, to ensure the reputation and authenticity of everyone involved without divulging private information. For example, an NFT creator could share a Verifiable Credential that demonstrates a history of creating legitimate NFTs, or a seller can have a Verifiable Credential that shows a reputation score based on their previous NFT sales. Verifiable credentials issued via Dock use Zero-Knowledge Proofs to ensure data privacy, so the parties involved in the NFT transaction can prove their legitimacy without actually revealing their real-world identities or private information.

Another way that Verifiable Credentials can be used alongside NFTs is to prove qualifications in order to access an NFT or an NFT marketplace. For example, an artist may want their NFTs to only be accessible by their fans or specific members of a community who meet certain criteria. In this instance, Verifiable Credentials could be issued to community members demonstrating they meet the criteria and can then present the Verifiable Credential to enable them to receive the NFT. There could also be an NFT marketplace that is only appropriate for certain ages or populations; a Verifiable Credential could be used to prove that someone meets this criteria without compromising their identity in order to be able to transact in the NFT marketplace.

Dock and Fog City Vintage Watches

Fog City Vintage is a world-renowned vintage watch dealer who is using NFTs to address specific problems that exist in the $20 billion vintage watch industry. Vintage watches are inherently unique, highly tradeable, and very expensive, but the market for these watches is highly susceptible to fraud and there are no quality standards. Most trades are based on loose connections and it’s not uncommon for sellers to be defrauded out of  tens of thousands, sometimes hundreds of thousands, of dollars. Tim Bender, CEO of Fog City Vintage, estimates that well over $1 billion is lost annually by buyers of fraudulent vintage watches.

Fog City Vintage is planning to use NFTs to represent the authenticity of the watch in order to prevent fraud and enhance the tradability of the watches. Fog City will be issuing NFTs that represent the vintage watches and provide an expert evaluation of the condition of the watches based on a new standard they are developing. By issuing NFTs, which can be done on the Dock blockchain, Fog City can provide watch buyers with the assurance that they are purchasing legitimate watches from reputable buyers.

Dock’s technology can be used to enhance Fog City’s watch marketplace in several ways. Verifiable Credentials can be issued to marketplace buyers and sellers that reflect their purchase history to eliminate fraudsters and ensure only legitimate parties are involved in the transactions. In addition, Fog City, as well as other vintage watch dealers, can have a Verifiable Credential that proves their own identity as reputable experts who are qualified to evaluate the condition of the watches. The watch dealers would issue certificates of authenticity for the watches based on their evaluations to ensure only genuine watches are being traded and therefore provide further trust and transparency in the marketplace.

Since Dock provides technology to issue anonymous credentials, all of the Verifiable Credentials shared in the transactions would protect the real identity of the marketplace participants without compromising the integrity of the marketplace. Verifiable Credentials and NFTs are stored in the holder's own digital wallets rather than in a centralized database, which removes the risk of holding customer data while giving customers control of their own identity and assets. Combining Verifiable Credentials with NFTs is a powerful and secure way to add trust and authenticity to marketplaces while ensuring the privacy of everyone participating.


Hello User - Pingidentity

Episode 13 with Katryna Dow

Description: Welcome to lucky episode number 13! Your new host Aubrey Turner, Executive Advisor at Ping, is thrilled to welcome Katryna Dow, CEO & Founder of the award-winning data platform Meeco. Katryna discusses Meeco’s mission to enable everyone on the planet access to equity and value in exchange for the data and information they share. She talks about why she saw a need
Description: Welcome to lucky episode number 13! Your new host Aubrey Turner, Executive Advisor at Ping, is thrilled to welcome Katryna Dow, CEO & Founder of the award-winning data platform Meeco. Katryna discusses Meeco’s mission to enable everyone on the planet access to equity and value in exchange for the data and information they share. She talks about why she saw a need for Meeco’s services, what we need to know as we approach a more “physigital”world, and how her vision all started with a Tom Cruise film.   Key Takeaways: [1:34] Katryna talks about her journey of founding Meeco, and how she was inspired by Tom Cruise’s movie Minority Report. In early 2012 she sat down and wrote a Manifesto, and asked the question: what would happen if everyday people had the power to make really good decisions on data, the way that social networks, government, and enterprise do? How can we create meaningful value and make better decisions with our data? [8:12] Katryna shares some of her concerns around modern privacy and where she sees things evolving, both good and bad. [9:35] Technology is neutral. It’s what we do with it that gives it bias and can make it either creepy or cool. [11:33] What does Katryna mean when she says it starts with trust by design? [17:22] The next wave may be just starting to bring people and things into the direct value chain, through wearables or IoT devices for example. [18:31] How can we create better digital onboarding for employees, knowing that even post-COVID-19 our world will not go back to how it was in December 2020? One thing that Katryna is sure of is that we must lean into innovation rather than doing nothing and waiting to see. [36:13] We must make sure we are paying attention to the misalignment between law and technology, especially when it comes to ethics and the safety of children growing up in a digital-forward world.   Quotes: “I think the challenge for any kind of technology and regulation is a lag factor, not a lead factor.”—Katryna “The line between creepy and cool is one of the things we are always trying to address from a technology point of view.”—Katryna “There isn’t really the option to not find better ways of digitally engaging.”—Katryna Mentioned in This Episode: PingIdentity AubreyTurner KatrynaDow Meeco “HowCOVID-19 has pushed companies over the technology tipping point—and transformed business forever”        

KuppingerCole

Mar 10, 2022: Eliminate Passwords with Invisible Multi-Factor Authentication

A high proportion of data breaches and ransomware attacks exploit stolen credentials. Eliminating passwords with multifactor authentication is an effective way to reduce the risk of unauthorized access to company networks, systems, SaaS applications, cloud infrastructure, and data. But not all MFA systems are created equal.
A high proportion of data breaches and ransomware attacks exploit stolen credentials. Eliminating passwords with multifactor authentication is an effective way to reduce the risk of unauthorized access to company networks, systems, SaaS applications, cloud infrastructure, and data. But not all MFA systems are created equal.

Feb 24, 2022: Die Rolle von Identity Security bei Zero Trust 

„Zero Trust“ ist heute für die meisten CISOs ein regelmäßiges Gesprächsthema. Im Kern geht es bei Zero Trust um das Prinzip der kontinuierlichen und sorgfältigen Zugriffskontrolle an mehreren Stellen für alle Benutzer beim Zugriff auf Netzwerk- und Systemressourcen ebenso wie Daten. Das ist erst einmal nichts Neues, bringt jedoch eine neue Fokussierung für die Frage, was und wie man IT-Sicherheit u
„Zero Trust“ ist heute für die meisten CISOs ein regelmäßiges Gesprächsthema. Im Kern geht es bei Zero Trust um das Prinzip der kontinuierlichen und sorgfältigen Zugriffskontrolle an mehreren Stellen für alle Benutzer beim Zugriff auf Netzwerk- und Systemressourcen ebenso wie Daten. Das ist erst einmal nichts Neues, bringt jedoch eine neue Fokussierung für die Frage, was und wie man IT-Sicherheit und Identity Security umsetzt. Zugriff muss mehr, detaillierter und besser gesteuert und kontrolliert werden.

Jolocom

International Thank You Day

Today, January 11, is the International Day of Gratitude. What better time then, to tell partners that you cherish their work and thank them for the inspiration they have given you? We at Jolocom reflect on amazing projects that became possible by joining forces with partners such as T-Labs, Bundesdruckerei, Stacks, and TIB – the ... The post International Thank You Day appeared first on Jolocom

Today, January 11, is the International Day of Gratitude. What better time then, to tell partners that you cherish their work and thank them for the inspiration they have given you? We at Jolocom reflect on amazing projects that became possible by joining forces with partners such as T-Labs, Bundesdruckerei, Stacks, and TIB – the Technical Information Library Hanover.  

It’s been a privilege to work with you and we hope to continue our work together. 

About Us 

Jolocom is a leading steward of self-sovereign identity (SSI) in Europe with a long-term commitment to open source and open standards. We bring a full SSI tech stack, including library, software development kit (SDK), SSI protocol and the Jolocom SmartWallet application, as well as our technical expertise, to the community. Our aim is to accelerate the deployment of DIDs and verifiable credentials, ensuring interoperability between available solutions and increasing the self-sovereignty of users.  

We’ve had the chance to engage with multiple partners in the past year, allowing us to broaden and deepen our SSI knowledge and, what’s more, partnering up in projects that mark a significant turn in the SSI world and increase engagement from both users and developers. 

 
Thank You to.. 

T-Labs/T-Systems (Deutsche Telekom) 

T-Labs is the Research & Development Department of Deutsche Telekom, focusing on bringing new technology trends to the market and delivering tangible results into Deutsche Telekom’s innovation portfolio.  

Starting in November 2017, Jolocom has partnered with T-Labs to launch the development, bringing together a group of blockchain startups to build the prototype operating stack: the idea being that enterprises can build a decentralized back-end in a matter of minutes. Jolocom is developing a modular solution that enables individuals, organizations and companies to get a self-sovereign digital identity. In the T-Labs prototype, Jolocom provides the tool to create and securely verify claims about identity. This allows for the modeling of a trustworthy and complex relationships between entities (e.g. ownership structures of IoT devices with organizations/individuals). Modeling these relationships with a self-sovereign solution enables a frictionless bridge between different ecosystems and network environments. What we have begun with T-Labs in 2018, we are happy to continue from early 2021 into the middle of 2022 with T-Systems and others as part of our current project portfolios. 

 
Bundesdruckerei 

Bundesdruckerei (BDR)  is home to the digital experts supplying Germany’s ID cards and passports. They are working with different ID management solutions and technologies at the federal government security company. One of these approaches made use of Jolocom’s identity library and wallet. In 2019, the BDR demonstrated the potential of decentralized self-sovereign identity solutions in NAME OF PILOT, where identity information remains under the complete control of the citizen via the Jolocom SmartWallet downloaded onto their mobile device.  

More precisely, researchers at BDR merged the Jolocom software for decentralized identity and access management with the existing government IT infrastructure for verification and identification. The goal: creating one of the earliest proof-of-concepts (POCs) for a decentralized digital ID in Germany. 

This particular POC was an early example of an issuing and verification authority merging its existing technology with self-sovereign identity software. Jolocom’s software is fully open source. The part we played was adjusting our app interface to support BDR’s work.  

With more governments exploring the advantages of blockchain and, more specifically, self-sovereign identity solutions for common identity and access management challenges, we look forward to offering our open source library more widely. 

Furthermore, we have engaged with the BDR in another main project of which you might have heard: ONCE.  

Bundesministerium für Wirtschaft: ‘Showcase Secure Digital Identities’ projects ONCE, ID-Ideal and SDIKA 

The project is part of the competitive innovation program “Showcase Secure Digital Identities” (SDI) funded by Germany’s Federal Ministry for Economic Affairs and Energy (BMWi), where a dozen consortia were invited to present their concepts in an innovation competition and four projects that qualified for a fully-funded three year implementation phase. 

Jolocom is a partner in three of these four SDI implementation projects, to which we will lend our expertise in self-sovereign identity, years of experience in developing digital identity wallets and the open-source Jolocom stack. 

ONCE is the first of these projects to launch, followed by ID-Ideal and SDIKA. A few words about each: 

ONCE aims to harmonize eID with SSI in user-controlled wallets. ID-Ideal has the goal of harmonizing trust providers and consumers from multiple networks and legal setups under one coherent trust framework. Finally, SDIKA pursues the realization of cross-use case identities in open ecosystems with widespread adoption. We thank our new and old partners for joining us on this path. 

All of these projects are funded and supported by the BMWi, and we are grateful for using this huge opportunity to bring SSI to the individual citizen. 

Technical Information Library Hanover 

The Technical Information Library in Hanover is the German specialist library for technology as well as architecture, chemistry, computer science, mathematics and physics. TIB is the world’s largest specialist library for technology and natural sciences. What’s more, TIB and Jolocom have both been early adopters and pioneers of distributed ledger technologies (DLTs) and the Web 3.0 approach in Germany. Multiple encounters within the DLT community have led to a growing desire to join forces so we can further develop interoperability, open standards and open access. Together, we created the Conference Digital Identifier Integration (conDIDi) project. The idea was to implement the SSI approach in the academic sector, while also increasing global collaboration and interoperability due to the gradual integration of different solutions (i.e. ORCID, ConfIDent and conDIDi) into one single system.  

Simply put, conDIDi seeks to reduce the management burden for both participants and organizers: Automating the process of verifying that participants fulfilled requisite criteria, such as having paid fees, met requirements for paper submission, agreed to chair a panel, checked into the venue, etc, would reduce a large part of the bureaucratic, time-consuming and costly overhead of conference management.  At the same time, automated exchange of verifiable participation credentials reduces the waiting time for academics to build their CVs and paper portfolios, and – more recently – for medical professionals to collect proofs of training and education requirements. ConDIDi is now an ongoing pilot project within TIB and we are excited to see what it will grow into and to support its continuous development in future collaborations with our partners at TIB.  

Stacks 

Jolocom also partnered with Stacks, integrating the Stacks into Jolocom SSI solution. 

The integration will enable any developer to easily add support for Stacks-based identities to their SSI service, and to build new SSI services (e.g. issuers, verifiers, mediators) with identities secured by the Stacks blockchain. Furthermore, the developed tools will also allow any user who owns a BNS name to easily derive a valid, globally resolvable DID and use it with various SSI services. Finally, the Jolocom SmartWallet will allow regular users to easily interact with SSI services secured by DIDs anchored on the Stacks network. 

For instance, real-world activities providing any verifiable information for renting a bike or hotel room would happen in a much more seamless, secure, and privacy-preserving way. While the benefits of this technology will be globally available and relevant, we are applying to the Stacks Foundation with a time-sensitive opportunity to reach the European market with pilot projects that can really scale. 

We at Jolocom are looking forward to continuing our great partnerships in current and upcoming projects, as well as building many more partnerships in 2022.  

Wish to read more? Find more information on the projects here:

T-Labs: https://jolocom.io/de/press-release-t-labs-deutsche-telekom-announces-project-with-major-blockchain-startups/  

Bundesdruckerei:https://jolocom.io/de/blog/jolocom-self-sovereign-identities-at-work-in-bundesdruckerei-proof-of-concept-for-e-government/  

TIB: https://labs.tib.eu/condidi/  

The post International Thank You Day appeared first on Jolocom.


Ontology

BUIDL with Ontology Together!

There Are Many Opportunities for You to Join Our Global Team to Welcome the Tomorrow of Web3 At Ontology, we work hard to bring trust, privacy, and security to Web3. We believe that identity and data should be decentralized and self-sovereign. If you share our vision, to provide decentralized identity (DID) and data solutions to the new web we are building together, then we want to hear from
There Are Many Opportunities for You to Join Our Global Team to Welcome the Tomorrow of Web3

At Ontology, we work hard to bring trust, privacy, and security to Web3. We believe that identity and data should be decentralized and self-sovereign. If you share our vision, to provide decentralized identity (DID) and data solutions to the new web we are building together, then we want to hear from you.

As the world moves towards the next iteration of the internet, we continue with our efforts to build the infrastructure required for a new generation of decentralized applications. Ontology aims to deliver a seamless experience for both developers and the users who interact with their projects. To do so we are looking for talented people, with the vision to match their abilities.

We are thrilled to have compiled a thorough list of all the positions we are looking to fill with awesome new team members! Whether you are looking for technical positions in development, or positions in marketing, business development, human resources… chances are, we have an opening for you.

If you or anyone you know is a perfect fit for any of the roles above, please feel free to contact us at careers@ont.io for more details. Our global Harbingers are also glad to help you onboard. Looking forward to meeting you and having you join us in our mission to bring true decentralization to identity & data practices!

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

BUIDL with Ontology Together! was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

How SSI Eliminates Friction, Adds Control for Travelers

PhocusWire The post How SSI Eliminates Friction, Adds Control for Travelers appeared first on Indicio Tech.

Monday, 10. January 2022

KuppingerCole

EIC Blog | Interview with Felix Magedanz

Single factor authentication like passwords are considerate a bad practice. Passwordless authentication, done right, is not only more secure but also more convenient. Learn more about the increasing demand, regulations as well as use cases. Martin Kuppinger enjoyed a delightful conversation with Felix Magedanz from Hanko on the EIC 2021 about the future of authentication.

Single factor authentication like passwords are considerate a bad practice. Passwordless authentication, done right, is not only more secure but also more convenient.

Learn more about the increasing demand, regulations as well as use cases.
Martin Kuppinger enjoyed a delightful conversation with Felix Magedanz from Hanko on the EIC 2021 about the future of authentication.




Elliptic

Crypto Regulatory Affairs: DeFi Crypto Prediction Platform Reaches $1.4million Settlement With the CFTC

After a short pause during the holiday season, Elliptic’s Global Policy and Research Group is back to provide you with the latest news and analysis on all things crypto policy and regulation. We’d like to take this opportunity to wish a prosperous New Year to our readers and subscribers.

After a short pause during the holiday season, Elliptic’s Global Policy and Research Group is back to provide you with the latest news and analysis on all things crypto policy and regulation. We’d like to take this opportunity to wish a prosperous New Year to our readers and subscribers.


Nuggets

2022 is set to be an exciting year!

2022 is set to be an exciting year! Last year was an incredible one for Nuggets. The need for a persistent, portable digital identity is now undeniable. Which sets us up for an even bigger 2022! We joined the Blockchain Alliance Nuggets is now a member of the Blockchain Alliance. This is a public/private coalition, made up of more than 30 blockchain companies, along with over 30 law e

2022 is set to be an exciting year!

Last year was an incredible one for Nuggets. The need for a persistent, portable digital identity is now undeniable. Which sets us up for an even bigger 2022!

We joined the Blockchain Alliance

Nuggets is now a member of the Blockchain Alliance. This is a public/private coalition, made up of more than 30 blockchain companies, along with over 30 law enforcement and regulatory agencies worldwide.

The group provides a forum for open dialogue between these three groups. Their shared goal: to make the blockchain ecosystem more secure, and promote further development of this transformative technology.

We’re in good company. Other members include organisations like Blockchain.com, Circle, Coinbase, Kraken, Ripple, and many more.

Find out more here.

Trusted Transactions

Last year, tech giant Panasonic confirmed it had been hacked over several months, leaving customer information vulnerable. It’s just the latest in the almost endless list of such data breaches. Omnichannel solutions, trust and identity are urgently needed.

Enter Nuggets. We’ve published a report on how we can help companies enable trust in data and technology from the outset — through an approach called Trusted Transactions.

In the report, you’ll find:

The benefits for various verticals, including financial services, healthcare, gaming and travel How Trusted Transactions create value at all levels, by building trust with customers and embedding that trust into all forms of data The benefits in privacy, security, compliance, efficiency, and time to market. How we can massively reduce fraud and false positives

Trust is paramount. And verified self-sovereign digital identities are a silver bullet. Discover how Trusted Transactions can transform your organisation: read the full report here.

DCMS Trust Framework

We’ve been taking part in alpha testing for the UK Digital Identity and Attributes Trust Framework, and collaborating with DCMS on the development of the Framework.

This is a huge validation for Nuggets, and our market readiness. Read more here.

Myidentity

In Q1 of this year, Nuggets is going live across the UK on the Myidentity hub: the digital identity trust scheme for buying and selling homes.

The scheme allows buyers and sellers to verify their identities just once, and then share that identity as needed throughout the course of the transaction. The result: a faster, more secure, more convenient process for everyone.

Find out more about Myidentity here.

FSTech Awards

We’ve already been shortlisted for three awards this year! These are ‘Cyber Security Solution of the Year’, ‘Blockchain Project of the Year’, & ‘Risk Management Software of the Year’ at the FStech Awards 2022!

Now in their 22nd year, these awards celebrate excellence and innovation in UK and EMEA financial services.

It’s great to see our work recognised alongside market-leading brands like IBM, Lloyds Banking Group, Deutsche Bank, J.P. Morgan and Natwest. A big thank-you to the judges, and many congratulations to our fellow nominees.

See the full shortlist here.

Digital IDs don’t have to impinge on civil liberties and privacy

In his latest article for Helpnetsecurity, our CEO Alastair Johnson looks at how the world has moved overwhelmingly towards digital technology.

The pandemic has forced technological leaps on all fronts, and incumbent technologies are struggling to hold back a deluge of fraud and cybercrime.

Big tech and even governments have repeatedly shown they can’t be trusted with this type of information, or control over such a system. The answer is a truly self-sovereign ID, or SSI, giving users complete control over their data. This is made possible through decentralized digital IDs, supported by a wide range of emerging technologies and techniques.

Alastair says: “…the problems of physical ID systems need to be addressed. They can be addressed in multiple ways, but everyone should be wary of any solution that is controlled and monitored by businesses or governments. The potential for abuse is far too high.”

Read the full article here.

A Metaverse built on privacy and trust — with Nuggets

The Metaverse is a new frontier for privacy, trust, and identity. To enter and traverse every online world, you need a Nuggets self-sovereign identity, from which you can manage all your different personas and avatars.

Our blockchain-based interoperable credential platform is live and ready for integration by Web3 teams. Talk to us now,

Read more about Nuggets and the Metaverse here.

A Digital Identity Fit For The Metaverse.

In this new article for Forbes, Alastair explores the coming rise of the Metaverse representing a new frontier for privacy, trust, and identity.

“In the rapidly approaching world of Web 3.0 applications and services that underpin the Metaverse, it’s more important than ever to have a portable and composable digital identity that preserves privacy and provides security. One that won’t just offer proof of who you are and what you can access but also serves as a non-custodial cache for your virtual assets.”

Check out the full article here.

So much is happening, and it’s only January! Stay tuned for more exciting announcements by signing up in our newsletter.

2022 is set to be an exciting year! was originally published in Nuggets on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

Into 2022 with the Indicio Network

The post Into 2022 with the Indicio Network appeared first on Indicio Tech.
In just a year, the Indicio Network has grown into the leading platform for enterprise-grade decentralized products and services—and a community of ground-breaking companies in decentralized and Web3 identity products and services

What a year.

Last January, we launched the Indicio MainNet, the keystone of the Indicio Network, a global platform designed to support mission critical uses of decentralized identity built on the open-source Hyperledger Indy code base.

The MainNet joined its older TestNet sibling, launched in 2020, and — in the summer of 2021 — both were joined by the DemoNet.

Together, they enabled companies to build and test their products (TestNet), demonstrate their products in a safe environment (DemoNet), and then deploy their products in a global  marketplace with the assurance that they were backed by a robust, professionally staffed, global MainNet.

That assurance is critical. We know how long it takes to test and demo products. We know what enterprises need to create and sell products. We saw the need, we heard, repeatedly, from our earliest clients that’s what they needed; and so we made it a reality.

Now, in January 2022, our growing global network of nodes is supported by twenty-six diverse companies on five continents. Each is an innovator. All are committed to building a better way to manage digital identity and to create the trust in data needed for the machine, spatial, and Web3 age. Indicio is a global network for a technology that brings value to everyone. 

What makes the Indicio Network the best network for decentralized identity solutions?
Decentralized identity networks are often likened to utilities for identity, so what do people, typically, want from utilities? Dependability, affordability, and simplicity

Dependability. There can’t be outages. If technical problems occur, they need to be fixed fast. The Indicio Network meets these challenges, first by scale. Each node on the network supports a copy of the ledger, so unlike Facebook going down, there are 26 other Facebooks to make sure your product, company, or organization isn’t in the dark.

Second, our network is staffed and monitored by experts using state-of-the-art monitoring tools. This may sound obvious—surely every network has this? But it isn’t and they don’t. We as a company build solutions on the Indicio Network. We have skin in the network working dependably. We need it to be the best functioning utility for identity out there. This means our staff are your staff in this endeavor. We’re here for you — all the time. Your success is our success.

Affordability. Writes to the ledger are one way to monetize decentralized identity. We believe in making fees for writing to our ledger simple, cost effective, and predictable. We also believe that these fees should not be an obstacle to implementing the right architecture for a solution. No decentralized identity system should cut corners to save money or from fear of unexpected future costs. We see affordability driving reliability, and reliability driving and sustaining adoption. 

Simplicity. It’s no secret that decentralized identity can be tough to understand if you’re delving into it for the first time (and especially if you’re not an engineer). We get it! We want to make everything about decentralized identity as simple as possible, from onboarding a node to providing wizards to simplify technical processes and enabling machine-readable governance.

Still too complicated? We will host and run a node for you.

We also have the largest range of hands-on training workshops for all aspects of decentralized identity—and we were chosen by Hyperledger to develop and run technical training programs.

This is what we built in 2021—and it’s amazing to look back at the year and all we — and our node operators — achieved. But it was even more exciting to see what people began building and releasing on this foundation, projects such as the Aruba Health Pass, LAVCE, and Passmate.

As we move toward a more trustworthy digital world, 2022 is going to be one long, energetic build-athon. And if that excites you, we should talk.

The post Into 2022 with the Indicio Network appeared first on Indicio Tech.


Imageware

[On-Demand Webinar] Introducing Imageware’s New Law Enforcement Platform

Do you know that now your organization can have complete control over who can access your resources and you can also engage in remote collaboration of any kind without putting sensitive data or resources at risk?............ The post [On-Demand Webinar] Introducing Imageware’s New Law Enforcement Platform appeared first on Imageware.

In this webinar, we will present Imageware’s new Law Enforcement solution built with an intuitive user interface, a Cloud-based deployment option to access it from a police car, a tablet in the field, or in a station, and pre-integrated with passwordless biometric login security.

Duration – 45 Minutes

In this webinar, we discuss:

• Why Imageware built this new solution
• The Law Enforcement platform components: Capture, Identify, and Authenticate
• Benefits of the new features
• Demo of the new solutions
• What else is coming

The post [On-Demand Webinar] Introducing Imageware’s New Law Enforcement Platform appeared first on Imageware.


Affinidi

Can Verifiable Credentials Make Life Better for Refugees?

Do you know that until 2021 82.4 million people have been forcibly displaced from their homes due to wars and conflicts? That’s almost the size of Germany’s population! Yet, they are struggling for basic human needs such as food, clothing, and shelter. Why? An Identity Crisis Much of their woes stem from the fact that they do not have the documents to prove their nationality or for t

Do you know that until 2021 82.4 million people have been forcibly displaced from their homes due to wars and conflicts? That’s almost the size of Germany’s population!

Yet, they are struggling for basic human needs such as food, clothing, and shelter.

Why?

An Identity Crisis

Much of their woes stem from the fact that they do not have the documents to prove their nationality or for that matter, even their refugee status.

Unfortunately, a majority of the documents that prove identity and nationality are in the form of physical documents such as a passport or driver’s license that can get lost, damaged, or stolen while fleeing from their country. Sometimes, it may even be left behind for security reasons.

As a result, many of these people become stateless and are unable to get refugee status or its benefits, and for the most part of their lives, languish in camps and work only in low-paying jobs.

The sad part is that this trickles down to their children as well, and the next generation of refugees are unable to study or find jobs because the parents don’t have the required documents.

In this sense, the state of refugees today is an identity crisis as they do not have documents to prove their identity and nationality.

What is the Solution?

From the above discussion, it’s clear that every refugee requires an identity that would enable that person and his/her family members to get the refugee benefits offered by the host country.

Ideally, the document must be,

Verifiable, so the refugee can get access to financial and non-financial benefits under treaties and host nation programs. Easy to access from anywhere, so an individual can digitally prove his or her educational credentials and job experience to make it easy for them to find work in the host nation. Stored in a safe place. Tamper-proof. From a competent authority that is trustable by anyone in any part of the world. For example, the government of a nation.

While these conditions may sound daunting at first, the truth is we already have such an identity in place. It’s called Self-Sovereign Identity (SSI) and this is implemented through Verifiable Credentials (VCs).

The existing format of these VCs has to be simply extended to meet the requirements of the host nation, so every refugee can prove his or her identity and credentials in no time.

How can Verifiable Credentials Help?

VCs meet all of the requirements mentioned above.

It is machine-verifiable, tamper-proof, and has the digital signatures of the entity that is authorized to issue it.

Here is how a VC looks.

As you can see, the cryptographic proof in the VC proves that it has come from an authorized entity while the identification details show the credentials of the holder. Furthermore, it cannot be tampered with or censored.

Let’s understand this with a real-world example.

Real-world Example

Let’s say Mr.X is forced out of his country due to war and he reaches the neighboring country but doesn’t have any physical document to prove his identity such as name, address, educational qualifications, work experience, etc.

Thankfully, he has them all in the form of a verifiable credential. So, he visits the Foreign Office of the host nation, logs into the office’s computer, generates a verifiable presentation, and shares the same with the government officials.

In turn, the officials scan the QR code and learn all about Mr. X. They check if he qualifies for the refugee status and grants the same through a VC.

Now, Mr.X is all set to move around the host nation, collect refugee allowances, study, find a job, and get on with life.

As you can see, the entire process takes only a few minutes and gives Mr. X a chance to start his life all over again. That’s the true power of VCs.

Before we start dreaming about a pain-free world, note that VCs may not work well in all situations.

Are There Any Downsides?

There are a few problems that come with using VCs to handle refugee identities.

Let’s say Mr. X is going to his neighboring country because of political persecution. Knowing this, his country revokes all the VCs issued for him. As a result, he can no longer use the VCs to prove his identity, though he can use VCs issued by his employer and educational institutions, provided they are not revoked. In this sense, VCs can be used for political revenge.

The second issue is that VCs can become inaccessible if Mr.X forgets the password of his digital wallet where the credentials are stored.

So, is VC a solution at all?

Yes, in most cases. These VCs can be a life-saver for most refugees as it proves their identity, work experience, and skills.

Curious to know how else VCs can be used across different industries and situations? Reach out to us on Discord.

Also, check out our Dev Portal that has a ton of resources on SSIs, VCs, and their applications.

Can Verifiable Credentials Make Life Better for Refugees? was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 09. January 2022

KuppingerCole

Analyst Chat #107: From Log4j to Software Supply Chain Security

A new year, and 2022, like 2021, again begins with a look back at a far-reaching security incident. Cybersecurity Analyst Alexei Balaganski and Matthias take the topic of Log4j as an opportunity to look at code quality and cyber supply chain risk management. They also mention Mike Small's excellent blog post, which can be read here.

A new year, and 2022, like 2021, again begins with a look back at a far-reaching security incident. Cybersecurity Analyst Alexei Balaganski and Matthias take the topic of Log4j as an opportunity to look at code quality and cyber supply chain risk management. They also mention Mike Small's excellent blog post, which can be read here.




Identosphere Identity Highlights

Identosphere 64 • Global Verifiable Credential Adoption • did:indy $70,000 Coding Challenge • Marlinspike on Web3

Happy New Year from Identosphere! Your weekly digest of the latest news, events, and development related to creating a system for ID online that gives users ownership of their personal info
Happy New Year & Welcome Back!!!

We’re very excited about the coming year, and look forward to providing even more value than ever!!!

Consider supporting this publication with a monthly payment via Patreon ← click here

Read previous issues and Subscribe : newsletter.identosphere.net

Contact \ Content Submissions: newsletter [at] identosphere [dot] net

Upcoming

The "Completing the Framework" Open Call eSSIF-Lab (until 1/15/22)

Build Your Identity Solution Using Hyperledger Aries 1/20

Hyperledger Indy Technical Deep Dive 2/3

Data Space Launchpad MyData • Until 2/22

Funding Add support for "did:indy" to Hyperledger Indy Node and Indy VDR </>Code With Us

Accpting Applications until 1/10 4:00 PM PST

The total funding for the challenge is $70,000CDN and is divided into 4 phases. The first 3 phases require the use of Python working on the Indy Node and Indy Plenum repos, while the 4th phase requires Rust development in the Indy VDR repo.

The Asymmetry of Open Source Matt Holt

Many people view funding open source as a moral or ethical problem at its core: essentially, companies should pay for what they use (if a project accepts payment) because not doing so is exploitation. I sympathize with this perspective, but I believe a more helpful one is of economics and incentives, because we can reason about money more objectively and constructively this way.

Financing Open Source Software Development with DAO Governance Tokens Kyle Den Hartog

One of the biggest problems in open source software development today is that it’s that the majority of open source software is written by developers as side projects on their nights and weekends. Out of the mix of developers who do produce software in their nights and weekends only a small sliver of them receive any funding for their work.

Collaborative Resource Global Verifiable Credential Adoption Trinsic (Notion)

🔥 This is a community resource for tracking the adoption of verifiable credentials around the world. Please have a look around and join 10+ others who have contributed!

Explainer Trusted Third Parties vs Self-Sovereign Identity Affinidi

All of us have multiple identities at any point. We are sons, daughters, brothers, sisters, parents, partners, friends, colleagues, and more to different people.

A Future Built on Decentralized Identity Bloom

Decentralized identity is an emerging concept becoming more popular for online consumers by eliminating the need to pass personal identifiable information (PII) to an ever-increasing number of companies. However, in practice, decentralized identity has only existed for a handful of years, and its potential is still being discovered. So how did we get here?

Webinar: The Future of Self Sovereign Identity Patientory 

Self-sovereign identity (SSI) is a movement that claims digital identity should be just as legitimate and nuanced as a person’s human identity, while being accessible to all, privacy-preserving, and not reliant on a single government or corporation.

Identity management is key to increasing security, reducing fraud and developing a seamless customer experience Identity Praxis

Identity management is an iterative process with three core elements – initial identification, authentication (re-identifying the individual) and verification (ensuring the individual is who they claim to be)

Enterprises employ a vast array of technologies to execute these processes which are growing in scope and complexity

Understanding why identity management is necessary to enterprises and how this creates opportunities for vendors

Thoughtful Super Apps Or Smart Wallets? David G.W. Birch

There's plenty of talk of super apps around at the moment as a variety of players attempt to become the western equivalent of the Asian app giants such as Alipay, Gojek and Kakao. But how do you get from a digital wallet to a super app? 

Human Authority Moxy Marlinspike SelfSovereignIdentity_memes @SSI_by_memes

Is anonymity good? - lets ask this wife

Companies Magic Product Updates: December Edition MagicLabs

Since our last product update, we’ve launched a multifaceted set of capabilities that enable you to do more with Magic.

Lessons From the School of Cyber Hard Knocks Podcast IDRamp

Passwords and zero-trust and pink locker rooms, oh my! In this episode, Mike discusses IdRamp, what self-sovereign identity is, why we still have passwords today, zero-trust, what the near future holds, pink locker rooms!, his path to IdRamp, and as always, his toughest lesson learned.

Swisscom partners with Orell Füssli for identity Ledger Insights

This isn’t Swisscom Blockchain’s first identity partnership. It also has a relationship with Adresta, which developed a digital identity solution for watches.

Interview with IAMX -Self-Sovereign Identity SSI Spicy Dumpling Show

insights on what IAMX does, how can users and telcos can benefit from it, and why would users be interested in and trust IAMX. Later we talked about the recording and securing of biometric information. IAMX is having an ISPO

Use Case Is the Self-Sovereign digital identity the future digital business registry? GORAN VRANIC, ANDREJA MARUSIC An automatized Identity and Access Management system for IoT combining SSI and smart contracts Montassar Naghmouchi, Hella Kaffel, and Maryline Laurent

This paper proposes a blockchain-based identity and access management system for IoT – specifically smart vehicles- as an example of use-case, showing two interoperable blockchains, Ethereum and Hyperledger Indy, and a self-sovereign identity model.

Development DTDL models - Azure Digital Twins | Microsoft Docs

MSFT does know how to do to JSON-LD they just pretend not to

DTDL is based on JSON-LD and is programming-language independent. DTDL isn't exclusive to Azure Digital Twins, but is also used to represent device data in other IoT services such as IoT Plug and Play.

The human impact of identity exclusion in financial service Caribou Digital

we spoke to a range of participants who are or who have felt excluded from financial systems for different reasons and we’ll be sharing these stories over the next few months. This research is the foundation for Women in Identity to build an Identity Code of Conduct — a set of guiding principles and a framework for inclusive ID-product development.

Public Sector Data: A New Direction — But Which Direction? Alan Mitchell

This is the fifth and final blog in our series about the UK Government’s proposals for data protection reform — “Data: A New Direction”. Previous blogs focused on the thinking behind the proposals. This blog summarises what the main proposals are. 

Joining forces towards European digital credentials European Commission Who Do You Trust With Your Wallet? State of Identity (not ssi)

Sweden's Freja eID is a pinnacle example of government-approved digital identity, all from the convenience of your mobile wallet. Join host Cameron D'Ambrosi as he kicks off 2022 with Kristofer von Beetzen, Chief Product Officer of Freja eID. They dive into the burning questions for eID including who should control identity, why and when you shouldn't host identity data yourself

Organization LEGO & Learning Economy: Gearing up with Super Skills ID Foundation

The Super Skills app combines a custodial wallet (Torus) and Ceramic VC/storage tooling to give children private, exportable, future-proof achievement records – a self-sovereign educational credentialing system in miniature.

Announcing the 2022 OpenID Foundation Individual Community Board Member Election

Board participation requires a substantial investment of time and energy. It is a volunteer effort that should not be undertaken lightly. Should you be elected, expect to be called upon to serve both on the board and on its committees. You should have your employer’s agreement to attend two or more in-person board meetings a year, which are typically collocated with important identity conferences around the world.

VIIVI LÄHTEENOJA APPOINTED AS MYDATA GLOBAL CHAIR

Previous Chair,  Antti “Jogi” Poikola commented: I am delighted to be succeeded by Viivi Lähteenoja as MyData Global’s Chair. […] Viivi’s experience both in and out of the MyData community make her excellently qualified to open up new dialogues on how personal data can empower people and communities. 

Web3 My first impressions of web3 Moxie Marlinspike

This guy gets it ^^^^^

Given the history of why web1 became web2, what seems strange to me about web3 is that technologies like ethereum have been built with many of the same implicit trappings as web1.
[…]
Personally, I think enough money has been made at this point that there are enough faucets to keep it going, and this won’t just be a blip. If that’s the case, it seems worth thinking about how to avoid web3 being web2x2 (web2 but with even less privacy) with some urgency.

Defining the web3 stack Edge & Node

This post will be a living document that I keep up with as I learn, experiment, and gather feedback from developers building in web3.

Using Reputation in DAO Governance Ontology

a16z, the famous venture capital company, is entering the DAO field through investments into DAO projects, such as the recent investments in social DAO project, FWB.

3 Types of Passwordless Authentication for Web 3.0 MagicLabs

Passwordless authentication is a fundamental shift in how people will access their tools and information online, and it will provide more security, prevent billions in losses, and create greater transparency.

The dark side of COVID vaccine Criminals Rake in at Least $100k in Bitcoin for Fake Covid Vaccine Passes Elliptic

Scores of illicit vendors are capitalising on vaccine scepticism in Europe and North America by selling counterfeit Covid-19 vaccination and test certificates for Bitcoin.

Thanks for Reading! Support this publication: patreon.com/identosphere Read more \ Subscribe: newsletter.identosphere.net Contact \ Submission: newsletter [at] identosphere [dot] net

Saturday, 08. January 2022

Europechain

Playing For Value: The Play-To-Earn Model And Top 5 Blockchain Games

The gaming industry is entering a new era, with blockchain gaming growing faster than ever. But what are the benefits, and how does it benefit the players?

If there were a sheriff in gaming country, he better be worried, cause there is a new player in town. Most gamers would be familiar with pay-to-win (P2W), an insidious game mechanic that enables faster progress to those who fork out real cash. Players can acquire more powerful weapons or equipment, extra health allocation, etc. -for a price-, thus giving these ‘premium’ players an unfair advantage...

Source


Infocert (IT)

Avviso importante per gli utenti SPID InfoCert ID

InfoCert comunica a tutti i clienti SPID – InfoCert ID che è in corso l’aggiornamento dei certificati relativi ai metadata SPID InfoCert nel sistema SPID.   I servizi pubblici e privati stanno recependo i nuovi certificati nei loro sistemi, ma tra lunedi 10 e martedi 11 gennaio potrebbero verificarsi dei malfunzionamenti in fase di autenticazione su […] The post Avviso importante per g

InfoCert comunica a tutti i clienti SPID – InfoCert ID che è in corso l’aggiornamento dei certificati relativi ai metadata SPID InfoCert nel sistema SPID.  

I servizi pubblici e privati stanno recependo i nuovi certificati nei loro sistemi, ma tra lunedi 10 e martedi 11 gennaio potrebbero verificarsi dei malfunzionamenti in fase di autenticazione su alcuni siti web.

In caso di malfunzionamento, consigliamo di attendere alcune ore e riprovare per consentire l’aggiornamento di tutti i sistemi. 

Grazie 

Il Team InfoCert 

The post Avviso importante per gli utenti SPID InfoCert ID appeared first on InfoCert.

Friday, 07. January 2022

Caribou Digital

The human impact of identity exclusion in financial services

Stories from the UK and Ghana In 2021, together with Habitus Insight, we explored stories on the human impact of identity exclusion for Women in Identity. This is the first of two blogs on that research — the following blog will share the findings from the UK and Ghana. As part of that work, we spoke to several people who have experienced ID exclusion. Va-Bene is a transwoman — or as s
Stories from the UK and Ghana

In 2021, together with Habitus Insight, we explored stories on the human impact of identity exclusion for Women in Identity. This is the first of two blogs on that research — the following blog will share the findings from the UK and Ghana.

As part of that work, we spoke to several people who have experienced ID exclusion. Va-Bene is a transwoman — or as she calls herself, a transvatar — based in Kumasi, Ghana. She faces challenges every time she goes to a physical bank branch. Her IDs (and particularly the photos on them) don’t reflect who she is now.

“If I have to withdraw money with my ATM card, that is fine. But any other thing I will do directly in the bank. Believe me, either we are going to fight in the bank or I’m just going to be delayed for several hours without any money.” Being denied access to her own money not only has a direct financial impact on Va-Bene. It also has an emotional impact: “it is very frustrating when I am excluded. Very, very frustrating. Very depressing.”

“Know Your Customer” and existing challenges of exclusion

In her book In Pursuit of Proof, Tarangini Sriraraman documents how “identification” began as a need to recognise individuals for governance, but has become a process heavily shaped by social norms. Why is gender an important characteristic for identification? Who decides how that is categorised (male/female)? What happens when gender goes beyond binary, as in Va-Bene’s case?

“Know Your Customer” (KYC) is a legal requirement to comply with anti-money laundering regulations (AML) in most countries. Verification requirements to access financial services often include database checks, identity document verification, and, increasingly, biometric checks. For proof of address, utility bills or bank statements can often serve as acceptable documentation. By verifying a customer’s identity and intentions when they open an account, and then understanding transaction patterns, financial institutions can more accurately pinpoint potential account takeover or other suspicious activities.

However, many people get stuck when trying to prove who they are (like Va-Bene) or where they live. Obtaining an ID can be a time-consuming and expensive process, made more challenging for those who are female, from an ethnic or racial minority background, live in a rural area or with a disability, or are refugees or migrants. Youth, too, are emerging as a demographic facing barriers to ID. Often these issues include not having a proof of address (being without a fixed address, having recently arrived in the country, or for other reasons) or not being able to easily prove who one is (not having foundational ID documentation). These challenges can become even harder to address without knowledge of where to go or what to do next.

Interview with Payal, a recently arrived migrant in northwest London, UK, who faced challenges opening a bank account without proof of address.

Next steps

In addition to Va-Bene and Payal, we spoke to a range of participants who are or who have felt excluded from financial systems for different reasons and we’ll be sharing these stories over the next few months. This research is the foundation for Women in Identity to build an Identity Code of Conduct — a set of guiding principles and a framework for inclusive ID-product development.

Nowhere is this more necessary than in financial services, where ID is needed to protect users (e.g., from financial crime), but at the same time provide access to specific products. Service providers need to know who users are, but how does that process happen, why, and when does it become problematic? The Identity Code of Conduct will establish a set of guiding principles around inclusion, building on the broader Digital ID Principles. It will offer a practical set of tools to address inclusion at every stage of identification.

Note from Women in Identity:

At Women in Identity we believe identity solutions should be inclusive, built for all, by all. But we also believe this does not happen by chance.

The aim of our initial research and videos is to work towards this Code of Conduct: a practical guide for product designers to gain deeper empathy for the challenges experienced at the user end of the ID lifecycle and the knowledge to take these challenges into account when designing products.

Other sectors already take this approach to product development. The pharmaceutical industry, for instance, has an ongoing Code of Practice based on principles of care, fairness, honesty, and respect that impact agreed standards of product development (e.g., clear and transparent information on packaging).

We believe we can learn from other industries and create an Identity Code of Conduct which ensures inclusion is built into identity product design, thus making these products better for the organisations that rely on them and for the people that use them.

We look forward to you joining us on this journey and welcome feedback and suggestions. Please reach out to us on @WomenInID on Twitter or @WomenInIdentity on LinkedIn!

Read more about this project at ID Code of Conduct and follow us at @womeninID @WomenInIdentity @CaribouDigital @habitusinsight #DiversityByDesign #ForAllByAll #IDCodeofConduct #IdentityExclusion

The human impact of identity exclusion in financial services was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


ValidatedID

Awarded with an exhibitor place at Mobile World Congress Americas

We are selected by the Spanish Ministry of Energy, Tourism and Digital affairs to present our e-Signature and Digital Identity services internationally at MWCA
We are selected by the Spanish Ministry of Energy, Tourism and Digital affairs to present our e-Signature and Digital Identity services internationally at MWCA

See you at MWC2019 and 4YFN!

In this edition, we’d be presenting our services for electronic signature and digital identity. Latest developments among the worldwide leaders in innovation.
In this edition, we’d be presenting our services for electronic signature and digital identity. Latest developments among the worldwide leaders in innovation.

Validated ID and DocuWare sign strategic cooperation agreement

DocuWare, provider of cloud-based solutions for document management and workflow automation, introduces new electronic signature service
DocuWare, provider of cloud-based solutions for document management and workflow automation, introduces new electronic signature service

Awarded as Top 10 Patient Engagement Solution Provider in Europe

Validated ID’s efforts to improve the doctor-patient relationship through the electronic signature service VIDsigner have been recognized with the award
Validated ID’s efforts to improve the doctor-patient relationship through the electronic signature service VIDsigner have been recognized with the award

More than 100 trees were planted for our partners

More than 3M documents were digitally signed this 2019. We are making a difference and we decided to plant 100 new trees, one for each partner to...
More than 3M documents were digitally signed this 2019. We are making a difference and we decided to plant 100 new trees, one for each partner to...

Security and hidden costs of digital signature pads

The use of digital signature services instead of digital signature pads is not only a technological or price decision, but also one of guarantees.
The use of digital signature services instead of digital signature pads is not only a technological or price decision, but also one of guarantees.

New strategic partnership agreement with DocVisie

This collaboration has allowed DocVisie to develop an integration that facilitates and simplifies the signature processes, getting better control of documents
This collaboration has allowed DocVisie to develop an integration that facilitates and simplifies the signature processes, getting better control of documents

Digital signature for informed consent

Advantages of electronically signing informed consent and sensitive documents and how to choose a valid signature system
Advantages of electronically signing informed consent and sensitive documents and how to choose a valid signature system

Electronic signatures with DocuWare

With the advent of remote work, we need a better way to manage documents. We propose the Docuware platform integrated with VIDsigner.
With the advent of remote work, we need a better way to manage documents. We propose the Docuware platform integrated with VIDsigner.

Working from home with digital signatures

The integration of an electronic signature service allows your team and customers to sign and send documents conveniently from anywhere.
The integration of an electronic signature service allows your team and customers to sign and send documents conveniently from anywhere.

Validated ID has a new worldwide client: United VARs

This collaboration will help United VARs to digitally sign all the annual contracts among their 50 partners and over 90 countries.
This collaboration will help United VARs to digitally sign all the annual contracts among their 50 partners and over 90 countries.

Blockchers contest finalists, Validated ID and Factory Matters

Our project about the use of Blockchain in the education sector is now classified in the final of the European competition Blockchers
Our project about the use of Blockchain in the education sector is now classified in the final of the European competition Blockchers

SSI in the age of a global pandemic: Covid Credentials Initiative

At Validated ID we are developing privacy and data protection technologies through the Covid Credential Initiative.
At Validated ID we are developing privacy and data protection technologies through the Covid Credential Initiative.

Sign your documents from Dynamics 365 with VIDSigner

If you are a Microsoft Dynamics 365 user, from now on you can send PDF documents and sign them digitally thanks to the our VIDsigner connector
If you are a Microsoft Dynamics 365 user, from now on you can send PDF documents and sign them digitally thanks to the our VIDsigner connector

Validated ID raises € 2M in financing round

The new financing is led by Randstad Innovation Fund, Caixa Capital Risc, and Cuatrecasas Ventures
The new financing is led by Randstad Innovation Fund, Caixa Capital Risc, and Cuatrecasas Ventures

Validated ID wins the second edition of Cuatrecasas Acelera

Winner of the second edition of Cuatrecasas Acelera with the project VIDchain, a decentralized self-sovereign digital solution based on Blockchain.
Winner of the second edition of Cuatrecasas Acelera with the project VIDchain, a decentralized self-sovereign digital solution based on Blockchain.

Improve the patient experience with electronic signatures

Electronic signatures are the perfect solution for compliance-based information processing.
Electronic signatures are the perfect solution for compliance-based information processing.

ValidatedID is now part of the AS4EDI2020 consortium

Validated ID participates in the AS4EDI20 project for the implementation of the CEFeDelivery AS4 profile in Europe.
Validated ID participates in the AS4EDI20 project for the implementation of the CEFeDelivery AS4 profile in Europe.

The importance of the legal evidence in electronic signatures

A court in California has ruled a document signed by a large eSignature company inadmissible. Validated ID collects evidence so that this doesn't happen.
A court in California has ruled a document signed by a large eSignature company inadmissible. Validated ID collects evidence so that this doesn't happen.

Strategic partnership agreement with Saqqara Informatic

This collaboration has allowed Saqqara Informatic to facilitate to its clients the signature of the documents managed from the ERP of Sage
This collaboration has allowed Saqqara Informatic to facilitate to its clients the signature of the documents managed from the ERP of Sage

The tourism sector opts to use electronic signatures

Electronic signatures are the most efficient solution to the traditional operational problems of tourism companies
Electronic signatures are the most efficient solution to the traditional operational problems of tourism companies

Sponsors at the IV Ibero-American Congress of Public Innovation

Validated ID sponsors the IV Ibero-American Congress of Public Innovation (NovaGob 2017) to be held between October 18th and 20th in La Laguna (Tenerife).
Validated ID sponsors the IV Ibero-American Congress of Public Innovation (NovaGob 2017) to be held between October 18th and 20th in La Laguna (Tenerife).

Validated ID and m.Doc GmbH enter into a strategic partnership

Digital health document management becomes easy and secure thanks to digital signature integration that simplifies analogue signature processes.
Digital health document management becomes easy and secure thanks to digital signature integration that simplifies analogue signature processes.

The digital transformation of the education sector

The electronic signature improves the experience in education for students, teachers, parents, guardians and other school staff.
The electronic signature improves the experience in education for students, teachers, parents, guardians and other school staff.

Different types of electronic signatures, breaking myths

Types of electronic signatures: biometric signature, remote signature and centralised signature; their legal status and correct name.
Types of electronic signatures: biometric signature, remote signature and centralised signature; their legal status and correct name.

GDPR capture of consent using Biometric and Advanced Signature

At Validated ID we offer a set of options to enable signing GDPR consent that range from the biometric signature for face to face scenarios to remote solutions
At Validated ID we offer a set of options to enable signing GDPR consent that range from the biometric signature for face to face scenarios to remote solutions

Self-Sovereign Identity: fasten your seatbelts

Identity has been an unresolvable issue that wandered since the beginnings of the Internet. With the explosion of Social Media, the user's digital activity has been exponential, building a new digital identity
Identity has been an unresolvable issue that wandered since the beginnings of the Internet. With the explosion of Social Media, the user's digital activity has been exponential, building a new digital identity

How to implement digital signatures in your company?

If you are evaluating electronic signature solutions for your company, keep in mind these 5 steps to implement a digital signature solution.
If you are evaluating electronic signature solutions for your company, keep in mind these 5 steps to implement a digital signature solution.

VIDsigner electronic signatures integrated with Sage

Available for Sage X3, Sage Accounting, and Sage 200cloud. VIDsigner electronic signatures allow to send documents to be signed without leaving Sage.
Available for Sage X3, Sage Accounting, and Sage 200cloud. VIDsigner electronic signatures allow to send documents to be signed without leaving Sage.

VIDchain, the future of digital identity

VIDchain's blockchain-based approach is based on aggregating different digital identity sources into a dentity wallet of identity attributes.
VIDchain's blockchain-based approach is based on aggregating different digital identity sources into a dentity wallet of identity attributes.

We are new members of Lab Santé France

Lab Santé is a cooperation structure between those who offer innovative solutions and their future users, connects public-private entities.
Lab Santé is a cooperation structure between those who offer innovative solutions and their future users, connects public-private entities.

New free service for mobile signature with the Spanish ID card

Validated ID develops new service to sign easily, safely and free with the new Spanish ID card, the DNI 3.0
Validated ID develops new service to sign easily, safely and free with the new Spanish ID card, the DNI 3.0

Ready for INATBA’S World Blockchain Congress

After joining the INATBA we’re preparing to be part of a forum that will highlight the great work done together with companies, universities, and organizations.
After joining the INATBA we’re preparing to be part of a forum that will highlight the great work done together with companies, universities, and organizations.

Improve productivity with electronic signatures and SAP

The integration of VIDsigner e-signatures with SAP allows you to sign your documents within SuccessFactors, Business One, By Design, S/4HANA and ECC.
The integration of VIDsigner e-signatures with SAP allows you to sign your documents within SuccessFactors, Business One, By Design, S/4HANA and ECC.

What kinds of digital signatures can be used under the eIDAS?

Qualified signatures have very clear requirements: advanced signature + qualified certificate + secure signature creation device
Qualified signatures have very clear requirements: advanced signature + qualified certificate + secure signature creation device

e-Signatures: product or service? Responsibility is th