Last Update 9:13 PM January 16, 2021 (UTC)

Identosphere - Company Blog Feeds

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Saturday, 16. January 2021

Fission

Discussion with Rosano: Zero Data Apps, remote storage, and Funding Buttons

Rosano joined us as the invited speaker for the second weekly video chat of the year, on January 14th, 2021. What's top of mind for Rosano right now is federated systems, zero data, web apps, information architecture, publishing, and decentralized funding, and we covered all of them in our discussion.

Rosano joined us as the invited speaker for the second weekly video chat of the year, on January 14th, 2021. What's top of mind for Rosano right now is federated systems, zero data, web apps, information architecture, publishing, and decentralized funding, and we covered all of them in our discussion.

rosano.ca

You can listen to the full discussion here (Boris being the main host voice you hear), or watch the video embedded at the end of this post.

Your browser does not support the HTML5 Audio element.

We had a wide ranging conversation starting with his Zero Data Apps initiative: explaining and promoting #ownyourdata principles:

an app in which your data stays with you you control where the data is stored no spam, no captcha, no sign up, no passwords, bring your own identity using open protocols for flexibility and interoperability do what you want with your data at any time your data is accessible forever even if the app stops working

Rosano lists Fission on the page, and is waiting for us to have an app list that can be automatically included on the page. Working on it!

All of the apps are "zero data", whether it uses remote storage, Solid protocol, Unhosted, or otherwise upholds the principles listed.

Including Fission team member Steven's Diffuse music player. We interviewed Steven about Diffuse way back in June 2019, before he even joined the team.

We had a great discussion that covered thoughts on how people use apps and what concepts are and aren't familiar to them. It was really inspiring to speak with Rosano about all of his apps, and the work that he's done with friends and others to introduce them to these zero data app concepts.

Links from Chat

Kommit, Rosano's flashcard app (inspired by Anki)

Localizing not just interface, but also programming languages. Ramsey Nassar has created an Arabic programming language:

a programming language exploring the role of human culture in coding. Code is written entirely in Arabic, highlighting cultural biases of computer science and challenging the assumptions we make about programming. It is implemented as a tree-walking language interpreter in JavaScript.

Privacy preserving product analytics by Brave (via Helder)

Anagora "The [[Agora]] is a [[distributed knowledge graph]] and [[experimental]] [[social network]]." (by TBD)

Rosano's Launchlet, for customizing websites with JavaScript and CSS. Another area that we talked about for how to enable people to customize apps, and if there might be a specification to create. Started a wiki page on the forum about User Scripts and Styles.

Discussion on forking/cloning, in the context of Fission's upcoming App Cloning feature. The Fork N Go concept for Github - "Forking a website repository on GitHub designed to be easily used by others." (via Helder)

The Fund Button

Since we didn't do a screenshare during the live talk, here is a short video of the fund button and flows from Rosano's HyperDraft note taking app.

You can find links to the separate sections in the forum.

We talked about perhaps specifying this "zero data funding" method into a specification. Could use any sort of payment or donation system – e.g. Paypal, Patreon, Github Sponsors, Open Collective, etc.

This is something that Verifiable Credentials could potentially be used for: a "hasPaidForHyperDraft: true" value could be set for the user.

Video

Due to a mistake in settings, the video only pins to Rosano during the recording, so you just get Boris' and James' disembodied voices while we look at Rosano. The audio recording above is the same content as in this video.

Open video chats happen most Thursdays 9am PST / 12pm EST / 1800 CET on a variety of topics, and everyone is welcome. View the full event listing on the forum »


Web Native Database Community Kickoff

We held our first community meeting about building out the distributed database capabilities of the Web Native File System. Video from Jan 7, 2021.

We held our first community meeting about interest in building out the distributed database capabilities of the Web Native File System (WNFS).

Here is the video from the January 7th, 2021 meeting:

James posted notes from the kickoff:

As we had predicted, the group was a mix of people who are interested in using WNDB as app developers (who have feature and API design interests) and those interested in the protocol / “plumbing” of WNDB There were not a lot of strong feelings about the nature of the API - rather, WNDB needs features and documentation on how to implement them There is a lot of interest / consensus around our initial event source-based design ideas. Brooklyn highlighted datomic and datascript. A lot of good (early) discussion around schema management: developers should be able to define them (but also share and re-use), they should be applied as “views”, and lots of interest in project cambria

We've got a new Webnative DB category in the forum that you can subscribe to and follow along, as well as join the Fission Discord chat.

Friday, 15. January 2021

Evernym

A Call for Reciprocal Negotiated Accountability

Am I crazy if I am a privacy hawk, but I’m opposed to unfettered anonymity? I think people should be free to live their lives without Siri or Alexa or Google Assistant listening to private bedroom conversations. I find the surveillance economy repugnant. The Snowden revelations leave me convinced that government eavesdropping needs more constraints. […] The post A Call for Reciprocal Negotiated

Am I crazy if I am a privacy hawk, but I’m opposed to unfettered anonymity? I think people should be free to live their lives without Siri or Alexa or Google Assistant listening to private bedroom conversations. I find the surveillance economy repugnant. The Snowden revelations leave me convinced that government eavesdropping needs more constraints. […]

The post A Call for Reciprocal Negotiated Accountability appeared first on Evernym.


1Kosmos BlockID

2021 Top-5 Cyber Security Priorities

It’s already mid-January and Google has been filled with references to web pages ranking what is supposedly the best to come in 2021 for a multitude of things. Spoiler alert: this blog isn’t about the 2021 best electric SUVs on the market. First, what do I know about SUVs? Second, I live in Houston, the world capital of oil refinery, and if I want to keep on being invited to my city’

It’s already mid-January and Google has been filled with references to web pages ranking what is supposedly the best to come in 2021 for a multitude of things. Spoiler alert: this blog isn’t about the 2021 best electric SUVs on the market. First, what do I know about SUVs? Second, I live in Houston, the world capital of oil refinery, and if I want to keep on being invited to my city’s high society social gatherings, I better not promote e-cars in any shape or form.


Elliptic

3 Key Takeaways from our Conversation with Brian Brooks

We were delighted to welcome Brian Brooks, Acting Comptroller of the Currency, to the latest in our series of “Coffee and Crypto with Regulators” webinars. The discussion with Elliptic CEO Simone Maini was a fascinating and inspiring exploration of Brian’s passion for financial innovation and the positive transformations it can bring. They also delve into the crypto-specific work tha

We were delighted to welcome Brian Brooks, Acting Comptroller of the Currency, to the latest in our series of “Coffee and Crypto with Regulators” webinars. The discussion with Elliptic CEO Simone Maini was a fascinating and inspiring exploration of Brian’s passion for financial innovation and the positive transformations it can bring. They also delve into the crypto-specific work that he has spearheaded at the OCC - including a number of interpretive letters on cryptoasset activities and the licensing of crypto businesses as national banks.


Tokeny Solutions

Tokeny’s Talent|Nida’s Story

The post Tokeny’s Talent|Nida’s Story appeared first on Tokeny Solutions.
Nida joined the team at the beginning of Tokeny’s establishment as an UI/UX Designer and Frontend Developer Who are you?

I’m Nida Orhan. I am a remote UI/UX designer and Frontend Developer. I was born in and I am still living in Turkey. I have studied Business at Bogazici University. I have done a couple of internships in sales, marketing and finance departments in various companies in my first years of college. In my third year, I started to learn graphic/web design to create visuals and interfaces for our startup project. Our business idea didn’t work out in the end but from this grew a strong interest in web design and development and I decided to pursue a career in the field. I took courses and did an internship as a graphic designer. After finishing college I worked as a frontend developer in Turkey for some time, then I did freelance design and development for almost 1 year. After that I joined Tokeny in its early days and have been working here since then.

How did you land at Tokeny Solutions?

I was looking for a permanent remote position and I got an offer from Tokeny. The company was only a few months old at the time with a few employees. The team was still trying to finish the first MVP. After interviewing with our CEO, I was impressed by Tokeny’s vision in blockchain technology and it was obvious for me that the company will succeed in this area. So I wanted to be a part of the team that will realize that vision.

How would you describe working at Tokeny Solutions?

Tokeny has been a remote friendly company since the beginning. It is a flexible and well connected organization. We have regular weekly online meetings to keep each other in sync about our work and monthly meetings to learn about the company’s overall performance and strategy. I love the collaborative team culture and friendly atmosphere at Tokeny. I have been working remotely for almost 3 years now and I have never had the chance to visit our offices in Europe but I have always felt accepted and valued and seen as a part of the team.

What are you most passionate about in life?

I’m most passionate about product design. I love creating applications from scratch that are beautiful and easy-to-use for people and doing this by combining design and development. It has been always very satisfying for me to design an interface and bring it to life with code.

What is your ultimate dream?

I have many, but the ultimate one is to live a fun and happy life with my family. I became the mother of an adorable little son two years ago and he has been the center of my life since then. I want to see him grow and cherish him throughout his life.

What would you change in the world if you could?

I would like to give everyone the opportunity of having a high quality education so that everyone can achieve their dreams.

She prefers: check

Coffee

Tea

Movie

check

Book

check

Work from home

Work from the office

check

Dogs

Cats

check

Call

Text

check

Salad

Burger

check

Ocean

Mountains

Wine

check

Beer

check

Countryside

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

Fiat

check

Morning

Night

Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent|Nida’s Story appeared first on Tokeny Solutions.


auth0

How to Control Hue Lights with JavaScript

Learn how to control Hue lights with JavaScript
Learn how to control Hue lights with JavaScript

KuppingerCole

Digital Workplace Delivery Platforms

by Paul Fisher The KuppingerCole Market Compass Workplace Delivery covers solutions that assist organizations in managing applications and data that end users access from a “single pane of glass” interface. These can run on existing PC workstations and remote devices including smartphones and tablets. As part of digital transformation these solutions should improve productivity, increase employee

by Paul Fisher

The KuppingerCole Market Compass Workplace Delivery covers solutions that assist organizations in managing applications and data that end users access from a “single pane of glass” interface. These can run on existing PC workstations and remote devices including smartphones and tablets. As part of digital transformation these solutions should improve productivity, increase employee satisfaction, and create value in distributed cloud infrastructures. As the world shifts to greatly increased mobile and remote working during and after the global pandemic, Workplace Delivery Platforms will enable a richer working experience from wherever end users are situated. As ever, security of these platforms should be a primary consideration when buying.


Ontology

“Welcome Home” is now Approved!

Ontology and Daimler Accelerate Progress on ‘Welcome Home’ App Enjoy full transparency and take control of your own data and privacy We are happy to announce that the much anticipated “Welcome Home” application powered by Ontology has officially passed through the initial stages of review with Daimler Mobility. As a truly pioneering MVP product, Welcome Home combines the premium experi
Ontology and Daimler Accelerate Progress on ‘Welcome Home’ App Enjoy full transparency and take control of your own data and privacy

We are happy to announce that the much anticipated “Welcome Home” application powered by Ontology has officially passed through the initial stages of review with Daimler Mobility. As a truly pioneering MVP product, Welcome Home combines the premium experience of Daimler’s mobility units with Ontology’s technical superiority, especially related to decentralized identities and data protection. All of this can be experienced in 6 simple steps, outlined in the videos below!

This marks an unprecedented step in the right direction, bringing your automotive experience a step closer to Ontology’s decentralized blockchain services.

Car operators who have access to the Welcome Home in-car system will have control over unique features, including:

Your in-car and mobility service preferences are attached to your profile, not to any one particular vehicle, therefore you could transform any vehicle into your own, anytime, anywhere across the globe. You can access third-party service providers integrated into the system with one-time verification, yet stay anonymous to all providers after the services are completed. You will be authorizing your data for service providers to give personalized services, yet you retain full transparency and control of all access and usage of your data, which would allow you to stay anonymous.

According to Harry Behrens, Head of Daimler Mobility Blockchain Factory:

“Customers using the Welcome Home which integrates with the Daimler Mobility Blockchain Platform would be greeted to their customized user settings and profile by any Mercedes they pair the app with. Fully sovereign in control of their data they can choose to share this experience with their friends and be safe at home with Mercedes — anywhere they go”

As we’ve written about previously, Welcome Home is designed to be a user-centric application granting us full control over when, where, and whom we share data with through our ONT ID.

We firmly believe that the control over which content to share should belong to the user, and is something achievable through our Decentralized Identity (DeID) protocol as well as our Distributed Data Exchange Framework (DDXF) — both powered by cryptographic algorithms on the blockchain.

At the end of your journey, you can share your trip with people in your social circle with just a couple of clicks. This allows the Welcome Home app to combine your mobility with social networking. By using Ontology’s unique framework for DID and highly secure data access we have combined usability with the highest standards for data sovereignty and privacy.

Looking to the future, we look forward to the gradual rollout of not just the Welcome Home app, but also further automotive innovations through Daimler Mobility and their embrace of the blockchain world.

Find Ontology elsewhere

Ontology website / Ontology GitHub / ONTO website / OWallet (GitHub)

Telegram (English)Discord

Twitter / Reddit / FacebookLinkedIn

“Welcome Home” is now Approved! was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


ArcBlock

ArcBlock Year in Review 2020

ArcBlock 2020 year in review was a transitional year for the Blockchain industry. In this article, ArcBlock CEO Robert Mao takes a look back at some of the challenges, opportunities, and showcases an optimistic view of what’s to come in 2021. ArcBlock CEO, Robert Mao — “A Year of Progress” Whether it’s me or the ArcBlock team, 2020 has been a tough year, while also being a year of growth an
ArcBlock 2020 year in review was a transitional year for the Blockchain industry. In this article, ArcBlock CEO Robert Mao takes a look back at some of the challenges, opportunities, and showcases an optimistic view of what’s to come in 2021.
ArcBlock CEO, Robert Mao — “A Year of Progress”

Whether it’s me or the ArcBlock team, 2020 has been a tough year, while also being a year of growth and love. I believe that one day in the future when we look back we will all realize that we accomplished a lot — both personally and professionally.

Similar to many of our friends and customers, ArcBlock was deeply impacted by the epidemic and saw the overall industry take a downward trajectory as many businesses and individuals began to realize the full extent of the virus.

So for ArcBlock 2020 was a time to reflect, learn from our mistakes, and return to our beginning by rediscovering ourselves and focusing ourselves on our mission to make decentralization a reality. We are removing past mistakes, focusing on our strengths, and delivering products to our customers that they need, want, and can use.

2021 is no longer about unrealistic expectations, but rather embracing who we are and what we want to do in order to make ArcBlock a success. If you go back and look at the last year, you can see that despite the impact of the pandemic the ArcBlock team remained strong, resilient, and focused on building, improving, and delivering our products.

“One day in retrospect, the years of struggle will strike you as the most beautiful.”

Product Releases ABT Node In July, ArcBlock officially released ABT Node RC1 In September, ArcBlock officially released ABT Node 1.0 for production DApps and use cases In October, ArcBlock released an improved version of ABT Node that included various feature enhancements In December, ABT Node 1.1 was released with several new innovations and Blocket updates ABT Wallet March saw ArcBlock release ABT Wallet 2.5 improving on their industry-leading decentralized wallet and reimagining how users could control their data on the decentralized web. NFT ArcBlock’s Devcon in June included a wide range of NFTs including tickets, certificates, and more. Token Swap In January, ArcBlock released its official Token Swap Service. DevCon In June, ArcBlock announced it’s first official Developer Conference to be held live over a 48-hour period in June. ArcBlock Devcon 2020 Day 1 Recap ArcBlock Devcon 2020 Day 2 Recap On June 28, ArcBlock held a week-long Hackathon that included unique NFTs, leading developer teams, and more Experiences In February, ArcBlock released a live demo experience “Try Identity Now” that enabled anyone to experience decentralized identity, blockchain and DApps in a few simple steps. Partnerships In Aprile, ArcBlock joins the Covid 19 Initiative In June, ArcBlock officially announced a new partnership with the WTIA In November, Poulsat and ArcBlock officially announced a new strategic partnership to bring Decentralized Identity, Blockchain, and tokenization to Africa. In December,ArcBlock joined the internationally recognized decentralized identity organization MyData Global. Milestones In October, ArcBlock celebrated its three-year anniversary 2021 and Beyond

As ArcBlock continues to grow as a company and become more public with it’s product releases, ArcBlock will no longer be announcing it’s internal product roadmap. As a product company, the focus will remain entirely on the DApps Platform and customers with every action going forward focused on bettering the ArcBlock experience.

Some notable things to be aware of for the first and second quarters of 2021 include:

To improve the ArcBlock experience going forward, we will initiate and complete a complete reverse Token Swap operation returning to the free ERC20 state. In doing so, we will be able to improve the ArcBlock platform and experience for our developers and customers by removing much of our technical debt; We will introduce a new ABT asset chain based on the Ethereum Optimistic Rollup mechanism, and sunset our previous Forge-based native asset chain approach. In doing so, we will enable our users to gain the benefits of both Ethereum and the high-performance native chain; Lastly, we will host our next DevCon 2 Developers Conference in June, when we’ll partner with more developer partners to show the future of decentralization, DApps, and more.

While 2020 has been challenging, it has also created many new opportunities for ArcBlock. While “the pessimist is often right, the optimist is often successful”. In the unprecedented dilemmas and challenges of 2020, we have chosen to be optimistic and are excited to share everything we’ve been working on.

ArcBlock Year in Review 2020 was originally published in ArcBlock on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 14. January 2021

KuppingerCole

Effective Endpoint Security With Automatic Detection and Response Solutions

The realization that cyber-attacks are inevitable has led the cybersecurity industry to shift some of its focus to detection and response rather than prevention in recent years. Therefore, the market for solutions designed to detect attacks on endpoints and respond accordingly has grown significantly. These Endpoint Detection & Response (EDR) solutions look for evidence and effects of malware

The realization that cyber-attacks are inevitable has led the cybersecurity industry to shift some of its focus to detection and response rather than prevention in recent years. Therefore, the market for solutions designed to detect attacks on endpoints and respond accordingly has grown significantly. These Endpoint Detection & Response (EDR) solutions look for evidence and effects of malware that may have slipped past Endpoint Protection (EPP) products. EDR tools also perform evaluation of threat intelligence, event correlation, and often allow interactive querying, live memory analysis, and activity recording and playback.




Evernym

Why We Support the Electronic Frontier Foundation

Privacy gets too little emphasis from some participants in the decentralized identity movement. They claim to value confidential interactions, yet advocate that individuals create public decentralized identifiers (DIDs) on the blockchain (ignoring legal warnings about DIDs being PII). They are okay with “phone home” verifications of credentials and revocation and capabilities. They think that sele

Privacy gets too little emphasis from some participants in the decentralized identity movement. They claim to value confidential interactions, yet advocate that individuals create public decentralized identifiers (DIDs) on the blockchain (ignoring legal warnings about DIDs being PII). They are okay with “phone home” verifications of credentials and revocation and capabilities. They think that selective […]

The post Why We Support the Electronic Frontier Foundation appeared first on Evernym.


Coinfirm

Is Cryptocurrency Legal in Singapore? Bitcoin Regulations 2021

Is cryptocurrency legal in Singapore? Short answer: yes. Singapore’s Bitcoin and crypto regulations and laws cover ICO, tax, AML/CFT and methods of buying/trading in virtual assets. Singapore is well-known for being a strict country in regards to laws and regulations. However, Singapore takes a balanced approach to cryptocurrency and Bitcoin regulations and is referred to...
Is cryptocurrency legal in Singapore? Short answer: yes. Singapore’s Bitcoin and crypto regulations and laws cover ICO, tax, AML/CFT and methods of buying/trading in virtual assets. Singapore is well-known for being a strict country in regards to laws and regulations. However, Singapore takes a balanced approach to cryptocurrency and Bitcoin regulations and is referred to...

OWI - State of Identity

IdRamp: The Digital Identity Transformation

State of Identity host Cameron D'Ambrosi sits down with Mike Vesey, CEO of IdRamp. As someone who has uniquely spent their full career immersed in identity, Vesey shares an insider perspective of how the industry has evolved during his tenure. This episode dives into the impacts of rapid digitalization worldwide, where identity is heading due to digital transformation, and the benefits that come w

State of Identity host Cameron D'Ambrosi sits down with Mike Vesey, CEO of IdRamp. As someone who has uniquely spent their full career immersed in identity, Vesey shares an insider perspective of how the industry has evolved during his tenure. This episode dives into the impacts of rapid digitalization worldwide, where identity is heading due to digital transformation, and the benefits that come with it.


KuppingerCole

SAP Enterprise Threat Detection

by Martin Kuppinger In these days of ever-increasing cyber-attacks, organizations have to move beyond preventative actions towards detection and response. This no longer applies to the network and operating system level only, but involves business systems such as SAP S/4HANA. Identifying, analyzing, and responding to threats that occur within the application layer is a must for protecting the cor

by Martin Kuppinger

In these days of ever-increasing cyber-attacks, organizations have to move beyond preventative actions towards detection and response. This no longer applies to the network and operating system level only, but involves business systems such as SAP S/4HANA. Identifying, analyzing, and responding to threats that occur within the application layer is a must for protecting the core business systems.


SecurEnds Credential Entitlement Management

by Richard Hill Due to the potential impact of security risks arising from a lack of proper access governance controls, access governance has become a vital IAM technology for any organization. SecurEnds Credential Entitlement Management (CEM) simplifies user entitlement activities through automation and insightful analytics, giving organizations control of their user access governance.

by Richard Hill

Due to the potential impact of security risks arising from a lack of proper access governance controls, access governance has become a vital IAM technology for any organization. SecurEnds Credential Entitlement Management (CEM) simplifies user entitlement activities through automation and insightful analytics, giving organizations control of their user access governance.


Nyheder fra WAYF

WAYF nu i stand til at filtrere attributværdier

Visse af de brugeroplysninger eller attributter som WAYF kan videreformidle til tjenester fra brugerorganisationer, har et meget stort værdirum, og i mange tilfælde er det kun en mindre delmængde af de mulige værdier der vedkommer tjenesteudbyderen. Language Danish Read more about WAYF nu i stand til at filtrere attributværdier

Visse af de brugeroplysninger eller attributter som WAYF kan videreformidle til tjenester fra brugerorganisationer, har et meget stort værdirum, og i mange tilfælde er det kun en mindre delmængde af de mulige værdier der vedkommer tjenesteudbyderen.

Language Danish Read more about WAYF nu i stand til at filtrere attributværdier

WAYF får nye HSM-bokse

Al datatrafik fra WAYF underskrives digitalt med en privatnøgle som opbevares i et hardware security module (en 'HSM'), så den ikke kan hackes. WAYFs nuværende HSM'er er imidlertid end-of-life og i færd med at blive erstattet af to nye. WAYFs nye HSM-bokse er af typen Thales Luna S790 og kan hver udføre ca. 10.000 digitale signeringer i sekundet (med 2K-nøgler). De nuværende bokse af samme

Al datatrafik fra WAYF underskrives digitalt med en privatnøgle som opbevares i et hardware security module (en 'HSM'), så den ikke kan hackes.

WAYFs nuværende HSM'er er imidlertid end-of-life og i færd med at blive erstattet af to nye.

WAYFs nye HSM-bokse er af typen Thales Luna S790 og kan hver udføre ca. 10.000 digitale signeringer i sekundet (med 2K-nøgler). De nuværende bokse af samme mærke har hver et kapacitetsloft på 1.200 signeringer i sekundet.

Language Danish Read more about WAYF får nye HSM-bokse

UbiSecure

Ubisecure CEO Review: 2020 roundup and looking into 2021

2020 in review I think we can all agree that 2020 was not the year we were expecting! The pandemic has affected... The post Ubisecure CEO Review: 2020 roundup and looking into 2021 appeared first on Ubisecure Customer Identity Management.
2020 in review

I think we can all agree that 2020 was not the year we were expecting! The pandemic has affected all of us in different ways and to different extents. Yet the silver lining for me was to see all the teams we have within Ubisecure adapt and continue regardless of last year’s challenges, continuing to communicate effectively, deliver first-rate services, and grow the company. This is a testament to each and every member of the Ubisecure team, dealing with remote working, home schooling, yet still growing Ubisecure as a leading identity specialist.

Our 2020 milestones include:

Great growth from both our Customer Identity and Access Management (CIAM) software, Identity Platform, and Legal Entity Identifier (LEI) service, RapidLEI. On top of our 100% revenue growth in 2019, we saw a 50%+ revenue growth in 2020 (CAGR of 70% over the last two years) despite the impact of the pandemic. RapidLEI went from being the fastest growing LEI provider to becoming the world’s #1 issuer of validated LEIs. RapidLEI also topping the charts for best data quality results amongst the GLEIF-accredited Local Operating Units (LOUs). We announced support for the new Global LEI Foundation Validation Agent (VA) model to support on-demand LEI issuance for Banks, Fintech, CAs and Business Registries. We launched many new CIAM and LEI partnerships, including technology alliances with Hitachi (finger vein biometrics), Omada (Identity and Governance Administration), and Verimi (pan-European consumer digital identity platform).

 

2021 – Looking forward

As I write this in the first week of January 2021, our teams across the organisation have already hit the ground running. We see an active, rapidly growing market, further driven by the current pandemic and ever-increasing cyber-attacks. This increase in malicious activity is leading organisations to ramp up cybersecurity efforts, including reviewing legacy Identity & Access Management solutions. More remote working serves to further blur the lines between ’employee’ and ‘customer’ identities, increasing the number of external identities an organisation manages. This has generated high demand for CIAM (Customer IAM) solutions, specifically built to streamline management of external identities, such as our Identity Platform.

Building on our role as an Organisation Identity Expert

Ubisecure’s plans for 2021 include further developments on our unique position as an Organisation Identity expert. We have strong recognition in this area both from analysts and from fellow identity management firms.  We will continue to build on this capability, joining our LEI and CIAM business arms to launch our innovative representation governance solution, “Sign in with RapidLEI” service to enable individuals to assert a right to represent their organisation in financial transactions, legal signatures and more.

Self-Sovereign Identity (SSI)

The concept of Self-Sovereign Identity (SSI) is in its infancy and, in the general case, as yet independent of classical identity systems. As organisations seek services that can intertwine these systems, we will provide interoperation and also enable the issuance of organisation credentials as the result of LEI issuance (as now used in our “Sign in with RapidLEI” service).

Identity Trust Anchors

I also plan to continue educating on the importance of Trust Anchors through various media – how they enable online interaction, and how we have been implementing them for more than 10 years. I spoke on the subject at the virtual Digital Identity and Digital On-boarding for Banking conference at the end of 2020 – watch the video and others on the Ubisecure YouTube channel.

Expanding use of national, local, and federated digital identities

The Ubisecure CIAM platform has long supported the industry’s widest range of digital identity schemes, both at a national and enterprise level. In 2021, we will offer expanded support for identity verification from both existing identity platforms, like Germany-based Verimi, and real-time identity proofing, thanks to partnerships with organisations like Onfido.

 

In Conclusion

Overall, while the ongoing impact of the pandemic on our personal lives in 2021 may continue, 2020 has shown us that Ubisecure is well-prepared to deliver business as usual whilst keeping our staff safely working from home. I look forward to talking more about our 2021 innovations and Ubisecure’s continued growth as the year pans out.

The post Ubisecure CEO Review: 2020 roundup and looking into 2021 appeared first on Ubisecure Customer Identity Management.


auth0

The 9 Worst Recent Data Breaches of 2020

There are valuable lessons to be learned from the cybersecurity stories of a challenging year.
There are valuable lessons to be learned from the cybersecurity stories of a challenging year.

PingTalk

Complete Guide to User Provisioning with Ping Identity

What is User Provisioning? User provisioning, also known as user account provisioning, is an automated process for creating user accounts and managing access to IT resources. It goes beyond simply identifying whether an individual is who they say they are and extends into that person’s rights and permissions to specific enterprise applications and other resources. To use a simple banking analogy
What is User Provisioning?

User provisioning, also known as user account provisioning, is an automated process for creating user accounts and managing access to IT resources. It goes beyond simply identifying whether an individual is who they say they are and extends into that person’s rights and permissions to specific enterprise applications and other resources. To use a simple banking analogy, identity management identifies the thieves from the customers and employees, allowing the latter groups to enter the bank, while user provisioning keeps the customers from dipping into other customers’ bank accounts.

 

Furthermore, user provisioning allocates user privileges and permissions automatically, based on criteria such as user role. This differs from identity governance and administration, which handles user identity lifecycle administration and does more than simple user provisioning. With user provisioning, you’re not relying on access requests. There’s no certification, no attestation, no audit trail—just the exact user information needed for access management.

 

User provisioning becomes increasingly critical the larger an enterprise grows. The more employees and positions within an organization, the more difficult it can be to determine access rights. User provisioning can help boost productivity by relieving the burden from IT having to manually create user accounts and arrange access for each new employee and application. For instance, as titles and departments change, IT can perform group updates. They can also easily provision new applications and deprovision old ones, further reducing the risk of unauthorized information falling into the wrong hands.

 

The Ping Provisioning Catalog

Ping solutions include numerous out-of-the-box integrations that allow you to automatically provision, update and deprovision users to a wide range of applications. Provisioning from cloud and on-premises HR application sources helps you maintain accurate, up-to-date user profile information, with full CRUD (create, read, update and delete) capabilities for user or group provisioning, so you can eliminate manual processes and profile synchronization challenges.

 

In keeping with Ping’s standards-based approach to IAM, our user provisioning is based on the System for Cross-domain Identity Management (SCIM) standard. SCIM was developed nearly a decade ago using protocols like REST and JSON in order to reduce complexity and provide a more straightforward approach to user management, and it enables easier, more powerful and standardized communication between identity data stores. 

 

Ping’s provisioning capabilities fall into two main categories: inbound HR provisioning and outbound application provisioning. 

 


Smarter with Gartner - IT

5 Key Predictions for Identity and Access Management and Fraud Detection

Security and risk management leaders are experiencing widespread disruption in identity and access management (IAM) solutions for many reasons, most notably because of the increased drive to customer-facing interactions on digital channels and the sudden and rapid expansion of the remote workforce because of the pandemic. “IAM challenges have become increasingly complex,” says Akif Khan, Senior

Security and risk management leaders are experiencing widespread disruption in identity and access management (IAM) solutions for many reasons, most notably because of the increased drive to customer-facing interactions on digital channels and the sudden and rapid expansion of the remote workforce because of the pandemic.

“IAM challenges have become increasingly complex,” says Akif Khan, Senior Director Analyst, Gartner, “and many organizations lack the skills and resources to manage effectively. Leaders must improve their approaches to identity proofing, develop stronger vendor management skills and mitigate the risks of an increasingly remote workforce.”

The five strategic planning assumptions that follow focus on current trends in decentralized identity, access management, IAM professional services and identity proofing.

[swg_ad id="36784"]

Cybersecurity mesh will support more than 50% of IAM requests

The old security model of “inside means trusted” and “outside means untrusted” has been broken for a long time. Most digital assets and devices are outside the enterprise, as are most identities.

Read more: Gartner Top 10 Security Projects for 2020-2021

By 2025, cybersecurity mesh will support more than half of all IAM requests, enabling a more explicit, mobile and adaptive unified access management model. The mesh model of cybersecurity provides a more integrated, scalable, flexible and reliable approach to digital asset access control than traditional security perimeter controls.

Delivery of IAM services will increase via managed security service providers (MSSPs)

Organizations lack the qualified resources and skills to plan, develop, acquire and implement comprehensive IAM solutions. As a result, they’re contracting professional services firms to provide the necessary support, particularly where multiple functions need to be addressed simultaneously.

More and more, organizations will rely on MSSP firms for advice, guidance and integration recommendations. By 2023, 40% of IAM application convergence will primarily be driven by MSSPs that focus on delivery of best-of-breed solutions in an integrated approach, shifting influence from product vendors to service partners.

Identity proofing tools will be implemented within the workforce identity life cycle

Historically, vendor-provided enrollment and recovery workflows for multifactor authentication have incorporated weak affirmation signals, such as email addresses and phone numbers. As a result, implementing higher-trust corroboration has been left as an exercise for the enterprise.

Because of the massive increase in remote interactions with employees, more robust enrollment and recovery procedures are an urgent requirement, as it is harder to differentiate between attackers and legitimate users. By 2024, 30% of large enterprises will newly implement identity-proofing tools to address common weaknesses in workforce identity life cycle processes.

A global, portable, decentralized identity standard will begin to emerge

Centralized approaches to managing identity data — common in today’s market — struggle to provide benefits in the three key areas: Privacy, assurance and pseudonymity. A decentralized approach uses blockchain technology to help ensure privacy, enabling individuals to validate information requests by providing the requestor with only the absolute minimum required amount of information.

By 2024, a true global, portable, decentralized identity standard will emerge in the market to address business, personal, social and societal, and identity-invisible use cases.

Demographic bias within identity proofing will be widely minimized

Bias with respect to race, age, gender and other characteristics gained attention significantly in 2020, coinciding with the increased interest in document-centric identity proofing in online use cases. This “ID plus selfie” process uses face recognition algorithms to compare selfies of customers with the photo in their identity document.

There has always been awareness of possible bias in face recognition processes, with implications concerning customer experience, brand damage and possible legal liability. As a result, by 2022, 95% of organizations will require that identity-proofing vendors prove that they are minimizing demographic bias, a significant increase from less than 15% today.

The post 5 Key Predictions for Identity and Access Management and Fraud Detection appeared first on Smarter With Gartner.


auth0

Node.js and TypeScript Tutorial: Secure an Express API

Learn how to use TypeScript and Auth0 to secure a feature-complete Express.js API. Learn how to use Auth0 to implement authorization in Express.
Learn how to use TypeScript and Auth0 to secure a feature-complete Express.js API. Learn how to use Auth0 to implement authorization in Express.

Node.js and TypeScript Tutorial: Build a CRUD API

Learn how to use TypeScript to build a feature-complete Express API. Learn how to use TypeScript with Express to create, read, update, and delete data.
Learn how to use TypeScript to build a feature-complete Express API. Learn how to use TypeScript with Express to create, read, update, and delete data.

Wednesday, 13. January 2021

Civic

Now, Consumers and Small Businesses Can Manage Proof of Testing and Vaccination

Now that the United States is a few weeks into deploying the COVID-19 vaccine, many frontline workers have received at least their first round of shots, along with a digital medical record documenting this, or vaccination record card provided by the Center for Disease Control (CDC). The digital records and vaccine cards include medical information, […] The post Now, Consumers and Small Businesse

Now that the United States is a few weeks into deploying the COVID-19 vaccine, many frontline workers have received at least their first round of shots, along with a digital medical record documenting this, or vaccination record card provided by the Center for Disease Control (CDC). The digital records and vaccine cards include medical information, including the manufacturer and date of administration.

As plans are unveiled for further vaccine distribution, a light at the end of the lockdown tunnel has begun to emerge, and planning for a “new normal” is underway. Businesses and governments are in the process of rethinking how they will manage social interactions throughout daily life. With part of the population vaccinated, occupancy limits will change and travel restrictions will be revamped, creating new opportunities to slow the spread of the virus and simultaneously expand in-person contact. 

COVID Health Key by Civic offers the most secure, most private and most compliant way to maintain safe spaces by securely verifying proof of vaccination status. We’ve designed our technology to allow for a more inclusive approach to rebuilding. With Health Key, individuals keep control over their personal information by using digital identity from the industry’s leading innovator. No personal data is stored or entered into any requestor’s database. 

Now, Civic is expanding its ability to help more organizations by providing small businesses and ad-hoc networks with the ability to create safe environments around the world — all without maintaining personally identifiable information or compromising the privacy of their patrons.

Consumers and small businesses may begin using COVID Health Key by Civic in Spring 2021 as a stand-alone, self-serve solution. Restaurants, gyms, daycares and more have struggled to resume business with mandates for low occupancy rates. Health Key helps enable the loosening of constraints in a safe and easy-to-use way. For customers, using Health Key means simply linking their digital medical records or uploading their vaccine card and self-attesting the document’s accuracy in the app.

For larger organizations, we’re offering a previously announced version of COVID Health Key by Civic that streamlines the management of entry points. This product helps organizations confirm that an individual has received a test result or vaccine, and that local entry point regulations have been followed. Civic links the medical record on the back end, for seamless review by the individual and receipt by requestors.

Civic is also building government solutions, available worldwide with the exception of a few sanctioned countries. Customized solutions can be linked to national healthcare systems, identity cards, passports and customs requirements. Health Key by Civic is a real-life application of our patented digital identity system. It’s also compliant with international GDPR, CCPA, and other data privacy regulations. 

Interested in finding out if Health Key by Civic is right for you? Get in touch with our team.

The post Now, Consumers and Small Businesses Can Manage Proof of Testing and Vaccination appeared first on Civic Technologies, Inc..


Evernym

A Better, More Secure, and More Private Approach to COVID Credentials

Not all digital credential solutions are created equal – here’s what makes Evernym’s solution safe, private, and open. Some legal, public health and identity leaders have expressed concerns about building high-stakes identity tools like COVID-19 digital health certificates. They point to immature standards and predict that scrutiny by governments and consumer advocates will reveal security, […]

Not all digital credential solutions are created equal – here’s what makes Evernym’s solution safe, private, and open. Some legal, public health and identity leaders have expressed concerns about building high-stakes identity tools like COVID-19 digital health certificates. They point to immature standards and predict that scrutiny by governments and consumer advocates will reveal security, […]

The post A Better, More Secure, and More Private Approach to COVID Credentials appeared first on Evernym.


IBM Blockchain

Love at first block: Sustainability with the help of blockchain

Over the past 17 years, my team at Newlight has been working to turn air and greenhouse gas into a material found throughout nature called AirCarbon. There are a number of interesting things about AirCarbon: first, it is a meltable material, so as we look for solutions to help solve our ocean plastics problem, it […] The post Love at first block: Sustainability with the help of blockchain appear

Over the past 17 years, my team at Newlight has been working to turn air and greenhouse gas into a material found throughout nature called AirCarbon. There are a number of interesting things about AirCarbon: first, it is a meltable material, so as we look for solutions to help solve our ocean plastics problem, it […]

The post Love at first block: Sustainability with the help of blockchain appeared first on Blockchain Pulse: IBM Blockchain Blog.


Ontology

Ontology Weekly Report (January 4th- 11th, 2021)

Highlights As the New Year gets well underway, we’ve hit the ground running with the launch of our 2021 roadmap, which introduced the development plan for our five core products: OScore, ONT ID, ONTO, Wing, and SAGA. Latest Developments Development Progress Completed 10% of the Ontology EVM-integrated design, which will be fully compatible with the Ethereum contract ecology after complet

Highlights

As the New Year gets well underway, we’ve hit the ground running with the launch of our 2021 roadmap, which introduced the development plan for our five core products: OScore, ONT ID, ONTO, Wing, and SAGA.

Latest Developments

Development Progress

Completed 10% of the Ontology EVM-integrated design, which will be fully compatible with the Ethereum contract ecology after completion.

Product Development

Released ONTO v3.6.7. This version supports Huobi ECO Chain (Heco), improves upon the Data and Help Center pages, and adds an asset type display as well as a blind box ad feature. More than 3,600 users opened blind boxes, of which 2,500 users received rewards. ONTO added support for 5 Binance Smart Chain dApps and 5 Huobi ECO Chain dApps. It also added support for BakerySwap’s official liquidity mining activities and also a news feed from CHAINNEWS on the information page.

dApps

110 dApps launched in total on MainNet 6,257,818 dApp transactions completed in total on MainNet

Community Growth

The Ontology team keeps on growing! This week we onboarded 1,639 new members across Ontology’s global communities.

Route Planning

Ontology released this year’s roadmap Sense 2021. The main focus centered around decentralized identity (DeID) and data, with the continued promotion of decentralized governance and bringing more innovative use cases.

Global Events

This month Li Jun, the founder of Ontology, was invited to attend the first Blockchain Innovators Assembly hosted by Bibi News where he participated in a discussion on public chain performance, alongside the founder of TRON and CTO of PlatON, Li Jun showcased his views on pressing topics at the moment, including the competition and collaboration with Ethereum, different methods of increasing on-chain scalability and privacy protection. Other subjects discussed at the assembly included the improvement of interoperability between public chains, and the development patterns of public chains in 2021. At the end of 2020, the Ontology team was invited to a blockchain discussion sponsored by Tuoniao Blockchain. This event saw a discussion on Ontology’s achievements in 2020 and development plans for 2021.

Industry News

Ontology’s Jun Li on credit-based DeFi and more real assets coming to blockchain in 2021

Li Jun was interviewed by Forkast.News where he reflected on the most noteworthy developments for the industry in 2020 as well as providing a look ahead to the next 12 months.

Crypto adoption in 2021: Top trends and predictions on what may come

“The prospects for 2021 look bright as major forces driving adoption in 2020 will remain powerful.”

Erick Pinos, the Americas Ecosystem Lead at Ontology also gave his thoughts on what 2021 might have in store for the crypto world. ”

Blockchain regulations to look for in 2021

Erick Pinos was also featured in App Developer Magazine commenting on trends in the blockchain and crypto industries for 2021.

“There will be attempts to pass regulations for blockchain and crypto, large financial institutions will move funds into Bitcoin, and DeFi will stay prominent throughout 2021.”

Dapp Industry Report 2020

2020 has proven to be a year of twists and turns. Despite these challenges, in the context of the global pandemic, blockchain applications and emerging technologies continue to thrive. As the dialogue on basic national income and the distribution of global database management systems changes, the global crisis appears to be drawing people’s attention to decentralized solutions.

Li Jun said: “An important achievement for us this year is the launch of Wing, an innovative cross-chain credit-based lending platform built on the Ontology blockchain. Wing improves the productivity of users’ assets through lending and borrowing and reduces users’ risks through insurance.

In 2021, our focus will be to further expand our applications and help more dApps solve problems related to user identity and data. Our goal is to further optimize the product experience and make it easier for users to use our cross-chain ONTO wallet, our decentralized credit-based lending platform Wing, and our data privacy product, SAGA.

We believe, as more and more players enter the field, DeFi will continue to develop, which will drive further innovation. Let’s see what 2021 holds!

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (January 4th- 11th, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Authenteq

Security and compliance for any industry, device, language, and geolocation – January Product Update

The post Security and compliance for any industry, device, language, and geolocation – January Product Update appeared first on Authenteq.

Ocean Protocol

OceanDAO — Round One Grant Results

OceanDAO — Round One Grant Results Highlights from OceanDAO’s first funding round Introduction Hello Oceaners! OceanDAO Round One officially concluded on December 21, 2020. For those unfamiliar with OceanDAO, you can learn more about Ocean’s community curated funding process and get involved here! Thank you to all of the participants, voters & proposers. We are excite
OceanDAO — Round One Grant Results Highlights from OceanDAO’s first funding round Introduction

Hello Oceaners!

OceanDAO Round One officially concluded on December 21, 2020.

For those unfamiliar with OceanDAO, you can learn more about Ocean’s community curated funding process and get involved here!

Thank you to all of the participants, voters & proposers. We are excited about the growth of Ocean Protocol in 2021 & beyond!

Highlights

OceanDAO Round One was announced on Nov 30, 2020.

The top five highest voted grant proposals from the snapshot ballot were selected to receive 10000 OCEAN, along with an additional 3000 OCEAN inaugural bonus, to foster positive value creation for the overall OCEAN ecosystem.

The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, and resources that the community finds valuable.

Funding Category Types:

• Build / improve applications or integrations to Ocean

• Outreach / community (grants don’t need to be technical in nature)

• Unleash data

• Build / improve core Ocean software

Proposal Vote Results:

• 9 proposals submitted

• 76 Voters

• 3,257,733 $OCEAN voted

Voter Turnout:

•Voting opened on Dec 15th at 12:00 GMT

•Voting closed on Dec 21 at 23:59 GMT

Recipients

Congratulations to the top 5 vote recipients. These projects will be receiving an OceanDAO grant in the form of $OCEAN tokens.

Ocean Academy

Ocean Academy is an education platform designed to act as a catalyst for the adoption and growth of Ocean Protocol, involving NFT certificates & a new module on data DeFi, to make it more appealing to data scientists and the broader community.

The Data Whale

The Data Whale wants to create a mobile platform for “all things data economy” & encourage adoption of Ocean Market with a simple, user-friendly interface on iOS and Android devices.

Ocean Surfer

Ocean Surfer will “combine Ocean V3 and Superfluid Protocol to enable a new data token primitive that leverages money streaming agreements to allow the payment/consumption of on-demand Compute-to-Data services.”

Ocean Pool Alerts

Ocean Pool Alerts will create a solution to provide important Ocean data pool alerts while creating equal staking opportunities for the community.

Decentralized File Rating

Decentralized File Rating is “aiming towards a mechanism where users are able to give a file rating after they have bought it.”

Future Rounds

OceanDAO Funding Round 2 is just around the corner and we are excited about your participation!

Project proposal deadline: Feb 1, 2021 at 23:59 GMT Voting wallet balance snapshot: Feb 1, 2021 at 23:59 GMT Voting closes: Feb 4, 2021 at 23:59 GMT Funding: funds will be disbursed within 24 hours after voting ends.

See the OceanDAO wiki for up-to-date round information and any links you will need to get involved as a project team member or a community curator.

This is just the start. Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO — Round One Grant Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

A Reflection on Auth0’s 2020 and Hope for a Brilliant 2021

Auth0’s CEO on renewal and being prepared to take on a new year
Auth0’s CEO on renewal and being prepared to take on a new year

Recapping AWS re:Invent 2020

Did you miss Auth0 at AWS re:Invent? It’s not too late.
Did you miss Auth0 at AWS re:Invent? It’s not too late.

Okta

Developer's Cheat Sheet for C# 9.0

Introduction to C# 9 (and a bit of C# 8, too) Let’s start with a background on how C# 9 got here (implementation examples start in the next section). The last few years in computer science, we’ve observed the rising popularity of the #FreeLunchOver concept. The idea is that CPU technology, based on electrical signals and Von Neumann architecture, has reached its intrinsic limits. As long as
Introduction to C# 9 (and a bit of C# 8, too)

Let’s start with a background on how C# 9 got here (implementation examples start in the next section).

The last few years in computer science, we’ve observed the rising popularity of the #FreeLunchOver concept. The idea is that CPU technology, based on electrical signals and Von Neumann architecture, has reached its intrinsic limits. As long as integrated circuits were slower than light, we knew we had work to do in improving the technology. But the time has come where, to go faster, we would need a faster medium and/or a different architecture. Scientists are all convinced that nothing can go faster than light. So, for the time being, engineers have explored the architecture path toward increasing speed. And they did exactly what the Industrial Revolution did to increase productivity - break the work into subunits, execute the subunits in parallel pipelines, and finally, assemble the resulting components. We began to see dual-core, quad-core processors… and we’ll probably soon see the day where exponential notation has to be used to express the number of cores in a CPU!

In the software industry this trend has produced significant effects. Programming paradigms that weren’t particularly relevant in sequential computing became more important, such as immutability, threading and asynchronous pipelines. Functional Programming (FP), previously relegated to academic and niche domains, gained more popularity in the commercial software arena. FP has characteristics that better adapt to parallel computing workflows than the Object Oriented Programming (OOP) that dominated programming language design for 20 years.

The first mainstream OOP language was C++, born as an OOP extension to the former procedural C. C++ has become an incredibly rich language without losing its fundamental trait, that is being an unmanaged language, which means it is close to the machine. With time, managed languages began to appear, where some aspects of the machine management, like memory allocation and especially deallocation and collection, have been promoted from “developer concern” to “machine concern.” As these features are not easy to implement in hardware, language creators invented the Virtual Machine (VM) concept. The VM is actually a piece of software itself that effectively makes available to the developer a different machine than the bare hardware. With time, VMs have become integral components of modern Operating Systems (OSs).

The most prominent example is probably Java, which conquered a vast share of the market with its free-to-use policy. There was a time where the big actors tried to ride the Java horse to their own advantage. But eventually Microsoft decided to create its own managed software framework, and .NET and C# were born.

When functional programming began to emerge, it became essential for these big actors to offer functional languages to the public for their own VM engines. They were constrained, though, as the languages needed to be compatible with their already grown OOP babies. So, after spending some time considering the already existing FP languages (Haskell and OCaml, to mention two of the most successful), they created new languages - Scala (Java VM) and F# (.NET VM). I am personally very passionate about functional programming,, and an active F# advocate.

But progress never stops! Quantum computing seems to be the next hardware architecture frontier, which will of course invite new challenges for software engineers. These days, I am plunging into Microsoft Quantum Development Kit (QDK) and its associated language Q# - that’s another story for an upcoming post.

Back on earth, the OOP vs. FP gossip is very juicy. . In the Microsoft realm, it seems to me that there’s more effort to make C# more functional, rather than F# more object oriented. There seems to be some strong resistance in the industry against FP, maybe because FP developed a reputation as more difficult and abstract. Which is not completely undeserved, though I’d say the reward makes the effort spent in learning it absolutely worth it.

Starting with version 7 and on with version 8 and 9, C# has seen several much welcome improvements, both in syntax and features. This post aims to collect some of those advancements, including real code examples. As a matter of fact, the title “Cheat Sheet” is a little… well, cheating. But I hope you’ll enjoy it anyhow.

Requirements to Develop with C# 9

The resources used for this post are:

A computer with a .NET Core compatible Operating System (I used Windows 10) Your favorite .NET IDE (I used Visual Studio) .NET 5 SDK The Visual Studio 2019 Sample Project

To follow the examples in this article, you need .NET 5 SDK. I’ll be using Visual Studio 2019 CE version 16.8.3. The code examples are in a solution in this GitHub repo.

Please create a new Solution with a .NET Core console Project. By default, the project uses .NET Core 3.1 as target framework, with C# 8 as the language version.

At the time of writing, all you need to enable C#9 in your project, is to select .NET 5 as target framework for your project, since .NET 5 used C#9 by default.

<Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <OutputType>Exe</OutputType> <TargetFramework>net5.0</TargetFramework> </PropertyGroup> </Project>

With this, we are now ready to explore the new features C#8 and C#9 have to offer.

Immutability

Immutability is a property that symbols in a program can have. We are so used to the “variable” terminology, that there seems to be no distinction from a symbol we declare to identify some value, and the concept of variable. This is because OOP is centered around the concept of imperative assignment, where any symbols representing data can be mutated as we wish, hence perfectly matching the “variable” nomenclature.

FP, on the other hand, is centered on the opposite concept, that of immutability. The universally known “=” syntax has a different meaning: not assignment, but binding. So whatever is on the left is not variable anymore, but “invariable” if you like. (What a bad term!) That’s why I prefer to use the term “symbol,” in place of “variable”.

Sometimes we find opinions in the community that depict mutability as evil, advocating an almost religious superiority of FP over OOP. I don’t personally agree with this point of view. In my experience, having more tools makes things better, not worse. If nothing else, I can always decide not to use a tool I have, while I cannot use a tool I don’t have.

Also in my experience, OOP and FP are two powerful weapons in my arsenal after the necessary familiarization period. Now it’s natural to use both in designing new software, and I find they can get along pretty well (and this is one of the reasons why I like F#).

One myth I would like to dispel is that C# cannot define immutable types. It can, only it’s not quite made for this, so it requires some additional effort. The feature described in this section helps to avoid most of the boilerplate that was needed in previous versions.

Init-Only Properties (a.k.a. Immutable Reference Types)

The new keyword init can be used when declaring properties. Properties can be set only once at initialization.

using Xunit; namespace Cs9CheatSheet.Immutability.InitOnlyProperties { class OktaOptionsClass_Mutable { public string OktaDomain { get; set; } public int Retrials { get; set; } } class OktaOptionsClass_Immutable_Wrong { public string OktaDomain { get; } public int Retrials { get; } } class OktaOptionsClass_Immutable_Constructor { public string OktaDomain { get; } public int Retrials { get; } public OktaOptionsClass_Immutable_Constructor(string oktaDomain, int retrials) { OktaDomain = oktaDomain; Retrials = retrials; } } class OktaOptionsClass_Immutable_Ok_Init { public string OktaDomain { get; init; } public int Retrials { get; init; } } public class Tests { [Fact] public void Test() { var options_mutable = new OktaOptionsClass_Mutable { OktaDomain = @"https://dev-509249.okta.com", Retrials = 3 }; options_mutable.Retrials = 9458257; //properties can be set at any time //Compiler error, cannot set properties (at all) //var options_immutable_wrong = // new OktaOptionsClass_Immutable_Wrong { // OktaDomain = @"https://dev-509249.okta.com", // Retrials = 3 }; //Ok, use constructor var options_immutable_ok = new OktaOptionsClass_Immutable_Constructor(@"https://dev-509249.okta.com", 3); //options_immutable_ok.Retrials = 9458257; //design time error: properties are readonly //Ok, new init-only properties var options_immutable_ok_init = new OktaOptionsClass_Immutable_Ok_Init { OktaDomain = @"https://dev-509249.okta.com", Retrials = 3 }; //options_immutable_ok_init.Retrials = 9458257; //design time error: properties are readonly } } }

There are 4 different versions of a class with different degrees of immutability:

OktaOptionsClass_Mutable; mutable, properties can be changed at any time OktaOptionsClass_Immutable_Wrong; immutable, but also useless as there is no way to set the properties OktaOptionsClass_Immutable_Constructor; old way immutable, requiring to write a “dumb” constructor OktaOptionsClass_Immutable_Ok_Init; new way immutable, less boilerplate (you don’t need to write the “dumb” constructor) Records

A new type of declaration named record makes it easier to work with complex types. For example:

Record variables are object references like class types Equality is by value, like struct types Immutability features (copy constructor/cloning, deconstructor, deep equality logic) are created for us by the compiler using Xunit; namespace Cs9CheatSheet.Immutability.Records { class OktaOptionsClass { public string OktaDomain { get; set; } public int Retrials { get; set; } public OktaOptionsClass(string oktaDomain, int retrials) { OktaDomain = oktaDomain; Retrials = retrials; } } struct OktaOptionsStruct { public string OktaDomain { get; set; } public int Retrials { get; set; } public OktaOptionsStruct(string oktaDomain, int retrials) { OktaDomain = oktaDomain; Retrials = retrials; } } record OktaOptionsNominalRecord { public string OktaDomain { get; set; } public int Retrials { get; set; } } public class Tests { [Fact] public void Test() { //class semantic: 2 objects are created and 3 references //The variables option_class_* represent the references, not the content var options_class_1 = new OktaOptionsClass(@"https://dev-509249.okta.com", 5); var options_class_2 = new OktaOptionsClass(@"https://dev-509249.okta.com", 5); //Reference copy var options_class_3 = options_class_1; //struct semantic: 3 objects are created and no references //The variables option_struct_* represent content var options_struct_1 = new OktaOptionsStruct(@"https://dev-509249.okta.com", 5); var options_struct_2 = new OktaOptionsStruct(@"https://dev-509249.okta.com", 5); //value copy var options_struct_3 = options_struct_1; //record semantic: as class semantic when instanciating var options_record_1 = new OktaOptionsNominalRecord { OktaDomain = @"https://dev-509249.okta.com", Retrials = 5 }; var options_record_2 = new OktaOptionsNominalRecord { OktaDomain = @"https://dev-509249.okta.com", Retrials = 5 }; //Reference copy var options_record_3 = options_record_1; //class semantic: despite pointing to identical contents, only variables _1 and _3 are compared equal //this is because the compiler generates reference, not content, comparison code Assert.NotEqual(options_class_1, options_class_2); Assert.NotEqual(options_class_2, options_class_3); Assert.Equal(options_class_1, options_class_3); options_class_1.Retrials = 7; Assert.Equal(7, options_class_3.Retrials); //class semantic: only content has been changed, not references, so comparisons are unchanged Assert.NotEqual(options_class_1, options_class_2); Assert.NotEqual(options_class_2, options_class_3); Assert.Equal(options_class_1, options_class_3); //struct semantic: compiler generates value comparison (no reference is created for structs) Assert.Equal(options_struct_1, options_struct_2); Assert.Equal(options_struct_2, options_struct_3); Assert.Equal(options_struct_1, options_struct_3); options_struct_1.Retrials = 7; //struct semantic: the variables option_struct_* represent the content //so the change in value is reflected in variable comparison Assert.NotEqual(options_struct_1, options_struct_2); Assert.Equal(options_struct_2, options_struct_3); Assert.NotEqual(options_struct_1, options_struct_3); //record semantic: even though the variables represent references, the compiler generates //value comparison code. The behavior is like struct Assert.Equal(options_record_1, options_record_2); Assert.Equal(options_record_2, options_record_3); Assert.Equal(options_record_1, options_record_3); options_record_1.Retrials = 7; //record semantic: after a content change, comparisons behave as class, not struct (variables are references) Assert.NotEqual(options_record_1, options_record_2); Assert.NotEqual(options_record_2, options_record_3); Assert.Equal(options_record_1, options_record_3); } } }

In this code sample, I am declaring three different data types with the same content, but different language implementation. Please refer to the comments for more detailed information.

Notice however that in this paragraph I am using record with enforced mutability (where properties have set accessors), in order to compare them to the traditional mutable constructs offered by the language. Records, though, are designed to make it particularly easy to work with immutable data.

Positional Records

Records offer a particularly lean notation to create immutable objects

using System; using Xunit; namespace Cs9CheatSheet.Immutability.Positional { class OktaOptionsClass { public string OktaDomain { get; init; } public int Retrials { get; init; } public OktaOptionsClass(string oktaDomain, int retrials) { OktaDomain = oktaDomain; Retrials = retrials; } public override bool Equals(object obj) { var other = (OktaOptionsClass)obj; return OktaDomain.Equals(other.OktaDomain) && Retrials.Equals(other.Retrials); } //Declaring a Deconstruct method simplifies retrieving properties //(automatically generated by the compiler for records) public void Deconstruct(out string domain, out int retrials) => (domain, retrials) = (OktaDomain, Retrials); public override int GetHashCode() { return OktaDomain.GetHashCode() ^ Retrials.GetHashCode(); } OktaOptionsClass Clone() { return new OktaOptionsClass(OktaDomain, Retrials); } public OktaOptionsClass(OktaOptionsClass oktaOptions) => oktaOptions.Clone(); } public record OktaOptionsPositionalRecord(string OktaDomain, int Retrials); //Records can be derived as classes public record OktaOptionsPositionalRecordDerived(string OktaDomain, int Retrials, DateTime ExpirationDate) : OktaOptionsPositionalRecord(OktaDomain, Retrials) { //Compiler generates copy constructor automatically for records public OktaOptionsPositionalRecordDerived Copy() => new OktaOptionsPositionalRecordDerived(this); } public class Tests { [Fact] public void Test() { //Compiler generates automatically a initializing constructor for records var options_class_1 = new OktaOptionsClass(@"https://dev-509249.okta.com", 5); var options_record_1 = new OktaOptionsPositionalRecord(@"https://dev-509249.okta.com", 5); //Compiler generates automatically init-only property accessors for records var domain = options_record_1.OktaDomain; var retrials = options_record_1.Retrials; //options_record_1.Retrials = 7; //Compiler error, properties are init-only //Traditional object deconstruction var (domain_class, retrials_class) = (options_class_1.OktaDomain, options_class_1.Retrials); //New object deconstruction based on custom Deconstructor method var (_domain_class, _retrials_class) = options_class_1; //Compiler generates automatically deconstructor for records var (domain_record, retrials_record) = options_record_1; var options_record_derived_1 = new OktaOptionsPositionalRecordDerived(@"https://dev-509249.okta.com", 5, DateTime.Now); //Copying through method (explicit use of the automatically generated copy constructor) var options_record_derived_2 = options_record_derived_1.Copy(); Assert.Equal(options_record_derived_1, options_record_derived_2); //Copying through with expression (implicit use of the automatically generated copy constructor) var options_record_derived_3 = options_record_derived_1 with { }; Assert.Equal(options_record_derived_1, options_record_derived_3); //With expression can generate a modified copy var options_record_derived_4 = options_record_derived_1 with { OktaDomain = "OktaUrl2" }; var (_, retrials_1, expiration_1) = options_record_derived_1; var (_, retrials_4, expiration_4) = options_record_derived_4; Assert.Equal((retrials_1, expiration_1), (retrials_4, expiration_4)); //With expression modified fields can refer to original record's fields var options_record_derived_5 = options_record_derived_1 with { ExpirationDate = options_record_derived_1.ExpirationDate.AddDays(1.0) }; var (domain_1, _, _) = options_record_derived_1; var (domain_5, retrials_5, _) = options_record_derived_5; Assert.Equal((domain_1, retrials_1), (domain_5, retrials_5)); } } }

Note: OktaOptionsPositionalRecord behaves as OktaOptionsClass. In other words, C# now offers syntactic sugar to add immutability boilerplate to class based types automatically. You simply have to declare your type as record instead of class. A Deconstruct method can be defined in a class type. This is also a new feature created to implement records, but can be used in non-record types.

Static Local Functions

One of the principles of functional programming, along with immutability, is “No side effects.” This means a function should produce results only as a return value, and data needs to be passed in as a parameter in order to to produce it. This is what static local functions are all about.

We can ensure that local functions, lambdas, and anonymous methods are side-effect free. The compiler will refuse to compile them if they access data outside their context.

using System; using Xunit; namespace Cs9CheatSheet.StaticLocalFunctions.LocalClosureCapture { public class Tests { [Fact] public void Local_function() { int x = 1; int AddWithCapture(int a, int b) { x = 5; return a + b; } //static: trying to use x will cause a compiler error static int AddWithoutCapture(int a, int b) { //x = 5; // Error return a + b; } //static local functions CANNOT change external in-scope locals (x here) Assert.Equal(2, AddWithoutCapture(x, 1)); //No side effects Assert.Equal(2, x + 1); //Non static local functions CAN change external in-scope locals (x here) Assert.Equal(2, AddWithCapture(x, 1)); //Here the side effect is to change the result of a test with the same inputs Assert.NotEqual(2, x + 1); } [Fact] public void Lambda() { int x = 1; Func<int, int, int> addWithCapture = (a, b) => { x = 5; return a + b; }; Func<int, int, int> addWithoutCapture = static (a, b) => { /*x = 9; //error */ return a + b; }; //static lambdas CANNOT change external in-scope locals (x here) Assert.Equal(2, addWithoutCapture(x, 1)); //No side effects Assert.Equal(2, x + 1); //Non static lambdas CAN change external in-scope locals (x here) Assert.Equal(2, addWithCapture(x, 1)); //Here the side effect is to change the result of a test with the same inputs Assert.NotEqual(2, x + 1); //Same with lambdas } } } Default Interface Methods

Default interface methods are a new feature, and are not strictly related to the FP enrichment, but they’re a very nice to have, especially for Library authors.

Non Breaking Interface Augmentation

Library authors have the possibility to add interface members without breaking applications.

namespace Cs9CheatSheet.DefaultInterfaceMethods.NonBreakingInterfaceAugmentation { //Library Version 1 namespace Version1 { namespace Library { interface ILibraryInterface { int Increment(int i); } } namespace Application { using Library; class UserClass : ILibraryInterface { public int Increment(int i) => i + 1; } } } //Library Version 2: Add Decrement method to interface //This change is breaking, because it forces the user to modify her code, //and add an implementation for the added method namespace Version2_Ideal { namespace Library { interface ILibraryInterface { int Increment(int i); int Decrement(int i); } } namespace Application { using Library; //Compiler error: User Class doesn't implement ILibraryInterface //class UserClass : ILibraryInterface //{ // public int Increment(int i) => i + 1; //} class UserClass : ILibraryInterface { public int Increment(int i) => i + 1; //User is force to add Decrement implementation public int Decrement(int i) => i - 1; } } } //Library Version 2 with additional interface //This change is not breaking, but it imposes the creation of new interface //with consequent naming pollution and messy architecture namespace Version2_Additional_Interface { namespace Library { interface ILibraryInterface { int Increment(int i); } interface ILibraryInterface2 : ILibraryInterface { int Decrement(int i); } } namespace Application { using Library; class UserClass : ILibraryInterface { public int Increment(int i) => i + 1; } } namespace Application_Version2 { using Library; //To use the new library features, UserClass needs to implement 2 interfaces class UserClass : ILibraryInterface, ILibraryInterface2 { public int Increment(int i) => i + 1; public int Decrement(int i) => i - 1; } } } //Library Version 2 with default implementation (C#8) //This change is not breaking, and avoids the issues above namespace Version2_Default_Implementation { namespace Library { interface ILibraryInterface { int Increment(int i); int Decrement(int i) => i - 1; } } namespace Application { using Library; class UserClass : ILibraryInterface { public int Increment(int i) => i + 1; } } } }

Note: The application code can always override the default interface implementation for Decrement when needed.

Pattern Matching

Pattern matching is a typical FP concept. As is often the case, explaining with words is more difficult than showing how it works. Though we could say that pattern matching belongs in that category of those improvements that allow us to write code that performs traditional things in a more succinct and expressive syntax.

Switch Expression

The new switch expression and the added relational and logical operators give us a more powerful and expressive way to carry out decision-making workflows.

using System.Linq; using Xunit; namespace Cs9CheatSheet.PatternMatching.SwitchExpression { static class ScoreEvaluator { static internal string SwitchStatement(int score) { var ret = string.Empty; switch(score) { case 0: ret = "Unclassified"; break; case 1: case 2: case 3: ret = "Bad"; break; case 4: case 5: ret = "Ordinary"; break; case 6: case 7: ret = "Good"; break; case 8: case 9: ret = "Excellent"; break; case 10: ret = "Outstanding"; break; default: ret = "Invalid Score"; break; } return ret; } static internal string SwitchExpression(int score) => score switch { 0 => "Unclassified", > 0 and <= 3 => "Bad", 4 or 5 => "Ordinary", >= 6 and <= 7 => "Good", 8 or 9 => "Excellent", 10 => "Outstanding", _ => "Invalid Score" }; } public class Tests { [Fact] public void Test() { Assert.All( Enumerable.Range(-1, 12), i => Assert.Equal(ScoreEvaluator.SwitchExpression(i), ScoreEvaluator.SwitchStatement(i))); } } }

The new pattern-matching coding in SwitchExpression is evidently more terse and expressive, stripping out a great deal of boilerplate needed in the traditional SwitchStatement implementation.

Type Check Pattern

We find similar welcome improvements when the decision-making workflow involves type checking.

using System; namespace Cs9CheatSheet.PatternMatching.TypeCheckPattern { class Cube { public double Side { get; } } class Sphere { public double Radius { get; } } class Cone { public double Radius { get; } public double Height { get; } } class Volume { static double Traditional(object solid) { if (solid.GetType().Equals(typeof(Cube))) { var cube = solid as Cube; if (cube.Side >= 0.0) return Math.Pow(cube.Side, 3); } else if (solid.GetType().Equals(typeof(Sphere))) { var sphere = solid as Sphere; return 4.0 / 3.0 * Math.PI * Math.Pow(sphere.Radius, 3); } else if (solid.GetType().Equals(typeof(Cone))) { var cone = solid as Cone; if (cone.Radius >= 0.0 && cone.Height >= 0) return Math.PI * Math.Pow(cone.Radius, 2) * cone.Height / 3.0; } return double.NaN; } static double IsStatement(object solid) { if (solid is Cube cube && cube.Side >= 0.0) return Math.Pow(cube.Side, 3); else if (solid is Sphere sphere && sphere.Radius >= 0.0) return 4.0 / 3.0 * Math.PI * Math.Pow(sphere.Radius, 3); else if (solid is Cone cone && cone.Radius >= 0.0 && cone.Height >= 0) return Math.PI * Math.Pow(cone.Radius, 2) * cone.Height / 3.0; return double.NaN; } static double SwitchStatement(object solid) { switch(solid) { case Cube cube when cube.Side > 0.0: return Math.Pow(cube.Side, 3); case Sphere sphere when sphere.Radius >= 0.0: return 4.0 / 3.0 * Math.PI * Math.Pow(sphere.Radius, 3); case Cone cone when cone.Radius >= 0.0 && cone.Height >= 0.0 : return Math.PI * Math.Pow(cone.Radius, 2) * cone.Height / 3.0; default: return double.NaN; } } static double SwitchExpression(object solid) => solid switch { Cube cube when cube.Side >= 0.0 => Math.Pow(cube.Side, 3), Sphere sphere when sphere.Radius >= 0.0 => 4.0 / 3.0 * Math.PI * Math.Pow(sphere.Radius, 3), Cone cone when cone.Radius >= 0.0 && cone.Height >= 0.0 => Math.PI * Math.Pow(cone.Radius, 2) * cone.Height / 3.0, _ => double.NaN, }; static double CascadeSwitchExpression(object solid) => solid switch { Cube cube => cube.Side switch { >= 0.0 => Math.Pow(cube.Side, 3), _ => throw new ArgumentException("..."), }, Sphere sphere => sphere.Radius switch { >= 0.0 => 4.0 / 3.0 * Math.PI * Math.Pow(sphere.Radius, 3), _ => throw new ArgumentException("..."), }, Cone cone => (cone.Radius, cone.Height) switch { (>= 0.0, >= 0.0) => Math.PI * Math.Pow(cone.Radius, 2) * cone.Height / 3.0, _ => throw new ArgumentException("..."), }, _ => double.NaN, }; } }

In the above, observe how:

Readability improves significantly from the “traditional” way, compared to the new expressive pattern matching flavors We need to write much less when using pattern matching, which both saves time and reduces errors. (After all, there is no code more correct that the one we don’t write!)

The last example CascadeSwitchExpression is not equivalent to the example before that, as it also raises validation exceptions. Rather, it’s an example showing how to nest switch expressions.

Property Pattern

Last but not least, checking properties values in decision workflows is a great improvement.

namespace Cs9CheatSheet.PatternMatching.PropertyPattern { class PropertyPattern { enum Companies { C1, C2, C3 }; enum Zones { Z1, Z2, Z3 }; class ZonalClient { public Companies Company { get; } public Zones Zone { get; } public int Purchases { get; } } class DiscountApplier { int Traditional(ZonalClient client) { if (client.Purchases >= 500) return 50; else if (client.Purchases >= 200 && client.Purchases < 500) return 30; else if (client.Company == Companies.C1 && (client.Zone == Zones.Z1 || client.Zone == Zones.Z3) && client.Purchases >= 150) return 25; else if (client.Purchases >= 150 || (client.Company == Companies.C2 && client.Purchases >= 100)) return 20; else if ((client.Zone == Zones.Z2 && client.Purchases >= 50) || ((client.Company == Companies.C2 || client.Company == Companies.C3) && client.Zone == Zones.Z3)) return 15; else if (client.Purchases >= 25) return 5; else return 0; } int PropertyPattern(ZonalClient client) => client switch { { Purchases: >= 500 } => 50, { Purchases: >= 200 and < 500 } => 30, { Company: Companies.C1, Zone: Zones.Z1 or Zones.Z3, Purchases: >= 150 } => 25, { Purchases: >= 150 } or { Company: Companies.C2, Purchases: >= 100 } => 20, { Zone: Zones.Z2, Purchases: >= 50 } or { Company: Companies.C2 or Companies.C3, Zone: Zones.Z3 } => 15, { Purchases: >= 25 } => 5, _ => 0 }; } } }

Understanding what the code does becomes much much easier, don’t you agree?

Compactness

Many of the new features you saw go towards a language that is not only more powerful, but that also removes boilerplate and adds more expressiveness.### Indices and Ranges

Slicing an indexed collection of items (array) can be done using the new back-indexing (^) and range (..) operators

namespace Cs9CheatSheet.IndicesAndRanges.Slicing { using System; using System.Linq; using Xunit; using static Collection; internal class Collection { static int milliseconds = DateTime.Now.Millisecond; public static Random Rand = new Random(milliseconds); public static int[] a = Enumerable.Range(0, Rand.Next(100, 1000)).Select(i => Rand.Next()).ToArray(); } public class Tests { [Fact] public void Past_end() { Assert.Throws<IndexOutOfRangeException>(() => a[a.Length]); Assert.Throws<IndexOutOfRangeException>(() => a[^0]); } [Fact] public void First_element() { Assert.Equal(a[0], a[^a.Length]); } [Fact] public void Last_element() { Assert.Equal(a[a.Length - 1], a[^1]); } [Fact] public void First_15() { Assert.Equal(a.Take(15), a[..15]); } [Fact] public void Last_27() { Assert.Equal(a.Skip(a.Length - 27).Take(27), a[^27..]); } [Fact] public void From_11_to_43() { Assert.Equal(a.Skip(11).Take(32), a[11..43]); } [Fact] public void From_37_to_6_back() { Assert.Equal(a.Skip(a.Length - 37).Take(37 - 6), a[^37..^6]); } [Fact] public void Starting_slice() { int to = Rand.Next(a.Length); Assert.Equal(a.Take(to), a[..to]); } [Fact] public void Ending_slice() { int from = Rand.Next(a.Length); Assert.Equal(a.Skip(from), a[from..]); } [Fact] public void Any_slice() { int from = Rand.Next(a.Length / Rand.Next(2, 4)); int size = Rand.Next(a.Length / Rand.Next(3, 5)); int to = from + size; Assert.Equal(a.Skip(from).Take(size), a[from..to]); } [Fact] public void Any_slice_back() { int from = Rand.Next(a.Length / Rand.Next(2, 4)); int size = Rand.Next(a.Length / Rand.Next(3, 5)); int to = from + size; Assert.Equal(a.Skip(a.Length - to).Take(size), a[^to..^from]); } } }

Working with the end of an array has become as easy as working with the front.

Null Coalescing

Two new operators make it easier checking on nulls:

??: Null coalescing operator. In place of the old a != null ? a : b or a == null ? b : a, you can now simply write a ?? b ??= Null coalescing assignment operator. It allows write a ??= b to assign b to a only if a is null, like in the old a = a != null ? a : b using System.Collections.Generic; using Xunit; namespace Cs9CheatSheet.NullCoalescing.Operators { public class Tests { [Fact] public void Null_coalescing_operator() { (object a, object b, object c, object d) = (new object(), null, new object(), null); Assert.Equal(a, a != null ? a : c); //old Assert.Equal(a, a ?? c); //new Assert.Equal(c, b != null ? b : c); //old Assert.Equal(c, b ?? c); //new Assert.Equal(c, d != null ? d : b != null ? b : c != null ? c : a); //old Assert.Equal(c, d ?? b ?? c ?? a); //new object[] array = { a, b, c, d }; for(int i = 2; i < 4; i++) { foreach (var combination in Combinatory.Combinations(array, i)) { AssertCombination(combination); } } void AssertCombination(object[] combination) { switch(combination) { case object[] array when array.Length == 2: var (a, b) = (array[0], array[1]); Assert.Equal(a != null ? a : b, a ?? b); break; case object[] array when array.Length == 3: var (c, d, e) = (array[0], array[1], array[2]); Assert.Equal(c != null ? c : d != null ? d : e, c ?? d ?? e); break; case object[] array when array.Length == 4: var (f, g, h, i) = (array[0], array[1], array[2], array[3]); Assert.Equal(f != null ? f : g != null ? g : h != null ? h : i, g ?? f ?? h ?? i); break; } } } [Fact] public void Null_coalescing_assignment() { (object a1, object b1, object c1) = (new object(), null, new object()); (object a2, object b2, object c2) = (new object(), null, new object()); Assert.NotNull(a1); Assert.NotNull(a2); a1 = a1 != null ? a1 : c1; //old a2 ??= c2; //new Assert.NotNull(a1); Assert.NotNull(a2); Assert.NotEqual(a1, c1); Assert.NotEqual(a2, c2); Assert.Null(b1); Assert.Null(b2); b1 = b1 != null ? b1 : c1; //old b2 ??= c2; //new Assert.NotNull(b1); Assert.NotNull(b2); Assert.Equal(b1, c1); Assert.Equal(b2, c2); } } static class Combinatory { public static IEnumerable<T[]> Combinations<T>(T[] array, int size) { T[] result = new T[size]; foreach (int[] j in Combinations(size, array.Length)) { for (int i = 0; i < size; i++) { result[i] = array[j[i]]; } yield return result; } } private static IEnumerable<int[]> Combinations(int size, int length) { int[] result = new int[size]; Stack<int> stack = new Stack<int>(size); stack.Push(0); while (stack.Count > 0) { int index = stack.Count - 1; int value = stack.Pop(); while (value < length) { result[index++] = value++; stack.Push(value); if (index != size) continue; yield return (int[])result.Clone(); break; } } } } }

Null coalescing operators are really helpful with the “null billion dollar mistake”. Usually in C#, checking for nulls populates a great deal of our source files. The new operators do not get rid of null, but they make it much less annoying to deal with it.

Target Type Inference

Target type inference moves us further towards writing less boilerplate. The compiler’s type inference power has been increased so that we don’t need to specify the type in a new statement if the new instance is meant to be referenced by a typed variable. Un the following example you can observe three different cases

using Xunit; namespace Cs9CheatSheet.FitAndFinish.TargetTypeInference { record MyType(int Value = 0); public class Tests { [Fact] public void Create_new_instance() { var a = new MyType(12); //old way MyType b = new(12); //new way Assert.Equal(b, a); } [Fact] public void Function_call() { int Double(MyType myVar) => myVar.Value * 2; var a = Double(new MyType(7)); //old way var b = Double(new (7)); //new way Assert.Equal(b, a); } [Fact] public void Property_init() { var a = new MyType() { Value = 61 }; //old way MyType b = new() { Value = 61 }; // new way Assert.Equal(b, a); } } } Type indication replaces the var declaration, so the right side of the assignment operator doesn’t need to repeat the type The type of the newly instantiated object is already clear by the called function (Double) declaration Similar to 1, but with property initialization

In other words, when the compile finds a new statement without a type name, it looks around if the object to be instantiated has a target symbol (a variable or a function parameter) where the type is specified. Hence the name “target type inference.”

Type Extensibility

These additions belong to the group of features that empower the use of third-party libraries. (We already saw above, the default interface methods.) In that case, the feature allows library users to supply new versions of their product by adding members to existing interfaces without forcing the user to refactor the application code. The amenities discussed in this section all facilitate users of third-party libraries to extend the library with more powerful constructs.

Covariant Returns

The very native extension pattern in OOP is inheritance. Type members can be declared virtual when defining a type, allowing the user to override them in derived types. Traditionally, though, method overrides need to maintain the same signature of the base. Covariant returns means that we can now define overridden methods with different return types than the base ones, provided that the override return type is derived from the base return type. Returning derived types from overrides was possible even before, but it required the call site to perform a cast or a type checking. Well, not anymore.

using System; using Xunit; namespace Cs9CheatSheet.FitAndFinish.CovariantReturns { namespace OldWay { class Thing { public string Name { get; init; } } class Place : Thing { public string Area { get; init; } } class Country : Place { public string Capital { get; init; } } class Event { public virtual Thing Get() => new Thing { Name = "Big Bang" }; } class Trip : Event { public override Thing Get() => new Place { Name = "Cruise", Area = "Mediterrnanean Sea" }; } class Holiday : Trip { public override Thing Get() => new Country { Name = "Australia", Area = "Oceania", Capital = "Canberra" }; } public class Tests { [Fact] public void Test() { var (@event, trip, holiday) = (new Event(), new Trip(), new Holiday()); var thing = @event.Get(); var thingName = thing.Name; var place = trip.Get(); var placeName = place.Name; //var placeArea = place.Area; //compiler error: Area is not Thing class' member var place1 = (Place)trip.Get(); //cast required var placeArea = place1.Area; //ok var country = holiday.Get(); var countryName = country.Name; //var countryArea = country.Area; //compiler error: Area is not Thing class' member //var countryCapital = country.Capital; //compiler error: Capital is not Thing class' member var country1 = (Place)holiday.Get(); //cast to Place var countryArea = country1.Area; //ok //var countryCapital = country1.Capital; //compiler error: Capital is not Place class' member var country2 = (Country)holiday.Get(); //cast to Country var countryCapital = country2.Capital; //ok Assert.Throws<InvalidCastException>(() => (Place)@event.Get()); //Runtime error Assert.Throws<InvalidCastException>(() => (Country)@event.Get()); //Runtime error Assert.Throws<InvalidCastException>(() => (Country)trip.Get()); //Runtime error } } } namespace NewWay { class Thing { public string Name { get; init; } } class Place : Thing { public string Area { get; init; } } class Country : Place { public string Capital { get; init; } } class Event { public virtual Thing Get() => new Thing { Name = "Big Bang" }; } class Trip : Event { public override Place Get() => new Place { Name = "Cruise", Area = "Mediterrnanean Sea" }; } class Holiday : Trip { public override Country Get() => new Country { Name = "Australia", Area = "Oceania", Capital = "Canberra" }; } public class Tests { [Fact] public void Test() { var (@event, trip, holiday) = (new Event(), new Trip(), new Holiday()); var thing = @event.Get(); var thingName = thing.Name; var place = trip.Get(); var placeName = place.Name; var placeArea = place.Area; //ok, place has already the correct type, no cast required var country = holiday.Get(); var countryName = country.Name; var countryArea = country.Area; //ok, country has already the correct type, no cast required var countryCapital = country.Capital; //ok, country has already the correct type, no cast required //As cast is not required, the possibility of runtime errors due to wrong cast is eliminated } } } }

In the above example, the namespace OldWay shows how to return derived types from overrides before this feature was added, whilst NewWay shows how with the new feature we can avoid casts, and also declaratively assert clearly that we are returning a derived type.

Enumerability By Extension

Another concept borrowed from FP is “laziness.” Laziness allows us to write more modular code, that is to say, implement a better separation of concerns. We can write code that produces a result, without executing it immediately, but deferring the execution to a later time when the result is effectively accessed. A textbook scenario is a database query. Often our programs contain iterative or event-based workflows - where we check some conditions and based on the check we run one of multiple different queries. Without laziness, we would need to put the query code inline with the condition-check, or alternatively pre-calculate all the possible results in the cached collection to be used in the iteration. With laziness, we get both. We can write the query code in one place and declare it as lazy. Somewhere else, in our codebase, we can then write the iteration accessing the data as if it were cached - when in reality the compile has hooked a call to our lazy code.

Laziness in constructing collections is not new in C# — the IEnumerable and IEnumerable<T> generic interfaces were available well before version 8 of the language.

What is new, though, is the possibility to add this feature to existing types via type extensions. Previously, Enumerable types needed to implement IEnumerable or IEnumerable<T>. Today, this is not strictly necessary. What is required is to provide a public GetEnumerator method returning an instance of a type that defines two public members: a current property and a bool MoveNext parameterless method.

And this method can be an extension method, so we can add enumerability to a library type (where previously the only way was to add an IEnumerable implementation which required access to the source code and recompilation of the library).

using System; using System.Collections; using System.Collections.Generic; using Xunit; namespace Cs9CheatSheet.FitAndFinish.EnumerableByExtension { class EnumerableTour : IEnumerable { public string Day1 => "New York"; public string Day2 => "Boston"; public string Day3 => "Washington DC"; public IEnumerator GetEnumerator() { return new Enumerator(this); } class Enumerator : IEnumerator { EnumerableTour _tour; public Enumerator(EnumerableTour tour) { _tour = tour; } int _index; public object Current => _index switch { 0 => _tour.Day1, 1 => _tour.Day2, 2 => _tour.Day3, _ => "Please buy a new Tour" }; public bool MoveNext() => (_index = Math.Max(_index++, 3)) < 3; public void Reset() => _index = 0; } } public class NonEnumerableTour { public string Day1 => "Chicago"; public string Day2 => "Las Vegas"; public string Day3 => "Miami"; } static public class Extensions { public class MyEnumerator { NonEnumerableTour _tour; int _index = 0; public MyEnumerator(NonEnumerableTour tour) { _tour = tour; } public string Current => _index switch { 0 => _tour.Day1, 1 => _tour.Day2, 2 => _tour.Day3, _ => "Please buy a new Tour" }; public bool MoveNext() => (_index = Math.Max(_index++, 3)) < 3; } public static MyEnumerator GetEnumerator(this NonEnumerableTour tour) => new MyEnumerator(tour); } public class Tests { [Fact] public void This_tour_is_for_me() { var lovedCities = new HashSet<string>( new [] { "New York", "Chicago", "Washington DC", "Las Vegas", "Los Angeles", "Boston", "Miami" }); foreach (var city in new EnumerableTour()) Assert.Contains(city, lovedCities); foreach (var city in new NonEnumerableTour()) Assert.Contains(city, lovedCities); } } }

In the previous code example, EnumerableTour implements IEnumerable but NonEnumerableTour doesn’t. However, in the test, we can see that both types allow you to enumerate their cities through foreach. What makes this possible is the Extension class code, where GetEnumerator is defined as an extension method for NonEnumerableTour. If NonEnumerableTour was in a NuGet library, this extensibility wouldn’t have been possible previously. If you wanted it, you had to change the source code and transform NonEnumerableTour into EnumerableTour, and recompile the NuGet package (or create a derived type implementing IEnumerable).

Asynchronous Streams

Asynchronous streams are a powerful feature aimed to improve resource usage, especially in scenarios characterized by a particular type of data source, known under the name of streams. With the ever-increasing pervasiveness of networking and distribution, the new generation of applications will need to deal with increasingly sparse data.

Once upon a time, computers got data through perforated cards and spit out results as line-printed-paper trails. Then we passed to magnetic supports and phosphorus screens… Step by step, we came to these days where almost everything external to the unit we are programming is identified by an internet URI (Uniform Resource Identifier). It doesn’t ever matter what there is behind that funny string. What matters is that we can write something like Web.GetSomethingFrom(“some-kind-of-protocol://someone-out-there/what-we-want?color=blue&flavor=vanilla&whatnot..”) in place of the “archaic” Console.Read() or File.GetLine() or DB.Query(“SELECT…”).

Every one of these data exchanges could traverse a large number of routing points, bounce through satellites, and be converted several times to different means (electricity, light, radio waves, little messages tied to a carrier pigeon’s ankle, etc).

Dealing with sparse and remote data sources has important implications, though. The most relevant is, indeed, asynchronism. A data exchange is more like a mail exchange than a phone call. We send a request to the URI and then… wait for an answer.

Modern program flows are not like in the old days:

Read inputs from a connected device Elaborate Write outputs to a connected device

They are more like:

Register your interest in some subject Do something else When some event occurs, react to it (elaborate) and write the relative piece of output

We cannot any longer assume that the data we need is immediately available when we need it. Instead we need to deal with the fact that someone else decides when to send it to us. Not only that, but we must also deal with the fact that the waiting times can be several orders of magnitude longer than those required by our CPU to elaborate. In other words, we cannot afford to simply wait, we must organize our programs to use the CPU in between successive packets of data. We need to make asynchronous programs.

Asynchronous programming is nothing new, by the way. Starting from the lowest level, the CPU, most activity is based on interrupts. Hardware components, like network cards, used to interrupt the CPU when they have something to tell it. While traditionally asynchronism was under the control of hardware or the operating system in the machine, today control is lost in the internet sea - so that its management must involve the application itself, rather than the system that runs the application.

The .NET API has gone through a profound transformation over the years, making asynchronous programming progressively easier and more natural. Events were available even before .NET, and they were part of the very first C# version. In many cases, though, asynchronous programming was left to the developer code, for example running a worker thread that, in turn, cyclically polls a service checking for new data to be available.

Later we saw the callback paradigm, where asynchronous operations were based on three functions:

Begin to start the asynchronous operation and register a completion callback Callback passed to the system in the Begin Call. The system calls it when a relevant event occurs End to be called by our code (usually at the end of the callback) to inform the system that the transaction is complete

The Task Parallel Library (TPL) was also created to ease multitasking, relieving the developer from the burden of thread management, as threads are expensive and therefore a limited resource.

The latest improvement is the async/await pattern. We are now able to write asynchronous workflows almost as a regular code block, while the compiler takes charge of transforming that into separate blocks with interruption and continuations, then put them into automatically created tasks, and schedule them in a coordinated way using a managed thread pool.

Asynchronous streams add a new important feature to the async/await pattern. Syntactically, the feature is quite easy to use. You simply need to:

Supply a method returning the new interface type IAsyncEnumerable<> and containing the composite keyword yield return await in its code In the call site, use the new await foreach asynchronous iteration construct

The new pattern is particularly suited for scenarios where the data emitter has the characteristics of a stream. A relevant example is an Internet of Things (IoT) environment. Imagine a pool of meteorology sensors distributed across a territory, which send updates to a cloud service, either in a timely fashion, and/or when some specific condition is detected (for example, the temperature changes by at least 1 degree). This is a typical scenario for the observer/observable pattern, however, asynchronous streams help represent it in code with similar expressivity, and probably less ceremony.

Here an example, followed by some explanation and a few notes.

using System; using System.Collections; using System.Collections.Generic; using System.Linq; using System.Net; using System.Text.Json; using System.Threading; using System.Threading.Tasks; using Xunit; namespace Cs9CheatSheet.AsyncStreams.WorldBank { public interface IDataSource { Task<string[]> DownloadIso2CodesAsync(); IEnumerable<Task<JsonElement>> DownloadCountries(string[] iso2Codes); IAsyncEnumerable<JsonElement> DownloadCountriesStream(string[] iso2Codes); } abstract class DataSource : IDataSource { static async Task<JsonElement.ArrayEnumerator> Fetch(string url) { var webClient = new WebClient(); var str = await webClient.DownloadStringTaskAsync(url); var worldBankResponse = JsonDocument.Parse(str); var array = worldBankResponse.RootElement.EnumerateArray(); array.MoveNext(); //header array.MoveNext(); //country(ies) return array; } public async Task<string[]> DownloadIso2CodesAsync() { var array = await Fetch(@"http://api.worldbank.org/v2/country?format=json&per_page=100"); var countries = array.Current.EnumerateArray(); var iso2Codes = countries.Select(country => country.GetProperty("iso2Code").GetString()).ToArray(); return iso2Codes; } public IEnumerable<Task<JsonElement>> DownloadCountries(string[] iso2Codes) { for (int i = 0, n = StartFeed(iso2Codes); i < n; i++) { yield return DownloadCountryAsync(i, iso2Codes[i]); } } public async IAsyncEnumerable<JsonElement> DownloadCountriesStream(string[] iso2Codes) { for (int i = 0, n = StartFeed(iso2Codes); i < n; i++) { yield return await DownloadCountryAsync(i, iso2Codes[i]); } } protected virtual int StartFeed(string[] codes) { return codes.Length; } protected virtual async Task<JsonElement> DownloadCountryAsync(int i, string isoCode) { var array = await Fetch(@$"http://api.worldbank.org/v2/country/{isoCode}?format=json"); var country = array.Current; return country; } } class CountriesDb : DataSource { } class CountriesIoT : DataSource { static SemaphoreSlim[] _semaphores; protected override async Task<JsonElement> DownloadCountryAsync(int i, string isoCode) { await _semaphores[i].WaitAsync(); return await base.DownloadCountryAsync(i, isoCode); } protected override int StartFeed(string[] codes) { var n = codes.Length; async Task Feed() { for (int i = 0; i < n; i++) { await Task.Delay(1000); _semaphores[i].Release(); } } _semaphores = Enumerable.Range(0, n).Select(_ => new SemaphoreSlim(0, 1)).ToArray(); Task.Run(Feed); return n; } } public abstract class Tests { IDataSource DataSource { get; init; } protected Tests(IDataSource dataSource) => DataSource = dataSource; async Task<string[]> GetRandomIsoCodes() { var iso2Codes = await DataSource.DownloadIso2CodesAsync(); var random = new Random(); return Enumerable.Range(0, 10).Select(i => random.Next(i * 10, i * 10 + 10)).Select(i => iso2Codes[i]).ToArray(); } [Fact] public async Task TestBlock() { var selection = await GetRandomIsoCodes(); var countries = await Task.WhenAll(DataSource.DownloadCountries(selection).ToArray()); foreach (var country in countries) AssertCountry(country); } [Fact] public async Task TestStream() { var selection = await GetRandomIsoCodes(); var countries = DataSource.DownloadCountriesStream(selection); await foreach (var country in countries) AssertCountry(country); } void AssertCountry(JsonElement country) {/*...*/} } public class TestsDb : Tests { public TestsDb() : base(new CountriesDb()) {} } public class TestsIoT : Tests { public TestsIoT() : base(new CountriesIoT()) { } } }

I used the WorldBank REST API for this example. It maintains a database of geographical and geopolitical information, available for free on the internet.

There is an interface called IDataSource, which abstracts our model of data emitter. On the application side, the data emitter is something that offers the following services:

DownloadIso2CodesAsync() - Download a collection of Iso2 country codes DownloadCountries(string[] iso2Codes) - Download a collection of country records, with code specified in the iso2Codes input parameter DownloadCountriesStream(string[] iso2Codes) - Same as the previous method, but using the new Asynchronous Streams feature

Note that the two last methods, besides having different names, have exactly the same input type but different return type:

DownloadCountries returns IEnumerable<Task<>> DownloadCountriesStream returns IAsyncEnumerable<>

Both functions return lazy values, which means that the foreach loops they implement won’t be executed when the functions return. And they will be executed only when the last lines of TestBlock() and TestStream() are reached in the Tests class. These two test methods have:

A first common part, where they download a set of ISO codes (100 in the example) and select ten of them randomly

A second part where they retrieve detailed information about the selected countries, but in different ways

Let’s now concentrate on the last two lines, where the key differences are manifested:

TestBlock calls DownloadCountries and receives a collection of Tasks. It then executes them in parallel using the Task.WhenAll TPL API. When all the tasks complete, the collection of results is returned as an array, referenced by the countries symbol. Finally, the last line runs an iteration synchronously on the hydrated array and does application-specific work.

TestStream calls DownloadCountriesStream and receives an IAsyncEnumerable, referenced by the countries symbol. This is the first difference. The symbol countries does not hold a hydrated array of countries like with TestBlock, but a promise. When the last line is reached, nothing has been materialized from the data sources yet, while with TestBlock the materialization is already complete. Finally, TestStream executes an asynchronous iteration calling the new await foreach. This is the point where the internet is finally hit and the remote data are retrieved

At this point, it’s worthwhile to highlight a qualifying point. In the TestBlock case, we have the opportunity to leverage the TPL, while in the TestStream case we don’t. This has two potential consequences:

Performance: If the backend has already available all the countries in our selection at the time the requests come, TestBlock will get all the cities in roughly the same time required to retrieve only one of them. TestStream, on the other hand, is constrained to a sequential workflow, so the time will necessarily be the sum of the times for each of the cities in the selection Memory: The block version will allocate memory for all the cities, while the stream version only needs to allocate memory for one.

Therefore, if the backend has all the data, choosing between block and stream is a tradeoff between memory resources and performance. Usually, in situations like this, the best solution is the parallel one with pagination.

However, if the backend does not have all the data, like the meteorology sensors I mentioned above, the performance advantage of parallel execution is nil, since even if the tasks are running in parallel, only one of them will complete just after the launch. The remaining tasks will need to wait until new data is available. In this case, the stream solution is definitely superior, as it will give the results in the same time, but with less memory use and with only one pending task during the data retrieval period.

In the previous example, this is demonstrated by creating two implementations of IDataSource: CountriesDb and CountriesIoT. CountriesDb behaves like a database. That is, it simulates a data source that responds immediately upon request. CountriesIoT, instead, simulates an IoT device, with data items available one by one at one-second time distance. (The trigger is the task started in StartFeed where a pool of semaphores is released over time, while the overridden DownloadCountryAsync awaits one of the semaphores before doing active work.) Running the tests produces the following results:

With a db-like data source, the block workflow executes in half of the time taken by the stream workflow. With the IoT-like data source, on the other hand, times are comparable.

Recap

In this article, you have explored some of the important improvements introduced into C# versions 8 and 9. What you learned:

Immutability: Init-Only Properties, Records, Positional Records, Static Local Functions Default Interface Methods: Non-Breaking Interface Augmentation Pattern Matching: Switch Expression, Type Check Pattern, Property Pattern Compactness: Indices and Ranges, Null Coalescing, Target Type Inference Type Extensibility: Covariant Returns, Enumerability By Extensions Asynchronous Streams Learn More About .NET, C# 8 and C# 9

If you are interested in learning more about C# 8 and C# 9 check out these other great articles:

Okta ASP.NET Core MVC Quickstart Secure Your ASP.NET Core App with OAuth 2.0 Build Single Sign-on for Your ASP.NET MVC App Policy-Based Authorization in ASP.NET Core Store ASP.NET Secrets Securely with Azure KeyVault User Authorization in ASP.NET Core with Okta What’s new in C# 8.0 What’s new in C# 9.0 Welcome to C# 9.0

If you like this topic, be sure to follow us on Twitter, subscribe to our YouTube Channel, and follow us on Twitch.


Trinsic (was streetcred)

Trinsic Goes Global, Exceeds 1,000 Customers

A developer and product manager from Trinsic’s first Fortune 500 customer scheduled a call with our CEO Riley Hughes shortly after Trinsic’s beta platform launched in 2019. “Thank you for building your API,” the developer explained. “We spent weeks trying to get something working with one of your competitors. All-in, we estimated that using their […] The post Trinsic Goes Global, Exceeds 1,000 C

A developer and product manager from Trinsic’s first Fortune 500 customer scheduled a call with our CEO Riley Hughes shortly after Trinsic’s beta platform launched in 2019. “Thank you for building your API,” the developer explained. “We spent weeks trying to get something working with one of your competitors. All-in, we estimated that using their solution would take us four months to build our product. In just the last three weeks with Trinsic, we’ve built our solution, demoed it to stakeholders, and secured our first pilot.”

 

This kind of conversation still happens almost every day at Trinsic; 80% of developers who’ve used our platform recommend it. Since launching the world’s first production-ready verifiable credential platform less than a year ago, over 1,000 developers from more than 75 countries have used Trinsic for issuing and verifying verifiable credentials. We have no “sales” or “business development” staff because the best sales tool is a good product, and the best business development is hundreds of happy customers.

 

See for yourself how easy it is to issue credentials in the Trinsic Studio. It will take you less than five minutes.

Production deployment around the globe

Supporting teams in 75+ countries has taught us important lessons about the diversity of use cases and their requirements. And with customers in production in more than 10 of those countries, there are some requirements that were needed immediately:

 

Compliance with global data protection regulation Choice of public utilities/blockchains Accessibility first

 

As the number and diversity of Trinsic’s customer base continue to grow, we know these features will be needed by all. That’s why we’re announcing our commitment to make Trinsic more accessible and functional for our customers all over the globe in the following ways.

Global compliance through data residency

Our customers must comply with local regulations which often mandate that PII (personally identifiable information) remain within the borders of their jurisdiction. Starting now, Trinsic’s customers will have the ability to choose where their data is hosted in both the Trinsic platform and the mobile app (mediator service). This helps with compliance requirements associated with data protection laws in Canada, EU, South Africa, and other geographies. This feature is available on all paid plans.

 

Trinsic now supports data centers in the following locations:

 

United States Canada Netherlands South Africa Singapore Australia

 

Each Organization (cloud agent) is hosted on its own dedicated tenant. When using Trinsic Studio, you can select the region by selecting from the drop-down menu that is titled “Select Region” when creating an Organization.

If you are interacting directly with Trinsic’s API, the default region is USA. A user can put the region into the API using the alpha-2 country code for that region. For example, a user who wants to create an Organization in Canada would add:

{

"name": "Canadian Organization",

"network": "sovrin-live"

"region": "CA"

}

Global utility networks

Experience with diverse, global use cases has shown us why choice of networks is important. Trinsic’s vision is to be ledger and protocol agnostic, so we’ll add any production-ready networks as they become available. We support the following networks out of the box:

 

Sovrin: Launched in 2017, Sovrin was the first production-ready distributed ledger purpose-built for self-sovereign identity and includes Sovrin MainNet, StagingNet (most common for demos and pilots), and BuilderNet. Sovrin has stellar uptime and a metrics dashboard that monitors its performance. BCovrin: This is a clone of Sovrin for testing in the British Columbia region of Canada. Indicio: Indicio is a concierge-class decentralized identity network that provides developers with a stable and reliable environment to build, test, and demo their SSI solutions, beginning with a test network. In the near future, we will support geo-specific networks in Africa, Canada, Finland, Germany, and Southeast Asia.

 

When creating an Organization in Trinsic Studio, there is a drop-down menu titled “Select Network” which allows you to easily designate which network your Organization is provisioned on.

The Trinsic mobile app supports these networks and also allows developers to add their own networks to connect with.

Add a new network to the mobile app by clicking the “+” button to the upper-right of the network selection page in the mobile app.

Switch between networks by navigating to the network selector page in the settings menu.

We look forward to continuing to add support for new networks as they develop and mature around the world. The following requirements must be met in order for Trinsic to add a production network to its platform:

 

The network needs to be sufficiently decentralized with a governance framework outlining how it intends to avoid centralization of power. The network needs to allow Trinsic to sign transactions on behalf of our customers; this is sometimes called an ‘endorser’ or ‘trust anchor’ model. The network needs to have some form of paid staff support so that we can quickly resolve issues if our customers face them.

 

If your network meets these requirements and you would like your network to be accessible to the global Trinsic developer community, please contact us.

Global accessibility

As a part of expanding into other geographies, we have begun work to translate the Trinsic Wallet application into various languages. If you would like to volunteer your time to translate the Trinsic Wallet into your language, let us know, and we can get you connected to the translation process.

Going global

Compared to the potential scale of SSI, 1,000+ customers means we’re still in our infancy. However, it also represents impressive initial growth of the SSI space. Developers are recognizing the value of SSI for themselves and their customers.

 

With the addition of the data residency feature, our ongoing support of new networks, and our push to translate the Trinsic Wallet application into other languages, we hope to expand the reach of the Trinsic platform to developers and organizations around the world looking to build SSI solutions. Our goal has always been to make SSI as accessible as possible, and we look forward to continuing to work at that goal as we grow.

The post Trinsic Goes Global, Exceeds 1,000 Customers appeared first on Trinsic.

Tuesday, 12. January 2021

IdRamp

IdRamp shows Iowa Senator Joni Ernst new ways to verify digital information

Ernst said she was excited to learn that information can be available digitally without having it stored on Google or with another large tech company. The post IdRamp shows Iowa Senator Joni Ernst new ways to verify digital information first appeared on idRamp | Decentralized Identity Evolution.

Ernst said she was excited to learn that information can be available digitally without having it stored on Google or with another large tech company.

The post IdRamp shows Iowa Senator Joni Ernst new ways to verify digital information first appeared on idRamp | Decentralized Identity Evolution.

Global ID

The GiD Report#142 — Your WhatsApp data belongs to FB now

The GiD Report#142 — Your WhatsApp data belongs to FB now Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. What we’ve got this week: Big Tech pulls plug on Parler Your WhatsApp data belongs to FB New Senate majority bad for Big Tech P
The GiD Report#142 — Your WhatsApp data belongs to FB now

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

What we’ve got this week:

Big Tech pulls plug on Parler Your WhatsApp data belongs to FB New Senate majority bad for Big Tech Paypal + Square v. Regulators Vaccine rollout needs better messaging + identity platforms Losing trust in the Federal Reserve Stuff happens 1. Big Tech pulls the plug on Parler

Most of you likely already know this story. Violence in DC last week sparked a swift reaction from Tech’s heavy hitters with Apple, Google and Amazon all pulling to plug on Parler, an alternative social media platform preferred by conservatives.

Two big takeaways:

Moderation is a must in today’s digital reality. Really, it’s a part of growing up and the internet is so intertwined with our daily lives, we can no longer have the expectation of some Wild West. Plus, we have an obligation to keep our communities safe — whether in the real world or online. Clearly, the two worlds have collided and have spilled into each other. Big Tech is very powerful — so powerful that they can swiftly shutdown an incredibly popular platform literally overnight, one that boasted over ten million users. Which only raises further questions.

Both are weighty challenges to tackle.

Via Todd — Apple Has Threatened To Ban Parler From The App Store Social media platforms muzzle Trump after Capitol melee Amazon and Apple pull the plug on Parler Parler is gone for now as Amazon terminates hosting Ben Thompson — Trump and Twitter Ben Thompson — New Defaults 2. Your WhatsApp data now belongs to Facebook.

Well, it’s not like we didn’t see it coming. After all, how else will Facebook make money on all those users?

It’s a reality that we’ve come to expect in this era of our tech evolution — the chapter where “we’re the product.” That’s why it’s “free.”

But evolution is a continuous process and there are alternatives now (such as GlobaliD).

There’s also Signal, which got the thumbs up from the new richest man on the planet on Twitter:

 — @elonmusk

This part of how change happens — from the bottom up, from innovation, from users deciding that they want something better.

Via /ctomc WhatsApp Soundly Beaten By Apple’s Stunning New iMessage Update Mandatory WhatsApp Privacy Policy Update Allows User Data to be Shared With Facebook Via /j — Elon Musk is now the richest person in the world, passing Jeff Bezos 3. The other way change happens is from the top down. You can bet that the new Democratic Senate majority is going after Big Tech.

Here’s The Information:

Democrats have discussed sweeping changes to antitrust laws to make it easier for prosecutors to win antitrust convictions against companies with dominant market power, but Republicans have stood in the way of such legislation. Federal anti-monopoly lawyers recently filed legal claims against both Google and Facebook under existing antitrust law. Incoming Democratic president Joe Biden also is expected to try to reverse corporate tax cuts and reform a law known as Section 230 that shields internet companies like Google and Facebook from lawsuits over content posted by their users. Legislation to update online privacy-related rules also would be more likely to pass. To be sure, the Senate currently has a cloture rule requiring 60 members to end debate on most topics to move them to a chamber-wide vote.
Briefing: Democrats Nearing Senate Majority, Spelling More Trouble For Big Tech Via /j — House: Amazon, Facebook, Apple, Google have “monopoly power,” should be split 4. But the top-down method is always reliable or effective. Sometimes, it misses the mark or has unintended consequences.

One example was the CFPB’s (well-intentioned) prepaid card regulations — which the regulatory body wanted to expand into Paypal’s territory. That, of course, never really made sense and would have strapped Paypal’s innovative potential without actually protecting consumers.

As Greg noted on Twitter:

 — @gregkidd

Paypal wins bid to strike down CFPB prepaid card regulations Photo: JD Lasica

Another example could be FINCEN’s midnight crypto wallet regulations. Square’s response was on point:

With this rulemaking, FinCEN seeks to expand reporting and Know Your Customer (“KYC”) type obligations to parties who are not our customers. Instead of leveraging blockchain tracing with wallet addresses (which to date has proven effective in tracking the unlawful activity cited in the Proposal leading to indictments and convictions), FinCEN proposes a static requirement that would have us collect names and physical addresses from non-customers. To put it plainly — were the Proposal to be implemented as written, Square would be required to collect unreliable data about people who have not opted into our service or signed up as our customers.
This creates unnecessary friction and perverse incentives for cryptocurrency customers to avoid regulated entities for cryptocurrency transactions, driving them to use non-custodial wallets or services outside the U.S. to transfer their assets more easily (non-custodial, or “unhosted” wallets are a type of software that lets individuals store and use cryptocurrency, instead of relying on a third party). By adding hurdles that push more transactions away from regulated entities like Square into non-custodial wallets and foreign jurisdictions, FinCEN will actually have less visibility into the universe of cryptocurrency transactions than it has today.
The impact of the Proposal would not only hamstring law enforcement capabilities, but also limit American innovation by hindering our ability to create a competitive service that allows customers to seamlessly transfer and transact in cryptocurrency the way the technology was designed. The burdensome information collection and reporting requirements deprive U.S. companies like Square of the chance to compete on a level playing field to enable cryptocurrency as a tool of economic empowerment.
Via /gregkidd — Square, Inc.’s Federal Comment Letter Regarding FinCEN’s Proposed Rulemaking on Requirements for Certain Transactions Involving Convertible Virtual Currency or Digital Assets Square says new FinCEN wallet proposal will inhibit crypto adoption and hinder law enforcement efforts 5. The COVID-19 vaccine rollout in the U.S. has been clunky at best. And at its core, it’s because we need better identity and messaging systems.

Axios:

The big picture: Historically, the federal government has established systems to help local governments deploy emergency information, like tornado and hurricane warnings that are broadcast on local television, as well as localized text alerts.
But the government hasn’t set up emergency communication systems to convey localized information about the vaccine, forcing citizens to turn to less reliable sources of information online.
“We’re going to have to think through systems that will reach people when they need information that’s highly specific — and in this case, time sensitive,” says Kathleen Hall Jamieson, director of the Annenberg Public Policy Center at the University of Pennsylvania. “We’ll need to institutionalize that structure and keep track of it.”
“The fact that we don’t already have it is a real indictment,” she told Axios. “We should’ve thought this through before.”
What to watch: A lack of coordinated messaging around the vaccine rollout has left millions of people to search for answers online and via social media, opening space for confusion and misinformation.
Experts worry that big tech platforms, already reeling from election misinformation problems, are not equipped to help vet and verify vaccine rollout information.
Of note: Almost all of the experts Axios spoke to said that the best way to tackle this problem would be through a massive, federal government-backed awareness campaign educating consumers about the importance of getting a vaccine and directing them to some sort of a federal directory with links to verified local resources.
Since the U.S. has no centralized database with citizens’ addresses and health records, that’s likely the fastest thing the federal government can do to support local governments with the rollout at this point.
As localities improvise to distribute COVID vaccines, an information vacuum emerges Singapore says COVID tracing app data is fair game for criminal investigations — CyberScoop Los Angeles Vaccine Recipients Can Put the Proof in Apple Wallet 6. This week in society’s trust deficit: The Federal Reserve.

Axios:

Big names in the world of finance are beginning to call out the Fed and other central banks for their role in ramping up economic inequality and manipulating financial markets — a departure from the praise they received for most of last year.
Why it matters: Wall Street was the only pillar of solid support. Most Americans say they don’t trust the Fed and politicians look to be taking aim at the central bank for overreaching with its unprecedented actions in March.
GOP efforts to curb Fed’s lending powers signal future hostility under Biden administration

Chart of the day:

Bitcoin doubles to new all-time high as bull run continues Gamblers Could Use Bitcoin at Slot Machines With New Patent Issuing Stablecoins on the XRP Ledger | Ripple Ripple CEO Brad Garlinghouse responds to questions surrounding the SEC’s lawsuit Lawyer Analyzes the SEC vs. Ripple / XRP Lawsuit and What the Likely Result Is. 7. Stuff happens: Via /vs — Denmark introduces digital driving license | ReadID Via /vsBack to the Future of FinCrime — CFCS | Association of Certified Financial Crime Specialists Via /jvsEquifax will pay $640 million for Kount’s AI-driven identity and fraud prevention tools Via /j — Trump bans Alipay and seven other Chinese apps The Man Who Turned Credit-Card Points Into an Empire Alphabet Union Sparks Interest From Workers Across Tech in Organizing

The GiD Report#142 — Your WhatsApp data belongs to FB now was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Gluu - Blog

FIDO2 plus fingerprints in a brilliant, small package

AuthenTrend, a fingerprint security key solution provider, today announced a new partnership with Gluu, a leader in open source identity and access management ecosystem. The...

AuthenTrend, a fingerprint security key solution provider, today announced a new partnership with Gluu, a leader in open source identity and access management ecosystem. The goal of the partnership is to collaborate on passwordless infrastructures for enterprise employees, consumers, and citizens. Gluu’s has tested AuthenTrend’s FIDO authentication devices, ATKey.Pro and ATKey.Card which are compliant with the FIDO2, FIDO U2F, andW3C WebAuthn standards, which enable people to move away from passwords in lieu of secure cryptographic credentials based on public and private key technology.

AuthenTrend’s applied fingerprint-authentication technology is trusted by the Microsoft Intelligent Security Association, and can also be used for Windows Sign-on. Unlike traditional fingerprint devices, AuthenTrend’s patented standalone enrollment technology allows users to enroll fingerprints directly on the ATKey.Card or ATKey.Pro, no app download required. The ATKey.Pro device is available as both USB-A and USB-C. The ATKey.Card device supports USB-A, NFC and Bluetooth.

FIDO leverages the previously enrolled biometric template to unlock access to a private key. While older FIDO technologies required an end-user to simply press a button, biometric FIDO credentials ensure that only the owner of the key can use the credential. The private key never leaves the device–meaning a breach of the server does not enable impersonation. FIDO technology also prevents phishing, as hackers cannot trick the user to provide a one time code or intercept it with malware.

Gluu includes extensive support for FIDO devices in it’s platform, including both FIDO U2F and FIDO 2 authentication and registration endpoints. This design enables applications to use federated digital identity protocols like SAML or OpenID Connect to achieve single sign-on across multiple domains, and to request FIDO as the mechanism to authenticate the end user. The Gluu platform also includes a self-service web portal called Casa that enables end users to see their registered devices, and to remove a device, for example if it’s lost. Like all Gluu components, the login experience and Casa can be customized to meet the exact localization, user experience and security requirements of enterprises and governments. For more information about Gluu, you can visit their website at https://gluu.org

AuthenTrend aspires to replace all passwords with biometrics to help people take back ownership of their credentials. Learn More AuthenTrend products: https://www.authentrend.com


Infocert (IT)

InfoCert digitalizza la raccolta del consenso informato per le vaccinazioni anti Covid-19

CON PROXYSIGN, INFOCERT CONTRIBUISCE A VELOCIZZARE LA SOMMINISTRAZIONE DEI VACCINI ANTI COVID-19. Il 27 dicembre 2020 sarà ricordato in Italia e in tutta Europa come il Vaccine day, ovvero il giorno in cui ha preso il via la campagna di vaccinazione anti Covid-19. Da questa stessa data InfoCert ha affiancato con le sue soluzioni una delle […] The post InfoCert digitalizza la raccolta del co

CON PROXYSIGN, INFOCERT CONTRIBUISCE A VELOCIZZARE LA SOMMINISTRAZIONE DEI VACCINI ANTI COVID-19.

Il 27 dicembre 2020 sarà ricordato in Italia e in tutta Europa come il Vaccine day, ovvero il giorno in cui ha preso il via la campagna di vaccinazione anti Covid-19. Da questa stessa data InfoCert ha affiancato con le sue soluzioni una delle prime aziende sanitarie a partire con le vaccinazioni in Italia.

La ProxySign Interactive Trust Platform di InfoCert ha permesso di digitalizzare l’intera procedura di raccolta del consenso informato dei vaccinandi, che è obbligatoria e richiede la sottoscrizione con firma autografa.

La soluzione messa in campo da InfoCert rende la procedura più semplice e pienamente conforme alle normative:

effettuato il riconoscimento del vaccinando attraverso la tessera sanitaria, il personale dell’ASL procede ad effettuarne l’anamnesi e ad inserire tutti i dati al terminale; il vaccinando appone sul tablet dell’operatore due firme grafometriche (Firme Elettroniche Avanzate – FEA); le firme vengono immediatamente recepite e verificate dal sistema senza alcun passaggio cartaceo.

La raccolta dei consensi tramite firma grafometrica aumenta la sicurezza del processo, la sua efficienza e permette di ridurre i tempi di permanenza degli utenti all’interno del centro vaccinale, azzera i rischi di errore procedurale e riduce significativamente i costi di gestione a carico delle ASL.

Scopri di più

The post InfoCert digitalizza la raccolta del consenso informato per le vaccinazioni anti Covid-19 appeared first on InfoCert.digital.


IdRamp

IdRamp and QiqoChat Announce Verifiable Credentials for Online Collaboration

IdRamp and QiqoChat have launched the world’s first implementation of verifiable personal identity credentials for virtual conferences and collaboration. The post IdRamp and QiqoChat Announce Verifiable Credentials for Online Collaboration first appeared on idRamp | Decentralized Identity Evolution.

IdRamp and QiqoChat have launched the world’s first implementation of verifiable personal identity credentials for virtual conferences and collaboration.

The post IdRamp and QiqoChat Announce Verifiable Credentials for Online Collaboration first appeared on idRamp | Decentralized Identity Evolution.

Coinfirm

Switzerland Crypto Regulations: KYC, Taxes & FINMA

Switzerland Crypto Regulations Key Takeaways; Zug residents pay taxes in crypto Strict AML, KYC & CFT requirements Cryptocurrencies are legal tender in some cases Closely aligned with the FATF FINMA & SFTA oversee cryptoasset activities Cryptocurrency banks & exchanges are legal As a historic global financial hub, Switzerland is well-known for being a large gold...
Switzerland Crypto Regulations Key Takeaways; Zug residents pay taxes in crypto Strict AML, KYC & CFT requirements Cryptocurrencies are legal tender in some cases Closely aligned with the FATF FINMA & SFTA oversee cryptoasset activities Cryptocurrency banks & exchanges are legal As a historic global financial hub, Switzerland is well-known for being a large gold...

auth0

Auth0 Customers: Your Brexit Questions Answered

Making sense of regulatory uncertainty in Brexit impact planning
Making sense of regulatory uncertainty in Brexit impact planning

Auth0 Adds Three Key SaaS Leaders to its Board of Directors

Industry veterans Sue Barsamian, Sameer Dholakia, and Simon Parmett join Board
Industry veterans Sue Barsamian, Sameer Dholakia, and Simon Parmett join Board

Product Management Predictions for 2021

An Interview with Five Product Leaders
An Interview with Five Product Leaders

Torus (Web3 DKMS)

An Off-Chain Solution for Social Recovery

The recent article by Vitalik on social recovery summarised his views on the different options for key management where he pushed for stronger adoption of social recovery wallets. We resonate with much of what was mentioned. Especially the magnitude of key management issues, and the strong need for solutions better than mnemonics. Given that we’ve been working on the same problem for a while

The recent article by Vitalik on social recovery summarised his views on the different options for key management where he pushed for stronger adoption of social recovery wallets.

We resonate with much of what was mentioned. Especially the magnitude of key management issues, and the strong need for solutions better than mnemonics. Given that we’ve been working on the same problem for a while now, we want to present our thoughts on Smart Contract Wallets (SCW) that the social recovery Vitalik describes depends on, why we’ve gone a different direction with our suite of solutions (tKey, the Torus Network) and the tradeoffs that come with both.

Social Recovery is awesome, but SCW/multi-sigs on L2s are a tough problem

Social recovery as prefered by Vitalik manages a user’s account via a smart contract’s signing key. In the case of loss, Guardians which are added prior, allow changing of the signing key via social flows. While definitely a great ideal to work towards, Vitalik outlined SCWs that two main shortcomings to overcome in the near future:

Gas costs. At ETH Gas Station’s standard gwei (250) and ETH’s price today this amounts to ~$35 per user to set up their SCW. Account abstraction and the reliance on relayers.

To address these issues, a migration to L2 and perhaps the inclusion of multi-sigs/SCWs on the protocol level was proposed. By no means is this a small task, there are definitely challenges to solve. With the different L2’s being dished out and the different SCWs/Multi-sigs in the space, each with their own technical architectures and business/project interests to align, one might expect implementations to vary quite a bit.

Because state is separate between all of the different L1s and L2s we face state issues. When we migrate, do we leave L1 state behind? Will we be able to have the same SCW on L1 as well as L2? Would we have to deploy our SCW on different L2s, deploying and syncing permissions on each? Combine this with ETH2 sharding complexity with managing state only increases.

On a technical level, these challenges are already complex, when we introduce building it for an onboardable UX for a mainstream user the restrictions are even tighter.

Using Threshold Cryptography (Off-chain Multi-sigs)

Because private keys are a relatively rigid crypto primitive, initiatives like SCWs/multi-sigs choose to wrap abstractions around accounts to provide more flexibility via an additional smart contract state that is managed on-chain. However, we can also secure accounts in the other direction, by using threshold cryptography to make private keys more flexible.

Initiatives, like Dark Crystal, Vault by Hashicorp, Shamir39 by Trezor as well as our tKey opt to go down this direction. The solutions often use a variation of Shamir Secret Sharing to split a user’s key into a number of shares. With a threshold number of these shares, a user is able to access/reconstruct their key (i.e. 2 out of 3). In specific, tKey has a focus on personal key management and user-centric flows and interested readers can try it out for themselves here.

It is very similar to a multi-sig but with one small (but impactful) difference — it results in a public-private key pair.

Tradeoffs between Threshold Key Management and SCWs/multi-sigs

As threshold key management results in a key pair, it’s directly and natively compatible with a multitude of other cryptographic constructs out there. When compared to SCW/multi-sigs, you have:

One account to manage — Instead of having to manage state across different L2s and L1 you only have one set of permissions/access control to worry about per account.

A key’s widespread compatibility — Cryptographic keys are the foundation of blockchains, and beyond being compatible with different chain states you can use your key to interact with whichever protocol out there be it zero-knowledge proofs on Aztec or staking on ETH2. Private keys are plug-and-play.

Supports use cases beyond blockchains — An application may want to simply sign a token to represent a login, encrypt data for a particular user. These constructions are by no means impossible for SCWs/multi-sigs but it introduces additional requirements and complexity to a stack.

Universally accepted — Key management is a universal problem across cryptography. Using key management structures that are more universally integrated allows blockchains to be more compatible, and potentially integrable with traditional software stacks.

Lack of generic computation — A strong benefit of SCWs is being able to read Ethereum state. You can set daily spending limits, add recoverability via timeouts, or even add DAOs as guardians. While wallets that support computation are definitely very powerful, arguably some of these features can be done client-side (e.g. daily thresholds) even if there aren’t consensus guarantees. Additionally, many useful security checks that applications use today (e.g. IP whitelisting) may be impossible to replicate on-chain.

Social Recovery flows via tKey

To an extent, social recovery flows can be replicated with tKey. To achieve:

There is a single “signing key” that can be used to approve transactions
There is a set of at least 3 (or a much higher number) of “guardians”, of which a majority can cooperate to change the signing key of the account.

We can create two separate sharings, one for the user to manage across their user devices (like a multi-sig) for personal use and another shared across people they know (like guardians). Hierarchies can be created for different groups through sub-sharings. Unfortunately, it is harder to replicate the timeouts or the “vault” architecture Vitalik through native crypto primitives since threshold key management doesn’t rely on Ethereum state.

Threshold Key Management or SCWs? Both?

As an ecosystem, on-chain wallets, with the power of generic computation have a lot to offer, but its compatibility and interoperability restrictions are definitely hurdles to overcome. As ETH2 progresses and more L2s evolve out with varying requirements, as well as other interesting low-level cryptographic tools that are being built, you can be sure that they will still always use native key pairs and we are always going to need good ways to secure them.

For us at Torus, we found the compatibility benefits from threshold key management very compelling. As an SDK it’s easier to integrate into different clients across platforms and blockchains, has no gas costs, and covers 80% of the access management flows required today. Because tKey is native it’s also naturally compatible with SCWs and we’re keeping an eye out for what could the right mix of solutions here. We’d love to work with any of the current SCW operators to see how that might look like.

For readers who want to try out what a Threshold Key Management flow could look like, you can check out tKey here. For developers, you can get the same experience in your application, no matter the platform. Check out docs.tor.us

An Off-Chain Solution for Social Recovery was originally published in Torus Labs on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 11. January 2021

Forgerock Blog

Harnessing Digital Identity to Build Tomorrow’s Public Sector

As we continue to endure the prolonged effects of the pandemic, it should come as no surprise that millions of people are online more than ever. In fact, our New Normal report surveyed 5,000 global consumers and showed that this is likely to be a long-term shift - 45% of people stated that they plan on continuing to use more online services post-pandemic than they did before.  Government s

As we continue to endure the prolonged effects of the pandemic, it should come as no surprise that millions of people are online more than ever. In fact, our New Normal report surveyed 5,000 global consumers and showed that this is likely to be a long-term shift - 45% of people stated that they plan on continuing to use more online services post-pandemic than they did before. 

Government services are one area that has seen online demand skyrocket. From departments who were challenged to manage an influx of benefit claimants, to websites buckling under the weight of people attempting to learn what the COVID-19 restrictions were in their area - government departments and agencies have been at the forefront of the battle to ramp up their digital transformation efforts to meet demand. 

Now with COVID-19 vaccines on the horizon, the beginning of the end of the pandemic crisis just might be in sight. So what long-term lessons can be learned about how to service a customer base which looks ready to access services increasingly online-first, while remaining agile enough to respond to any future societal shifts?

Identity First

To start, your approach should put digital identity at the heart of your operations. Many companies still consider identity a layer in their software stack rather than a core component. It’s only those that subscribe to the latter, more modern mode of thinking, who will stand to benefit from the true business value of identity. From managing and safeguarding a remote workforce to ensuring all citizens have easy access to your online services, digital identity will allow your organisation to face the challenges of today and provide secure, connected digital services at scale, simply by ensuring that systems, services and apps know who (or what) they are interacting with. 

For those of you in customer-facing operations, this matters because it allows you to deliver a personalized, and therefore better, online service that directs people to precisely what they need based on their identity. This creates an effortless user journey that also requires less resource investment. For your workforce, ForgeRock’s digital identity and access management functionality means you can now control who has access to what, and allow rapid scaling as demand grows or shrinks. 

Focusing Your Time on What Matters

Next, your organization needs to be able to react to unforeseen events and changing situations, like COVID-19, quickly and without tying up scarce resources which are in acute demand elsewhere.  

And using the Government-Cloud's framework will allow you to search, compare, and procure from pre-approved suppliers and vendors without lengthy bureaucratic processes.

This speeds up an otherwise protracted approval process, making the task of finding the right digital identity solution for your department quick and seamless. It also means you can focus your efforts on the mission-critical areas that need your attention - like managing the increased user demand from the thousands of people newly accessing government services online now and into the foreseeable future.  

Pick a Cloud, Any Cloud

And the good news? As an approved Government-Cloud vendor, ForgeRock is on the list of providers you can choose from. What’s more, ForgeRock is the only vendor recognized as a leader across the top three analyst firms

Which takes us to the third lesson: every department’s digital transformation journey won’t be identical. That’s why we’ve developed our Identity Cloud solution, a single identity and access management (IAM) platform that can manage and protect all of the identities, devices and services within your organization. 

There’s a range of deployment options available, meaning that organizations can embed our solution into an existing public cloud, on-premises or hybrid environments. It is not an all-or-nothing proposition: your department doesn’t need to ‘rip-and-replace’ their entire legacy identity systems to benefit from our identity cloud capabilities. We adapt to the unique needs and capabilities of each organization.

That means by moving to a hybrid cloud model that can co-exist and augment your existing identity infrastructure, your department can scale with ease, on your own timeline and according to your unique departmental demands and capabilities.

Keeping Your Data Safe

Final lesson: security is more important than ever. From HMRC reporting an increase in phishing scams at the start of the pandemic to local councils suffering from serious cyber-attacks, public sector data breaches are on the rise following the online migration of so many users during the pandemic. With more people signing up for government services, departments must ensure that user data is protected from cyber-attacks and malicious hackers. 

Thankfully, a modern hybrid cloud solution can defend against compromised accounts and data breaches, via ongoing contextual authentication and authorization - ensuring only the right people are accessing the right things. 

It’s a Hybrid World  

The pandemic has changed many aspects of our lives. Some of those shifts will be temporary; others permanent. But for public sector bodies concerned with equipping themselves for the long-term, the lessons are clear: a digital identity solution, available quickly, which adapts to your specific needs and protects your users’ data will put you in good standing. 

And ForgeRock’s Identity Cloud Solution, available to procure via the Government-Cloud portal, will do just that.

 


IBM Blockchain

Blockchain to transform insurance value chain

The insurance industry at its core is built on the legal promise to pay a compensation in case of a loss. Trust is at the core of this promise. However, the level of trust consumers have on their insurance provider is average. The IBM Institute for Business Value (IBV) surveyed 1,100 business insurance executives in 34 countries […] The post Blockchain to transform insurance value chai

The insurance industry at its core is built on the legal promise to pay a compensation in case of a loss. Trust is at the core of this promise. However, the level of trust consumers have on their insurance provider is average. The IBM Institute for Business Value (IBV) surveyed 1,100 business insurance executives in 34 countries […]

The post Blockchain to transform insurance value chain appeared first on Blockchain Pulse: IBM Blockchain Blog.


Bloom

Announcing New Bloom Community Grants for 2021

Bloom is happy to announce that we'll be offering funds for research and development grants to the crypto community to collaborate on some exciting new initiatives to benefit the crypto ecosystem. With these grants we hope to help shed light on some poorly understood fundamentals of the crypto space, as

Bloom is happy to announce that we'll be offering funds for research and development grants to the crypto community to collaborate on some exciting new initiatives to benefit the crypto ecosystem. With these grants we hope to help shed light on some poorly understood fundamentals of the crypto space, as well as to build some exciting new solutions for unsolved problems.

Research Grant: Tokenomics

Bloom will be accepting proposals for a grant of 5 ETH, with an opportunity for ongoing funding upon completion, for research on the economic behavior of ERC20 and other native blockchain tokens. The research should examine the economic interactions involved with different types of tokens, and how their supply, demand and price influence the behavior of smart contracts and off-chain mechanisms for different crypto projects, protocols and applications, and how various distribution models, such as token sales and airdrops, impact them. Examples of projects that may be relevant to this research include MakerDAO, Chainlink, Uniswap, Yearn, and the Graph. Grantees will be expected to produce a scientific research paper detailing their findings, as well as a summary to be published on the Bloom blog.

Build Grant: Unsecured Lending Vaults

Bloom will be accepting proposals for a grant of 10 ETH, with an opportunity for ongoing funding upon completion, for building a solution based on Bloom's specification for unsecured data lending vaults. The solution must involve using either Chainlink protocol's query methods, or direct on-chain transactions, to facilitate loading data on-chain to describe KYC status and interest rate for a borrower. This exciting project opens up the door to unsecured/undercollateralized lending to take place on-chain with participation from traditional banking institutions and other entities in the crypto space, allowing competition among different models of calculating risk for lending to emerge in the crypto space, and for the grantee to explore different models for decentralized governance of unsecured lending. Grantees will be expected to produce a working proof of concept, and if the grant is extended beyond the initial terms, grantees will have the opportunity to develop their solution into a full commercial application.

Unsecured lending vaults are Yearn-style vaults but instead of implementing yield farming strategies, they implement unsecured lending strategies based on the vault creator's proprietary risk pricing models. The community can provide liquidity to a lending vault to gain exposure to the lender's borrowers.

Token governance may be introduced to nominate & select lending strategies for the community to adopt. Stakers would receive rewards based on the returns from the adopted strategies. In addition, some strategies may wish to delegate lending terms like interest rates to liquidity providers, enabling the community to optimize the returns for a given cohort of borrowers.

Build Grant: Utility Experiments

Bloom will be accepting proposals for a grant of 5 ETH, with an opportunity for ongoing funding upon completion, for building a proof-of-concept solution for automatically providing BLT tokens to application users in exchange for providing Verifiable Credentials of their personal information via the Bloom App. Examples may include providing data to assist in building credit models, direct payment to users for personalized advertisement, providing data to assist in market research, surveys, or other similar use cases - in every case, through a mechanism similar to scanning a QR code and responding with Bloom verifiable credential data as well as an ETH address at which to receive payments. Grantees will be expected to produce a working proof of concept, and if the grant is extended beyond the initial terms, will have the opportunity to develop their solution into a full commercial application.

How to apply

Applications due by Friday, January 29, 2021

If you are interested in applying for any of the above grants, send us an application at grants@bloom.co. Please include information about other projects you have completed in the space, why you are a good candidate for the grant, and how you would approach the chosen topic, including a time estimate for completing the first submission. Once accepted, we will invite you to our Discord where we can kick off the project!

Joining the new community discord

Soon after the onboarding of our grant recipients to the Discord, we'll invite more community members in to discuss the projects as well as future research, and grant opportunities.

We look forward to receiving the community's proposals and announcing our grantees!


Trinsic (was streetcred)

Trinsic and Indicio Partner to Empower Enterprise Teams

Trinsic and Indicio.tech (Indicio), a professional services provider for decentralized identity, are partnering to ensure that the Indicio Network is accessible to all developers of self-sovereign identity (SSI) solutions. As a result of this partnership, the Trinsic platform now supports the Indicio Network in both Trinsic Studio and the Trinsic Wallet. In addition, Indicio is […] The post Trin

Trinsic and Indicio.tech (Indicio), a professional services provider for decentralized identity, are partnering to ensure that the Indicio Network is accessible to all developers of self-sovereign identity (SSI) solutions. As a result of this partnership, the Trinsic platform now supports the Indicio Network in both Trinsic Studio and the Trinsic Wallet. In addition, Indicio is now providing implementation services and expert training to its clients that wish to leverage the advanced capabilities of the Trinsic platform.

 

The hundreds of developers using Trinsic to issue verifiable credentials now have additional flexibility of the network they want to base their solution on. In addition, issuers and verifiers already using the Indicio Network may now recommend the popular Trinsic Wallet to their credential holders.

 

“Trinsic and Indicio are natural partners,” said Riley Hughes, CEO of Trinsic. “The Indicio team provides world-class professional services, and we provide the world’s most advanced SSI platform. Enterprise teams that combine both expert services and a powerful platform will be well-suited to build the SSI solutions they need to succeed in the market.”

The Indicio Network

The Indicio Network is a concierge-class decentralized identity network that supports the exchange of verifiable credentials. Developed and maintained by some of the most experienced architects in the SSI space, the Indicio Network provides developers with a stable and reliable environment to build, test, and demo their SSI solutions.

 

“We’re pleased to expand accessibility of the Indicio Network to the large community of developers who use the Trinsic platform,” said Heather Dahl, CEO of Indicio. “We’ve always focused on doing the best thing for our clients, whether they want to build their own solution, use open source code, or leverage an existing platform like Trinsic. This partnership solidifies that commitment and our ability to reimagine identity for our clients.”

 

Similar to other leading SSI networks, the Indicio Network is based on the open source SSI codebase of Hyperledger Indy. The network is based on the most recent SSI standards which enables the interoperability of SSI products across different networks.

Using the Indicio Network

You can issue credentials on the Indicio Network in less than 5 minutes for free by following these three simple steps:

 

Create a Trinsic Studio account. Click the “+ Organization” button to create a cloud agent capable of issuing and verifying credentials. Select “Indicio Test Network” from the dropdown and click “Confirm”.

 

Watch the video below to see just how easy it is to get started with the Indicio TestNet using Trinsic.

Other resources to help you get started

Trinsic has a myriad of resources to assist developers in building their solutions.

 

Build a web portal for issuing credentials with the Trinsic Issuer Reference App. Create hundreds of issuer agents on the Indicio TestNet using the Trinsic Provider API. See the Provider Reference App for an example of how you can get started. Want to integrate workflows between the Indicio Network and 2,000+ other SaaS apps? See our native Zapier integration and build integrations with Slack, Office 365, GSuite, and other applications you use every day. The role of decentralized networks in SSI

In verifiable credential exchange, a decentralized network (also referred to as a ledger or a blockchain) primarily acts as a decentralized public key infrastructure (DPKI) for identifying who issued a particular credential. For example, if an issuer is provisioned on the Indicio Network, the authenticity of credentials created by that issuer can be verified by looking up the issuer’s DID on the Indicio Network. For more information on issuer DIDs, see our documentation.

 

In addition to the Indicio Network, the Trinsic platform currently supports organizations being provisioned on the Sovrin Network, Sovrin’s test networks, and the BCovrin Test Network. We plan on adding support for additional networks in the near future and will continue to do so as more decentralized identity networks emerge.

Partnering with Trinsic

If you are a consulting firm and would like to partner with Trinsic, we’d love to talk with you. Likewise, if you are building an SSI solution and would like additional support from expert consultants or outsourced developers, click here to signal your interest, and we’ll put you in touch with a team that can help.

###

About Indicio.tech

Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance on the use of verifiable credentials to build digital identity solutions across a global community of clients. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community.

About Trinsic

Trinsic is the world’s leading self-sovereign identity (SSI) platform for developers. Launched in 2019, Trinsic allows people and organizations to prove things about themselves through portable, digital credentials based on open standards. Organizations use Trinsic Studio to issue ID cards, health certificates, business licenses, and more. Users store these credentials securely in the Trinsic Wallet on their device. Based on privacy-first, standards-compliant, open-source tech, Trinsic has become the leader in decentralized identity with more than 1000 unique customers all over the world. For more information, visit www.trinsic.id.

The post Trinsic and Indicio Partner to Empower Enterprise Teams appeared first on Trinsic.


Coinfirm

Crypto Regulations in the UK: FCA, KYC & Bans

Crypto Regulations in the UK Key Takeaways; Cryptocurrencies not classed as legal tender VASPs apply to FCA for licence (e-money as the exception) Taxes based on activities, entities & tokens Ban on derivatives offering FCA, Treasury & BoE make up Cryptoassets Taskforce 8 Cryptoasset Market ‘Actors’ FCA responsible for AML/CFT of cryptoassets UK Bitcoin Exchange...
Crypto Regulations in the UK Key Takeaways; Cryptocurrencies not classed as legal tender VASPs apply to FCA for licence (e-money as the exception) Taxes based on activities, entities & tokens Ban on derivatives offering FCA, Treasury & BoE make up Cryptoassets Taskforce 8 Cryptoasset Market ‘Actors’ FCA responsible for AML/CFT of cryptoassets UK Bitcoin Exchange...

Indicio

Because decentralized identity can make life better: Why we converted to a public benefit corporation

In December, Indicio.tech reincorporated as a public benefit corporation, joining a worldwide movement committed to align profit with a positive material impact on society. Here we explain why The post Because decentralized identity can make life better: Why we converted to a public benefit corporation appeared first on Indicio Tech.

Because decentralized identity can make life better: Why we converted to a public benefit corporation In December, Indicio.tech reincorporated as a public benefit corporation, joining a worldwide movement committed to align profit with a positive material impact on society. For Indicio, it has always been clear that decentralized identity benefits the public—that is what brought us, the founders, together. It solves a massive structural flaw in the architecture of life online: The lack of an effective way to encode uniqueness and thereby verify individual identity; and it does so in a way that removes the need for third parties to control and store personally identifying information.

Decentralized identity allows people to give meaningful consent to sharing their data in a maximally private and secure way. It answers the deep disquiet over the misappropriation of personal data that has been given a voice in data privacy regulation—and it makes compliance with such laws easy.

All of these are public “goods.” Now, add in decentralized identity’s capacity to help those who have no formal, legal identity, those who are stateless, those who are refugees—a number estimated at over a billion people—to prove that they exist, secure access to health and financial services, and establish rights over property.

To dream this big we have to formulate achievable, incremental steps to get there. We have to create the technology and infrastructure that can realize these public goods; we have to make the tech interoperable and, wherever possible, open source. We have to make it as easy as possible to understand, use, and adopt. We have to build use cases and help others build use cases to reveal its value.

As Indicio grew, and as we saw decentralized identity as an ecosystem that needed to be seeded and cultivated, the public benefit corporate model became more and more compelling as a way of ensuring that our beliefs and values were baked into this mission. But we also saw the benefit corporation as a way of encoding a positive and inclusive culture inside our company. If each team member is genuinely valued for the work they do, they will give their best to our clients, they will become the most effective advocates for our mission.

A brief overview of the benefit corporation movement
The idea of a benefit corporation begins with long-simmering dissatisfaction in the argument that the only responsibility or duty a company had was to increase its profits, a claim that had been forcefully made by University of Chicago economist Milton Friedman in the New York Times magazine in 1970. Arguing that only an individual had responsibilities, and a corporation couldn’t be a person, Friedman defined a new era of shareholder supremacy in business.

In practical terms, the easiest way to see whether a business was acting responsibly was to see if its share value was increasing, a simple metric that had profound consequences for the way a business or corporation was run. The CEO’s job became defined by what he or she did to increase their company’s share price. Shareholders didn’t need to buy into the reasons why the business was founded, or the vision of its founders, or even the value the company provided its customers and society: share price higher, company good. There was no obligation to think, strategically, outside the short-term, or to consider the welfare of community, the environment, or the company’s employees.

Dissatisfaction with the inflexibility of this model from the business side and growing public interest in economic and environmental sustainability and social responsibility helped to open up a legal middle way between for-profit and nonprofit corporations. The “benefit” corporation was the result and the first benefit corporation legislation was introduced in Maryland in 2010. Simply put, profit and public benefit can be combined in a way that allows company directors to balance shareholder and stakeholder interests in the pursuit of that public benefit. Many states now offer similar legislation. In Delaware, where Indicio is incorporated, such corporations are called public benefit corporations.

The case for benefit corporations has been most forcefully put by one of the best-known B-Corps, Patagonia. In registering as the first California benefit corporation in 2017, founder Yves Chouinard said, “Benefit corporation legislation creates the legal framework to enable mission-driven companies like Patagonia to stay mission-driven through succession, capital raises, and even changes in ownership, by institutionalizing the values, culture, processes, and high standards put in place by founding entrepreneurs.”

The social impact of technology
It’s not surprising that environmental impact has been central to defining the B-Corp movement and the companies that have embraced it. 1 But decentralized identity offers a similar opportunity for tech companies to think about the social impact of technology.

We need to set standards for what the public should expect from technology companies and from decentralized identity. We need independent third parties, like B-Lab, which was instrumental in creating the B-Corp model, to help codify and provide independent certification that we—and other tech companies—are walking the walk on digital identity, data privacy, and security when we build and govern decentralized identity infrastructure.

At a time when “Big Tech” is looking more 19th century than 21st century in the way it acts—“Big Tech face its Standard Oil moment” was an end-of-2020 headline in the Financial Times—a transformational technology like decentralized identity gives us an organic opportunity for a reset.

We have the means to give people control of their identities, the right to share their data, and to give the identity-less legal agency in the world. We believe this will trigger a new wave of innovation that will benefit business and organizations too. But we believe, most of all, that it’s the right thing to do. A public benefit corporation is not just the way to do this, it’s the way to create a meaningful conversation in business about the role of technology in people’s lives—and to hold us accountable for all this talk.

1The use of a distributed ledger in decentralized identity does not involve “proof of work” or mining, both of which entail substantial energy costs. Instead, with the optimal network size being 25 or fewer nodes, writing credentials to the ledger is energy comparable to logging into a website or sending a form. Much of the activity in a decentralized identity ecosystem takes place off ledger. This all makes decentralized identity a low-energy consumption practice.

 

Sign up for blog updates Thanks for reaching out. We'll be in touch shortly!

First Name

Last Name

Email

Company Name

SUBMIT

The post Because decentralized identity can make life better: Why we converted to a public benefit corporation appeared first on Indicio Tech.


One World Identity

Digital Identity In 2021: OWI’s Top Predictions

There’s not much that hasn’t already been said about 2020. Last year provided the world a historic level of uncertainty due to the pandemic. Companies worldwide had to suddenly adjust to employees working from home out of necessity while also responding to consumers and customers that were similarly homebound. Despite this uncertainty, the Digital Identity … Digital Identity In 2021: OWI’s Top

There’s not much that hasn’t already been said about 2020. Last year provided the world a historic level of uncertainty due to the pandemic. Companies worldwide had to suddenly adjust to employees working from home out of necessity while also responding to consumers and customers that were similarly homebound. Despite this uncertainty, the Digital Identity (DI) space saw a tremendous amount of forward momentum, as enterprises were forced to utilize DI technologies on a grander scale, simply to conduct business. This transition came with some downsides as well, most notably the lack of innovation. Still, the issues that arose from this response led to some exciting shifts in the DI space, including more investment and awareness.  

 

Now that we’ve officially put 2020 behind us, it’s time to look forward. With the new administration in the US, the rollout of vaccines and the ongoing maturation of DI, there’s plenty more opportunity and enthusiasm ahead.

 

Surely there will be some unexpected hiccups and macroeconomic issues that we can’t predict in 2021 – let’s just hope not on the scale of another global pandemic – but we do see some intriguing trends in digital identity that we believe will shape the space this year. While plenty of other potential possibilities exist, as others have highlighted, these five trends standout to us due to the momentum we’ve seen build over the past number of years. And if you have some thoughts or predictions of your own, drop us an email to let us know.

 

OWI’s 2021 Predictions

 

Incumbent fraud prevention platforms go on a buying spree 

 

Coming off of a record year for the eCommerce and eCommerce fraud space, most companies across the fraud prevention and risk management market saw revenues and profits rise in 2020. Based on the advantages of large-scale networks to deliver superior fraud detection signals, top platforms separated themselves from the field, opening up opportunities for industry consolidation of lagging players. Those companies that did well outperformed their targets by levels of 30% to 50%, while those left behind only saw 10% to 20% of growth beyond targets. Those that outperformed gained an edge due to basic advantages, like network data size, to provide more accurate models and offerings. Since these advantages will continue to widen, the companies that only had a good year will need to raise significant capital or be acquired in order to keep up with those that had a great year.

 

This will push the industry towards more comprehensive, integrated digital identity platform solutions, including everything from identity resolution, customer onboarding, to ongoing account management.

 

Identity confirmation becomes common at checkout and when watching mature content online

 

We expect large platforms, such as Youtube and Netflix, to comply with regulations that seek to protect children and consumers, which is the goal for the Audiovisual Media Services Directive out of the EU, as an example. This will force these companies to know more about their users in order to mitigate this risk, leaning towards convenience and frictionless identification, whenever possible.

 

Digital identity wallet adoption grows worldwide as contact tracing becomes mandatory for everyday activities

 

Companies with digital identity wallet capabilities and offerings have waited for a major catalyzing event to take a good idea and make it applicable for a massive population. COVID-19 contact tracing applications will normalize proving your identity through your device, confirming any close contacts, and potentially providing your vaccination status. It’s similar to the impact that Apple’s release of Face ID had on normalizing facial recognition. 

 

Privacy concerns remain the major barrier to adoption for these tools. Since the COVID response will play out on a worldwide scale, it offers a catalyzing event, which will improve initial adoption rates this year and increase digital identity confirmation uses moving forward. If the wallets can ease privacy concerns with the technology, particularly since consumers will often be utilizing the tool for the first time, then that worldwide adoption can lead to ongoing use far beyond tracking COVID.

 

Digital identity initiatives get the Biden Bump

 

In 2012, the Obama administration unveiled the Consumer Privacy Bill of Rights as part of a “comprehensive blueprint” to improve consumers’ privacy protections and ensure that the Internet remains an engine for innovation and economic growth. Better ID Coalition, led by former Obama administration executives, recently unveiled their blueprint and helped introduce the bipartisan Improving Digital Identity Act of 2020 in Congress. 

 

With the Biden administration tapping many of the same people that guided Obama, expect these issues to have a place at the table. This effort becomes even more operable, since the Biden administration will be working with a supportive Congress. 

 

The passwordless revolution finally arrives

 

Last year saw behavioral biometrics company BioCatch and multi-factor authentication company Trusona raise $168 million and $19.4 million, respectively. The funds came from serious investors, like Amex Ventures, Bain Capital, Citi Ventures, and Kleiner Perkins. This marks the moment of institutional buy-in to kill the password, once and for all. 

 

We expect to start seeing that shift this year with more investments and an increase in awareness about tactics in DI that go beyond the password to reach enterprises and consumers. 

 

What do you think? Agree with these predictions? Have some other thoughts for the New Year? Email us to let us know what you see for digital identity in 2021.

The post Digital Identity In 2021: OWI’s Top Predictions appeared first on One World Identity.


Torus (Web3 DKMS)

Divi Crypto Podcast: Key Management Interview with Zhen

Zhen joins Steve to chat about the decentralization of key management in cryptocurrency. During the podcast, Zhen discusses how tKey works on the backend. To explain it briefly: tKey leverages on Shamir Secret sharing to split a user’s key into three parts. One share remains on the Torus network (a decentralized network run by large ecosystem stakeholders like Binance, ENS, Etherscan, Ontology, Z

Zhen joins Steve to chat about the decentralization of key management in cryptocurrency.

During the podcast, Zhen discusses how tKey works on the backend. To explain it briefly: tKey leverages on Shamir Secret sharing to split a user’s key into three parts. One share remains on the Torus network (a decentralized network run by large ecosystem stakeholders like Binance, ENS, Etherscan, Ontology, Zilliqa, Tendermint, Skale and Matic), while the other two are available to the user, with one share being stored on their device, and the other behind a backup password/seed phrase.

The multi-factor authentication is built so that even if the user’s social account is hacked or compromised, the hacker would still need to find out the recovery passphrase or gain access to the user’s device to get their private key. This maintains high levels of security for the user, whilst maintaining the convenient onboarding flow that mainstream users are familiar with.

Listen to the podcast on Spotify:

Divi Crypto Podcast: Key Management Interview with Zhen was originally published in Torus Labs on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Top 6 Identity Industry Predictions for 2021

As a digital society, we are facing a privacy reckoning and a crisis of confidence. The level of data collection by tech companies has reached a new peak, and consumers are losing faith in service providers’ ability to manage their data respectfully. Combine this situation with the sweeping global changes the pandemic wrought in 2020—including the massive shift to work from home and explosive ecom

As a digital society, we are facing a privacy reckoning and a crisis of confidence. The level of data collection by tech companies has reached a new peak, and consumers are losing faith in service providers’ ability to manage their data respectfully. Combine this situation with the sweeping global changes the pandemic wrought in 2020—including the massive shift to work from home and explosive ecommerce growth—and it’s clear we’re in for an unprecedented era of identity and access management.

 

How will the IAM industry evolve to meet growing identity needs? To gain insight into this question, I and other identity leaders at Ping are sharing our thoughts here on what we think will be the most notable trends in the upcoming year. Some of us focus on data privacy issues, others on identity security, and still others on the user experience, but we all agree that big changes are on the horizon.

 


Torus (Web3 DKMS)

GirlGoneCrypto Interview: Web 3.0 Applications and Private Key Management

GirlGoneCrypto Video Interview: Web 3.0 Applications and Private Key Management Zhen got on a video interview with Lea Thompson to chat about how Torus is making Web 3.0 applications as smooth as Web 2.0. Watch the interview to join them in a deep-dive on the technicals of building key management for mainstream users. GirlGoneCrypto Interview: Web 3.0 Applications and Private Key Management was
GirlGoneCrypto Video Interview: Web 3.0 Applications and Private Key Management

Zhen got on a video interview with Lea Thompson to chat about how Torus is making Web 3.0 applications as smooth as Web 2.0. Watch the interview to join them in a deep-dive on the technicals of building key management for mainstream users.

GirlGoneCrypto Interview: Web 3.0 Applications and Private Key Management was originally published in Torus Labs on Medium, where people are continuing the conversation by highlighting and responding to this story.


Epicenter Podcast: The Decentralized Key Management and Login System for Web3

Leonard and Zhen sat down with Sébastien Couture from Epicenter to talk about the origins of Torus Labs, the products we recently released, and our aims for the future. Listen at https://epicenter.tv/episodes/b003 Topics discussed in the episode Zhen and Leonard’s backgrounds and how they got into crypto How and why Torus was formed The path Torus has followed over the pa

Leonard and Zhen sat down with Sébastien Couture from Epicenter to talk about the origins of Torus Labs, the products we recently released, and our aims for the future.

Listen at https://epicenter.tv/episodes/b003

Topics discussed in the episode Zhen and Leonard’s backgrounds and how they got into crypto How and why Torus was formed The path Torus has followed over the past year Unwrapping the base layer, the Torus Network How Shamir’s Secret Sharing works and why Torus chose to build on it The Torus Wallet and a step by step of how it works Torus and account portability Use cases and applications built on Torus The second layer of security, tKey Their plans for introducing Touch ID How Torus educate their users on how to stay secure The long term goal for Torus and how users are protected in the future

Epicenter Podcast: The Decentralized Key Management and Login System for Web3 was originally published in Torus Labs on Medium, where people are continuing the conversation by highlighting and responding to this story.


Smarter with Gartner - IT

How to Respond to a Supply Chain Attack

In late December, software company SolarWinds became aware of a supply chain attack on one of its software systems. The attackers added malware to signed versions of the supplier’s software, which was then used to infiltrate 18,000 private government and private organizations. The malware became active once deployed in the target environment.  Peter Firstbrook, Gartner VP Analyst, and Jerem

In late December, software company SolarWinds became aware of a supply chain attack on one of its software systems. The attackers added malware to signed versions of the supplier’s software, which was then used to infiltrate 18,000 private government and private organizations. The malware became active once deployed in the target environment. 

Peter Firstbrook, Gartner VP Analyst, and Jeremy D'Hoinne, Gartner VP Analyst, say that although these types of attacks are a reality, organizations are often unprepared to respond to cybersecurity attacks. We spoke with the analysts to discuss the nature of supply chain attacks and how security and risk teams can prepare for them.

[swg_ad]

What is a supply chain attack?

A supply chain attack is when goods, services or technology supplied by a vendor to a customer have been breached and compromised, which introduces a risk to the customer base. The risk to an organization will vary.

Read more: 6 Ways to Defend Against a Ransomware Attack

There are various examples of supply chain attacks, such as using a compromised email account from a supplier for social engineering or to increase the probability of a malware infection by sending it from a supplier’s email address. More elaborate attacks can compromise a supplier’s network and use its privileged access to infiltrate the target network. The most sophisticated attacks, including the SolarWinds attack, involve modifying trusted software tools. 

How do you detect this type of attack and the extent of the damage? 

It’s a short question with a very long answer. The biggest challenge is that supply chain attacks are utilized by advanced adversaries, often using new techniques and tools that are difficult to detect. In addition, anomaly detection is an imprecise art and can trigger too many alerts for security teams to address. Scaling the security operation team to respond to alerts and thus reduce detection time remains a challenge.

Supply chain attacks expand the scope further. In addition to what’s under the direct control of the organizations, security teams must:

Inventory and monitor the third-party tools the organization uses, and learn about vulnerabilities and disclosed breaches Monitor remote access granted to suppliers, restrict it and strengthen it with additional layers such as multifactor authentication.  Monitor third-party providers that have access to corporate resources. 

Supply chain attacks might leverage multiple attack techniques. Specialized anomaly detection technologies, including endpoint detection and response (EDR), network detection and response (NDR) and user behavior analytics (UBA) can complement the broader scope covered by security analytics on centralized log management/SIEM tools. 

Read more: Gartner Top 10 Security Projects for 2020-2021

The primary target of advanced adversaries is authenticated access, which enables them to blend into normal activities. This means identity infrastructure hygiene, multifactor authentication and continuous monitoring are key defenses. Additionally, network segmentation can limit the damage of undetected attacks by making it harder to get to higher-value corporate resources.  

How should organizations respond to a supply chain attack?

Incident response playbooks for supply chain attacks are similar to any incident response, but with different time horizons to consider. The first step is the incident response workflow. This includes tracking down the extent of the compromise with a forensic analysis and restoring normal operations. For this, access to the relevant information is critical.

In the absence of internal investigation resources, or when anticipating a critical breach, organizations should engage incident response services.

Longer term, acknowledging that anyone can be breached and that there is no inside vs outside of the network (i.e., zero trust), security teams should adapt their security and risk management roadmaps to better reflect supply chain attack exposure.  

Emerging breach and attack simulation tools can be used to continuously explore inside the network attack scenarios

Organizations that have determined that they are not impacted by a high-profile supply chain attack should take the opportunity to test “what if” scenarios by assuming they were impacted, what mitigations or security defenses would have provided effective containment and what would not. This type of analysis may change thinking about security priorities and procedures. 

Keep in mind that this particular attack was discovered by an alert security operator wondering why an employee wanted a second phone registered for multifactor authentication. This would imply that the attacker was aiming to leverage identity, and specifically MFA as an attack vector. As such successful security organizations must scrutinize identity onboarding procedures, which includes the registration of new devices for MFA usage security procedures. 

Emerging breach and attack simulation tools can be used to continuously explore inside the network attack scenarios and implications, as well as test security defenses.  

What specific concerns have security professionals been raising since the attack?  

We received a broad set of inquiries mixing traditional post-breach questions, typically aimed at understanding the scope and direct consequences of a specific attack along with an additional set of questions related to the impact of using the compromised supplier’s product, and how to adapt to this situation. 

This seems like a good time for organizations to review security and risk plans. Where should they focus? 

It is important to avoid ad hoc responses that might be too specific and not the most preferable move to improve security overall. A broad review helps put recent events in a broader, more balanced context. The review should not be limited to preventative controls, but should also include anomaly detection and incident response. Frameworks such as Breach and Attack (BAS) can help formalize the initial review and tools, and also aid in the automation of the assessment. 

The post How to Respond to a Supply Chain Attack appeared first on Smarter With Gartner.


Okta

Build a Secure GraphQL API with MicroProfile

MicroProfile is an open-source community project with the goal to encourage the development of Java microservice solutions. It was created in 2016 in response to the changing needs of modern web development. In particular, it seeks to foster the development of smaller, less monolithic services (microservices) that can run on faster release cycles than the typical, old-school Enterprise Java applic

MicroProfile is an open-source community project with the goal to encourage the development of Java microservice solutions. It was created in 2016 in response to the changing needs of modern web development. In particular, it seeks to foster the development of smaller, less monolithic services (microservices) that can run on faster release cycles than the typical, old-school Enterprise Java application. Shortly after its creation, it joined the Eclipse foundation.

MicroProfile, in essence, is a set of specifications and standards agreed upon by a community of developers that allows for “write once, run anywhere” in the Java microservice ecosystem. There are currently around nine compliant runtimes for MicroProfile, including Apache TomEE, Quarkus, and Open Liberty. A program written for one can be seamlessly run on another. The community also serves as an incubator for new ideas within Enterprise Java and microservice architectures.

In this tutorial, the runtime you are going to use is Open Liberty. Open Liberty is an implementation of the MicroProfile specification. Open Liberty bills itself as “a lightweight open framework for building fast and efficient cloud-native Java microservices.” It’s easily customizable, fast to start, and has a low footprint.

Table of Contents

Why Use GraphQL Instead of REST? Create a Java Project Using the MicroProfile Starter Configure MicroProfile for GraphQL Implement GraphQL Hello World Generate Surf Forecasts Using GraphQL Create an OpenID Connect Application Add Groups and UPN Claims to Default Authorization Server Generate a JSON Web Token Configure MicroProfile for JWT Authentication All’s Well That Ends Authenticated Why Use GraphQL Instead of REST?

The application you’re going to write is a surf report generator. Why? Because people seem to like weather apps for their example apps, and I thought a surf report generator was more fun. The surf reports, unfortunately, will just be randomly generated. However, you’ll use GraphQL to query and return data instead of a traditional REST API.

With a REST API, a typical transaction cycle might include numerous API requests. The client app may have to, at a minimum, make a different call for each type of resource required, and often, lots of data is returned that is not used by the client. This results in a lot of inefficient communication. Facebook created GraphQL to address this.

GraphQL allows a client to ask the server for exactly the data it needs in a single request. The client can group multiple request types in a single request and specify exactly which properties on the requested data structures it wants to be included in the response. This results in much more efficient communications between client and server, allowing, for example, mobile apps to work well even on slow connections.

The app will be secured using OAuth 2.0 and OpenID Connect using Okta as the provider. MicroProfile provides a JSON Web Token (JWT) authentication and authorization specification that Open Liberty implements. You’ll use this to add role-based authorization to your surf report.

Prerequisites:

Before you get started, you’ll need to have a few things installed.

Java 11: This tutorial uses Java 11. SDKMAN is an excellent option for installing and managing Java versions. Maven: The MicroProfile starter uses Maven, a dependency management utility. It can be installed according to the instructions on their website. You could also use SDKMAN or Homebrew. Okta Developer Account: You’ll be using Okta as an OAuth/OIDC provider to add JWT authentication and authorization to the application. You can go to our developer site and sign up for a free developer account. HTTPie: This is a powerful command-line HTTP request utility that you’ll use to test your MicroProfile app. Install it according to the docs on their site. Create a Java Project Using the MicroProfile Starter

You can bootstrap a project using the MicroProfile starter. This will quickly configure your MicroProfile project using OpenLiberty as the MicroProfile implementation.

Open the starter website, https://start.microprofile.io/, and make the following selections.

MicroProfile Version: 3.3 Java SE Version: Java 11 MicroProfile Runtime: OpenLiberty

Leave the groupId as com.example and the artifactId as demo. Don’t check any of the Examples for specifications. These are great for exploring and understanding the various features of MicroProfile, but they’ll just complicate things for this tutorial.

Click DOWNLOAD.

Copy the demo.zip file somewhere appropriate as a project root directory and extract the files.

You have a fully functioning application. Open a shell and navigate to the project root directory. Run the starter application using the following command.

mvn liberty:run

You’ll see a lot of output that ends with the following.

[INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/openapi/ui/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/metrics/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/ibm/api/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/health/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/jwt/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/openapi/ [INFO] [AUDIT ] CWWKT0016I: Web application available (default_host): http://localhost:9080/ [INFO] [AUDIT ] CWWKZ0001I: Application demo started in 1.574 seconds. [INFO] [AUDIT ] CWWKF0012I: The server installed the following features: [appSecurity-2.0, cdi-2.0, concurrent-1.0, distributedMap-1.0, jaxrs-2.1, jaxrsClient-2.1, jndi-1.0, json-1.0, jsonb-1.0, jsonp-1.1, jwt-1.0, microProfile-3.3, monitor-1.0, mpConfig-1.4, mpFaultTolerance-2.1, mpHealth-2.2, mpJwt-1.1, mpMetrics-2.3, mpOpenAPI-1.1, mpOpenTracing-1.3, mpRestClient-1.4, opentracing-1.3, servlet-4.0, ssl-1.0]. [INFO] [AUDIT ] CWWKF0011I: The demo server is ready to run a smarter planet. The demo server started in 4.429 seconds.

Test the basic server using HTTPie to run a GET request on the auto-generated endpoint.

http :9080/data/hello HTTP/1.1 200 OK ... Hello World Configure MicroProfile for GraphQL

Now you’re going to create a simple GraphQL “hello world.” To do this, you need to add some dependencies to the pom.xml file and add the GraphQL feature to the server.xml file. The pom.xml file is Maven’s dependency management file. The server.xml file (src/main/liberty/config/server.xml) is the MicroProfile configuration file.

Add two dependencies to the pom.xml file.

<dependency> <groupId>org.eclipse.microprofile.graphql</groupId> <artifactId>microprofile-graphql-api</artifactId> <version>1.0.3</version> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.16</version> <scope>provided</scope> </dependency>

The first dependency is the MicroProfile GraphQL dependency. The second adds Lombok to the project. Lombok is a collection of helper annotations that will save you (or me, really, since you’d just be copy and pasting them) from the tedious task of writing a bunch of getters, setters, and constructors.

You also need to make two additions to the src/main/liberty/config/server.xml file.

First, add the GraphQL feature to the <featureManager> section.

<featureManager> ... <feature>mpGraphQL-1.0</feature> <!-- ADD ME --> </featureManager>

Second, add the following line under the top-level <server> element (this just allows us to log from the app and have it shown on the console).

<server> ... <logging consoleLogLevel="INFO" /> </server> Implement GraphQL Hello World

Create a new Java file, HelloGraphQl.java. This is a simple GraphQL resource controller.

src/main/java/com/example/demo/HelloGraphQl.java

package com.example.demo; import org.eclipse.microprofile.graphql.GraphQLApi; import org.eclipse.microprofile.graphql.Query; @GraphQLApi public class HelloGraphQl { @Query("hello") public String sayHello() { return "Hello world!"; } }

MicroProfile and OpenLiberty have a feature that detects changes and code and attempts to reload the application quickly. However, this feature did not always work for me and often resulted in errors on subsequent GraphQL queries. I found it better just to stop and restart after changes.

Stop your server if you haven’t, using Control-c. Start it again.

mvn liberty:run

Wait for it to finish loading. Test your new GraphQL endpoint.

http POST :9080/graphql query='{ hello }' HTTP/1.1 200 OK Content-Language: en-US Content-Length: 33 Content-Type: application/json;charset=UTF-8 Date: Wed, 23 Dec 2020 16:57:01 GMT X-Powered-By: Servlet/4.0 { "data": { "hello": "Hello world!" } }

Notice that the REST endpoint was at /data/hello, and this endpoint is at /graphql. Notice also that you’re using the GraphQL query language, not making a REST request. You’re also making a POST request instead of a GET.

Explaining GraphQL query language is beyond the scope of this tutorial. If you’d like to learn more about it, take a look at the documentation on the project website.

Of course, this request is trivial. To do something more interesting, you’re going to create a GraphQL endpoint that returns surf forecasts. The forecasts are randomly generated, but the controller will accept a parameter and allow you to use the GraphQL query language to choose the data you’d like returned.

Generate Surf Forecasts Using GraphQL

First, create the SurfController class that will expose the surfReport query.

src/main/java/com/example/demo/SurfController.java

package com.example.demo; import org.eclipse.microprofile.graphql.GraphQLApi; import org.eclipse.microprofile.graphql.Name; import org.eclipse.microprofile.graphql.Query; @GraphQLApi public class SurfController { @Query("surfReport") public SurfConditions getSurfReport(@Name("location") String location) { return SurfConditions.getRandom(location); } }

Second, create the SurfConditions data model.

src/main/java/com/example/demo/SurfConditions.java

package com.example.demo; import lombok.Data; import lombok.ToString; import java.util.Arrays; import java.util.List; import java.util.Random; import java.util.logging.Logger; @Data @ToString public class SurfConditions { private static final Logger LOG = Logger.getLogger("SurfConditions"); private final String location; private final int chanceOfRainPercent; private final double windKnots; private final String windDirection; private final int swellHeight; private final int swellPeriodSeconds; static SurfConditions getRandom(String location) { List<String> windDirections = Arrays.asList("S", "SW", "W", "NW", "N", "NE", "E", "SE"); Random rand = new Random(); String windDirection = windDirections.get(rand.nextInt(windDirections.size())); int chanceOfRain = (int)(Math.random() * 100); double windKnots = (Math.random() * (40 - 5)) + 5; int swellHeight = (int)((Math.random() * (25 - 2)) + 2); int swellPeriodSeconds = (int)((Math.random() * (15 - 8)) + 8); SurfConditions report = new SurfConditions( location, // just the input location chanceOfRain, // random int between 0-100 windKnots, // random double between 5-40 windDirection, // random direction swellHeight, // random int between 2-14 swellPeriodSeconds // random int between 8-15 ); LOG.info(report.toString()); return report; } }

Much of this class’s complexity is in the getRandom() method that returns a random surf forecast, which is just a convenience method for this tutorial. In a real situation, you’d likely be querying a database instead of generating random forecasts.

If you strip that out (see below), you’ll see that your model class is really just a Java class with some properties defined. Two Lombok annotations (@Data @ToString) are used to generate all of the necessary getters, setters, constructors, and the toString() method.

package com.example.demo; import lombok.Data; import lombok.ToString; @Data @ToString public class SurfConditions { private final String location; private final int chanceOfRainPercent; private final double windKnots; private final String windDirection; private final int swellHeight; private final int swellPeriodSeconds; }

Stop the server with Control-c and re-run.

mvn liberty:run

Try a couple of queries. Play around. Notice that you can specify a different location parameter in the search query. You can also specify which return values you need. In more complex situations, this is a great way to reduce network traffic.

http POST :9080/graphql query='{ surfReport(location:"Texas") {location,chanceOfRainPercent,windKnots,windDirection,swellHeight,swellPeriodSeconds} }' HTTP/1.1 200 OK ... { "data": { "surfReport": { "chanceOfRainPercent": 9, "location": "Texas", "swellHeight": 22, "swellPeriodSeconds": 8, "windDirection": "E", "windKnots": 14.561441929794396 } } } http POST :9080/graphql query='{ surfReport(location:"Oregon") {windKnots,swellHeight,swellPeriodSeconds} }' HTTP/1.1 200 OK ... { "data": { "surfReport": { "swellHeight": 14, "swellPeriodSeconds": 13, "windKnots": 32.28906559297788 } } }

You’ve now got a working MicroProfile GraphQL app. The next step is to use Okta to add JWT authentication to your application.

Create an OpenID Connect Application

You should have already signed up for a free developer account with Okta. If you haven’t, please do so now by going to their website: https://developer.okta.com/signup. You’re going to use Okta to add JSON Web Token (JWT) authentication to your app.

TIP: You can also install the Okta CLI and run okta register to create an account.

Okta implements two standards that allow you to do this. First, Okta is an OAuth 2.0 provider. OAuth 2.0 is an authorization standard, meaning that it enables your application to verify user permissions. OpenID Connect (OIDC) is an authentication standard, meaning that it allows your application to verify the identity of the user or client service. Together they provide a full specification for authentication and authorization–everything you need for web security. As you will see, Okta implements this standard in a way that allows you to quickly and easily integrate these technologies into your application.

In this particular example, you’re implementing a web service that uses a JSON Web Token to verify the identity of web clients making requests. Each request is required to contain a valid web token in an authentication header. The web token will be issued by Okta (you’ll do this manually via the OIDC Debugger) and will also be validated by Okta using the MicroProfile JWT feature.

If you’re using the Okta CLI, run okta apps create. Then select 1 for Web, 2 for Other, and enter https://oidcdebugger.com/debug as the redirect URI. If you prefer to add a new application in your browser, continue following the steps below.

Open the Okta developer dashboard and log in.

Once you’ve logged in, you may need to click on the Admin button to get to the developer dashboard.

From the top menu, click on the Applications item, and then click the green Add Application button.

Select Web as the platform and click Next.

Give the app a name. I named mine “MicroProfile”, but you can call yours whatever you like.

Under Login redirect URIs, add a new URI: https://oidcdebugger.com/debug.

Click Done.

You configured Okta as an OAuth 2.0 OIDC provider. Take note of the Client ID because you’ll need it in a moment.

Add Groups and UPN Claims to Default Authorization Server

To enable role-based authorization, as well as to meet MicroProfile’s requirements for the JWT, you need to add two claims mappings to your default Okta authorization server: a groups claim and a upn claim. The groups claim mapping is what maps Okta’s groups to the role-based authorization in MicroProfile. MicroProfile requires the upn claim, and you’ll get an invalid token error without it. This is the “user principal name” as defined in the documentation.

From the Okta developer dashboard’s top menu, go to API and select Authorization Servers.

Click on the default server.

Select the Claims tab.

Click Add Claim.

Name: groups Include in token type: Access Token Always Value type: Groups Filter: Matches regex .*

Next, add a upn claim.

Click Add Claim.

Name: upn Include in token type: Access Token Always Value type: Expression Value: user.email

Click Create.

Generate a JSON Web Token

Every request to the secured API will require a valid JWT. Typically, the JWT is generated when a user signs in via a client application. In this case, there is no client application. Instead, you are going to use the OpenID Connect Debugger to generate a token. This web application allows you to perform a request for a JWT against Okta servers and inspect the results.

Open https://oidcdebugger.com.

Your Authorization URI is based on your developer account URI and will look like
https://dev-123456.okta.com/oauth2/default/v1/authorize. However, you need to replace the dev-123456 with your own URI (just look at your dashboard URI).

The Redirect URI stays the same. This is the URI you entered into the Okta OIDC application settings.

Copy and paste the Client ID from the Okta OIDC application into the Client ID field.

The State parameter can be any value but cannot be empty. This is used in production to help protect against cross-site forgery requests.

Response type should be code.

Scroll down and click Send Request.

You should see a success screen with an authorization code.

You can use HTTPie to exchange this authorization code for an actual token. Fill in the values in brackets with your values: the authorization code, your Okta domain, your OIDC app client ID, and your OIDC app client secret.

http -f https://{yourOktaDomain}/oauth2/default/v1/token \ grant_type=authorization_code \ code={yourAuthCode} \ client_id={clientId} \ client_secret={clientSecret} \ redirect_uri=https://oidcdebugger.com/debug

You should get a lengthy response that looks like the following (with sections omitted for brevity and clarity).

HTTP/1.1 200 OK Connection: keep-alive Content-Type: application/json;charset=UTF-8 ... { "access_token": "eyJraWQiOiJBX05XeGVXcVdrNG5pUjBFWlJnbWg5X3JJa...", "expires_in": 3600, "id_token": "eyJraWQiOiJBX05XeGVXcVdrNG5pUjBFWlJnbWg5X3JJa1Q3...", "scope": "openid", "token_type": "Bearer" }

To inspect the JWT, copy the access token value and go to token.dev. Paste the value in the JWT String field.

You’ll see something like the following for the JWT Payload.

{ "ver": 1, "jti": "AT.H_Mdj-ejg9RZycCk1Dury7Cuh34gQBsIUrsxBqPmfV4", "iss": "https://dev-447850.okta.com/oauth2/default", "aud": "api://default", "iat": 1610043537, "exp": 1610047137, "cid": "0oa1mpynrqj5XGGgS4x7", "uid": "00u133jktIYEgC5ZR4x6", "scp": [ "openid" ], "sub": "andrew.hughes@email.com", "upn": "andrew.hughes@email.com", "groups": [ "Everyone" ] }

Two things to notice: first, the upn claim is one of the claims you added. The MicroProfile JWT specification requires this. If it’s not there, go back and add the claim again. It won’t work without it. Second, notice the groups claim. This will contain an entry for every group (or role) of which the user is a member.

Groups and roles are not really the same thing (in short: groups are groups of users, and roles are collections of permissions). I mention this because, in a moment, you’ll use an annotation called @RolesAllowed to specify Everyone as the allowed role. Everyone is Okta’s default group, of which all users are a member. For the purposes of this tutorial, we don’t care if it’s a group or a role, just that it’s a string passed from Okta through the groups claim that specifies an authorization. It needs to be there for the rest of the tutorial to work. If it’s not, go back and check the groups claim in your default Okta auth server.

Save the access token to a shell variable.

TOKEN=eyJraWQiOiJBX05XeGVXcVdrNG5pUjBFWlJnbWg5X3JJ...

Leave this shell open. You’ll use it in a moment to make an authenticated request. First, you need to configure MicroProfile to use JWT authentication.

Configure MicroProfile for JWT Authentication

In src/main/liberty/config/server.xml, you need to replace these lines:

<!-- This is the keystore that will be used by SSL and by JWT. --> <keyStore id="defaultKeyStore" location="public.jks" type="JKS" password="atbash" /> <!-- The MP JWT configuration that injects the caller's JWT into a ResourceScoped bean for inspection. --> <mpJwt id="jwtUserConsumer" keyName="theKeyId" audiences="targetService" issuer="${jwt.issuer}"/>

With these lines:

<!-- Import default Java trust store for root certs --> <ssl id="defaultSSLConfig" keyStoreRef="defaultKeyStore" trustStoreRef="defaultTrustStore" /> <keyStore id="defaultTrustStore" location="${javaKeystoreLocation}" type="JKS" password="changeit" /> <!-- Configure MicroProfile JWT Auth --> <mpJwt id="myMpJwt" jwksUri="https://${oktaDomain}/oauth2/default/v1/keys" issuer="https://${oktaDomain}/oauth2/default" audiences="api://default" />

Add two properties to your pom.xml. You will need to add your own Okta domain for {yourOktaDomain}. For example, dev-6974382.okta.com.

<properties> ... <liberty.var.javaKeystoreLocation>${env.JAVA_HOME}/lib/security/cacerts</liberty.var.javaKeystoreLocation> <liberty.var.oktaDomain>{yourOktaDomain}</liberty.var.oktaDomain> </properties>

In the server.xml file, you did two important things. First, you configured MicroProfile to use the default Java trust store and key store. You have to do this because when the app wants to verify the JWT, it will try and connect to the default Okta auth server via TLS/SSL. For the TLS handshake to work, MicroProfile needs a key store with root certificates. The default Java key store has the root certificates already installed. Another option would be to use your own key store and import Okta’s certificates into the trust store.

The second thing happening in the server.xml changes is that MicroProfile JWT (<mpJwt></mpJwt>) is configured to use Okta to verify the JWTs.

Two values are used in server.xml: the okta domain and the location of the Java key store. However, these values are placed in pom.xml and passed to server.xml as system properties.

To add security to your GraphQL controller, add the @RolesAllowed annotation to the getSurfReport() method in the SurfController class.

src/main/java/com/example/demo/SurfController.java

import javax.annotation.security.RolesAllowed; @GraphQLApi public class SurfController { @RolesAllowed("Everyone") // <-- ADD ME @Query("surfReport") public SurfConditions getSurfReport(@Name("location") String location) { return SurfConditions.getRandom(location); } }

With that annotation, only users or clients that are members of the Everyone group will be able to access the query.

Stop the server with Control-c and re-run.

mvn liberty:run

Go back to the shell where you saved your token. First, try a request without using the token.

http POST :9080/graphql query='{ surfReport(location:"Texas") {windKnots,swellHeight,swellPeriodSeconds} }' HTTP/1.1 401 Unauthorized ...

Now, use your token.

http POST :9080/graphql query='{ surfReport(location:"Texas") {location,chanceOfRainPercent,windKnots,windDirection,swellHeight,swellPeriodSeconds} }' \ "Authorization: Bearer $TOKEN"

You should be able to query the GraphQL server successfully.

All’s Well That Ends Authenticated

In this tutorial, you created a simple sample web application using MicroProfile and OpenLiberty. You first saw how to easily bootstrap a project using the MicroProfile starter before configuring the project to use GraphQL. Next, you secured the application using JWT authentication.

You can find the source for this example on GitHub, in the oktadeveloper/okta-microprofile-graphql-example repository.

For more interesting tutorials related to MicroProfile and GraphQL, see the following posts:

Build a REST API Using Java, MicroProfile, and JWT Authentication Build a Java REST API with Java EE and OIDC How to GraphQL in Java Build a Secure API with Spring Boot and GraphQL Build a Health Tracking App with React, GraphQL, and User Authentication

If you have any questions about this post, please add a comment below. For more awesome content, follow @oktadev on Twitter, like us on Facebook, or subscribe to our YouTube channel.


MyKey

MYKEY Weekly Report 33(January 4th~January 10th)

Today is Monday, January 11, 2021. The following is the 33rd issue of MYKEY Weekly Report. In the work of last week (January 4th to January 10th), there are mainly 2 updates: 1. MYKEY officially supported the DOT token MYKEY officially supported the DOT token through third-party custody, and the custody service is provided by Hashkey Hub, click to read: https://bit.ly/3oo3BIh 2. The t

Today is Monday, January 11, 2021. The following is the 33rd issue of MYKEY Weekly Report. In the work of last week (January 4th to January 10th), there are mainly 2 updates:

1. MYKEY officially supported the DOT token

MYKEY officially supported the DOT token through third-party custody, and the custody service is provided by Hashkey Hub, click to read: https://bit.ly/3oo3BIh

2. The twenty-third MYKEY Crypto Stablecoin Report was published

We release MYKEY Crypto Stablecoin Report every week to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin. The twenty-third Crypto Stablecoin Report was published on January 7th, click to read: https://bit.ly/3s27Knr

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 33(January 4th~January 10th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 10. January 2021

Identosphere Identity Highlights

Identosphere Weekly #14 - VCs for QiqoChat • SSI Architecture Stack • Peak Paradox

Thanks for joining us for another edition of Identosphere’s Weekly Update! A special thanks to our Patrons, who keep the wheels turning! Feel free to share this publication with your friends and colleagues, who can sign up\read previous editions at newsletter.identosphere.net
Thanks for joining us for another edition of Identosphere’s Weekly Update! A special thanks to our Patrons, who keep the wheels turning!


Feel free to share this publication with your friends and colleagues, who can sign up\read previous editions at newsletter.identosphere.net.

Support this weekly newsletter with a monthly contribution of your choice by visiting https://patreon.com/identosphere.

Upcoming Events PIDapalooza 2021

The idea behind PIDapalooza is that our organizations (CDL, Crossref, DataCite, NISO & ORCID) have a shared love of persistent identifiers and the metadata that connects them. That's what connects us!

Open festival of persistent identifiers January 27-28 2021

Markus Sabadello attended, in 2019, and wrote DIDs are PIDs: Report from PIDapalooza

three of us DID people (Pam Dingle, Kaliya Young, and myself) independently submitted proposals to present on DIDs at the PIDapalooza conference in Dublin. Kaliya’s talk titled “DIDs are PIDs” was chosen by the organizers, but for logistical reasons I ended up going there instead. And I’m happy I had this opportunity!

KCLive: Unlocking Decentralized Identity

Feb 3, 2021 - 4:00 pm CET, 10:00 am EST, 7:00 am PST

Join prominent speakers across domains for a deep dive into the latest trends in the decentralized identity space and get insights into exciting concepts such as reusable identity and a credential-exchange economy

Decentralised Identity: What’s at Stake?

The debate discussion will focus on the INATBA Position Paper on Decentralised Identity […] contribute to identifying next steps and understanding political/regulatory buy-in to the idea and potential barriers

January 21st • Registration Link

Thoughtful Biometrics Workshop

1st week of February: Monday, Wednesday, and Friday, 9am-1pm PST Noon-5 EST.

creating a space to dialogue about critical emerging issues surrounding biometric and digital identity technologies.

Register on EventBrite!

News IdRamp and QiqoChat Announce Verifiable Credentials for Online Collaboration

QiqoChat has really stepped up in this time of need to provide an incredible online event user-experience, enabling a re-creation of the IIW experience throughout our Covid travel restrictions. This week they announced the launch of a Verifiable Credentials integration with the QiqoChat platform.

The community of professionals working on data privacy & consumer protection has been an early adopter of QiqoChat. During regional and global conferences, they have used the platform to share ideas and deliberate about the future of user-centric identity. Through these conferences, we’ve learned how solutions like IdRamp can be tremendously empowering for Internet users.

Spruce Developer Update #5

This is so exciting to see what Wayne and his team are building. 

At Spruce, we’re building a product suite to manage all aspects of the data supply chain.

Tezos DID Method - Specifies VC compatible DID creation and management

DIDKit - cross-platform toolkit for working with DIDs and VCs.

Credible - Spruce’s credential wallet.

Intake - onboarding tool \ secure document collection and processing.

IBM Digital Health Pass

IBM is pitching the Digital Health Pass to support re-opening. 

the digital wallet can allow individuals to maintain control of their personal health information and share it in a way that is secured, verifiable, and trusted.  Individuals can share their health pass to return to the activities and things they love, without requiring exposure of the underlying personal data used to generate the credential.

TNO shares verify the verifier use-case COVID-19 urgency: coercion of health credentials is bad for society

In practice, the issue of countermeasures against coercion has so become more prominent and urgent in the context of the COVID-19 crisis. Here the 800-pound gorillas may be employers demanding health information that they are not entitled to, or even shops and restaurants, if the sharing of health data has become low friction thanks to verifiable credentials.

The article proposes including coercion countermeasures in governance frameworks:

Require authoritative verifier.

Require evidence collection. 

Require enabling anonymous complaints.

Require remote/proxy verification.

Require complying holder agent.

IATA unveils key design elements of travel pass

The IATA Travel Pass three critical design elements:

The IATA Travel Pass stores encrypted data including verified test or vaccination results on the mobile device of the traveler. The traveler controls what information is shared from their phone with airlines and authorities. No central database or data repository is storing the information. By keeping travelers 100% in control of their information, the highest standards for data privacy are ensured. IATA Travel Pass is also built on the highest standards of data protection laws, including General Data Protection Regulation (EU GDPR).

Global standards recognized by governments to ensure verified identity and test/vaccine information.

Convenience and biosafety will be enhanced with integration into contactless travel processes. The ICAO CART recommendations for biosafety include the use of contactless travel processes to reduce the risk of virus transmission when documents need to be exchanged in the travel process.

InfoCert adheres to the GLEIF International Foundation's program for promoting vLEI

The vLEI is a cryptographically verifiable credential according to W3C standards and containing the LEI ( Legal Entity Identifiers ), the identification code of legal entities made mandatory by Mifid II in order to operate on the financial markets: InfoCert, formerly LOU ( Local Operating Unit ) authorized by GLEIF will adopt vLEI as an identification standard within its DIZME ecosystem , the blockchain-based decentralized digital identity platform.

Ayanworks open sources Aries Mobile Agent SDK for Google Flutter

Exactly a year ago in Jan 2020, we announced ARNIMA — first ever Aries React Native Mobile Agent SDK that we made open source for the Self-Sovereign Identity ecosystem.

[...] We are very excited to announce one more small open-source contribution from AyanWorks to the Aries community.

Distributed Open Identity: Self-Sovereign OpenID: A Status Report

follow up of the Identiverse 2019 session “SSO: Self-sovereign OpenID Connect – a ToDo list”. (Decentralized Identity, Mobile, Verified Claims & Credentials, Standards, Preeti Rastogi, Nat Sakimura)

Indian Data Legislation Revisiting the non-personal data governance framework

In July 2020, an expert committee established by the Ministry of Electronics and Information Technology (MEITY) released a report on the Non-Personal Data (NPD) governance framework for India. The document is well-intentioned in that it recognises the public value of data, and the need to democratise its use.

Potential Impacts of Draft India Personal Data Protection Bill (PDPB) (Deloitte) Capitalizing on Self-Sovereign Identity for Machines [Part One]

By providing a means to globally define an indisputable link between a machine and its machine identity across different sites, networks and businesses, we can secure IoT like never before.

The filancore integration for Verifiable Credentials is available now. You can learn more from the Venafi Marketplace.

Blogs How Exactly Are Verifiable Credentials Making the World Better?

This post shares 6 stories of how verifiable credentials can improve the lives of every day people:

Ajay is an Uber driver in San Francisco. He wants to try various temporary jobs while he’s studying but joining Lyft, Postmates and other platforms requires going through a long and tedious background verification and car certification process over and over again. 

Peak Paradox

Tony Fish follows up on an article he wrote with Kaliya about reconciling the differences between the rules and our principles.

This article explores how to look for a paradox within our known cognitive bias and how to identify and manage differences.

Ben Werdmuller blogs And he encourages us to also blog,

I've lost any fear of looking stupid, mostly through enough repetitive practice of absolutely being stupid online.

After you’ve gotten your blog up, be sure and submit its RSS feed to our blogroll

Building decentralized social media

People, in general, want convenience from their technology, not morality. So instead of building a more ethical version of the past, we need to build a more suitable version of the future.

Whatever we're building, we never absolve ourselves from the need to understand our users as people and meet their needs.

The new age of privacy

Privacy is a human right. Surveillance has a chilling effect on free speech and freedom of association, which we consider to be fundamental tenets of democracy. Sure, you can make a bunch of money by learning everything you can about an individual and selling access to their attention. But not everything that is profitable should be permissible.

Podcasts  Kaliya on Ubisecure

The Domains of Identity and SSI with “Identity Woman”, Kaliya Young 

Kaliya and Oscar discuss the long-running Internet Identity Workshop (IIW) that she co-founded, the effects of moving to virtual identity conferences in 2020, insights from Kaliya’s books – ‘The Domains of Identity’, newly published in 2020, and ‘A Comprehensive Guide to Self Sovereign Identity’ – plus some great tips for all business leaders on how to view the role of identity in their organisation.

Hygiene for a computing pandemic 

This episode of FOSS and Crafts features Christopher Lemmer Webber discussing the object capability security approach. Its a generalization not specific to VCs, continuing from the conversation on the CCG mailinglist, Hygiene for a computing pandemic: separation of VCs and ocaps/zcaps, we shared last month. 

The podcast show-notes include an epic list of references supporting the discussion.

GlobalID starts a podcast! EPISODE 01—The SEC’s crypto turf war and why XRP isn’t a security

We’re thrilled to start the new year by sharing the (impromptu) premier episode of The GlobaliD Podcast with Greg Kidd. We’ll have new episodes every two weeks.

Organization News 2021 OpenID Foundation Individual Community Board Members Election

The OpenID Foundation plays an important role in the interoperability of Internet identity. This is to announce the OpenID Foundation individual community board members 2021 election schedule. Those elected will help determine the role the Foundation plays in facilitating the creation and adoption of open identity standards.

The Value of the Pan-Canadian Trust Framework

Kaliya wrote a great article with Joni Brennan of DIACC about how to re-open our economy while protecting privacy.

Without transparent operational guidance, people’s privacy and personal freedoms may be compromised. By having a set of operational rules, decision makers will have the capacity to make better decisions that will enable the public to trust that the tools being implemented have been designed to respect their best interests.

Lissi shares:

#SSI Architecture Stack & Communication efforts - by Rouven Heck - presented at #IIW30 @idworkshop and updated by #DIF @DecentralizedID

https://github.com/decentralized-identity/decentralized-identity.github.io/blob/master/assets/ssi-architectural-stack--and--community-efforts-overview.pdf

Not SSI Torus: DKMS and Login for Web3

Leonard Tan & Yong Zhen Yu recently appeared on Epicenter Podcast to discuss their decentralized key management system, leveraging OAuth2 and WebAuthN to bring improved log-in and recovery for blockchain\DeFi applications. 

Did you know that California has a Data Broker Registry?

California law requires a data broker, as that term is defined in California Civil Code § 1798.99.80, to register with the Attorney General on its internet website that is accessible to the public, on or before January 31 following each year in which a business meets the definition of a data broker.

Thanks again, for stopping by!

Support this weekly newsletter with a monthly contribution of your choice by visiting https://patreon.com/identosphere.

Saturday, 09. January 2021

Coinfirm

OCC: Banks’ Stablecoin Payments & Running Nodes

Banks in the jurisdiction of the US can now perform more activities around stablecoins and nodes. A notice by the Office of the Comptroller of the Currency (OCC) issued last week further paves the way for the mass adoption of the crypto industry. Entitled OCC Chief Counsel’s Interpretation on National Bank and Federal Savings Association...
Banks in the jurisdiction of the US can now perform more activities around stablecoins and nodes. A notice by the Office of the Comptroller of the Currency (OCC) issued last week further paves the way for the mass adoption of the crypto industry. Entitled OCC Chief Counsel’s Interpretation on National Bank and Federal Savings Association...

Friday, 08. January 2021

Coinfirm

FinCEN’s Self-Hosted Wallet KYC Regulation Proposal

The Financial Crime Enforcement Network (FinCEN) – the US’ Financial Investigative Unit that combats terrorism financing and money laundering domestically and internationally – put out a proposed ruling on self-hosted wallets and their KYC requirements for transactions with VASPs in December 2020, entitled Requirements for Certain Transactions Involving Convertible Virtual Currency or Digital Asset
The Financial Crime Enforcement Network (FinCEN) – the US’ Financial Investigative Unit that combats terrorism financing and money laundering domestically and internationally – put out a proposed ruling on self-hosted wallets and their KYC requirements for transactions with VASPs in December 2020, entitled Requirements for Certain Transactions Involving Convertible Virtual Currency or Digital Assets. Here...

1Kosmos BlockID

Social Logins: The Other Side of The Coin

We’ve all been exposed to social logins. Social logins allow users like you and me to access websites or create accounts on websites by using existing social account credentials that we’ve already created with Google, Facebook, LinkedIn and many others. The idea behind the use of social logins is to simplify the sign-in and registration processes to provide seamlessness and convenien

We’ve all been exposed to social logins. Social logins allow users like you and me to access websites or create accounts on websites by using existing social account credentials that we’ve already created with Google, Facebook, LinkedIn and many others. The idea behind the use of social logins is to simplify the sign-in and registration processes to provide seamlessness and convenience compared to having to create a brand-new, stand-alone account to register with a specific website.


www.bloki-chain.com

Position – Paper Blockchain per le PMI

Position paper blockchain Confindustria Digitale_2020 L'articolo Position – Paper Blockchain per le PMI proviene da www.bloki-chain.com.

Okta

Easily Consume a GraphQL API from React with Apollo

GraphQL is an incredibly powerful query language for APIs that helps improve performance and extensibility in your APIs. The query language is designed to allow developers to query exactly the data they need. As your API grows in size and scope, current consumers are unaffected by changes since their queries should return the same data. Apollo Client is a state management library for JavaScript

GraphQL is an incredibly powerful query language for APIs that helps improve performance and extensibility in your APIs. The query language is designed to allow developers to query exactly the data they need. As your API grows in size and scope, current consumers are unaffected by changes since their queries should return the same data.

Apollo Client is a state management library for JavaScript. It fits seamlessly into React applications and can handle fetching, caching, and modifying application data.

In this application, you will create a small React application that uses Apollo Client to query a GraphQL API that contains the data for SpaceX’s launches. You will display an overview of launch histories to the user and allow the user to drill down into a specific launch. To secure the application you will use Okta’s okta-react library to make setting up your authentication easy.

Create your Okta Application

The first thing you will need a free developer account from Okta. if you don’t have one you can sign up for one here.. Once you have completed that navigate to your developer’s console and click on Applications and then Add Application. Select Single Page App and click Next.

Give your application a meaningful name. You will also need to change your URIs to localhost:3000 as this is the default in React.

Click on Done. On the application management page, you will be presented with your Client id. Make note of this as you will need it in your application.

Create Your Apollo and React Application

Next, you will create your web application. Navigate to the parent directory where your application will be stored and use the command npx create-react-app apollo-demo. After your application is finished scaffolding you will need to install the required dependencies. First, you will be using bootstrap and react-bootstrap because it makes creating web pages simple.

npm i bootstrap@4.5.3 npm i react-bootstrap@1.4.0

Next, you will need to get Okta’s core javascript and react packages. With these packages, setting up your authentication with Okta only takes a few short minutes.

npm i @okta/okta-auth-js@4.4.0 npm i @okta/okta-react@4.1.0

You will need the graphql and apollo-client packages to connect to the graphQL API that you will use for your data.

npm i @apollo/client@3.3.4 npm i graphql@15.4.0

Finally, you will use dotenv to store your sensitive data during development.

npm i dotenv@8.2.0

With dotenv installed, you can create a new file in the root of your application called .env and add the following code to it.

REACT_APP_OKTA_CLIENTID={yourClientId} REACT_APP_OKTA_URL_BASE={yourOktaDomain} REACT_APP_OKTA_APP_BASE_URL=http://localhost:3000

Now you want to start building your application by editing the App.js file. Open App.js and replace the code with the following.

import React from 'react'; import { BrowserRouter as Router } from 'react-router-dom'; import AppWithRouterAccess from './AppWithRouterAccess'; import 'bootstrap/dist/css/bootstrap.min.css'; import { ApolloClient, InMemoryCache } from '@apollo/client'; import { ApolloProvider } from '@apollo/client'; const apolloClient = new ApolloClient({ uri: 'https://api.spacex.land/graphql/', cache: new InMemoryCache() }); const App = () => { return ( <Router> <ApolloProvider client={apolloClient}> <AppWithRouterAccess /> </ApolloProvider> </Router> ); } export default App;

Most of this should look fairly familiar. You are importing the bootstrap CSS files at this level. You also need to instantiate your apolloClient and pass it into the ApolloProvider component that wraps AppWithRouterAccess. Per Apollo’s documentation, you should put the ApolloProvider as high up in your application as necessary so that all the children components can access it. The AppWithRouterAccess contains the logic for implementing Okta and securing routes.

Create a new file in your src directory called AppWithRouterAccess.jsx and add the following code.

import React from 'react'; import { Route } from 'react-router-dom'; import { Security, SecureRoute, LoginCallback } from '@okta/okta-react'; import { OktaAuth } from '@okta/okta-auth-js'; import Home from './Pages/Home' import Blastoff from './Pages/Blastoff' const AppWithRouterAccess = () => { const issuer = process.env.REACT_APP_OKTA_URL_BASE + '/oauth2/default' const clientId = process.env.REACT_APP_OKTA_CLIENTID; const redirect = process.env.REACT_APP_OKTA_APP_BASE_URL + '/callback'; const oktaAuth = new OktaAuth({ issuer: issuer, clientId: clientId, redirectUri: redirect }); return ( <Security oktaAuth={oktaAuth}> <Route path='/' exact={true} component={Home} ></Route> <SecureRoute path='/Blastoff' component={Blastoff} /> <Route path='/callback' component={LoginCallback} /> </Security> ); }; export default AppWithRouterAccess;

Here you are creating a new instance of OktaAuth and passing it into your Security component that is provided by Okta. The SecureRoute component is used to check if the user is logged in. If the user is not, they will be redirected to the Okta login page.

Now you can focus on creating your pages and components. First, add two new folders to your src directory; Pages and Components. In Pages, add a new file called Home.jsx. Add the following code to it.

import React from 'react'; import { Link, Redirect } from 'react-router-dom'; import Header from '../Components/Header' import { Container, Row, Col, Card } from 'react-bootstrap' import { useOktaAuth } from '@okta/okta-react'; const Home = () => { const { authState } = useOktaAuth(); return (authState.isAuthenticated ? <Redirect to={{ pathname: '/Blastoff' }} /> : <Container> <Header></Header> <Row> <Col sm={12} className="text-center"> <h3>BlastOff! </h3> <h4> A Look at SpaceX Launch History</h4> <br> </br> <h5>A React Demo Using <a target="_blank" rel="noreferer" href="https://www.apollographql.com/docs/react/">Apollo Client </a><br />Secured With <a target="_blank" rel="noreferer" href="https://www.okta.com/">Okta </a></h5> <h5><a href="https://api.spacex.land/graphql/" target="_blank" rel="noreferer">GraphQL Data Available here</a></h5> </Col> </Row> <br></br> <Row > <Col sm={12} className="text-center"> <Card style={{ width: '21.5em', margin: '0 auto' }}> <Card.Header> Already have an Okta Account? </Card.Header> <Card.Body> <Link to='/Blastoff'>Login Here</Link> </Card.Body> </Card> </Col> </Row> <footer className="text-muted text-center"> <div className="container"> <p>A Small demo using <a target="_blank" rel="noreferer" href="https://www.apollographql.com/docs/react/">Apollo Client </a> Secured by <a target="_blank" rel="noreferer" href="https://www.okta.com/">Okta </a></p> <p>By <a href="https://profile.fishbowlllc.com">Nik Fisher</a></p> </div> </footer> </Container> ); }; export default Home;

This page serves two major needs. First, it provides a landing page for your users and gives some further reading about the technologies used on the site. Second, it contains a redirect to the Blastoff page if the user is already logged in. This second part isn’t strictly necessary but you can change the workflow if you feel so inclined.

Next, you can implement the code for the Blastoff page. Add a new file to your Pages directory called Blastoff.jsx and add the following code.

import React, { Component } from 'react'; import Header from '../Components/Header' import { Container } from 'react-bootstrap' import Histories from '../Components/Histories'; import History from '../Components/History'; class Blastoff extends Component { constructor(props, context) { super(props, context); this.state = { loading: false, showHistory: false, shownHistoryId: -1 } this.onHistorySelected = this.onHistorySelected.bind(this); this.onReturnToHistories = this.onReturnToHistories.bind(this); } onHistorySelected (id) { this.setState({ showHistory: true, shownHistoryId: id }); } onReturnToHistories () { this.setState({ showHistory: false, shownHistoryId: -1 }); } render () { if (this.state.loading) { return <Container> <Header></Header> <h4>Loading, please wait.</h4> </Container> } if (this.state.showHistory) { return ( <Container> <Header></Header> <History id={this.state.shownHistoryId} onReturnToHistories={this.onReturnToHistories}></History> </Container > ) } return ( <Container> <Header></Header> <Histories onHistorySelected={this.onHistorySelected}></Histories> </Container > ); } } export default Blastoff;

This page is the showcase of your application. Users will land on this page and be able to see all the detail your application offers. It has two modes, one to show all the histories available and one to show the details on a specific history. When the user first lands on this page, you will display the Histories component. If the user clicks on the title of a launch they will be presented with the details of that launch on a different component.

Next, add a new file to your Components directory and add a file called Histories.jsx. The code for that component is as follows.

import { gql, useQuery } from '@apollo/client'; import { Container } from 'react-bootstrap' function Histories ({ onHistorySelected }) { const HistoriesQuery = gql`{ histories { id details links { article } flight { id mission_name } } }`; const { loading, error, data } = useQuery(HistoriesQuery); if (loading) { return (<Container> <img src="https://media1.tenor.com/images/3f1d85ab9951d0db65e797c7f40e89cc/tenor.gif"></img> </Container>); } else { return ( <Container> <table className="table table-striped"> <thead> <tr> <th>Mission Name</th> <th>Details</th> <th>Article</th> </tr> </thead> <tbody> {data.histories.map((history, i) => { return <tr key={i}> <td><a href="#" onClick={() => onHistorySelected(history.id)}>{history.flight == null ? "Unnamed" : history.flight.mission_name}</a> </td> <td>{history.details}</td> <td><a href={history.links.article} target="_blank" rel="noreferer">Read Article</a></td> </tr> })} </tbody> </table> </Container> ) } } export default Histories;

This component does the heavy lifting for displaying the histories to the user. Since you are using the useQuery hook, you will need to use it in a React function component. From this hook, you can get the loading, error, and data values. These allow you to build the component itself. If the data is still loading, you can present the user with a waiting gif of a classic space race moment. If there is an error you can display a brief error message. And once the data is returned, you can build a table with the data you wish to display to the user. Note here that the first column contains the mission_name and an onClick handler to tell the Blastoff page to load the detail.

For a detailed history, add a new file called History.jsx to the Components folder and add the following code.

import { gql, useQuery } from '@apollo/client'; import { Container, Row } from 'react-bootstrap' function History ({ id, onReturnToHistories }) { const HistoryQuery = gql` query history($historyId: ID!) { history(id: $historyId) { details event_date_unix flight { rocket { rocket_name } launch_date_utc launch_site { site_name } launch_success } event_date_utc } } `; const { loading, error, data } = useQuery(HistoryQuery, { variables: { historyId: id }, }); if (loading) { return ( <Container> <img src="https://media1.tenor.com/images/3f1d85ab9951d0db65e797c7f40e89cc/tenor.gif"></img> </Container>); } if (error) { console.log(JSON.stringify(error, null, 2)); return (<div>error</div>); } else { var successLabel; if (data.history.flight.launch_success) { successLabel = <span className="text-success">Success!</span> } else { successLabel = <span className="text-danger"> Failed ):</span> } return ( <Container> <Row> <div className="col-lg-3"> <button className="btn btn-primary" onClick={() => onReturnToHistories()}>Return</button> </div> </Row> <Row> <div className="col-lg-3"> Launch Time UTC: </div> <div className="col-lg-3"> {data.history.event_date_utc} </div> <div className="col-lg-3 text-right"> Success? </div> <div className="col-lg-3"> {successLabel} </div> </Row> <Row> <div className="col-lg-3"> Launch Site: </div> <div className="col-lg-3"> {data.history.flight.launch_site.site_name} </div> </Row> <Row> <div className="col-lg-3"> Rocket Name: </div> <div className="col-lg-3"> {data.history.flight.rocket.rocket_name} </div> </Row> <Row> <div className="col-lg-3"> Details: </div> </Row> <Row> <div className="col-lg-12"> {data.history.details} </div> </Row> </Container> ) } } export default History;

Again, you obtain the loading, error, and data values from the useQuery hook, but this time you need a value for the gql. The reason is that you only want to get the data for one history. The gql itself is a little different as it fetches more data than the overview. This is one of the powerful features of graphQL, you only need to request the data you need which reduces the overhead.

Finally, in your components folder, add a file called Header.jsx. This file will provide links for your users as well as the logic for a login/logout button.

import React from 'react'; import { useOktaAuth } from '@okta/okta-react'; import { Navbar, Nav, Form, Button } from 'react-bootstrap' const Header = () => { const { oktaAuth, authState } = useOktaAuth(); if (authState.isPending) { return <div>Loading...</div>; } const button = authState.isAuthenticated ? <Button variant="secondary" onClick={() => { oktaAuth.signOut("/") }}>Logout</Button> : <Button variant="secondary" onClick={() => { oktaAuth.signInWithRedirect("/Blastoff") }}>Login</Button> return ( <Navbar bg="light" expand="lg"> <Navbar.Brand href="/">BlastOff!</Navbar.Brand> <Navbar.Toggle aria-controls="basic-navbar-nav" /> <Navbar.Collapse id="basic-navbar-nav"> <Nav className="mr-auto"> <Nav.Link href="/">Home</Nav.Link> <Nav.Link href="/Blastoff">History</Nav.Link> </Nav> <Form inline> {button} </Form> </Navbar.Collapse> </Navbar> ); }; export default Header; Test Your Apollo and React Application

Your code is now complete. Use the command npm run start to start running your application. First, you will be shown the Home page. Click on either Login or Blastoff!. You will be presented with the Login screen hosted by Okta. Login in with your Okta account and you will be directed to the Blastoff page to review the launches from SpaceX.

Learn More About React and GraphQL

Using GraphQL you can create data-rich applications with minimal overhead. Apollo offers many products relating to GraphQL and I recommend you check them out. There are many great, open data sources using GraphQL. Try to build an application using one of them yourself.

A Quick Guide to Integrating React and GraphQL Quickly Consume a GraphQL API from React Build a Simple React Application Using Hooks

Make sure you follow us on Twitter and subscribe to our YouTube channel. If you have any questions, or you want to share what tutorial you’d like to see next, please comment below.

Thursday, 07. January 2021

OWI - State of Identity

Global Data Consortium: Going Beyond KYC

Global Data Consortium Co-founder Charles Gaddy joins host Cameron D'Ambrosi to discuss the biggest highlights of the year for identity verification as a compliance solution. Plus: What will 2021 bring to the identity industry? Find out in this episode!

Global Data Consortium Co-founder Charles Gaddy joins host Cameron D'Ambrosi to discuss the biggest highlights of the year for identity verification as a compliance solution. Plus: What will 2021 bring to the identity industry? Find out in this episode!


MyKey

Tips on Network Fees for Ethereum Accounts

Hi everyone, The high ETH Gas increases the cost of recovering accounts for Ethereum users. It is necessary to remind you to reduce unnecessary costs and better to use MYKEY. The following are tips for you: Network Fee is required for you who have the recovery phrase to recover the account It is dear for you who have enabled the Ethereum feature to recover the accounts with the curre

Hi everyone,

The high ETH Gas increases the cost of recovering accounts for Ethereum users. It is necessary to remind you to reduce unnecessary costs and better to use MYKEY. The following are tips for you:

Network Fee is required for you who have the recovery phrase to recover the account It is dear for you who have enabled the Ethereum feature to recover the accounts with the current Ethereum network. It is recommended to use an idle phone to backup your accounts to prevent losing. If you don’t have the extra phone, you can use the camera to save the photos of the synchronized QR code.

How to Back Up the Account: https://bit.ly/38lrhXY

MYKEY Lab will keep improving to provide you with a better product experience.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Tips on Network Fees for Ethereum Accounts was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Uncertainty remains under the continuous growth of Stablecoins, Regulatory restrictions and the…

Uncertainty remains under the continuous growth of Stablecoins, Regulatory restrictions and the open letter of Circle Original link: https://bihu.com/article/1999493307 Original publish time: January 6, 2021 Original author: Xiang Yao, researcher of MYKEY Lab We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analy
Uncertainty remains under the continuous growth of Stablecoins, Regulatory restrictions and the open letter of Circle

Original link: https://bihu.com/article/1999493307

Original publish time: January 6, 2021

Original author: Xiang Yao, researcher of MYKEY Lab

We released MYKEY Crypto Stablecoin Report to share our interpretation of the development status of stablecoins and analysis of their development trends to help the participants in the crypto market stay updated on the development status of stablecoin, looking forward to maintaining communication with the industry and exploring the development prospects of stablecoin together, please feel free to leave suggestions.

Quick Preview The market capitalization of major stablecoins has increased by $24.66 billion to $28.25 billion, a monthly average increase of 11.5%. Last month, the circulation of USDT, USDC, DAI, and BUSD increased by 1.850 billion, 940 million, 110 million, and 293 million, the circulation of the remaining stablecoins decreased slightly. The circulation of USDT exceeded $20 billion, the circulation of USDC reached $4 billion, the circulation of BUSD approached $1 billion. Members of Congress proposed the STABLE Act, which is intended to regulate stablecoin-related behaviors, which has caused strong controversy. 1. Overview of Stablecoin Data

First, let’s review the changes in the basic information of the various stablecoins in the past month(December 1, 2020, ~ December 31, 2020, same below).

Market Circulation

Source: MYKEY, CoinMarketCap, Coin Metrics

At present, the market capitalization of major stablecoins has increased by $2.99 billion to $28.25 billion. The average daily market capitalization in December was $26.63 billion, an increase of $2.758 billion from the past month. The figure below shows the monthly average change of the circulation of the stablecoin, that is, the daily average circulation in December minus the daily average in November.

Source: MYKEY, Coin Metrics

Among them, the main growth comes from Tether. Tether has additionally issued a total of 1.850 billion, specifically 1.0 billion on Ethereum and 850 million on TRON. The circulation of USDC increased by 940 million, reaching $4 billion on December 31, 2020 (from the Circle official website), of which 3.913 billion was on Ethereum, and the rest was on Solana and Algorand. It needs to be emphasized that most media, including Forbes, only count data on Ethereum, which is not accurate. The market capitalization of the remaining stablecoins has not changed much.

The Number of Active Addresses

Source: MYKEY, Coin Metrics

Last month, the average daily active addresses of major stablecoins were 227,000, an increase of 0.12% from the past month.

The Number of 24-hour Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Compared with the previous month, the number of daily transactions of major stablecoins increased by an average of 2.50%. Among them, the growth rate of DAI is remarkable, from an average of 14,000 daily transactions to 48,300, an increase of 45.2%.

The Number of 24-hour Volume of Transactions on the Public Blockchains

Source: MYKEY, Coin Metrics

Source: MYKEY, Coin Metrics

The daily volume of transactions of major stablecoins increased by $903 million, an increase of 18.32%.

2. The boundaries of stablecoins continue to expand

The entire stablecoin and the ecology are still booming. The amount of funds has steadily increased, and the monthly supply of major stablecoins has increased to $3 billion, of which USDC has increased by $1 billion in one month, the total supply has reached $4 billion, and the total supply of BUSD has also approached $1 billion. The main body and platform are more abundant, the euro stablecoin EURB is launched on Stellar, and HUSD announced that it would be deployed on Nervos.

The supply of major stablecoins increased by $3 billion monthly

In December 2020, the supply of major stablecoins increased by $3 billion per month, of which the main contribution came from USDT and USDC ($18.50 and $940 million). The total supply of stablecoins reached $28.2 billion, doubling in 5 months.

The Euro stablecoin EURB is launched on Stellar

On December 9, 2020, the Stellar official website announced that the euro stablecoin EURB is launched. This is the first stablecoin directly issued by a banking institution on Stellar and the first issuance of this type of stablecoin in the market. The specific issuance method of EURB is that the blockchain technology service provider Bitbond provides operations such as the issuance and redemption of EURB on Stellar, while the euro is hosted by Bank von der Heydt. Philipp Doppelhammer, a member of BVDH, said that EURB would be integrated into DTransfer, a payment service based on Stellar provided by SatoshiPay.

Bitbond issued a Security Token approved by the BaFin in 2019.

HUSD will be deployed on Nervos

On December 8, 2020, according to the official blog of Nervos, the compliant stablecoin HUSD will be deployed on Nervos, becoming the first stablecoin collateralized by legal currency on Nervos, and Nervos will also become the first non-access blockchain deployed by HUSD outside of Ethereum.
HUSD Token is a compliant stablecoin issued by the financial technology startup Stable Universal in cooperation with Paxos Trust. It is anchored 1:1 with the U.S. dollar. The U.S. dollar assets are hosted by Paxos Trust and audited and certified by a top American audit company.

3. The stablecoin bill caused controversy

Regulatory issues for stablecoins continue to heat up. On December 4, American Congressmen proposed the ‘STABLE Act’ to regulate the issuance of stablecoins, ‘protect consumers from risks brought by emerging digital payment tools’ and prevent the emergence and growth of abusive, opaque, and stablecoin-based shadow banking systems. The bill is pending approval, and there is no doubt that this proposal has caused a strong controversy in the cryptocurrency community.

The STABLE Act attempted to restrict stablecoins

According to documents on the official website of the US Congress and reports from Forbes, the ‘STABLE Act’ contains the following content. First, the issuers of stablecoins need to comply with state laws or currency transfer laws; second, they must fully comply with existing banking regulations; besides, the issuers of stablecoins may not issue stablecoins until six months after obtaining approval from the Federal Reserve and the FDIC; finally, the issuers of stablecoins need to obtain FDIC insurance or directly deposit reserves with the Federal Reserve. The approach of the current issuers is to cooperate with FDIC-covered financial institutions.

The proposal is pending approval, and another report of Forbes explains it. The article believes that the current stablecoin ecosystem has two major risks. One is KYC/AML. Users holding stablecoins cannot belong to a forbidden set and should be associated with their identities to report to the tax system; the other is an equivalent Exchange relationship between stablecoins and the US dollar, which needs to be achieved through full reserves. This may be the motivation for congressmen to sponsor the bill.

Controversy continues in the cryptocurrency community

Although the bill has not yet been approved, it has undoubtedly caused controversy in the cryptocurrency community, especially from issuers of stablecoin. Circle CEO Jeremy Allaire said on his personal Twitter that the bill is a huge step backward in the innovation of digital currency in the United States and restricts the development of the blockchain and financial technology industries. He believes that the cryptocurrency industry is providing solutions that fundamentally improve the speed, availability, cost reduction, and efficiency of the U.S. and global payment and banking services and that Congress should support open innovation rather than forcing non-bank fintech companies to bear the huge regulatory burden from the Federal Reserve and FDIC. The technology and governance standards of stablecoins should be negotiated and new forms of regulatory regulations should be considered. Circle looks forward to constructive cooperation with federal agencies and the private sector will continue to innovate in this area. Besides, Jeremy Allaire also sent an open letter to the U.S. Department of the Treasury, stating the development status of USDC and offering a response to FinCEN’s proposed non-custodial wallet transaction reporting rules.

According to the collation of CoinTelegraph, other opinion leaders also expressed opposition. Meltem Demirors, chief strategy officer of CoinShares, said that ‘cryptographic currency reduces the cost of providing services to people who have historically been excluded from the banking sector. If the bill is introduced, service costs will increase, compliance will become more difficult, and will, in turn, put the cart before the horse’. Tyler Lindholm, a member of the House of Representatives of Wyoming believes that the bill violates the basic spirit of a decentralized system. The cryptocurrency industry has provided financial services for people without bank accounts, and centralized power is not suitable for a decentralized world. Shapeshift CEO Erik Voorhees also believes that the bill is doomed to fail and cannot be forced to make the crypto industry operate as a bank.

Tips

This is what we’re sharing in this MYKEY Crypto Stablecoin Report, welcome to stay tuned for follow-up crypto stablecoin reports. We will provide you with more analysis of the development status, trends, and international impact of stablecoins to help readers stay updated on the development status of stablecoin.

PS: MYKEY Lab has the final right to interpret the content of the article, please indicate the source for the quotation. Welcome to follow our official account — MYKEY Lab: MYKEY Smart Wallet.

Past Review

Crypto Stablecoin Report 18: The market capitalization of stablecoins increased to $18.53 billion, The rise of CBDC

Crypto Stablecoin Report 19: The market capitalization of stablecoins reached $19.86 billion, The latest development of several decentralized stablecoin protocols

Crypto Stablecoin Report 20: The market capitalization of stablecoins reached $20.62 billion, Stablecoin regulatory policy of the United States

Crypto Stablecoin Report 21: Stablecoin stepped out of Ethereum, Global stablecoins triggered the regulatory awakening

Crypto Stablecoin Report 22: The Supply of DAI Exceeded 1 billion, USDC was Used to Aid Venezuela

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

Uncertainty remains under the continuous growth of Stablecoins, Regulatory restrictions and the… was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 06. January 2021

Spruce Systems

Spruce Developer Update #5

At Spruce, we’re building a product suite to manage all aspects of the data supply chain. Here’s the latest from our development efforts: Tezos DID Method The Tezos DID Method specifies how Tezos can be used for DID creation and management, compatible with the issuance, storage, and verification of Verifiable Credentials. We are still collecting feedback on the specifications of the DID m

At Spruce, we’re building a product suite to manage all aspects of the data supply chain. Here’s the latest from our development efforts:

Tezos DID Method

The Tezos DID Method specifies how Tezos can be used for DID creation and management, compatible with the issuance, storage, and verification of Verifiable Credentials.

We are still collecting feedback on the specifications of the DID method as well as the on-chain DID manager. DIDKit

DIDKit is a cross-platform toolkit for working with W3C Decentralized Identifiers (DIDs) and Verifiable Credentials (VCs).

Packaged DIDKit into an npm package so it can run on the backend for a Node.js application. Started building DIDKit for WebAssembly, for use in web browsers. Implemented did-tezos resolution layer 1 to support tz1/tz2/tz3 without the DID manager smart contract or off-chain updates Implemented JSON-LD to RDF serialization to support arbitrary Linked Data verifiable credentials, with compile-time context loading. Tested our implementation of JSON-LD to RDF serialization and JSON-LD Dataset normalization with W3C test suites, discovering and fixing bugs along the way. We are increasing cryptographic agility so that the user can choose their own cryptographic function implementations. Credible

Credible is Spruce’s native credential wallet for the consumption, storage, and presentation of Verifiable Credentials on Android and iOS.

Our efforts on Credible are currently focused on making sure did-tezos resolution layer 1 is the default, working DID method in Credible. Intake

Intake is a smarter onboarding tool for businesses via secure document collection and processing. These artifacts can then be used as evidence to generate and issue credentials to the counterparty that originally uploaded them.

We’ve currently completed our initial sprint on Intake to create an app capable of basic onboarding via documents and fields through form creation. We are designing the models for forms and submissions, using JSON Schema as well as the possibility to enrich them using semantics and linked data. We are designing the implementation of workflows state machines, aiming for a robust but expressive system, studying general-purpose workflow engines and models such as petri nets.

If you would like to discuss how we would deploy the architecture described above for a specific use case, please take 30 seconds to leave us a message, and we will respond within 24 hours.

Follow us on Twitter

Follow us on LinkedIn


Forgerock Blog

Behind-the-Scenes of Virtual Banking

In today’s digital world, we no longer have to go to a branch office or ATM machine to do the majority of our banking. For the most part, consumers can manage all bank transactions from the comfort of their home on mobile devices, especially during the COVID-19 pandemic. However, virtual banking has the potential to be so much more than just cashing checks. Many organizations around the world offe

In today’s digital world, we no longer have to go to a branch office or ATM machine to do the majority of our banking. For the most part, consumers can manage all bank transactions from the comfort of their home on mobile devices, especially during the COVID-19 pandemic. However, virtual banking has the potential to be so much more than just cashing checks. Many organizations around the world offer financially empowering services such as budgeting and saving plans to meet specific goals for their customers, like buying a house or car. However, there is quite a bit of work that goes on behind the scenes that dictates the success of such ventures, particularly from an identity standpoint.

Virtual Banking Industry Growth

The global digital banking platform market is on track to reach $9 billion by 2026. The concept of the virtual bank is largely credited with beginning in Hong Kong, rapidly expanding to the rest of Asia and then the rest of the world. Every institution, from established global banks like HSBC to start ups like Mox, are getting into the game and rightfully so. Virtual banking is quickly growing globally due to the profitability of the business model. Zero infrastructure is needed to build or maintain branch offices as every service and transaction is conducted via devices like phones and tablets.

This model has indeed been quite profitable for many organizations around the globe, such as PT Bank Tabungan Pensiunan Nasional Tbk (BTPN), one of Indonesia’s largest banks. BTPN was tasked with the challenge of reaching a population of over 250 million scattered over 17,000 islands. Virtually, of course. Creating branch offices on various islands was not financially feasible, so the organization virtually launched its Jenius application to serve its disparate consumer base. The bank quickly reached two million subscribers much faster than it had anticipated. As a result, BTPN has become much more profitable. Also, its flexible new system allows BTPN to introduce new services such as its recently released Wow application, which caters to small business owners.

Identity is Key in Digital Banking Experience 

Some of the challenges during the shift to virtual banking include the need for a technology solution that allows for secure and quick onboarding of new customers and businesses that delivers an exceptional digital experience throughout the customer lifecycle. Due to these factors, digital identity plays a critical role in the success of virtual banking. Banks need to understand who their users are, which devices they are using and what their preferences are in order to offer consumer-friendly, personalized services. Plus, all of this needs to be accomplished in a secure and frictionless manner, minimizing the number of clicks and logins, while also connecting to downstream apps and services like credit cards and loan offerings.

The Future of Digital Banking

Mobile technology has opened the door for innovation in banking and personal finance management. For example, with AI-enabled apps, banks can offer consumer analytics, reminders and personalized advice. It is important to understand though, that none of this would be possible without the backend identity management solutions to support these mobile banking apps and services. With new passwordless authentication advancements such as biometric authentication, consumers can now access their banking information from anywhere with just the tap of their finger, while multifactor authentication (MFA) also remains a key element for logging into important financial apps and accounts. Looking ahead, brick-and-mortar banks are not going away anytime soon, but it’s clear that banks must adapt to consumer expectations quickly to keep from being outpaced in our rapidly digitizing world. 


IBM Blockchain

Blockchain and sustainability through responsible sourcing

Mining for cobalt, an essential raw material for lithium-ion batteries, carries a high cost in human suffering. More than 60 percent of the world’s supply comes from the Democratic Republic of Congo (DRC), with about 45 percent coming from large-scale mining operations. The remaining 15 percent comes from small-scale mines in the DRC, where children […] The post Blockchain and sustainability thr

Mining for cobalt, an essential raw material for lithium-ion batteries, carries a high cost in human suffering. More than 60 percent of the world’s supply comes from the Democratic Republic of Congo (DRC), with about 45 percent coming from large-scale mining operations. The remaining 15 percent comes from small-scale mines in the DRC, where children […]

The post Blockchain and sustainability through responsible sourcing appeared first on Blockchain Pulse: IBM Blockchain Blog.


Coinfirm

Ransomware Bitcoin Demands and How Coinfirm’s Investigations Help

In 2021, the damage from cybercrime is predicted to hit $6 trillion (that’s India’s nominal GDP, twice).  Late last month it was revealed SolarWinds, a company supplying tech infrastructure to 400 out of the Fortune 500, suffered a prolonged malware attack that affected at least 250 US government agencies – including the Nuclear Weapons Agency...
In 2021, the damage from cybercrime is predicted to hit $6 trillion (that’s India’s nominal GDP, twice).  Late last month it was revealed SolarWinds, a company supplying tech infrastructure to 400 out of the Fortune 500, suffered a prolonged malware attack that affected at least 250 US government agencies – including the Nuclear Weapons Agency...

IBM Blockchain

Serving up new business practices for restaurants with blockchain

Has the pandemic changed the way we dine, forever? Restaurants from formal dining to fast casual, sole proprietors to national chains, are making the adjustments for survival and growth. Diners, employees, and executives are all working through the pandemic to deliver great tasting food, provide more transparency to the menu, and deliver a work environment […] The post Serving up new business pr

Has the pandemic changed the way we dine, forever? Restaurants from formal dining to fast casual, sole proprietors to national chains, are making the adjustments for survival and growth. Diners, employees, and executives are all working through the pandemic to deliver great tasting food, provide more transparency to the menu, and deliver a work environment […]

The post Serving up new business practices for restaurants with blockchain appeared first on Blockchain Pulse: IBM Blockchain Blog.


Blockchain in 2021: Accessibility, authenticity and AI

As of December 2019, experts forecast blockchain spending would surpass USD 16 billion by 2023, according to an IBM Institute for Business Value (IBV) report. However, over the course of the year, COVID-19 disrupted the global economy, slashed technology budgets and refocused corporate efforts to stay afloat. Despite these challenges, the pandemic has also accelerated […] The post Blockchain in

As of December 2019, experts forecast blockchain spending would surpass USD 16 billion by 2023, according to an IBM Institute for Business Value (IBV) report. However, over the course of the year, COVID-19 disrupted the global economy, slashed technology budgets and refocused corporate efforts to stay afloat. Despite these challenges, the pandemic has also accelerated […]

The post Blockchain in 2021: Accessibility, authenticity and AI appeared first on Blockchain Pulse: IBM Blockchain Blog.


Elliptic

Crypto Regulatory Affairs: OCC Paves the Way for Bank Adoption of Stablecoins

If you thought 2020 was a busy year for crypto regulatory activity, 2021 is already shaping up to have last year beat. Elliptic brings you this crypto regulatory newsflash to give you the scoop on a few regulatory developments that have kicked 2021 into high gear.

If you thought 2020 was a busy year for crypto regulatory activity, 2021 is already shaping up to have last year beat. Elliptic brings you this crypto regulatory newsflash to give you the scoop on a few regulatory developments that have kicked 2021 into high gear.


UbiSecure

The Domains of Identity and SSI with “Identity Woman”, Kaliya Young – Podcast Episode 36

Let’s Talk About Digital Identity with Kaliya Young – consultant, conference organiser, author, activist. In episode 36, Kaliya and Oscar discuss the... The post The Domains of Identity and SSI with “Identity Woman”, Kaliya Young – Podcast Episode 36 appeared first on Ubisecure Customer Identity Management.
Let’s Talk About Digital Identity with Kaliya Young – consultant, conference organiser, author, activist.

In episode 36, Kaliya and Oscar discuss the long-running Internet Identity Workshop (IIW) that she co-founded, the effects of moving to virtual identity conferences in 2020, insights from Kaliya’s books – ‘The Domains of Identity’, newly published in 2020, and ‘A Comprehensive Guide to Self Sovereign Identity’ – plus some great tips for all business leaders on how to view the role of identity in their organisation.

“I think we may be selling self-sovereign identity all wrong. It should be infinitely scalable, low-cost federation. That’s really powerful!”

Kaliya Young is the author of two books “The Domains of Identity” and “A Comprehensive Guide to Self Sovereign Identity”.

For the past 15 years, she has been working to catalyse the creation of a layer of identity for people based on open standards. She co-founded the Internet Identity Workshop (IIW) in 2005 to bring together technologists who want to see decentralised identity come into being. In the fifteen years their community has been meeting, they have created standards being used all over the internet, like OpenID Connect and OAuth. In 2012 she was recognised as a Young Global Leader by the World Economic Forum.

The next IIW is in April. Sign up on Eventbrite.

Kaliya is widely recognised for her community leadership. She travels to Africa and Asia at least once a year to ensure the development of person-centric identity is truly global and inclusive. Most recently, she co-founded HumanFirst.Tech with Shireen Mitchell, a project focused on creating space for diverse voices and building a more inclusive industry.

In 2009, she was named one of Fast Company’s Most Influential Women in Technology.

Find Kaliya on Twitter @IdentityWoman and LinkedIn.

Check out Kaliya’s website at identitywoman.net and her podcast with Seth Goldstein, PSA Today (Privacy, Surveillance, Anonymity).

Regular listeners of Let’s Talk About Digital Identity will know that Oscar asks every guest for their top tips on how to protect our digital identities. For 2021, Oscar has a new burning question for all LTADI guests – “for all business leaders listening to us now, what is the one actionable idea that they should write on their agendas today?”

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

 

The post The Domains of Identity and SSI with “Identity Woman”, Kaliya Young – Podcast Episode 36 appeared first on Ubisecure Customer Identity Management.


Okta

Android Login Made Easy with OIDC

Having a dedicated part of a mobile app for authorized users is a must for a modern-day app. Users want to have a personalized experience with the apps they love. They expect to seamlessly use services on different devices and platforms. And, most of all, they want to be sure that their personal data is secure. Implementing a secure login process on Android can be challenging to achieve si

Having a dedicated part of a mobile app for authorized users is a must for a modern-day app. Users want to have a personalized experience with the apps they love. They expect to seamlessly use services on different devices and platforms. And, most of all, they want to be sure that their personal data is secure.

Implementing a secure login process on Android can be challenging to achieve since many different moving parts need to be working in sync. On the Android side, you need to create a friendly login UI, communicate with a back-end service, securely persist the user data on the device, and maintain that data since the user expects to go through the login process only once. On top of that, you need to have a back-end app that supports the login features in the most secure way possible.

Here is where the Okta OIDC SDK comes to the rescue! It’s a mobile SDK developed by industry-leading security experts, designed to provide a simple way for your users to log in into your app. This is done by using OAuth 2.0 and OpenID Connect (OIDC), which are the security industry’s standards for authorization and authentication. A fun guide to these technologies can be found in An Illustrated Guide to OAuth and OpenID Connect.

This post will show you how easy it is to set up the Okta OIDC SDK on Android and leave you with a working app that allows users to securely login!

Table of Contents Create an Android Login Application Add the Okta Android OIDC SDK Install the Okta CLI Register Your Android Application Create an Okta OIDC App Create an Android Application Class Manage Authentication with a Manager Class Add an Android Splash Screen Build an Android Login Screen Create an Android Home Screen Run Your Android Application Learn More About Android and OIDC Create an Android Login Application

To begin, create a new Android app. It will consist of three screens:

Splash screen where you figure out if the user is already logged in or not.

Login screen.

Home screen, which is only accessible to the logged-in users.

First, you’ll need to download and install the latest version of Android Studio (v4.1.1 at the time of this writing).

Next, launch the app and navigate to File → New…​ → New Project…​. Then, create an "Empty Activity" for "Phone and Tablet." You should now see a screen similar to this:

Change your application name and add the right package name, set the minimum SDK to API 23, and click Finish to create the project.

Add the Okta Android OIDC SDK

Android apps use Gradle as their build tool. To add the Okta OIDC SDK as a dependency to your project, you will need to modify the app module’s build.gradle file:

You need to add the right auth redirect scheme for your Okta app and add the Okta library as a dependency. Make sure the applicationId matches what you used when creating your app.

plugins { id 'com.android.application' id 'kotlin-android' } android { compileSdkVersion 29 buildToolsVersion "30.0.2" defaultConfig { applicationId "dev.dbikic.oktaloginexample" minSdkVersion 23 targetSdkVersion 29 versionCode 1 versionName "1.0" testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner" manifestPlaceholders = [ "appAuthRedirectScheme": "com.okta.dev-123456" (1) ] } buildFeatures { (2) viewBinding true } buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' } } compileOptions { (3) sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 } kotlinOptions { jvmTarget = '1.8' } } dependencies { implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version" implementation 'androidx.core:core-ktx:1.3.2' implementation 'androidx.appcompat:appcompat:1.2.0' implementation 'com.google.android.material:material:1.2.1' implementation "androidx.constraintlayout:constraintlayout:2.0.4" implementation 'com.okta.android:oidc-androidx:1.0.17' (4) testImplementation 'junit:junit:4.+' androidTestImplementation 'androidx.test.ext:junit:1.1.2' androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0' } 1 The redirect URI for the application you created in your Okta Developer Console. 2 We are using a view binding feature to interact with our views. More information can be found here. 3 Okta OIDC libraries require Java 1.8 compatibility. 4 Add the dependency required for the Okta OIDC library.

Sync the project with Gradle files by clicking the File → Sync Project with Gradle Files, so the Okta dependency gets downloaded.

Install the Okta CLI

Getting started with Okta is made quite straightforward with the help of the Okta CLI. It’s a tool that makes creating an Okta account and an Okta application a breeze!

The Okta CLI is available for macOS, Linux, and Windows.

MacOS (via Homebrew) brew install --cask oktadeveloper/tap/okta Homebrew is a package manager for macOS. Linux (via Flatpak) flatpak remote-add --if-not-exists flathub https://flathub.org/repo/flathub.flatpakrepo flatpak install com.okta.developer.CLI alias okta="flatpak run com.okta.developer.CLI" Flatpak is a utility for software deployment and package management for Linux. Windows (via Chocolatey) choco install okta --version=0.7.1 Chocolatey is a package manager for Windows.

If you don’t want to use a package manager, you can install the Okta CLI with curl and bash.

curl https://raw.githubusercontent.com/oktadeveloper/okta-cli/master/cli/src/main/scripts/install.sh | bash Register Your Android Application

To create an Okta developer account, simply run the following command:

okta register

You will be prompted with a few questions:

➜ okta register First name: Dino Last name: Bikic Email address: ******@*****.*** Company: Okta Creating new Okta Organization, this may take a minute: OrgUrl: https://dev-123456.okta.com (1) An email has been sent to you with a verification code. Check your email Verification code: 123456 (2) Your Okta Domain: https://dev-123456.okta.com To set your password open this link: https://dev-123456.okta.com/welcome/drp7UBGB_GVjeHp_5Jbs (3) 1 This is your Okta domain; you will need it later. 2 Enter the code you receive in your email. 3 Click on this link to change your password. You will need these credentials for this example. Create an Okta OIDC App

To create a new Okta OIDC app, run the following command:

okta apps create

You will be prompted with a few questions:

➜ okta apps create Application name [dbikic]: Android Login (1) Type of Application (The Okta CLI only supports a subset of application types and properties): > 1: Web > 2: Single Page App > 3: Native App (mobile) (2) > 4: Service (Machine-to-Machine) Enter your choice [Web]: 3 Redirect URI Common defaults: Reverse Domain name - com.okta.dev-123456:/callback Enter your Redirect URI [com.okta.dev-123456:/callback]: (3) Enter your Post Logout Redirect URI [com.okta.dev-123456:/]: (4) Configuring a new OIDC Application, almost done: Created OIDC application, client-id: ******************** Okta application configuration: okta.oauth2.issuer: https://dev-123456.okta.com/oauth2/default okta.oauth2.client-id: ******************** (5) 1 Name your Okta app. 2 Native app (mobile) is the option to select for Android applications. 3 Press enter for the default value. You will need this value for the Okta library setup. 4 Press enter for the default value. You will need this value for the Okta library setup. 5 You will need this ID for the initialization of the Okta library in your application. Create an Android Application Class

In the root folder of your app’s package (in the provided example, that’s the folder app/src/main/java/dev/dbikic/oktaloginexample) create a Kotlin application class named`OktaLoginApplication`.

The Application class is the entry point of your app and is used to maintain the global state of the application. The most common use for it is to initialize the third-party libraries in its onCreate() method. More info about it can be found in Android’s official documentation.

For now, just create the class and make it extend the Application class from the Android framework.

package dev.dbikic.oktaloginexample (1) import android.app.Application class OktaLoginApplication : Application() 1 Update the package to match the one you set when creating the project.

After creating the application class, you need to reference it in your app/src/main/AndroidManifest.xml file:

<?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="dev.dbikic.oktaloginexample"> <uses-permission android:name="android.permission.INTERNET" /> (1) <application android:allowBackup="true" android:icon="@mipmap/ic_launcher" android:label="@string/app_name" android:name=".OktaLoginApplication" (2) android:roundIcon="@mipmap/ic_launcher_round" android:supportsRtl="true" android:theme="@style/Theme.OktaLoginExample" /> </manifest> 1 Okta OIDC SDK needs internet permission to communicate with the back-end. 2 Reference the application class you created in the previous step. AndroidManifest.xml is an essential file for an app that contains basic info about the app’s name, the package name, permissions, activities, and many other things. More information about it can be found in the official documentation. Manage Authentication with a Manager Class

When adding third-party libraries to your codebase, it’s usually a good idea to create a wrapper class that will hide the actual usage. Reasons for this include:

Reusing of the common interaction with the libraries.

You can define all the library interactions in an interface and provide the actual implementation with dependency injection throughout your app.

Everything related to that library is in one place. Replacing the library with a different one is easy as you only need to change the wrapper class.

Because the above, create a class called OktaManager in the root package. This class will be used in all the screens you create:

package dev.dbikic.oktaloginexample import android.app.Activity import android.content.Context import com.okta.oidc.* import com.okta.oidc.clients.sessions.SessionClient import com.okta.oidc.clients.web.WebAuthClient import com.okta.oidc.net.response.UserInfo import com.okta.oidc.storage.security.DefaultEncryptionManager import com.okta.oidc.util.AuthorizationException class OktaManager(applicationContext: Context) { /** * Authorization client using chrome custom tab as a user agent. */ private var webAuth: WebAuthClient (1) /** * The authorized client to interact with Okta's endpoints. */ private var sessionClient: SessionClient (2) init { val config = OIDCConfig.Builder() .clientId("********************") (3) .discoveryUri("https://dev-123456.okta.com") (4) .redirectUri("com.okta.dev-123456:/callback") (5) .endSessionRedirectUri("com.okta.dev-123456:/") (6) .scopes("openid", "profile", "offline_access") .create() webAuth = Okta.WebAuthBuilder() .withConfig(config) .withContext(applicationContext) .withCallbackExecutor(null) .withEncryptionManager(DefaultEncryptionManager(applicationContext)) .setRequireHardwareBackedKeyStore(true) (7) .create() sessionClient = webAuth.sessionClient } fun isAuthenticated(): Boolean { return sessionClient.isAuthenticated } fun registerWebAuthCallback(callback: ResultCallback<AuthorizationStatus, AuthorizationException>, activity: Activity) { webAuth.registerCallback(callback, activity) } fun registerUserProfileCallback(callback: RequestCallback<UserInfo, AuthorizationException>) { sessionClient.getUserProfile(callback) } fun signIn(activity: Activity, payload: AuthenticationPayload) { webAuth.signIn(activity, payload) } fun signOut(activity: Activity, callback: RequestCallback<Int, AuthorizationException>) { webAuth.signOut(activity, callback) } fun clearUserData() { sessionClient.clear() } } 1 private lateinit var webAuth: WebAuthClient is a reference to the web client you will invoke to log in. 2 private lateinit var sessionClient: SessionClient refers to the session you can use to conduct multiple operations after logging in, such as getting the user’s profile, revoking the authentication token, refreshing the authentication token, etc. 3 Replace with your client ID. 4 Replace with your discovery URL. 5 Replace with your redirect URL. 6 Replace with your end session redirect URL. 7 setRequireHardwareBackedKeyStore(true) forces the app to require a device with encryption capabilities. This is the default configuration for Okta OIDC, and it’s considered the best practice. If you want to run this code in an emulator, though, you can temporarily set it to false. Make sure to use the values you received when completing the Create an Okta OIDC App step.

The last step of the setup stage will be to initialize the OktaManager. Remember the empty OktaLoginApplication class? Now you need to modify it to initialize the manager when the app is created.

package dev.dbikic.oktaloginexample import android.app.Application class OktaLoginApplication : Application() { lateinit var oktaManager: OktaManager override fun onCreate() { super.onCreate() oktaManager = OktaManager(this) } }

That’s it! Now, let’s create the screens.

Add an Android Splash Screen

The purpose of a splash screen is to initialize all the applications' dependencies and prepare the app for usage. You’ll use it to figure out if the user is authenticated and decide which screen to show next: the login screen or the home screen.

Create a SplashActivity class in the root package.

package dev.dbikic.oktaloginexample import android.content.Intent import android.os.Bundle import androidx.appcompat.app.AppCompatActivity import dev.dbikic.oktaloginexample.ui.LoginActivity class SplashActivity : AppCompatActivity() { (1) private val oktaManager: OktaManager by lazy { (application as OktaLoginApplication).oktaManager } override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) if (oktaManager.isAuthenticated()) { navigateToHome() } else { navigateToLogin() } } private fun navigateToHome() { // todo implement } private fun navigateToLogin() { startActivity(Intent(this, LoginActivity::class.java)) (2) finish() } } 1 For simplicity, the instance of the OktaManager class is in the application class so that it can be easily accessed from all the activities. The real-world solution here would be to use dependency injection and inject the instance class. 2 Ignore the unresolved reference error for now because we will add the missing class in the next step.

Register the activity in the AndroidManifest.xml file:

<?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="dev.dbikic.oktaloginexample"> <uses-permission android:name="android.permission.INTERNET" /> <application android:allowBackup="true" android:icon="@mipmap/ic_launcher" android:label="@string/app_name" android:name=".OktaLoginApplication" android:roundIcon="@mipmap/ic_launcher_round" android:supportsRtl="true" android:theme="@style/Theme.OktaLoginExample"> <activity android:name=".SplashActivity"> <intent-filter> (1) <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application> </manifest> 1 This intent filter specifies that the SplashActivity is the first activity that will be shown when the app is launched

This class won’t compile just yet. You’ll need to create HomeActivity and LoginActivity classes before it does.

Build an Android Login Screen

Now, let’s do the LoginActivity! First, create a simple layout with a button in app/src/main/res/layout/activity_login.xml:

<?xml version="1.0" encoding="utf-8"?> <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" android:background="@color/white" android:orientation="vertical" tools:context=".LoginActivity"> <Button android:id="@+id/signInButton" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_margin="40dp" android:text="Sign in" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" app:layout_constraintBottom_toBottomOf="parent" /> </androidx.constraintlayout.widget.ConstraintLayout>

Then, create the LoginActivity class in a new ui package:

package dev.dbikic.oktaloginexample.ui import android.content.Intent import android.os.Bundle import android.util.Log import androidx.appcompat.app.AppCompatActivity import com.okta.oidc.* import com.okta.oidc.AuthorizationStatus.* import com.okta.oidc.util.AuthorizationException import dev.dbikic.oktaloginexample.OktaLoginApplication import dev.dbikic.oktaloginexample.OktaManager import dev.dbikic.oktaloginexample.databinding.ActivityLoginBinding class LoginActivity : AppCompatActivity() { private val oktaManager: OktaManager by lazy { (application as OktaLoginApplication).oktaManager } private lateinit var binding: ActivityLoginBinding override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) binding = ActivityLoginBinding.inflate(layoutInflater) setContentView(binding.root) setupOktaCallback() setupViews() } private fun setupOktaCallback() { oktaManager.registerWebAuthCallback(getAuthCallback(), this) (1) } private fun setupViews() { binding.signInButton.setOnClickListener { val payload = AuthenticationPayload.Builder().build() oktaManager.signIn(this, payload) (2) } } private fun getAuthCallback(): ResultCallback<AuthorizationStatus, AuthorizationException> { return object : ResultCallback<AuthorizationStatus, AuthorizationException> { override fun onSuccess(result: AuthorizationStatus) { (3) when (result) { AUTHORIZED -> navigateToHome() SIGNED_OUT -> Log.d("LoginActivity", "Signed out") CANCELED -> Log.d("LoginActivity", "Canceled") ERROR -> Log.d("LoginActivity", "Error") EMAIL_VERIFICATION_AUTHENTICATED -> Log.d("LoginActivity", "Email verification authenticated") EMAIL_VERIFICATION_UNAUTHENTICATED -> Log.d("LoginActivity", "Email verification unauthenticated") } } override fun onCancel() { Log.d("LoginActivity", "Canceled") } override fun onError(msg: String?, exception: AuthorizationException?) { Log.d("LoginActivity", "Error: $msg") } } } private fun navigateToHome() { // todo implement } } 1 Register the auth callback with the OktaManager. 2 Call the sign-in method when the button is clicked. 3 The result is an AuthorizationStatus object. With a simple when expression we can quickly figure out the status type and access its members if needed.

And register it in the AndroidManifest.xml:

<?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="dev.dbikic.oktaloginexample"> ... <application ... > ... <activity android:name=".ui.LoginActivity" android:theme="@style/Theme.MaterialComponents.Light.NoActionBar" /> </application> </manifest>

The purpose of the LoginActivity is to try to authenticate the user with Okta when the login button is pressed. To achieve that, you need to register the web auth callback with the Okta OIDC SDK, and call the signIn() method.

This is enough for the SDK to open a custom Chrome tab with the login screen of the Okta application. Users input their credentials into the form, and when the process is finished, the appropriate method of your auth callback will be called. This allows you to gracefully handle the possible errors or handle the login success, which is, in this case, navigating to the HomeActivity.

Create an Android Home Screen

HomeActivity is the part of your app which can be accessed only by authorized users. In this example, you can fetch the user details, display the user name on the UI, and sign the user out of the app. First, create the layout file in res/layout/activity_home.xml:

<?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical" tools:context=".HomeActivity"> <TextView android:id="@+id/userLabel" android:layout_width="match_parent" android:layout_height="0dp" android:layout_weight="1" android:gravity="center" android:textSize="22sp" tools:ignore="HardcodedText" tools:text="Hello, user!" /> <Button android:id="@+id/signOutButton" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_margin="40dp" android:text="Log out" tools:ignore="HardcodedText" /> </LinearLayout>

Then, create the HomeActivity:

package dev.dbikic.oktaloginexample.ui import android.content.Intent import android.os.Bundle import android.util.Log import androidx.appcompat.app.AppCompatActivity import com.okta.oidc.RequestCallback import com.okta.oidc.net.response.UserInfo import com.okta.oidc.util.AuthorizationException import dev.dbikic.oktaloginexample.OktaLoginApplication import dev.dbikic.oktaloginexample.OktaManager import dev.dbikic.oktaloginexample.databinding.ActivityHomeBinding class HomeActivity : AppCompatActivity() { private val oktaManager: OktaManager by lazy { (application as OktaLoginApplication).oktaManager } private lateinit var binding: ActivityHomeBinding override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) binding = ActivityHomeBinding.inflate(layoutInflater) setContentView(binding.root) oktaManager.registerUserProfileCallback(getUserProfileCallback()) (1) binding.signOutButton.setOnClickListener { oktaManager.signOut(this, getSignOutCallback()) (2) } } private fun getSignOutCallback(): RequestCallback<Int, AuthorizationException> { return object : RequestCallback<Int, AuthorizationException> { override fun onSuccess(result: Int) { oktaManager.clearUserData() (3) val intent = Intent(this@HomeActivity, LoginActivity::class.java) (4) intent.flags = Intent.FLAG_ACTIVITY_CLEAR_TOP (5) startActivity(intent) finish() } override fun onError(msg: String?, exception: AuthorizationException?) { Log.d("HomeActivity", "Error: $msg") } } } private fun getUserProfileCallback(): RequestCallback<UserInfo, AuthorizationException> { return object : RequestCallback<UserInfo, AuthorizationException> { override fun onSuccess(result: UserInfo) { binding.userLabel.text = "Hello, ${result["preferred_username"]}!" (6) } override fun onError(msg: String?, exception: AuthorizationException?) { Log.d("HomeActivity", "Error: $msg") } } } } 1 Register the user profile callback with the OktaManager. 2 Sign out from the app on the sign out button. 3 After the user is successfully logged out from Okta, clear the user’s data. 4 Navigate the user back to the LoginActivity after they sign out. 5 This flag makes sure that all the back stack activities are cleared and that the LoginActivity will be the only activity in the memory. 6 You have fetched the user info successfully! You can check which fields you received here.

And register it in the AndroidManifest.xml:

<?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="dev.dbikic.oktaloginexample"> ... <application ... > ... <activity android:name=".ui.HomeActivity" android:theme="@style/Theme.MaterialComponents.Light.NoActionBar" /> </application> </manifest>

You can now implement the empty method navigateToHome() in both SplashActivity and LoginActivity.

import dev.dbikic.oktaloginexample.ui.HomeActivity ... private fun navigateToHome() { startActivity(Intent(this, HomeActivity::class.java)) finish() } Run Your Android Application

Now it’s time to run the application on an emulator or on a physical device, by pressing the play icon in the top right part of Android Studio. Your app and its login process should look similar to the video below:

What’s cool about the Okta OIDC SDK is that it also securely stores the user session to the app’s local storage and maintains its state for you. Instead of creating a custom user management system and handling multiple edge-cases that can happen in the real world, you can concentrate on spending your time building app features for your users.

You also implemented the logout flow, which is triggered by the user clicking the Log Out button:

Learn More About Android and OIDC

This post showcased how easy it is to set up and use the Okta OIDC SDK for an Android app. The functionalities which the SDK brings to your app, like the OAuth 2.0 authorization and OpenID Connect authentication, are essential for a modern-day app with challenges like security and data privacy.

Creating a custom solution for security and privacy is challenging and time-consuming since the code on the mobile part is not enough, and you also need to have a back-end app that supports those features. Maintaining two applications can cause a lot of long-term work.

You can find the source code for this example on GitHub, in the oktadeveloper/okta-android-login-example repository.

Although the example you created here does enough to satisfy the needs of most apps, the Okta OIDC SDK doesn’t stop there. The Okta OIDC Android repository contains a variety of ideas and suggestions to improve user experience such as:

Using your own OkHttp client.

Using a custom UI to log in.

Add a social login for accounts that include Google, Apple, Facebook, and LinkedIn.

Biometric login, with Iris authentication, fingerprint authentication, PIN authentication, pattern authentication, and more.

Having fine-grained control over session tokens' expiration and refresh.

Settings to handle preference of browser client for the authentication process.

This post has provided you with the foundations to set up a successful OIDC client. If you want to deepen your knowledge around modern authentication systems, check these additional resources on Android, OAuth 2.0, and OpenID Connect:

An Illustrated Guide to OAuth and OpenID Connect

OAuth 2.0 Overview in Okta documentation

Nobody Cares About OAuth or OpenID Connect

Create a React Native App with Login in 10 Minutes

OAuth 2.0 for Native and Mobile Apps

If you enjoyed this blog post and want to see more like it, follow @oktadev on Twitter, subscribe to our YouTube channel, or follow us on LinkedIn.

Tuesday, 05. January 2021

Authenteq

Emanuel Goldberg, Raymond Kurzweil and OCR

The post Emanuel Goldberg, Raymond Kurzweil and OCR appeared first on Authenteq.

Global ID

The GiD Report#141 — Everything you need to know about the SEC v. Ripple

The GiD Report#141 — Everything you need to know about the SEC v. Ripple Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. Happy new year! The SEC v. Ripple (podcast edition) Quick overview of lawsuit Stuff happens ICYMI 1. First off — he
The GiD Report#141 — Everything you need to know about the SEC v. Ripple

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

Happy new year!

The SEC v. Ripple (podcast edition) Quick overview of lawsuit Stuff happens ICYMI 1. First off — here’s everything you need to know about the SEC v. Ripple.

Courtesy of the “impromptu” premier of our new podcast with Greg Kidd. (New episodes every two weeks.)

A morsel for you. Here’s Greg:

The ironic thing here is if you read what the mission is of the SEC, it’s to protect investors, to maintain fair, orderly and efficient markets, and to facilitate capital formation. If you go through each of those five things — is this protecting investors? Is this fair? Is this orderly? Is this efficient? Does this promote capital formation? And in every one of those checkboxes, this action achieved exactly the opposite outcome.

This first episode covers:

Our hot take on the SEC lawsuit (and what their intentions were — hint: it’s a turf war.) How the lawsuit comes off like a Tony Soprano shakedown — without actually bringing regulatory clarity. (And certainly not in step w/the spiritual mission of the SEC.) Whether or not XRP is a security. (It’s not.) And what this means for the industry at large. (And where do we go from here?)

Check it out!

EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security

Please share with your network or on Twitter!

We’ll be publishing highlights from the podcast throughout the week.

2. In case you haven’t been keeping up with the lawsuit, here’s a quick update. Crypto Troll? Former SEC chief Jay Clayton drops Ripple complaint days before Christmas and his own resignation.

The SEC lodged a complaint against Ripple for the sale of $1.3B in unregistered securities (i.e. XRP):

SEC Sues Ripple Over XRP Cryptocurrency Cryptocurrency company Ripple says SEC lawsuit is imminent Ripple says it’s about to be sued by the SEC, in what the company calls a parting shot at the crypto industry Ripple to Face SEC Suit Over XRP Cryptocurrency

Ripple’s response:

Brad Garlinghouse — The SEC’s Attack on Crypto in the United States | Ripple Asheesh Birla — The Biden administration can change the world with new crypto regulations — TechCrunch

Bitstamp and Coinbase (among many other exchanges) have announced that they will halt XRP trading (for U.S. customers in Bitstamp’s case):

XRP trading and deposits to be halted for US customers — Bitstamp Coinbase will suspend trading in XRP on January 19

Coinbase sued for selling XRP:

Coinbase Sued Over Crypto XRP Commissions After SEC Pursues Ripple

Next steps:

SEC vs. Ripple case: Initial pretrial conference set for February 22 3. Stuff happens: The pandemic has changed America’s startup landscape Congress Poised to Apply Banking Regulations to Antiquities Market 1 big thing: 2021 will demand new kinds of video conferencing Via /ctomc — I spoke to 1,700+ people about remote work in 2020 — A few predictions of what will happen in 2021 Via /jvsIndicio becomes a public benefit corporation | Indicio Tech Laid-Off Workers, Foiled Crypto Plans: Updates on Some of Our Biggest Stories 4. ICYMI last week: Via /jIn Spain a “register” for those who refuse the vaccine: “It will be shared with EU countries” — L’Unione Sarda.it China Envisions Its Digital-Currency Future, With Lotteries and a Year’s Worth of Laundry With Alibaba Investigation, China Gets Tougher on Tech Apple Should Win Its Intensifying Battle With Facebook Via /jCoronavirus vaccination proof will play key role in global economy Buoyed by Video Success, Zoom Explores Email, Calendar Services How Amazon Wins: By Steamrolling Rivals and Partners Trump Will Face Different Twitter Rules When He Leaves Office U.S. vs. Facebook: Inside the tech giant’s behind-the-scenes campaign to battle back antitrust lawsuits The ‘app store’ before there was an App Store wants to liberate your iPhone … again Apple’s my-way-or-the-highway rules are under threat — at the worst possible time

The GiD Report#141 — Everything you need to know about the SEC v. Ripple was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ockto

Ockto breidt op te halen data uit met banktransacties via PSD2

Naast informatie uit overheidsbronnen kan nu ook banktransacties aangeleverd worden Ockto stelt consumenten in staat om makkelijk en veilig persoonlijke gegevens bij verschillende overheidsbronnen op te halen en te delen met financieel dienstverleners. Inmiddels is de Ockto dataset uit te breiden met informatie uit banktransacties. Hierdoor zijn acceptatieprocessen voor financiële diensten nog ver
Naast informatie uit overheidsbronnen kan nu ook banktransacties aangeleverd worden

Ockto stelt consumenten in staat om makkelijk en veilig persoonlijke gegevens bij verschillende overheidsbronnen op te halen en te delen met financieel dienstverleners. Inmiddels is de Ockto dataset uit te breiden met informatie uit banktransacties. Hierdoor zijn acceptatieprocessen voor financiële diensten nog verder te digitaliseren. Daarnaast maakt deze uitbreiding het werken met brondata voor nog meer sectoren interessant.

Na de toekenning van de PSD2-vergunning door DNB afgelopen zomer is er bij Ockto veel werk verzet. Klanten van Ockto kunnen consumenten nu ook vragen benodigde informatie uit banktransacties aan te leveren. Deze worden gecategoriseerd naar soort inkomsten en uitgaven. Denk daarbij aan de maandelijkse vaste lasten en inkomsten voor een huishouden zoals betaalde huur, kinderopvang, eventuele alimentatie, etc. Dat maakt het mogelijk voor een consument om niet alle banktransacties te delen, maar alleen de informatie die noodzakelijk is voor het verlenen van de betreffende dienst.

Werken met brondata nu voor meer sectoren interessant

Met de toevoeging van informatie uit banktransacties wordt het digitaliseren door te werken met brondata voor nog meer sectoren interessant. Auke Dirkmaat, Sales Manager Ockto: “We zien dat informatie uit banktransacties een belangrijke game changer is voor onder andere aanbieders van consumptief krediet en de private lease sector. Maar denk ook aan overheidsinstanties. Zij kunnen met behulp van Ockto de dienstverlening digitaliseren waardoor burgers veel persoonlijke gegevens niet meer los aan hoeven te leveren. Bijvoorbeeld bij schuldhulpverlening en het toekennen van regelingen en subsidies.”

Toezichthouders

Ook toezichthouders bewegen inmiddels mee met de trend om processen te digitaliseren met behulp van brondata. Per 1 april wordt bijvoorbeeld de VFN-leennorm aangepast waarbij het gebruik van brondata wordt toegestaan voor de kredietwaardigheidstoetsing.

Klanttrajecten

Inmiddels lopen er verschillende klanttrajecten waarbij informatie uit banktransacties gebruikt worden. Bijvoorbeeld voor bijzonder beheer in de hypotheeksector. Gert Vasse, Account Director Ockto: “Partijen beginnen logischerwijs met deze functionaliteit bij afdelingen waar een consument al banktransactie informatie moet aanleveren. Voor zowel de consument als de casemanager kunnen hiermee veiliger en efficiënter banktransacties opgehaald, gecategoriseerd en geïnterpreteerd worden.”

Snel, eenvoudig en veilig gegevens aanleveren via een app

Met Ockto kunnen financiële dienstverleners consumenten verzoeken digitaal gegevens aan te leveren via de Ockto-app. Bijvoorbeeld voor een hypotheek, financieel advies, een huurwoning, of het openen van een spaarrekening. Snel, makkelijk en veilig. Hierdoor wordt het aanleveren van losse documenten verleden tijd. Ockto zorgt dat alleen de gegevens aangeleverd worden die nodig zijn voor het proces waarmee dus volledig wordt voldaan aan de AVG.

Meer weten over Ockto en de mogelijkheden van informatie uit banktransacties? Neem contact op met Auke Dirkmaat, of vul het contactformulier in.

The post Ockto breidt op te halen data uit met banktransacties via PSD2 appeared first on Ockto.


MyKey

MYKEY officially Supported the DOT Token

Hi everyone, MYKEY, the multi-chain smart wallet, officially supported the DOT token through third-party custody, and the custody service is provided by Hashkey Hub. To ensure the security of users’ assets, KYC authentication is required(PS: users who have enabled BTC feature no longer need it). DOT is the token of Polkadot. Polkadot is an open-source sharding multi-chain protocol that can

Hi everyone,

MYKEY, the multi-chain smart wallet, officially supported the DOT token through third-party custody, and the custody service is provided by Hashkey Hub. To ensure the security of users’ assets, KYC authentication is required(PS: users who have enabled BTC feature no longer need it).

DOT is the token of Polkadot. Polkadot is an open-source sharding multi-chain protocol that can facilitate the cross-chain transmission of any type of data or asset (not only tokens) so that various blockchains can interoperate.

The difference between the MYKEY DOT token and the DOT of other wallets

1. Implementation method: MYKEY uses a decentralized account to authorize centralized custody, and all operations can only be performed after the user’s signature, ensuring the security of the user’s DOT assets to the greatest extent.

2. Cost of Transfers: DOT transfers between MYKEY users can be made in a fast way and free fee (the Network Fee).

3. Asset Retrievable: Based on the recovery mechanism of the KEY ID protocol, even if the MYKEY account/device is lost, users can retrieve the DOT assets.

4. The compliance Agency: HashKey Hub is a product of HashKey Group, committed to providing users with one-stop digital asset management services in a simple, easy-to-use, safe, and compliant manner. An App realizing the safe storage, gain, and trade of digital assets. Free transfers amongst Hashkey platform users, and steady income based on regular financial products in mainstream tokens.

How to add the DOT token

Enter MYKEY and update to the newest version 3.4.0. Click [Assets] > [Token] > [+] > [DOT] and add it.

Note:

1. The single transfer amount of the user to MYKEY DOT account is at least 1.5 DOT. If a single transfer is less than 1.5 DOT, the tokens will be lost.

2. Because the custodian needs to be responsible for users as much as possible and ensure the security of BTC assets, KYC authentication is required to create an account. Please upload the ID card and handhold a note with Hashkey and the application date and your ID. Users who have enabled the BTC feature no longer need it.

FAQ

1. How long does the KYC audit take? What if the submission fails?

Generally, it takes 1–2 working days for the audit, if the application fails to be submitted, please contact the MYKEY Customer Service: @mykeytothemoon in Telegram.

2. How much Network Fee is required for transfer?

There is no Network Fee for internal transfer, and it will arrive quickly. $0.2 Network Fee per transaction is required for external transfer.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY officially Supported the DOT Token was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 04. January 2021

Global ID

EPISODE 01—The SEC’s crypto turf war and why XRP isn’t a security

We’re thrilled to start the new year by sharing the (impromptu) premier episode of The GlobaliD Podcast with Greg Kidd. We’ll have new episodes every two weeks. The SEC v. Ripple In today’s episode, we cover the SEC lawsuit against Ripple. What were the agency’s intentions? Is XRP a security? And what does this mean for the industry at large? Plus, stay until the end for Greg’s vis

We’re thrilled to start the new year by sharing the (impromptu) premier episode of The GlobaliD Podcast with Greg Kidd. We’ll have new episodes every two weeks.

The SEC v. Ripple

In today’s episode, we cover the SEC lawsuit against Ripple. What were the agency’s intentions? Is XRP a security? And what does this mean for the industry at large?

Plus, stay until the end for Greg’s vision of a self-sovereign digital citizen.

Have a question for Greg? A topic you’d like covered? A guest you’d like to see? Hit us up on Twitter:

Greg on Twitter GlobaliD on Twitter

EPISODE 01—The SEC’s crypto turf war and why XRP isn’t a security was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology 2021 Roadmap

Can you “sense” the next year yet? With 2020 in the rearview mirror, it is time to take a look at all the things Ontology has lined up for 2021. While 2020 was dominated primarily by the rise of DeFi, which Ontology took part in through the successful launch of Wing.Finance, we hope that 2021 will take the momentum DeFi has seen and spread it across other industries. Our focus for the upcom
Can you “sense” the next year yet?

With 2020 in the rearview mirror, it is time to take a look at all the things Ontology has lined up for 2021. While 2020 was dominated primarily by the rise of DeFi, which Ontology took part in through the successful launch of Wing.Finance, we hope that 2021 will take the momentum DeFi has seen and spread it across other industries.

Our focus for the upcoming year revolves around optimizing deeper product integrations across different blockchains and bringing more users onboard the Ontology experience with secure layers of protection for their Decentralized Identity and Data. Given the various core projects Ontology is building, we thought it’d be best to share a roadmap of how each of these projects will collaborate and grow.

Just as human beings have five core senses ‒ sight, hearing, touch, smell, and taste ‒ which work in tandem to create the human experience, Ontology also has 5 core “senses” that collaborate intimately to create a better Ontology experience in 2021. In no particular order, they are ONT ID (“Identity”), OScore (“Credit”), ONTO (“Wallet”), Wing (“Lending”), and SAGA (“Data”).

Below is our 2021 roadmap for how we plan on driving further innovation and bring more users to our platforms.

ONT ID & UID

2021 Roadmap:

Integrate our decentralized identity features on more public chains in addition to Ethereum, Polkadot, and Binance Smart Chain Implement additional security and privacy controls for users

At the core of what we do is a laser focus on providing the utmost security and privacy controls for users in regards to their own identity and the data they produce. This is perhaps most commonly seen in our ONT ID, a decentralized form of identification transcribed on the blockchain making it immutable and completely safe from hacking. Through a revamped DDXF which aims to support more decentralized data application scenarios, ONT ID protocols can be exported to an array of different public chains in 2021 in order to become the default market leader for decentralized identities and data protocols across chains. This vision has already been set in motion in 2020, as cross-chain Universal Identity (UID) is now supported on Ethereum, Polkadot, Binance Smart Chain, and more. By locking down industry leaders, we are confident in adding additional chains in 2021 to the DeID movement.

OScore

2021 Roadmap:

Integrate OScore into more DeFi platforms Tackle other use cases for OScore

With the inaugural launch of OScore on Wing’s Inclusive Pool, we have seen market adoption from an extended network of users. Through this data and analysis, we strongly believe that 2021 will see significant growth in other platforms and dApps adopting our credit-scoring system. Many of the immediate possibilities revolve around the DeFi space, such as credit-based lending and borrowing, which is still drastically underdeveloped. Compared to traditional finance, we know that DeFi can only truly take off with the addition of a trustworthy and transparent credit system like OScore. Not only will OScore be used in DeFi, but we envision a world where decentralized credit scoring could evolve into a self-sovereign reputation score controlled by the end-user.

ONTO

2021 Roadmap:

Add support for more dApps on ONTO Build deeper native integrations with DEXs and dApps within ONTO

As cross-chain usability becomes a default requirement in wallets, ONTO is focused on bridging multiple chains together within a single user experience. As seen with its recent additions of Binance Smart Chain and Ethereum 2.0, the team behind ONTO will primarily focus on making sure ONTO is the only wallet you need, especially as we continue to invest in adding more decentralized applications and DEXes. We also view ONTO as a leading candidate to give you accessibility to the DeFi space by making it easier for you to stake in different liquidity pools or swap assets across the same, or different chains. As OScore continues to evolve, ONTO and OScore will become more synergistic, ultimately creating an in-depth snapshot of your crypto transactions and dApp activities under your control. We also envision ONTO will help the unbanked around the world by allowing for micropayments, and ultimately salary to be paid directly to a person’s ONTO wallet address.

Wing

2021 Roadmap:

Launch new product pools to support more asset types beyond fungible tokens Launch the Wing platform on other blockchains including Ethereum Drive forward total value locked and usage from existing and new communities

2020 was marked by the sudden astronomical rise of DeFi. As a natural extension to Ontology’s decentralized identity capabilities, the Ontology team invested heavy resources in building out Wing’s technical infrastructure, resulting in successful product launches like the Flash Pool and more recently, the Inclusive Pool. Moving into 2021, Ontology will continue working with Wing to develop further product lines that expand the types of assets which can be lent, borrowed, and collateralized. These include but are not limited to NFTs, art, real estate, derivatives, synthetic assets, and securities. Using OScore, Wing will also lean into deeper elements of undercollateralized lending to expand accessibility for more users to participate in the lending ecosystem. This will also allow Wing to explore new areas such as microlending and credit delegation, all while retaining user privacy and data protection. On top of that, 2021 will see Wing being built on other blockchains as well. The primary focus will be to build it on Ethereum so that Wing can support more ERC-20 assets, making it much more convenient to use for the existing Ethereum DeFi community. At the same time, Wing will work to bring in a generation of new users from emerging markets who stand to benefit from the disruptive potential of decentralized finance.

SAGA

2021 Roadmap:

Add new ecosystem partners across various industries to the decentralized data platform Invest into how research can be applied to more real use cases

Moving into 2021, SAGA will invest heavily in research, and how theoretical research applications can be integrated into real use cases that strengthen SAGA’s data interaction protocols. Investments will also be funneled into creating more prototypes that can use the vast amount of data already available on the SAGA platform. Throughout the year, the SAGA team aims to deepen its ties with existing ecosystem partners while improving the product framework and seeking further partners to grow a truly decentralized data marketplace.

Conclusion

As we grow, our senses continue to adapt and change. Sometimes a scent like vanilla will be enticing, whereas other times it will be bothersome. With this in mind, we treat the five core elements of Ontology the same way, rapidly testing and iterating based on the feedback obtained from external stimuli. We look forward to growing out each vertical and have full confidence in our team to execute on the improvements needed along the way.

Despite the rapid changes being made in these verticals, the core aspect of Ontology, which is our public blockchain, will not undergo much change. Rather, there will be a reinforced focus on improving the security of our decentralized network through onboarding additional enterprises and community run consensus, as well as candidate nodes. Our new governance model lowers the threshold for users to run a node or stake into a node to participate in the governance of our public blockchain. This improves upon its already established high performance while providing security and stability. In addition, more flexible staking mechanisms and support for different types of smart contracts will also be added in 2021. Our core tech team will look to create even more innovative products at the cross-section of DeFi and DeID in 2021 to complement the existing Ontology ecosystem.

In terms of community, our goal is to continue tackling real-world use cases made possible by using the Ontology blockchain, ONT ID / DDXF, Ontology’s DeFi products such as Wing, and our ONTO Wallet, as well as with the additional support from our innovative community members and all of Ontology’s community-run projects.

To conclude this roadmap, we want to thank our entire community for your support over the last 3 years, and we look forward to recapping 2021 at the end of next year the same way we recapped 2020 this year — with pride and success!

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology 2021 Roadmap was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

This Is Why We Can’t Have Nice Things

by Alexei Balaganski I had no intention to write any blog posts during the holidays or, God forbid, do any predictions for the next year (look how relevant last year’s predictions turned out to be). However, an interesting story involving Ticketmaster, a large American ticket sales company, has caught my eye and made me think once again about my career in cybersecurity. The whole story goes all

by Alexei Balaganski

I had no intention to write any blog posts during the holidays or, God forbid, do any predictions for the next year (look how relevant last year’s predictions turned out to be). However, an interesting story involving Ticketmaster, a large American ticket sales company, has caught my eye and made me think once again about my career in cybersecurity.

The whole story goes all the way back to 2013, but the details have only recently been unsealed after the company has entered into a plea agreement and agreed to pay a $10 million fine for illegal access to a competitor’s computer system. Apparently, a former employee of the said (undisclosed) competitor switched to Ticketmaster in 2013, taking with him usernames and passwords for his former employer’s computer systems. On multiple occasions between 2013 and 2015, he and his new Ticketmaster colleagues illegally accessed that company’s computers to assess their competitor’s products and to “steal back” their customers. According to the article, Ticketmaster used this data to gain a competitive advantage while being aware of the illegal tactics and even promoted the employee.

Can't see the forest for the trees

Now, I realize that this is old news, and it will certainly be overshadowed by other, hotter stories like the recent SolarWinds scandal. However, it is nevertheless worth looking at with more attention since, in a way, it much better represents the current state of cybersecurity as a whole. Pardon me for using a silly analogy, but cybersecurity experts sometimes remind me of the fans of zombie movies. I’m pretty sure everyone watching such a film imagines themselves as the main protagonist defending against a horde of monsters using a truck-mounted machine gun. However, In the very unlikely event of a real zombie apocalypse, most of us will surely end up as an insignificant part of that very horde…

In a similar way, discussing how we should defend against the latest highly advanced attack of another state-sponsored group has very little relevance for the risks an average company is facing daily, and designing your security architecture around the most current buzzword is not a wise strategy, perhaps even less wise than ignoring security risks completely. I’m not going to guess which of the extremes was the culprit of the Ticketmaster scandal, but you don’t have to be an expert to see that the victim company cannot be rated high on the cybersecurity maturity scale.

Not de-provisioning users that have left the company, using passwords without any multifactor authentication, and apparently not having any monitoring tools in place to detect illegal access – these are not some complicated technologies, but the very basic security hygiene rules. Why bother inventing new ways to fight sophisticated Russian or North Korean hackers if any malicious insider can easily exploit your systems for years without causing any suspicion? Maybe cybersecurity vendors, as well as the industry press and analysts should perhaps leave their ivory towers for a moment and focus their efforts on the lowest common denominator among their customers?

A dangerous precedent

There is another, even more worrying aspect in this case, however. Ticketmaster, a large company and a subsidiary of an even larger Live Nation conglomerate with billions in yearly revenue, was basically slapped on the wrist with a mere $10 million fine. Doesn’t it set a precedent that shows that corporate espionage and theft of intellectual property using the tried and trusted malicious insider approach can be considered a valid business practice, maybe unethical but still undeniably profitable? It took years to create reasonably comprehensive and harsh regulations for protecting personal information (with GDPR and such), but what about other types of sensitive data?

I have written about the “cargo cult of cybersecurity” earlier, but what we have here is even worse – that’s cybersecurity’s Stone Age! I really hope that problems like this will be the focus of more discussions in 2021 and not just the trendiest buzzwords like ransomware or supply chain security. Because if they won’t, the worst of cybersecurity predictions for the next year will turn out to be true: nothing will ever change.


PingTalk

Remote Work in 2021: Security Leaders Reveal Their Top App Integration Strategies for Strengthening IAM

Work from home is here to stay. With the WFH workforce expected to double in 2021, executives are increasing their efforts into building secure, productive remote employee experiences. Optimizing this growing model begins with ensuring strict identity verification and centralized authentication, made possible by tightening integration strategies around work-from-home applications to create frictio

Work from home is here to stay. With the WFH workforce expected to double in 2021, executives are increasing their efforts into building secure, productive remote employee experiences. Optimizing this growing model begins with ensuring strict identity verification and centralized authentication, made possible by tightening integration strategies around work-from-home applications to create frictionless and secure user experiences.

 

To provide understanding into what major organizations are planning, we interviewed Chief Information Security Officers from six large enterprise companies across several industries on their top app integrations plans for 2021. Here’s a summary of their insights, along with a look at key security initiatives they are focusing on for the new year.

 


Okta

Offline JWT Validation with Go

Modern authentication systems use and generate JSON Web Tokens (JWT). There are many different ways that JWTs are used but, in this post, we will concentrate on JWTs that are used as OIDC access tokens. When a user successfully logs in to an application using a service like Okta, an OIDC access token is generated in the form of a JWT. That token can be passed in requests to the backend. The backen

Modern authentication systems use and generate JSON Web Tokens (JWT). There are many different ways that JWTs are used but, in this post, we will concentrate on JWTs that are used as OIDC access tokens. When a user successfully logs in to an application using a service like Okta, an OIDC access token is generated in the form of a JWT. That token can be passed in requests to the backend. The backend can then validate that token and reject all requests with invalid or missing tokens.

A common way to validate OIDC access tokens is to simply make an API request to the issuer with the access token. While this is the simplest method to use, it is far faster to validate tokens “offline”.

Today, we are going to build a simple web application that uses the Okta authentication widget to log users in. An access token will be generated and sent to an API written in Go which will validate the token. Let’s get started!

PS: The code for this project can be found on GitHub.

Prerequisites to Building a Go Application

First things first, if you haven’t already got Go installed on your computer you will need to Download and install - The Go Programming Language.

Next, create a directory where all of your future code will live.

mkdir jwt-go cd jwt-go

Next, we will make our directory a Go module and install the Gin web server package, and the JWT verifier package.

go mod init jwt-go go get github.com/gin-gonic/gin go get github.com/dgrijalva/jwt-go

A file called go.mod containing these dependencies will be created by the go get command.

To use Okta authentication, you need to have a free [Okta Developer account] (https://developer.okta.com). Once you’ve done this, sign in to the developer console and select Applications > Add Application. Then select Single-Page App and hit Next. The next page is filled in with default values, most of which are sufficient for this application and don’t need to be changed. Important: Add the URL http://localhost:8080 to the allowed redirect URLs. Hit Done.

There are two pieces of information that you need to obtain from the Okta Developer Console. These are your Okta domain name (e.g. dev-12345.okta.com) and your client id (e.g. 0ab1c2defg3AB4Chi567).

How to Build a Web Client in Go with Okta Authentication

First, we need to create a directory for the static web pages.

mkdir client

Next, create a file called client/index.html which loads the Okta JavaScript, and has a <div> with an id of widget-container for use by the Okta authentication widget, and a form to send messages to a backend API.

<html> <head> <meta charset="UTF-8" /> <title>Offline JWT Validation with Go</title> <script src="https://global.oktacdn.com/okta-signin-widget/4.3.2/js/okta-sign-in.min.js" type="text/javascript"></script> <link href="https://global.oktacdn.com/okta-signin-widget/4.3.2/css/okta-sign-in.min.css" type="text/css" rel="stylesheet"/> <link href="style.css" rel="stylesheet" type="text/css" /> <script src="control.js" defer></script> </head> <body> <h1>Offline JWT Validation with Go</h1> <div id="widget-container"></div> <div class="centred"> <form id="messageForm"> Message: <input id="message" name="message" type="message"/> <input type="button" value="Send" onclick="onmessage()"/> </form> <textarea id="messages" name="messages" rows="10" cols="50">Messages</textarea><br/> </div> </body> </html>

Next, create a file called client/control.js containing the following JavaScript code:

var accessToken = null; var signIn = new OktaSignIn({ baseUrl: 'http://${yourOktaDomain}', clientId: '${yourClientId}', redirectUri: window.location.origin, authParams: { issuer: 'https://${yourOktaDomain}/oauth2/default', responseType: ['token', 'id_token'] } }); signIn.renderEl({ el: '#widget-container' }, function success(res) { if (res.status === 'SUCCESS') { accessToken = res.tokens.accessToken.accessToken; signIn.hide(); } else { alert('fail);') } }, function(error) { alert('error ' + error); }); function onmessage() { const url = "/api/messages"; var headers = {} if (accessToken != null) { headers = { 'Authorization': 'Bearer ' + accessToken } } fetch(url, { method : "POST", mode: 'cors', headers: headers, body: new URLSearchParams(new FormData(document.getElementById("messageForm"))), }) .then((response) => { if (!response.ok) { throw new Error(response.error) } return response.text(); }) .then(data => { var msgs = JSON.parse(data) document.getElementById('messages').value = msgs.messages.join('\n'); }) .catch(function(error) { document.getElementById('messages').value = "Permission denied"; }); }

So, what is this code doing? First, a variable named accessToken is created to store the JWT access token.

Next, we have created an OktaSignIn object named signIn. Note You need to replace both occurrences of ${yourOktaDomain} with the Okta domain name from the console. Also, replace ${yourClientId} with the client ID from the application we previously created in the console.

The renderEl() function displays the authentication UI and performs the authentication. If successful, a JWT access token is returned and saved. The UI is then hidden.

The onmessage() function is called when the “Send” button is clicked to submit the message form. This function makes a POST request to an /api/messages endpoint that we will be creating later on. The onmessage() function also passes in the message text to /api/messages/. The request to the /api/messages endpoint will also include the access token in an HTTP Authorization header. As it is a token the header needs to specify its type as Bearer. When the response comes back, the messages are all displayed in the text area.

How to Build a Simple Go Web Server

We are going to implement a web server using the Go Gin library. Create a file called main.go containing the following Go code in the jwt-go directory that you created at the beginning of this post:

package main import ( "net/http" "github.com/gin-contrib/static" "github.com/gin-gonic/gin" ) var messages []string func Messages(c *gin.Context) { message := c.PostForm("message") messages = append(messages, message) c.JSON(http.StatusOK, gin.H{"messages": messages}) } func main() { r := gin.Default() r.Use(static.Serve("/", static.LocalFile("./client", false))) r.POST("/api/messages", Messages) r.Run() }

So, what does this code do? The messages are stored in a slice of strings called messages. The main() function creates a default instance of a Gin HTTP web framework engine. It serves the static content, which we have already created, from the client directory. It calls the function Messages() on receipt of POST requests to /api/Messages.

The Messages() function extracts the message from the POST form data and appends it to the list of messages. It then returns the list of messages back to the requester as a JSON object.

Now, we can test the application by running the server and pointing a web browser at http://localhost:8080.

go run main.go Introduction to Complex Data Structures in Go

For those new to Go, some of the more complex code which we will see later will make more sense if we understand how Go handles lists and maps (also known as dictionaries or hash tables). Here is a short introduction to lists and maps in Go:

Consider an application which reads a JSON object from a file or a network resource. For example:

[{ "name":"John", "age":30, "car":null }, { "name":"Jane", "age":27, "car":"Mini" }]

We want to use the Go JSON decoder to turn the JSON string into a list of maps.

Map and list values can be of type nil, int, float, string, list, and map. Go is a compiled language and it is also strongly typed. Map keys are always strings. The values of any map or list can be a mixture of types, so the values of a map or list can’t be given an explicit type in the code.

For example, In the code above, we declared a list of type string with this code:

var messages []string

In order to overcome the typing issues, Go allows maps and lists to have values of any type by declaring the type as an interface, for example:

var persons []interface{} var person map[string]interface{}

This leads to another issue. When writing the code, it is impossible to know for certain what the actual type of the value is. For example, the JSON structure could change. The type can only be determined at runtime. This makes it important to know what the data structure actually is. If you know the type, you can use a type assertion that tells the compiler what the actual type is. In our example above, persons is a list of maps and person is a map containing a number of attributes.

The type assertions become:

person, err := persons[0].(map[string]interface{}) name, err := person["name"].(string)

If the type assertion agrees with the actual type the err will be nil. You can omit the err return value, but if the type assertion fails then an exception will be thrown. Multiple type assertions can be used in the same expression:

name := persons[0].(map[string]interface[])["name"].(string)

Finally, if you don’t know the actual type of an interface value, then you can use reflect to find it out:

print(reflect.TypeOf(persons[0])) How to Validate a JWT Token in Go

Now that we have the application working, it is time to validate the access token which is the focus of this article.

First of all, what is a JWT? It is three base64 encoded components separated by a . character.

The first component is the header. This is a map. The most important fields of the map are alg which specifies the cryptographic algorithm used, and the kid or key identifier which identifies which public key to use to verify the JWT.

The second component is the payload, which is a set of claims. A claim is a map. Important claims are aud which is the audience, and iss which is the issuer of the token.

The final component is the signature, which is a digital signature of the header and payload.

In order to validate the token, we first need the public key so that we can validate the signature. Start by modifying the import section of main.go to look like this:

import ( "net/http" "crypto/rsa" "encoding/json" "encoding/base64" "math/big" "os" "github.com/gin-contrib/static" "github.com/gin-gonic/gin" )

Next, add the following Go code to main.go:

var rsakeys map[string]*rsa.PublicKey func GetPublicKeys() { rsakeys = make(map[string]*rsa.PublicKey) var body map[string]interface{} uri := "https://" + os.Getenv("OKTA_DOMAIN") + "/oauth2/default/v1/keys" resp, _ := http.Get(uri) json.NewDecoder(resp.Body).Decode(&body) for _, bodykey := range body["keys"].([]interface{}) { key := bodykey.(map[string]interface{}) kid := key["kid"].(string) rsakey := new(rsa.PublicKey) number, _ := base64.RawURLEncoding.DecodeString(key["n"].(string)) rsakey.N = new(big.Int).SetBytes(number) rsakey.E = 65537 rsakeys[kid] = rsakey } }

Then and add a call to the Keys() function to the main() function, like this:

func main() { GetPublicKeys() r := gin.Default() r.Use(static.Serve("/", static.LocalFile("./client", false))) r.POST("/api/messages", Messages) r.Run() }

What do these changes do? First of all, in GetPublicKeys() a GET request is made to the Okta API to get the signing public keys for the Okta domain stored in the $OKTA_DOMAIN environment variable. The response is a JSON object containing a list of keys.

Next, we parse the JSON object to extract one or more of the RSA public keys that we will use to verify access tokens from Okta. The key has two components, a number (called “N”), and an exponent (called “E”). The number is base64 encoded and the exponent is usually 65537 so we just hard-code it.

Now we need to add some code to do the actual verification. Update your import section so it looks like this:

import ( "crypto/rsa" "encoding/base64" "encoding/json" "math/big" "os" "strings" "net/http" "github.com/dgrijalva/jwt-go" "github.com/gin-contrib/static" "github.com/gin-gonic/gin" )

Then, add the following code to main.go.

func Verify(c *gin.Context) bool { isValid := false errorMessage := "" tokenString := c.Request.Header.Get("Authorization") if strings.HasPrefix(tokenString, "Bearer ") { tokenString = strings.TrimPrefix(tokenString, "Bearer ") token, err := jwt.Parse(tokenString, func(token *jwt.Token) (interface{}, error) { return rsakeys[token.Header["kid"].(string)], nil }) if err != nil { errorMessage = err.Error() } else if !token.Valid { errorMessage = "Invalid token" } else if token.Header["alg"] == nil { errorMessage = "alg must be defined" } else if token.Claims.(jwt.MapClaims)["aud"] != "api://default" { errorMessage = "Invalid aud" } else if !strings.Contains(token.Claims.(jwt.MapClaims)["iss"].(string), os.Getenv("OKTA_DOMAIN")) { errorMessage = "Invalid iss" } else { isValid = true } if !isValid { c.String(http.StatusForbidden, errorMessage) } } else { c.String(http.StatusUnauthorized, "Unauthorized") } return isValid }

The function extracts an Authorization request header and looks for a Bearer token. If it does not exist, then a 401 Unauthorized response is sent.

The token string is then passed to the jwt.Parse() function, the second parameter being a function that returns the public key.

Now, we can validate the token. It must be rejected if any of the following are true:

The parse function returns an error which means that the token can’t be decoded, or more likely that the public key can’t decrypt the signature. The token is invalid. The cryptographic algorithm must be RSA256. The audience does not match the expected audience. The issuer is not the expected issuer. The token has no key identifier.

Finally, modify the API functions to call Verify().

func Messages(c *gin.Context) { if Verify(c) { message := c.PostForm("message") messages = append(messages, message) c.JSON(http.StatusOK, gin.H{"messages": messages}) } } Testing it all out

We can now test end to end. Start by running the server:

go run main.go

Next, point a web browser at http://localhost:8080.

Enter a message and hit the submit button. You should get an authorization error.

Now, login and try sending another message. This should send a token that gets validated correctly. Your message should be displayed.

Conclusion

In this post, we have learned how to authenticate with Okta to get a JWT, how to use that JWT in the Authentication header of an HTTP GET request, and how to perform “offline” validation of that JWT in Go.

If you like this topic, be sure to follow us on Twitter, subscribe to our YouTube Channel, and follow us on Twitch.


MyKey

MYKEY Weekly Report 32 (December 28th~January 3rd)

Today is Monday, January 4 2021. The following is the 32nd issue of MYKEY Weekly Report. In the work of last week (December 28th to January 3rd), there are mainly 1 update: 1. We are carrying out a topic activity with rewards: ‘新年要有新气象’ in http://bihu.com until January 6 For details, click to view: https://bit.ly/3hA2DpK. !!! If you encounter any abnormal situation while using MYKEY,

Today is Monday, January 4 2021. The following is the 32nd issue of MYKEY Weekly Report. In the work of last week (December 28th to January 3rd), there are mainly 1 update:

1. We are carrying out a topic activity with rewards: ‘新年要有新气象’ in http://bihu.com until January 6

For details, click to view: https://bit.ly/3hA2DpK.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 32 (December 28th~January 3rd) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 03. January 2021

Identosphere Identity Highlights

Identosphere Weekly #13 • Identity Podcasts • Generativity of SSI • Company News

⑬ Getting the new year off to a good start with another issue of Identosphere Updates. Regardless of the news cycle, there's always more to learn. ⑬
Happy New Year!!!

Thanks for reading Identosphere’s Weekly. The news doesn’t move too fast from the solstice through the new year, but that sure hasn’t stopped you from reading, and hasn’t stopped us from filling each newsletter with useful info.

We’re looking forward to unleashing the potential in this coming year, and bringing valuable new info-products to you.

You can now find Identosphere Weekly at newsletter.identosphere.net and e-mail us at newsletter@identosphere.net with any corrections or content suggestions!!

Thanks to our Patrons!

This is our 13th issue and we now have 13 patrons contributing a total of $125 a month!!!

As time goes on, we’ll continue expanding our coverage of the industry, and develop an increased capacity to deliver valuable news and insight to your inbox.

Every contribution is greatly appreciated and brings us another step towards a sustainable weekly publication. Our patrons also fuels the continued development and refinement of our curation infrastructures.

Upcoming Events DIALOGUES ON DATA & POLICY: the 4.0, the identity and the privacy issue

In this workshop we want to bring together policy and industry experts and the emerging NGI ecosystem as it is growing with the innovators organized by DAPSI, the Data Portability & Services Incubator.

Thoughtful Biometrics Workshop

The Thoughtful Biometrics Workshop is creating a space to dialogue about critical emerging issues surrounding biometric and digital identity technologies. It’s happening the 1st week of February: Monday, Wednesday, and Friday, 9am PST to 1pm PST / Noon EST to 5 EST.

Register on EventBrite!

Other Conferences and Events

There are many other upcoming events you may be interested in that can be found at IdentityReview’s conferences and events page.

Podcasts Listening to identity podcasts is our 2021 new year’s resolution

A gem we must have missed last month from Ubisecure, lists the identity podcasts that should be on our list.

Let’s Talk About Digital Identity (LTADI) – Ubisecure

Definitely Identity – Tim Bouma

PSA Today – Privacy, Surveillance and Anonymity by Kaliya Identity Woman and Seth Goldstein

ID Talk – FindBiometrics and, as you’d expect, focused on the biometrics space

State of Identity – OWI

Podcasts that didn’t make their list.

Identity, UnlockedAuth0 (really great!)

Identity North PodcastIdentity North

If you can think of any identity podcasts that should be on this list, especially ones focused on SSI, it would be much appreciated if you leave a comment on this post at newsletter.identosphere.net or send a message to newsletter@identosphere.net.

Blog Posts The Generative Self-Soverign Internet

Phil Windley published a fantastic post articulating the generativity of self sovereign identity. Kaliya agrees!

Generative systems use a few basic rules, structures, or features to yield behaviors that can be extremely varied and unpredictable. 

Generativity is a function of a technology’s capacity for leverage across a range of tasks, adaptability to a range of different tasks, ease of mastery, and accessibility.

Jumpstart Global Travel Industry Using Self-Sovereign Identity for COVID-19 Immunity Credentials

Tata Consulting Services a vision for how SSI can be used to re-open global travel with the reality of COVID-19.

SSI still requires market validation, and support for its implementation is currently limited to a relatively small group of technologists and enthusiasts. However, the implementation of SSI in the travel industry at a future point in time, especially once the standards and protocols are production ready and existing user experience challenges have been resolved, is something that all travel industry stakeholders should be watching, waiting and ready for.

SSI as articulated by Energy Web Foundation

Digging Deeper into Self-sovereign Identity and Access Management

What we’d like to highlight in this simplified process is the fact that it is the user who stores the claim and anchors it on chain. Also, because it is a private claim, the contents are provable but not disclosed. The user can therefore prove that they have been granted a certain privilege, but unless they elect to disclose this information, it is impossible for a third party to find out.

And as the users must anchor the claim on-chain themselves, there is no opportunity for an observer to make a list of the addresses added to a particular smart contract or any other indirect way to observe the activities of the IAM. The only information an observer can garner is that the user has added a claim to their DID document but not the content, origin, or nature of the claim.

Towards Self-Sovereign Identity with Tykn Co-Founders, Khalid Maliki and Jimmy J.P. Snoek 

Ubisecure LTADI Episode 35

The conversation details the 'three pillars of SSI' (verifiable credentials, decentralised identifiers and blockchain), how SSI fits with existing processes, what it should appear as to end users (and what level of education they need around the technology), the importance of accessibility for inclusivity, and what's next for Tykn. "In 5 years, people should take [SSI] for granted" Khalid Maliki

From the mailing lists Presentation Exchange v1.0 Call for Review

Daniel Buchner shared on public-credentials@w3.org:

spreading the word that the Presentation Exchange v1.0 specification is now a Working Group Draft.

...codifies the Presentation Definition data format Verifiers can use to articulate proof requirements, as well as the Presentation Submission data format Holders can use to submit proofs in accordance with them. The specification is designed to be both claim format and transport envelope agnostic, meaning an implementer can use JSON Web Tokens (JWTs), Verifiable Credentials (VCs), JWT-VCs, or any other claim format, and convey them via Open ID Connect, DIDComm, Credential Handler API, or any other transport envelope.

Comments on the Draft are welcome through 03:59 UTC/GMT on 2021-01-22

Company Updates Trinsic Year in Review 2020

Lots of good things happened!

Helped start the COVID-19 Credentials Initiative and has since worked with dozens of developers and organizations COVID-19 related SSI solutions:

MedCreds: Reducing the Risk of Returning to Work

Decreased Unemployment Among African Youth Using Verifiable Credentials

Verifiable Credentials and Smart Contracts for COVID-19 Data Management

Raised pre-seed funding and rebranded from Streetcred ID to Trinsic, becoming the first investment of  Kickstart Seed Fund $110 million fund.

Partnered with Zapier to Bring SSI to 2000+ Applications

Joined Trust over IP Foundation as Founding Member

Indicio Tech Indicio.Tech Incorporates as a Public Benefit Corporation

Indicio joins companies such as Patagonia and Kickstarter in embracing a corporate model that aligns shareholders and stakeholders around a shared mission to deliver a material benefit to society, not only through products and services and how they are made and delivered, but through prioritizing the welfare of employees, diversity and inclusion, and environmental impact.

Indicio.Tech releases Aries Mediator Agent

The Indicio Mediator Agent is the company’s latest contribution to Aries Cloud Agent Python (ACA-Py) and the Aries Toolbox. Following RFC 0211: Mediator Coordination, Indicio built on the work of the open-source community to make mediation interoperable and vendor agnostic. This expands the opportunities for mobile wallet implementations.

[...]

Indicio.tech is committed to becoming a resource-hub for decentralized identity, providing enterprise-grade open source tools to its clients and to the community. This includes the Private Networks build service, the Indicio TestNet, and a variety of customizable training programs.

IDRamp partners with Indicio.Tech 

This is an interesting announcement with two companies partnering together to create new SSI services for the companies the work with. 

Work on Aries Becoming a Hyperledger Aries Developer

Course from the Linux Foundation

Learn how to develop blockchain-based production-ready identity applications with Hyperledger Aries in this free course.

60k contract to work on SSI with the BC Gov!

Extend the Hyperledger Aries Cloud Agent Python protocols to support ZKP W3C Standard Verifiable Credentials based on BBS+ Signatures

This opportunity is for developers familiar with Hyperledger Aries, Aries Protocols, Python and JSON-LD processing to add support in ACA-Py for several important VC formats used by a number of other organizations in the VC community.

Identity but not SSI Near-Final Second W3C WebAuthn and FIDO2 CTAP Specifications

The W3C WebAuthn and FIDO2 working groups have been busy this year preparing to finish second versions of the W3C Web Authentication (WebAuthn) and FIDO2 Client to Authenticator Protocol (CTAP) specifications

See you Next Week Feel free to share this content and subscribe for updates! Support its creation with a monthly contribution of your choice via Patreon.com

Ayan Works

Announcing ARNIMA FL — Open Source Aries Flutter Mobile Agent SDK

Open Sourcing ARNIMA FL — Aries Mobile Agent SDK for Google Flutter Aries Flutter Mobile Agent SDK Exactly a year ago in Jan 2020, we announced ARNIMA — first ever Aries React Native Mobile Agent SDK that we made open source for the Self-Sovereign Identity ecosystem. (See here) Today, while we wish you a very happy and healthy 2021, we thought to add to that happiness beyond just w
Open Sourcing ARNIMA FL — Aries Mobile Agent SDK for Google Flutter Aries Flutter Mobile Agent SDK
Exactly a year ago in Jan 2020, we announced ARNIMA — first ever Aries React Native Mobile Agent SDK that we made open source for the Self-Sovereign Identity ecosystem. (See here)

Today, while we wish you a very happy and healthy 2021, we thought to add to that happiness beyond just wishing. We are very excited to announce one more small open-source contribution from AyanWorks to the Aries community.

ARNIMA FL — Flutter based Aries Mobile Agent SDK

During the recent IIW #31 (Internet Identity Workshop) that took place during Oct 2020, we announced about our willingness to bring open-source Flutter based Aries Mobile Agent SDK. Keeping up with the promise and with the commitment we have for contributing to the open-source, we are glad to announce ARNIMA FL — Aries Flutter Mobile Agent SDK, first ever Google Flutter based open-source cross platform Aries Mobile Agent SDK.

While this is the first version, we have implemented initial set of Aries RFCs into it for the developers community to start using.

Features available in V1 of ARNIMA Flutter SDK

This being the first version, we have following basic features made available –

Create Wallet Connection with Mediator Agent Connection with Aries Cloud Agent using Invitation URL

We believe that these initial features will help SSI developers to get first hand playing around with this version. Soon, we will push remaining features/RFCs in subsequent releases.

GitHub Repo

Following is the GitHub repository you can get the SDK source from. We have provided a sample mobile app for your ready reference as part of the repo.

We intend to move this repository soon as a Hyperledger project under Hyperledger Github Org, so that the bigger Hyperledger Aries community can contribute & benefit.

https://github.com/ayanworks/ARNIMA-flutter-sdk

We look forward to your feedback and contribution as always the case.

Happy collaborating!!

AyanWorks Team

#SelfSovereignIdentity #SSI #HyperledgerAries #MobileAgent #Flutter #opensource #HyperledgerIndy #annoucement

Friday, 01. January 2021

IdRamp

IdRamp Offers Market-Ready Decentralized Identity Platform on the Indicio Network

Decentralized identity provider announces best-in-class dependability on the newest distributed network purpose-built for decentralized identity and the exchange of verifiable claims. The post IdRamp Offers Market-Ready Decentralized Identity Platform on the Indicio Network first appeared on idRamp | Decentralized Identity Evolution.

Decentralized identity provider announces best-in-class dependability on the newest distributed network purpose-built for decentralized identity and the exchange of verifiable claims.

The post IdRamp Offers Market-Ready Decentralized Identity Platform on the Indicio Network first appeared on idRamp | Decentralized Identity Evolution.

Ontology

Ontology Monthly Report — December 2020

Ontology Monthly Report — December 2020 As we reach the end of 2020, it has been another exciting month for Ontology . We released ONTO v3.6.5, and launched the Wing Inclusive Pool for credit-based lending. We are also happy to share that we are integrating the Ontology decentralized identity (DeID) solution into the blockchain-based e-voting system launched by Waves Enterprise. Our DeI
Ontology Monthly Report — December 2020

As we reach the end of 2020, it has been another exciting month for Ontology . We released ONTO v3.6.5, and launched the Wing Inclusive Pool for credit-based lending. We are also happy to share that we are integrating the Ontology decentralized identity (DeID) solution into the blockchain-based e-voting system launched by Waves Enterprise. Our DeID can ensure that the identities and data of people who vote are not only protected, but verified. We also announced a partnership with Litentry, aiming to onboard 10,000 users to the DeID and OScore solutions.

Some of the highlights from December 2020 include:

The signing of our MOU with Waves and Litentry The launch of Wing’s Inclusive Pool The launch of ONTO v3.6.5 ONTO’s Special NFT Medal Campaign The “Gas Fee Airdrop” campaign with Binance Smart Chain

You can find more detailed updates below.

中文

繁體中文

한국어

日本語

Española

Slovák

Tiếng Việt

русский

Tagalog

සිංහල

हिंदी

বাংলা

MainNet Optimization

- Completed the update of governance contract port

- Optimized JSON-RPC performance in high concurrency scenarios

- Completed 60% of Wasm-NeoVm cross-protocol debugging tool update

Product Development

ONTO v3.6.4 released

- Added support for Binance Chain

- Added support for Binance Smart Chain dApps

- Optimized integrated cross-chain aggregated transaction feature

- Supported ETH ID generation and display

- Supported Npay Phase II project

- Added an announcement page on social media

- Optimized OScore functions

ONTO v3.6.5 released

- Added multiple Binance Smart Chain dApps

- Supported the integrated cross-chain trading feature for Binance Smart Chain

- Added a staking column on the market page, optimized the CeFi column

- Optimized OScore functions

- Added a Chinese information function

- Optimized the ONT node staking function

- Added support for newly released Binance Smart Chain dApps including Pancake, dForce, and Forge Chain

- Launched the Wing Inclusive Pool for credit-based lending

Campaign

- ONTO and AlpacaCity joined forces to carry out a joint campaign whereby users were given the opportunity to earn 5,000 Alpaca NFTs by downloading ONTO. A limited amount of 20 Generation-0 Alpaca were available only for ONTO users. Approximately, 2,000 people participated in the campaign.

- We carried out the “Gas Fee Airdrop” campaign with Binance Smart Chain. Users who transferred BEP-20 BNB in their ONTO wallets were given the opportunity to earn BNBs from a pool with the highest equivalent value of $50000. As of December 20th, more than 7,000 ONTO users participated in the campaign.

dApp

- 108 dApps launched in total on MainNet

- 6,240,890 dApp transactions completed in total on MainNet

Community Growth

- We have onboarded 4,295 new members across Ontology’s global communities, with significant growth in our Vietnamese, Arabic, and Turkish communities.

Bounty Program

- 529 applications, 5 new additions to existing bounties.

- 38 tasks, 50 teams in total: 31 teams have finished their tasks and have received rewards in return, while 19 teams are still working on tasks.

Latest Release

- In the Polkadot Parachain test network Rococo V1 press conference, it was announced that the testnet will be launched on Christmas Eve, December 24th. Ontology will integrate DelD onto Polkadot to build up Polkadot’s parachains and participate in the parachain slots auction in the future. Aside from this, Ontology will gradually adopt its credit scoring system OScore, DDXF, and other technologies into the Polkadot ecosystem.

- On December 15th, Ontology entered into a technological partnership with Waves Enterprise, an enterprise-grade blockchain platform combining both public and private networks as a hybrid solution. The two companies will collaborate to integrate Ontology’s advanced DeID technologies into the blockchain-based e-voting system launched by Waves Enterprise so that corporate users can seamlessly benefit from a fully decentralized approach.

- On December 17th, we partnered with Litentry, a blockchain identity management layer based on the Polkadot network. Ontology and Litentry are aiming to onboard 10,000 users to our DeID and OScore solutions, enriching aggregated identities and targeting potential customers based in Europe, the USA, Latin America, and South-East Asia.

- On December 1st, ONTO wallet added support for the management, deposit, and withdrawal of a variety of digital assets on Binance Chain and Binance Smart Chain.

- On December 5th, the Wing Inclusive Pool launched its test network. The Inclusive Pool will continue Wing’s “supply, borrow, and insurance” mechanisms and set up a Lend pool, Loan pool, and Security Deposit pool.

- On December 9th, Wing Finance, a cross-chain DeFi platform based on Ontology, launched the Inclusive Pool, the world’s first credit-based product for users to lend, borrow, and insure assets. As a digital asset flow pool, the Inclusive Pool increases transparency in DeFi. It does this by integrating the users’ self-sovereign scoring system OScore, built on Ontology, and utilizing these scores in the decision making framework of the Inclusive Pool’s lending and borrowing mechanisms.

Events

- On December 3rd, Li Jun was invited to attend the “Polkadot decoded’’ online conference and to deliver a keynote speech on topics such as DeID, DeFi, and credit scoring. In his speech, Li Jun focused on the applications of ONT ID, the on-chain credit score OScore, the “Welcome Home” blockchain for the automotive industry use case, and Wing, the first credit-integrated DeFi project.

- On December 8th, Li Jun, founder of Ontology, was invited by Upbit, the largest digital asset trading platform in South Korea, to deliver a keynote speech at the UDC 2020 Meet-up Conference. The focus of this was to discuss Ontology’s digital identity and data management solutions.

- On December 3rd, Ning Hu, a senior Ontology architecture expert, was invited to participate in the first session of a series of talks held by Zhongguancun Blockchain Industry Alliance. This talk was titled “Linking the future: a dialogue between Chinese and foreign countries in Blockchain”.

- On December 10th, our Chief of Global Ecosystem Partnerships, Gloria Wu, spoke at the 2020 Paris Blockchain Week Summit. In Gloria’s speech, she introduced Ontology’s various decentralized digital identity and credit technologies, with a focus on our self-sovereign in-car personalization and management solution “Welcome Home”, which was a joint collaboration between Daimler Mobility and Ontology.

- On December 11th, our Chief of Global Ecosystem Partnerships, Gloria Wu, and our Chief of European Ecosystem Partnerships, John Izaguirre, attended the online conference; 2020 MyData, ran by mydata.org.

- On December 10th, Nick Zhang, Ontology Ecosystem Growth Manager, was invited to share the latest developments and progress of Wing in the fifth online resource sharing event by Blocklike and DoraHacks, a decentralized global developer community. This speech focused on introducing Wing’s newly launched Inclusive Pool.

- On December 11th, we closed the essay writing competition for the celebration of Ontology’s third anniversary. We received a total of 146 entries, of which 54 were awarded prizes.

- On December 10th, the ONTO December NFT Medal Campaign was launched. The last of the Ontology monthly NFT series includes two types of medals: the Christmas Reindeer and the Christmas Tree. Both of these medals are OEP-5 NFTs with a supply of 1,000 each. During this campaign, anyone who uses any of the Binance Smart Chain dApps (AlpacaCity / Pancake / DoDo) on the Discovery or Market page of ONTO for more than 1 minute will earn a medal.

Recruitments

- Solution Architect

- Business Sales

- Global Development Director

- Global Development Manager

- Global Marketing Manager

- Social Media Associate

- Product Manager

- Senior JAVA Engineer

New Team Members

- 1 Senior operation associate

- 1 Community co-builder

Contact Us

- Contact us: contact@ont.io

- Apply now: careers@ont.io

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Monthly Report — December 2020 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 31. December 2020

Indicio

Indicio.tech advances decentralized identity with release of critical open source technology

Indicio.tech finishes a working “mediator agent” and contributes its code to Hyperledger Aries and ACA-Py making it possible to communicate verifiable credentials to mobile devices via open source standards. Indicio is hosting a freely available test mediator agent for developers. The post Indicio.tech advances decentralized identity with release of critical open source technology appeared first

Indicio.tech advances decentralized identity with release of critical open source technology Indicio.tech finishes a working “mediator agent” and contributes its code to Hyperledger Aries and ACA-Py making it possible to communicate verifiable credentials to mobile devices via open source standards. Indicio is hosting a freely available test mediator agent for developers.

Indicio.tech, a professional services firm specializing in decentralized identity, today announced the launch—and availability for open testing—of the Indicio Mediator Agent. The company is also donating the underlying code to the open source Hyperledger Aries project.

“It’s simple,” says Indicio CEO Heather Dahl. “If verifiable digital credentials are to be exchanged and identity verified, people need to be able to manage them on their mobile devices through an app. The mediator agent can be thought of as the pipe that enables this to happen.”

“The mediator is the tool that provides a cloud-based connection point to send and receive requests between mobile holders and issuers and verifiers,” says Indicio CTO, Ken Ebert. “We built it to make our clients’ projects work, and we contributed it to the open source community because a working mediator agent is critical to widespread adoption of decentralized identity.”

The Indicio Mediator Agent is the company’s latest contribution to Aries Cloud Agent Python (ACA-Py) and the Aries Toolbox. Following RFC 0211: Mediator Coordination, Indicio built on the work of the open-source community to make mediation interoperable and vendor agnostic. This expands the opportunities for mobile wallet implementations.

As part of these contributions, mediation can be managed and debugged through ACA-Py’s built-in Admin REST API or the Aries Toolbox.

“We believe in an open-source foundation for decentralized identity,” says Dahl. “But as a professional services company, we couldn’t wait for critical technology to spontaneously mature—and, more importantly, our clients couldn’t wait either. So we’re filling the technical gaps, and making the code available for everyone to get building.”

For the Mediator Agent tools and repositories at Hyperledger Aries, access is available here.

Indicio Offers Mediator Agent Workshops

To bring developers up to date on this latest contribution to the open source community, Indicio has prepared an instructor-led workshop focusing on the use and application of the Mediator Agent. Developers will learn how to access the repository, how to use an agent, and then discuss possible use cases.

“One thing we’ve found is that the community is hungry for professional development opportunities.” says Ebert. “By providing private training, we have motivated engineers to learn more about decentralized identity, develop use cases, and shorten the time between development, demonstration and even launch. We tailor our training  to those  points in the decentralized identity development journey that give the most trouble to those who want to pick up this technology and get to work building products and services.”

To book a private, instructor-led Mediator Agent training workshop, contact us.

Indicio-hosted Mediator Agent ready for developer testing

Those who want to build solutions using a mediator agent now can use Indicio’s free, hosted version of the mediator for testing, development, and demonstrations. This will shorten the time to develop proof-of-concept and pilot solutions, and help with ensuring interoperability.

Anyone in the decentralized identity community will have access to the mediator agent code via the open source contribution in the ACA-Py library.

You can access the Indicio-hosted Public Test Mediator Agent here.

A portfolio of decentralized identity services

Indicio.tech is committed to becoming a resource-hub for decentralized identity, providing enterprise-grade open source tools to its clients and to the community. This includes the Private Networks build service, the Indicio TestNet, and a variety of customizable training programs. Companies and organizations from diverse industries around the world rely on Indicio.tech for expertise in building their decentralized identity solutions.

“We are focused on taking decentralized identity from a great idea to a great solution,” says Dahl. “For that to happen, companies and organizations need to focus on the products they want to build and the services they want to offer. They need all the technological elements of decentralized identity finished, working, and available. This is a major step toward  that goal.”

Sign up for blog updates Thanks for reaching out. We'll be in touch shortly!

First Name

Last Name

Email

Company Name

SUBMIT

The post Indicio.tech advances decentralized identity with release of critical open source technology appeared first on Indicio Tech.


Ocean Protocol

2020 — A Year to Remember

2020 — A Year to Remember Looking Back on Ocean’s Progress and Growth 2020 has been a long, albeit, memorable year for the Ocean team. Crypto is intense in normal times but when you layer on a global health crisis that leads to unprecedented disruption of our day-to-day activities, the rollercoaster flipped and flopped us around like never before. It’s hard to see how any other y
2020 — A Year to Remember Looking Back on Ocean’s Progress and Growth

2020 has been a long, albeit, memorable year for the Ocean team.

Crypto is intense in normal times but when you layer on a global health crisis that leads to unprecedented disruption of our day-to-day activities, the rollercoaster flipped and flopped us around like never before. It’s hard to see how any other year can compare.

To a large extent, crypto has been insulated from the economic upheaval. Crypto might even have benefited. The government response of quantitative easing and stimulus programs raised fears of currency debasement, which brought new converts and established crypto as a reliable asset class and a store of value.

Teams turned inwards to build out technology and focus on user adoption, rather than travel around to conferences. Projects cut out expenses to extend runways and ready themselves for an extended crypto-winter. And as the year progressed, people were finally able to play with working production code, and integrate with other projects to allow the entire space to mature together at a rapid pace.

At Ocean, we started 2020 with an ambitious plan to fulfill all of the covenants made since our project inception in 2017 . Having released Ocean V1 at the end of 2019, which included the Pacific Network, our service execution agreements to govern access control, for 2020, we wanted to:

Deliver Ocean V2 “Compute-to-Data”, our most requested feature Re-architect and build out Ocean V3, datatokens using Ethereum infrastructure for the smart contracts and the ERC20 datatokens. Build Ocean Market as a platform for data providers to meet with consumers and help to launch Initial Data Offerings Have Ocean running in a fully decentralized manner on unstoppable infrastructure Deploy a DAO and set the stage for community-based funding for Ocean ecosystem projects.

These were ambitious goals. We were going to give it our best shot to achieve them.

I’m pleased to declare, that all of the promises made in 2017 were fully discharged. Everything that was promised, was delivered.

Even better, the nascent community around Ocean grew into a sizable and vocal “Navy”. The many ecosystem programs that had germinated in 2019, were fully deployed in 2020. The Ocean Ambassador, Shipyard and Data Economy Challenge programs were expanded and are reaching critical mass. The Telegram community became more diverse with multiple languages being supported by community members. We have a lively discussion on Discord. And the number of new contributors to our Github repository is growing rapidly.

We struck new partnerships with other projects and companies like Daimler, dexFreight, Filecoin, GAIA-X, Secret Network, the United Nations ITU and many more.

The $OCEAN token has increased in value 30x from the lows in Q3/2019, the number of holders has increased from 5,000 to over 20,000, and it is now listed on more than 35 exchanges with traded volumes ranging from $10–25 million per day. $OCEAN tokens have come so far from the days when there would be only $20–50k in daily traded volume.

Looking solely from the perspective of the Ocean, 2020 was a breakout year in all ways. We are truly blessed as a global, borderless community, fighting the good fight to build an open Data Economy that is accessible to everyone.

2021 promises to be a year of growth — see you there!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

2020 — A Year to Remember was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Coinfirm

2020: A Year in Review

It’s been a very, very busy year for the crypto markets and the Coinfirm team. Here is our annual Year in Review for 2020.
It’s been a very, very busy year for the crypto markets and the Coinfirm team. Here is our annual Year in Review for 2020.

Elliptic

Treasury’s Unhosted Wallet Proposal: Unnecessary, Ineffective and Counterproductive

Elliptic does not believe that the US Treasury’s proposed rules on unhosted wallet transactions are fit for purpose. The data shows that the risk has been overstated and the rules could in fact make it more challenging for law enforcement to pursue financial crime in cryptoassets. We have called on FinCEN to extend the shortened comment period and reconsider their proposal.

Elliptic does not believe that the US Treasury’s proposed rules on unhosted wallet transactions are fit for purpose. The data shows that the risk has been overstated and the rules could in fact make it more challenging for law enforcement to pursue financial crime in cryptoassets.

We have called on FinCEN to extend the shortened comment period and reconsider their proposal.

Wednesday, 30. December 2020

Trinsic (was streetcred)

Trinsic in 2020: A Year in Review

The year 2020 was quite the year for everyone. Despite its challenges, 2020 was a significant year of growth for the Trinsic platform, team, and brand. Below, we highlight the moments of growth in 2020 that we will never forget. The world’s first SSI platform is born At the beginning of 2020, we launched the […] The post Trinsic in 2020: A Year in Review appeared first on Trinsic.

The year 2020 was quite the year for everyone. Despite its challenges, 2020 was a significant year of growth for the Trinsic platform, team, and brand. Below, we highlight the moments of growth in 2020 that we will never forget.

The world's first SSI platform is born

At the beginning of 2020, we launched the world’s first self-sovereign identity (SSI) platform. This included our powerful multi-tenant API platform, the Developer Portal (now called Trinsic Studio) and a new and improved version of the Streetcred Wallet (now called Trinsic Wallet). Trinsic Studio, built on our powerful API platform, made it easy for developers to start issuing credentials in minutes and manage their credential exchange.

 

It was the first production-grade SSI platform to market (prior to this there were only betas, alphas, early-access, pre-release, and other non-production solutions). And it was the first time anyone, not just software engineers, could get started with SSI.

 

Fast forward to today, with hundreds of developers actively using Trinsic, we decided it was time to release the next generation of our Studio. The release of Trinsic Studio 2.0 included a simplified pricing model, an improved design, and better performance. These changes to Trinsic Studio have made SSI more accessible than ever. See for yourself at https://trinsic.studio.

Trinsic Studio Trinsic Wallet Trinsic receives pre-seed funding & rebrands

One of the biggest moments in 2020 was when we raised pre-seed funding from institutional investors and rebranded from Streetcred ID to Trinsic. Kickstart Seed Fund led the oversubscribed round as the first investment of its recently-closed $110 million fund.

 

Of this investment, Dalton Wright, Partner at Kickstart Seed Fund, said, “Decentralized identity enables us to finally introduce human trust to the internet, something that has been sorely needed for decades. We’ve been anxiously waiting for an opportunity to invest in this space for years, and when we came across Trinsic, we knew we wanted in. Trinsic is doing for decentralized identity what Stripe did for online payments.”

 

At the same time as our funding announcement, we introduced our new company name and brand—Trinsic. This rebrand included the launch of our new website and design system. It has been great to see how our new name has been received with open arms from our developer community and the SSI community in general.

Product and feature improvements

One of the main questions at Trinsic that drives our decision-making is: What will make SSI more accessible? With the excellent feedback we receive from our developer community and the skills of our world-class engineering team, we consistently work to improve our platform to answer that question. In addition to the ongoing addition of new Aries RFCs and extensions to the platform, here are a few noteworthy product and feature updates:

 

Provider API: We added a third API to our platform called the Provider API—the first product built specifically for SSI providers. The Provider API allows you to programmatically create and manage credential issuers/verifiers. Portable digital wallets: Trinsic, Lissi, and esatus AG were the first SSI vendors to achieve wallet portability. This means that for the first time a person could stop using one type of digital wallet and start using a new one without losing their credentials. New & improved documentation site: We released a new documentation site which includes a knowledge base for those that want to learn SSI concepts, our API reference, and a public developer forum. Interactive connections: Interactive connections enable people to have richer interactions with others when using the Trinsic Wallet. This includes the following features: proactive credential sharing, interoperable chat messaging, an activity log, and connection invitations. Integrate SSI directly with 2,000+ other applications

This year, Trinsic partnered with workflow automation platform Zapier to bring verifiable credentials to over 2,000 other SaaS apps you use every day, including Office 365, GSuite, Adobe, Atlassian, Slack, Salesforce, and many more. After we released this, the Internet Identity Workshop (IIW) used it to integrate Eventbrite (registration & ticketing) with Trinsic (verifiable credentials) in order to issue IIW tickets as verifiable credentials. See our tutorial to learn how to get started fast with an example.

COVID-19 and verifiable credentials

At the beginning of the global pandemic, Trinsic along with many others in the SSI space, saw the potential that verifiable credentials could have in mitigating the effects of COVID-19. We responded by helping start the COVID-19 Credentials Initiative and called upon the world to explore how verifiable credentials could be used to respond to the virus. Since our initial call to action, dozens of developers and organizations have used Trinsic to build SSI solutions related to COVID-19. The solutions include receiving COVID-19 testing and vaccination credentials. Here are a few examples:

 

MedCreds: Reducing the Risk of Returning to Work During COVID-19 Decreasing Unemployment Among African Youth Using Verifiable Credentials Combining Verifiable Credentials and Smart Contracts for COVID-19 Data Management Webinars & trainings

In 2020 we started a webinar series focused on Trinsic’s platform and the latest trends in the SSI community. Webinar attendance and feedback have been fantastic, and we look forward to continuing our monthly cadence in 2021. The webinars and trainings we have done to date are:

Intro to SSI for Developers: Architecting Software Using Verifiable Credentials

Passwordless Login Using Verifiable Credentials

Making Money with SSI

How to Integrate Verifiable Credentials into Any Application

Trinsic's blog

This year we launched a robust blog that focuses on platform updates, announcements, thought leadership, basic SSI concepts, and tutorials. Here are some hand-picked selections:

Announcements Trinsic & Zapier Partner to Bring SSI to 2000+ Applications Trinsic Cements its Commitment to Interoperability Ahead of Internet Identity Workshop XXXI Trinsic Joins Trust over IP Foundation as Founding Member Thought leadership What Is Self-Sovereign Identity? Building SSI Digital Wallets: The 3 Major Decisions SSI and the Cloud Trinsic Basics What Are Verifiable Credentials? What Are SSI Digital Wallets? What Are SSI Standards? Studio tutorials How to Issue Credentials in Trinsic Studio How to Create Connections in Trinsic Studio How to Verify Credentials in Trinsic Studio Looking forward to 2021

The year 2020 has brought its unique challenges for everyone, but we have gratefully experienced tremendous moments of success and growth as a company. In 2021, we look forward to continuing to build our enterprise-grade SSI platform and make SSI as accessible as possible to everyone. Big things are coming for the Trinsic platform and the SSI industry, and we look forward to making the global adoption of SSI a reality.

The post Trinsic in 2020: A Year in Review appeared first on Trinsic.


Indicio

Indicio becomes a public benefit corporation

The post Indicio becomes a public benefit corporation appeared first on Indicio Tech.

Indicio becomes a public benefit corporation New structure supports the company’s mission, values, and its belief that identity technology should serve the public interest.

Decentralized identity is a transformational technology that can protect an individual’s privacy, enable their consent in data sharing, and provide a pathway to formal identity for hundreds of millions of people currently without any legal means of proving who they are. Indicio.tech was founded to advance decentralized identity through providing the kind of professional services and critical infrastructure that can catalyze adoption of this technology. Today, in recognition of the role it can play in building and shaping a technology for the greater good, Indicio, announces that it has reincorporated as a public benefit corporation (PBC).

Indicio joins companies such as Patagonia and Kickstarter in embracing a corporate model that aligns shareholders and stakeholders around a shared mission to deliver a material benefit to society, not only through products and services and how they are made and delivered, but through prioritizing the welfare of employees, diversity and inclusion, and environmental impact.

“When it comes to our digital lives, it is hard to think of a technological advance more beneficial to the public than decentralized identity,” says Heather Dahl, CEO of Indicio. “It will transform people’s digital lives by giving them control over who they are online and who they share information with. It will create identity for the hundreds of millions of people who currently lack formal, legal identity, which means giving them a way to access financial and health services. The advances in identity technology help us recover some of the lost, early idealism of the internet as a benefit to everyone. And while we know one company can’t save the world, we can take a stand about how the world can be a better place. Decentralized identity is our stand.”

As a Delaware PBC, the company will operate under the same management structure and corporate and tax laws it does today and with the same commitment to strong growth and profitability.

“Decentralized identity needs a variety of business models to rapidly scale,” says Dahl. “And we think for Indicio, the PBC model combines the best attributes of the traditional for-profit corporation with the public mission orientation of a nonprofit. We need to be agile. We need to be sustainable. We need to be innovative. And we need all of these qualities to be directed, without compromise, toward advancing decentralized identity.”

“For Indicio, becoming a PBC means honoring the idealism of the open source community that brought decentralized identity technology into existence,” says Ken Ebert CTO. “This means open sourcing the infrastructure that we build, and by making interoperability the compass point that directs how we build for others. Indicio has already begun doing this by open-sourcing its monitoring tools package and the company is about to release more tools and services that will make it easier for companies to develop and use decentralized identity solutions.”

As a PBC, Indicio will continue to pioneer architectural solutions and deliver superlative development and engineering support to its list of global clients, and it will do so by cultivating a company culture where employees and interns can get the professional development and mentoring they need in order to consistently deliver their best.

“When we reflect on the values that inspired our launch, propelled our growth, and delivered for our clients, we want to bake them into our company,” says Dahl. “We want to hold ourselves accountable to those values, and we want to be held publicly accountable for them. That’s a powerful feature of the PBC model. And just as it has enabled credible, third-party assessment on whether a company is delivering on its environmental commitments, we see it as providing a path for identity technology to be assessed in a similar way. There’s a long way to go, but at a time, when technology is under increasing criticism, we have a chance to build better and audit better from the beginning.”

Indicio joins a growing number of companies worldwide embracing the public benefit corporate model recognizing that businesses can build greater long-term value by committing to stakeholders, employees, and communities. So far, 35 states and the District of Columbia have passed legislation enabling public benefit corporations (sometimes called benefit corporations), and many countries have followed with similar legislation.

Indicio’s PBC status will position the company as a leader in trusted identity platform builders as they advance the technology, the industries it serves, and connect the growing field of decentralized identity vendors. Indicio will set out its public benefit goals in the coming weeks.

 

###

About Indicio 

Indicio.tech is a professional services firm specializing in decentralized identity architecture, engineering, and consultancy. Indicio provides expert guidance to a global community of clients on the use of verifiable credentials to build digital identity solutions. The decentralized networks and tools created by Indicio make verifiable credentials easy to adopt, simple to deploy, and reliable to use. As a Public Benefit Corporation, Indicio is committed to advancing decentralized identity as a public good that enables people to control their identities online and share their data by consent. Indicio believes in privacy and security by design, interoperability, and supports the open source goals of the decentralized identity community.

Sign up for blog updates Thanks for reaching out. We'll be in touch shortly!

First Name

Last Name

Email

Company Name

SUBMIT

The post Indicio becomes a public benefit corporation appeared first on Indicio Tech.


KYC Chain

2020 Travel Rule Development Review

Arguably one of the biggest drivers of innovation in the regtech industry has been the movement to accelerate adoption of the FATF’s Recommendation 16 - commonly referred to as the Travel Rule. That expansion has sparked a wide range of new developments, initiatives and solutions, which we cover in this article. The post 2020 Travel Rule Development Review appeared first on KYC-Chain.

The post 2020 Travel Rule Development Review appeared first on KYC-Chain.


Okta

How to Write Secure SQL Common Table Expressions

Common table expressions are a powerful feature of Microsoft SQL Server. They allow you to store a temporary result and execute a statement afterward using that result set. These can be helpful when trying to accomplish a complicated process that SQL Server isn’t well suited to handle. CTEs allow you to perform difficult operations in two distinct steps that make the challenge easier to solve.

Common table expressions are a powerful feature of Microsoft SQL Server. They allow you to store a temporary result and execute a statement afterward using that result set. These can be helpful when trying to accomplish a complicated process that SQL Server isn’t well suited to handle. CTEs allow you to perform difficult operations in two distinct steps that make the challenge easier to solve.

In this article, you will learn how to write common table expressions using Microsoft SQL Server. You will then learn how to use that statement in a .NET Core MVC web application that is secured using Okta. Okta is a powerful yet easy to use single sign-on provider. By making use of Okta’s Okta.AspNetCore package from Nuget, you will learn how to secure your application and any data from your CTEs properly.

Secure Your SQL CTE with an Okta Application

The first thing you want to do is set up your Okta application to handle your authentication. If you haven’t done so yet, you can sign up for a free developer account here.

Log in to your Okta Developer console and click on Add Application. Select Web and click Next. On the next page, give your Okta application a meaningful name. You will also want to replace your URIs’ ports from 8080 to 3000. Finally, set your Logout redirect URIs to http://localhost:3000/signout/callback.

Click on Done, and you will be taken to your application home screen. Make note of your Client ID, Client secret, and Okta domain, as you will need these in your web application.

Prepare the SQL Database to Use with Your CTE

To work on your database, you will need to have a database first. Microsoft provides several samples via Github. For this project, I used the Wide World Importers sample database v1.0. To use this, you will need to have at least SQL 2016 installed. Microsoft provides .bak and .bacpac files for you to use.

Common Table Expressions (CTEs)

Common table expressions are a temporary result set created by a simple query. Once this result set is obtained, you can perform SELECT, INSERT, DELETE, or MERGE operations on it. CTEs can be used in place of a complicated query - one with difficult joins and logic in it. By operating on a temporary result set, you can simplify the process, making it more readable, easier to optimize, and easier to debug. Let’s take a look at how CTEs work, and how they can make your life easier.

The first thing you notice is that the tax rates are all wrong. You aren’t supposed to charge tax unless the DeliveryPostalCode is 90490. So, you’ll need to update each line item in the InvoiceLines table. But, to get the DeliveryPostalCode, you need a join from the InvoiceLines table to the Invoices table; then, a join to the Customers table. You can round up the DeliveryPostalCode and associate it to the InvoiceID with SELECT, using the common table expression below. Next, you can run one statement using the temporary results set tax_update. After creating your results set of InvoiceID and DeliveryPostalCode, you can update the InvoiceLines table with the new TaxRate, and then you can update the TaxAmount and ExtendedPrice with ease.

--**************************** --BEGIN CTE --**************************** WITH tax_update (InvoiceID, DeliveryPostalCode) AS ( SELECT [WideWorldImporters].[Sales].[Invoices].InvoiceID, DeliveryPostalCode FROM [WideWorldImporters].[Sales].[Invoices] INNER JOIN [WideWorldImporters].[Sales].[InvoiceLines] ON [Invoices].[InvoiceID] = [InvoiceLines].[InvoiceID] INNER JOIN [Sales].[Customers] ON [WideWorldImporters].[Sales].[Invoices].CustomerID = [Sales].[Customers].CustomerID ) UPDATE [WideWorldImporters].[Sales].[InvoiceLines] SET TaxRate = CASE WHEN DeliveryPostalCode = '90490' THEN 6.00 ELSE 0 END FROM tax_update WHERE [WideWorldImporters].[Sales].[InvoiceLines].InvoiceID = tax_update.InvoiceId --**************************** --END CTE --**************************** UPDATE [WideWorldImporters].[Sales].[InvoiceLines] SET TaxAmount = TaxRate * Quantity * UnitPrice UPDATE [WideWorldImporters].[Sales].[InvoiceLines] SET ExtendedPrice = TaxAmount +(Quantity * UnitPrice)

Next, you can write a select using a CTE. The example below is a bit simple for a CTE (you can accomplish this just with a join) but it serves the purpose of showing you how a CTE is written for selects.

Here, you are selecting the CustomerID to use to obtain the CustomerName from the Customers table, along with the StockItemId to obtain the StockItemName in your application.

WITH customer_items (CustomerID, StockItemId, Quantity, LineProfit) AS ( SELECT CustomerID, StockItemId, SUM (Quantity) AS Quantity, SUM (LineProfit) AS LineProfit FROM [WideWorldImporters].[Sales].[Invoices] INNER JOIN [WideWorldImporters].[Sales].[InvoiceLines] ON [Invoices].[InvoiceID] = [InvoiceLines].[InvoiceID] GROUP BY CustomerID, StockItemID ) SELECT CustomerName, customer_items.Quantity, customer_items.LineProfit, [Warehouse].[StockItems].[StockItemName] FROM customer_items INNER JOIN [Sales].[Customers] ON [Sales].[Customers].[CustomerId] = customer_items.CustomerID INNER JOIN [Warehouse].[StockItems] ON customer_items.StockItemId = [Warehouse].[StockItems].[StockItemID] Create Your ASP.NET Core Web Application

Your SQL database is now set up, and it’s time to begin working on your web application. Open Visual Studio 2019 and Create a new project. Select ASP.NET Core Web Application and press Next. Select Web Application (Model-View-Controller). Ensure your framework is set to .NET Core 3.1 and uncheck Configure for HTTPS. Press Create and wait for your application scaffold.

Once your application is created, open the project properties window and change your App URL to http://localhost:3000 to match your Okta settings. Next, import the Okta.AspNetCore package from NuGet.

Install-Package Okta.AspNetCore -Version 3.5.0

Once that is completed, you can begin to add your code. First, take a look at your appsettings.json file. This is where you will add application-specific variables such as your Okta information or any connection strings. Replace the code in this file with the following. You will need to replace the WideWorldImporters.ConnectionString information with your connection string.

{ "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } }, "Okta": { "OktaDomain": "", "ClientId": "{yourClientId}", "ClientSecret": "{yourClientSecret}" }, "WideWorldImporters": { "ConnectionString": "{yourConnectionString}" } }

Next, add a file to hold the SQL Settings for WideWorldImporters. Add a new folder called Settings, and add a class file called SqlSettings.cs to it. Add the following code:

public class SqlSettings { public string ConnectionString { get; set; } }

This very simple class will be populated at start-up and injected into your controllers as needed. You can see that process by opening your Startup.cs file and replacing the code with the following:

public class Startup { public Startup(IConfiguration configuration) { Configuration = configuration; } public IConfiguration Configuration { get; } // This method gets called by the runtime. Use this method to add services to the container. public void ConfigureServices(IServiceCollection services) { services.AddAuthentication(options => { options.DefaultScheme = CookieAuthenticationDefaults.AuthenticationScheme; options.DefaultChallengeScheme = OpenIdConnectDefaults.AuthenticationScheme; }) .AddCookie() .AddOktaMvc(new OktaMvcOptions { OktaDomain = Configuration.GetValue<string>("Okta:OktaDomain"), ClientId = Configuration.GetValue<string>("Okta:ClientId"), ClientSecret = Configuration.GetValue<string>("Okta:ClientSecret"), Scope = new List<string> { "openid", "profile", "email" }, }); services.Configure<Settings.SqlSettings>(Configuration.GetSection("WideWorldImporters")); services.AddControllersWithViews(); } // This method gets called by the runtime. Use this method to configure the HTTP request pipeline. public void Configure(IApplicationBuilder app, IWebHostEnvironment env) { if (env.IsDevelopment()) { app.UseDeveloperExceptionPage(); } else { app.UseExceptionHandler("/Home/Error"); // The default HSTS value is 30 days. You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts. app.UseHsts(); } app.UseHttpsRedirection(); app.UseStaticFiles(); app.UseRouting(); app.UseAuthentication(); app.UseAuthorization(); app.UseEndpoints(endpoints => { endpoints.MapControllerRoute( name: "default", pattern: "{controller=Home}/{action=Index}/{id?}"); }); } }

Most of this code is boilerplate, but there are a few things you should note. First, the Configure method doesn’t pre-populate with app.UseAuthentication(), so you will need to add it here.

Next, you set up your Okta middleware in the ConfigureServices method. You also register your WideWorldImporters SQL configuration in this method.

To consume the WideWorldImporters in your controller, you will need to let .NET Core inject it, and then use it. You can see how this is done in the DashboardController. Add a file to your Controllers folder called DashboardController.cs, and replace the code with the following:

public class DashboardController : Controller { IOptions<Settings.SqlSettings> _sqlSettings; public DashboardController(IOptions<Settings.SqlSettings> sqlSettings) { _sqlSettings = sqlSettings; } [Authorize] public IActionResult Index() { return View(getDashboardIndexModel()); } protected Models.DashboardIndexViewModel getDashboardIndexModel() { DataTable dt = new DataTable(); string cmdText = @"WITH customer_items (CustomerID, StockItemId, Quantity, LineProfit) AS ( SELECT CustomerID, StockItemId, SUM (Quantity) AS Quantity, SUM (LineProfit) AS LineProfit FROM [WideWorldImporters].[Sales].[Invoices] INNER JOIN [WideWorldImporters].[Sales].[InvoiceLines] ON [Invoices].[InvoiceID] = [InvoiceLines].[InvoiceID] GROUP BY CustomerID, StockItemID ) SELECT CustomerName, customer_items.Quantity, customer_items.LineProfit, [Warehouse].[StockItems].[StockItemName] FROM customer_items INNER JOIN [Sales].[Customers] ON [Sales].[Customers].[CustomerId] = customer_items.CustomerID INNER JOIN [Warehouse].[StockItems] ON customer_items.StockItemId = [Warehouse].[StockItems].[StockItemID]"; SqlDataAdapter da = new SqlDataAdapter(cmdText, new SqlConnection(_sqlSettings.Value.ConnectionString)); da.Fill(dt); return new Models.DashboardIndexViewModel(dt); } }

The application is injecting the IOptions<Settings.SqlSettings> object into this controller. You can reference it later to obtain the connection string for your database. Speaking of your database, this controller also contains the logic for building the model for your view. You will add the model momentarily, but for now, you notice that you are using ADO.Net to call the CTE you wrote earlier. This works just as well with Dapper or Entity Framework; ADO.Net was chosen here because it’s the simplest to set up.

Add a new class to the Models folder called DashboardIndexViewModel and add the following code to it:

public class DashboardIndexViewModel { public List<DashboardLineItem> Items { get; set; } public DashboardIndexViewModel(System.Data.DataTable items) { Items = new List<DashboardLineItem>(); foreach (System.Data.DataRow row in items.Rows) { Items.Add( new DashboardLineItem( row["CustomerName"].ToString(), Convert.ToInt32(row["Quantity"]), Convert.ToDecimal(row["LineProfit"]), row["StockItemName"].ToString() ) ); } } } public class DashboardLineItem { public string CustomerName { get; set; } public string StockItemName { get; set; } public int Quantity { get; set; } public decimal LineProfit { get; set; } public DashboardLineItem(string customerName, int quantity, decimal lineProfit, string stockItemName) { CustomerName = customerName; Quantity = quantity; LineProfit = lineProfit; StockItemName = stockItemName; } }

This model is just taking the items from the query you ran earlier and injecting them into a nice view-model for your view.

Next, add a controller to your Controllers folder called AccountController if one does exist. Replace the code with the following.

public class AccountController : Controller { public IActionResult SignIn() { if (!HttpContext.User.Identity.IsAuthenticated) { return Challenge(OktaDefaults.MvcAuthenticationScheme); } return RedirectToAction("Index", "Dashboard"); } [HttpPost] public IActionResult SignOut() { return new SignOutResult ( new[] { OktaDefaults.MvcAuthenticationScheme, CookieAuthenticationDefaults.AuthenticationScheme, }, new AuthenticationProperties { RedirectUri = "/Home/" } ); } }

This code will set your application to use Okta authentication. In the SignIn method, you look to see if the user is already logged in. If they aren’t, you return a challenge which will redirect them to Okta for authentication. Once the user is logged in, they will be directed to the Dashboard/Index page. The signout method will redirect users back to the home page.

Finally, you need to add your views. First, open your Shared/_Layout.cshtml file and replace the code with the following.

<!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" /> <title>@ViewData["Title"] - CTEsDemo</title> <link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" /> <link rel="stylesheet" href="~/css/site.css" /> </head> <body> <header> <nav class="navbar navbar-expand-sm navbar-toggleable-sm navbar-light bg-white border-bottom box-shadow mb-3"> <div class="container"> <a class="navbar-brand" asp-area="" asp-controller="Home" asp-action="Index">CTEsDemo</a> <button class="navbar-toggler" type="button" data-toggle="collapse" data-target=".navbar-collapse" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="navbar-collapse collapse d-sm-inline-flex flex-sm-row-reverse"> <ul class="navbar-nav flex-grow-1"> <li class="nav-item"> <a class="nav-link text-dark" asp-area="" asp-controller="Dashboard" asp-action="Index">Dashboard</a> </li> </ul> </div> @if (User.Identity.IsAuthenticated) { <ul class="nav navbar-nav navbar-right"> <li> <form class="form-inline" asp-controller="Account" asp-action="SignOut" method="post"> <button type="submit" class="nav-link btn btn-link text-dark" id="logout-button">Sign Out</button> </form> </li> </ul> } else { <ul class="nav navbar-nav navbar-right"> <li><a asp-controller="Account" asp-action="SignIn" id="login-button">Sign In</a></li> </ul> } </div> </nav> </header> <div class="container"> <main role="main" class="pb-3"> @RenderBody() </main> </div> <footer class="border-top footer text-muted"> <div class="container"> &copy; 2020 - CTEsDemo - <a asp-area="" asp-controller="Home" asp-action="Privacy">Privacy</a> A small demo app by <a href="https://profile.fishbowlllc.com">Nik Fisher</a> </div> </footer> <script src="~/lib/jquery/dist/jquery.min.js"></script> <script src="~/lib/bootstrap/dist/js/bootstrap.bundle.min.js"></script> <script src="~/js/site.js" asp-append-version="true"></script> @RenderSection("Scripts", required: false) </body> </html>

This view has some logic that detects if the user is logged in or not. If the user isn’t logged in, you will display a Login button to them; if they are already logged in, you’ll display a Logout button.

Next, open your home page and add the following code to it:

@{ ViewData["Title"] = "Home Page"; } <div class="jumbotron"> <div class="container"> <h1 class="display-3">Common Table Expressions</h1> <p>A small demonstration application for writing Common Table Expressions in <a href="https://www.microsoft.com/en-us/sql-server/sql-server-downloads" target="_blank" rel="noreferrer">Microsoft SQL Server</a> and securing them with <a href="https://www.okta.com/" target="_blank" rel="noreferrer">Okta</a> on .NET Core 3.1.</p> </div> </div> <div class="container"> <!-- Example row of columns --> <div class="row"> <div class="col-md-4"> <h2>SQL Server</h2> <p><a class="btn btn-secondary" href="https://www.microsoft.com/en-us/sql-server/sql-server-downloads" role="button">View details &raquo;</a></p> </div> <div class="col-md-4"> <h2>.NET Core</h2> <p>.NET Core is a free, cross-platform, open-source developer platform for building many different types of applications.</p> <p><a class="btn btn-secondary" href="https://dotnet.microsoft.com/download/dotnet-core" role="button">View details &raquo;</a></p> </div> <div class="col-md-4"> <h2>Okta</h2> <p>The Okta Identity Cloud gives you one trusted platform to secure every identity in your organization and connect with all your customers.</p> <p><a class="btn btn-secondary" href="https://www.okta.com/" role="button">View details &raquo;</a></p> </div> </div> <hr> </div> <!-- /container -->

There is nothing critical to the application here; it simply provides some extra links for you to read for further learning.

Finally, add or update your Dashboard/Index.cshtml view with the following code:

@{ ViewData["Title"] = "Index"; } @model CTEsDemo.Models.DashboardIndexViewModel <table class="table table-striped"> <thead> <tr> <th>Customer Name</th> <th>Quantity</th> <th>Line Profit</th> <th>Stock Item Name</th> </tr> </thead> <tbody> @foreach(var item in Model.Items) { <tr> <td>@item.CustomerName</td> <td>@item.Quantity</td> <td>@item.LineProfit.ToString("C")</td> <td>@item.StockItemName</td> </tr> } </tbody> </table>

This view displays the data in a nice table for your users to see.

Test Your Application

Your application is now ready to start. Press F5 to begin debugging. You should be presented with the home page. From there, you can click on Login or Dashboard. Either should bring you to the Okta login screen. Log in with your Okta account, and you will be presented with the Dashboard.

Check out this project’s repo on Github.

Learn More About .NET & Okta

If you are interested in learning more about security and .NET check out these other great articles:

ASP.NET Core 3.0 MVC Secure Authentication Migrate Your ASP.NET Framework to ASP.NET Core with Okta Build an Incredibly Fast Website with Dapper + C#

Make sure to follow us on Twitter, subscribe to our YouTube Channel and check out our Twitch channel so that you never miss any awesome content!

Tuesday, 29. December 2020

KEYLESS

Keyless update: 2020 in review

2020 has been a year like no other. Yet despite the many challenges we’ve faced — we’ve managed to scale our company, and launch a full suite of passwordless authentication products and services for the enterprise. The year that pushed the world to passwordless Like all businesses, new and old, we’ve had our company direction rerouted by the result of unprecedented travel restrictions, lockdowns
2020 has been a year like no other. Yet despite the many challenges we’ve faced — we’ve managed to scale our company, and launch a full suite of passwordless authentication products and services for the enterprise. The year that pushed the world to passwordless

Like all businesses, new and old, we’ve had our company direction rerouted by the result of unprecedented travel restrictions, lockdowns, a global recession, and the tumultuous political and social movements that captured and held the attention of people all around the world.

The events of this unforgettable year forced businesses to adapt to how they operate — and one positive outcome we noticed is that this has catapulted enterprises into an era of accelerated digital transformation. We’ve seen unprecedented recognition for the need to protect critical systems and data with zero-trust security solutions — and as a result we’re seeing an increasing appetite for passwordless authentication solutions like ours, that are powered by privacy-first technology.

In what was already a turning point year for passwordless authentication, the urgency for privacy-first, zero-trust solutions has been continually fueled by high profile hacks and breaches such as the Russian led Solar Winds Attack on the US government agencies, the Twitter hack that targeted the followers of high profile users and the ongoing Spotify hacks… to name just a few.

Read more on what our CTO and co-founder has to say on COVID-19 driving adoption of passwordless solutions…

Will COVID-19 be the Catalyst to Finally Replace Passwords With Biometrics?

How far we’ve come in just one year

At this time last year, we’d only just finished building out our platform’s core technology — the Keyless Protocol. The Keyless Protocol combines privacy-enhancing technologies with modern biometrics to enable real-time biometric authentication powered by the Keyless Network. The Keyless Network utilizes a distributed cloud architecture to process and store encrypted authentication data.

Breakthroughs in biometric authentication

We successfully applied for five patents to date for our novel technology — of these, one is a non-provisional patent, and four are provisional. Our breakthroughs in privacy-first, distributed biometric authentication are enabling a shift away from device-dependent authentication, towards a more unified and user-friendly authentication experience.

The breakthrough shift from device-dependent biometric authentication, to device in-dependent biometric authentication, means we can use a single, universal set of biometric templates to authenticate and identify users across multiple devices, in a way that enhances privacy compliance, rather than impeding it.

As a result we’re able to provide secure, private and user-friendly solutions that can help solve some of the larger issues around how we manage identities across the internet.

Ensuring Privacy and Security Compliance

Since biometric data is considered sensitive, personal information, the use of biometrics is regulated in accordance with the General Data Protection Regulation (GDPR) in Europe, and other similar regulations around the world such as the California Consumer Privacy Act (CCPA) in the United States.

We’ve made significant headway validating our compliance stance with these regulations, as well as with directives such as the Revised Payments Services Directive (PSD2), which mandates what strong-customer authentication solutions should provide.

In total we’ve completed five compliance assessments this year, and have been actively engaging industry alliances, such as the FIDO Alliance, to educate certification bodies around the benefits of applying privacy-enhancing technologies and distributed cloud systems to biometric-enabled authentication.

Rapid growth, and US $6.2 million in backing

This year we not only scaled our team, going from 15 to 32 employees in under twelve months, which forced us to move into brand new offices in the centers of Rome and London. We also brought our funding to US $6.2 million — a tremendous sign of confidence in uncertain times.

Putting Keyless on the passwordless radar

In just one year, we’ve managed to build, test and launch a total of five products that serve the full spectrum of enterprise use cases for workforce and consumer passwordless authentication.

Our workforce authentication products suite includes passwordless authentication solutions for single sign-on (SSO) with providers such as Okta and Microsoft, VPN, Virtual Desktop Infrastructure (VDI) and Windows Workstation Login — all powered by the Keyless Authenticator.

Check out our Workforce Authentication solutions here…

Keyless | Zero-Trust Passwordless Authentication

The Keyless Authenticator is an app that users or employees can download on any of their devices to enable seamless, privacy-enhancing access to all of these services.

For consumer authentication, we offer the Keyless SDK for both Android and iOS platforms. The Keyless SDK can be integrated into existing applications in a matter of hours. It offers fast, secure and frictionless, consumer multi-factor authentication (MFA) powered by the Keyless technology.

Check out our consumer authentication solutions here…

Keyless | Zero-Trust Passwordless Authentication

On top of launching these products, we’ve partnered with the leading IAM and cloud providers so that customers can integrate and deploy our passwordless solutions quickly and easily.

We’ve so far partnered with Microsoft, Okta, and ForgeRock to help expand their identity management solutions with privacy-first passwordless authentication. We’ve also partnered with IBM and Cisco.

Partnering with Cisco to help universities adapt quickly to remote-learning

One of the achievements our team is most proud of is the first commercial deployment of our passwordless authentication technology in the education sector.

This year, we partnered with Cisco to help Luiss Guido Carli University, one of Rome’s leading higher education institutions, facilitate secure, remote authentication for students and staff during online classes and exams.

Unlike other passwordless solutions, Keyless offers Unique Student (or user) Identification. We can offer this feature because, with Keyless, users have one universal set of biometric templates that are used across all their devices and accounts. This means that it’s impossible for users to login to their accounts on another device using a different set of biometrics, essentially giving our clients much higher assurance that a user is who they claim to be online.

While Luiss University was a huge success for us, it wasn’t the only deployment we’ve been working on this year.

10 customers, 5 sectors, 3 continents

Luiss University was our first live commercial deployment, however we currently have a number of customers in the Banking and Fintech and Aerospace and Defense industries who are all currently testing Keyless in pilot stage within their organizations.

Between these customers, we’re currently processing five thousand biometric authentication requests per day from approximately 15 thousand users.

$600k secured in awards and recognized by Gartner

Bans on international travel this year didn’t stop us from being invited to participate in a number of technology and cybersecurity competitions around the world. Some of these competitions have awarded us with cash-prizes, which are going directly into improving our product and services. A special thanks to B2B Startup in South Europe, Startup in Italy, BNP Paribas, Startup in Slingshot 2020 Global and Banking Tech Awards.

We were also identified by Gartner as a leading biometric vendor in this year’s Hype Cycle for Identity and Access Management Technologies. It’s particularly encouraging to be recognized by Gartner as one of the leading biometric solutions that’s helping companies tackle the evolving threat landscape head on.

This year, we’ve not only witnessed the fast-tracked adoption of cloud-first platforms, zero-trust technology and passwordless authentication, we’ve seen what impacts a lack of preparedness and unrealized digital strategy can have on organizations who are faced with unforeseen challenges. By merging modern biometrics with zero-trust, privacy-enhancing technologies, we’re able to provide a breakthrough scalable solution that can readily help companies on their journey towards a more secure and private future.

We’re humbled and excited by the industry recognition we’ve received for technology so far — and believe this proves that the demand for solutions that place equal importance on user privacy, security and usability is limitless.

What does the next year hold?

Out of darkness often comes light, and so we are hoping that as we head into a fresh year, that despite the urge to return to normal, that we are able to maintain the momentum gained during the last twelve months.

With the shift to remote and flexible work and the subsequent increased importance on being able to verify that someone is who they claim to be online, organizations around the globe are starting to recognise why we must act fast to converge authentication with identity proofing in the online realms. It’s not simply enough for a user to prove that they know or have something, they must also prove that they are who they claim to be — something that can only be achieved using advanced biometrics combined with machine learning.

We are confident that organizations sustain their interest in adopting solutions that leverage emerging technologies to enable more secure, reliable and privacy-focused authentication and identity management.

For us as a company, we plan to continue helping enterprises on their path towards a zero-trust, passwordless future by building out capabilities for our workforce and consumer authentication solutions, while accelerating our go-to-market efforts in Europe.

Some of our best content in 2020 A beginner’s guide to Shamir’s Secret Sharing

Take a deep-dive into one of the core building blocks of Keyless. Shamir’s Secret Sharing enables the secure transfer of encrypted biometric templates to and from our network.

A beginner’s guide to Shamir’s Secret Sharing

What could have prevented 2020's massive Twitter Hack

One of the biggest breaches to capture our attention this year was the Twitter attack that targeted the likes of Jeff Bezos and other high profile users… read our blog to learn how Keyless could have stopped this breach.

If Twitter did this one simple thing, last week’s attack would never have happened

Local Authentication vs Keyless pt. 1

In this piece we take a look at the usability and security limitations faced by some of the most well known authentication solutions on the market

Local Authentication vs Keyless pt. 1: FaceID and YubiKey

Local Authentication vs Keyless pt. 2

Keyless offers enhanced security, privacy and usability than other biometric authentication solutions — in this piece we explain how.

Local Authentication vs Keyless pt. 2: How Keyless compares

Request a personalized demo of Keyless

Keyless™ authentication can help deliver secure and seamless digital experiences for your end-users and for your increasingly remote workforce.

Head to our website to learn more about our biometric authentication and identity management solutions.

Alternatively, you can email us directly at info@keyless.io

Keyless update: 2020 in review was originally published in KeylessTech on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Making a CRUD API using Azure Functions and Azure Cosmos DB

Learn how to make a wishlist API using Azure Functions and Azure Cosmos DB.
Learn how to make a wishlist API using Azure Functions and Azure Cosmos DB.

Monday, 28. December 2020

Okta

How to Docker with Spring Boot

Those of you reading this have certainly heard of Docker. After years of hype, it has become the somewhat standard technology for everyday DevOps operations. It greatly helped to simplify deployments and testing by creating efficient, immutable images of the applications which are working in their own silo. More efficient placement of applications has made this technology central for cloud applica

Those of you reading this have certainly heard of Docker. After years of hype, it has become the somewhat standard technology for everyday DevOps operations. It greatly helped to simplify deployments and testing by creating efficient, immutable images of the applications which are working in their own silo. More efficient placement of applications has made this technology central for cloud applications which is why it has gotten so much attention in recent years.

Docker has enabled a new, unified way of application deployment. The basic idea is simple: instead of preparing a target environment on each machine, bring it as a part of your application in the form of a container. This means no conflicting library versions or overlapping network ports. Built images are immutable - your application works the same way locally, on your teammate’s computer, or in the cloud. Also, it’s possible to run multiple instances of the container on the same machine, and that helps to increase the density of deployment, bringing down costs.

In this tutorial, you will build and run a simple web application into the Docker-compatible image using Cloud Native Buildpacks support, introduced in Spring Boot 2.3.0.

Prerequisites:

Java 11+ Unix-like shell Docker installed Okta CLI installed

Table of Contents

Bootstrap a Secure Spring Boot Application Run Your Spring Boot Application Build a Spring Boot Docker Image Secure Your Spring Boot Application in Docker Configure Spring Security to Lock Down Access Start Spring Boot Application in Docker Bonus - Use a dotenv File Deploy Spring Boot + Docker to Heroku Learn More About Docker, Spring Boot, and Buildpacks Bootstrap a Secure Spring Boot Application

Start by creating a Spring Boot application using Spring Boot Initializr. This can be done via the web interface or using a handy curl command:

curl https://start.spring.io/starter.tgz -d dependencies=web,okta \ -d bootVersion=2.4.1 \ -d groupId=com.okta \ -d artifactId=demospringboot \ -d type=gradle-project \ -d language=kotlin \ -d baseDir=springboot-docker-demo | tar -xzvf -

This command requests that Spring Boot Initializr generate an application that uses the Gradle build system and Kotlin programming language. It also configures dependencies on Spring Web and Okta. The created project is automatically unpacked to the springboot-docker-demo directory.

Update your main application class to allow unauthenticated access in WebSecurityConfigurerAdapter. While in there, add a controller that welcomes the user. It’s safe to put everything in a single file src/main/kotlin/com/okta/demospringboot/DemoApplication.kt:

package com.okta.demospringboot import org.springframework.boot.autoconfigure.SpringBootApplication import org.springframework.boot.runApplication import org.springframework.security.config.annotation.web.builders.HttpSecurity import org.springframework.security.config.annotation.web.configuration.WebSecurityConfigurerAdapter import org.springframework.context.annotation.Configuration import org.springframework.web.bind.annotation.RequestMapping import org.springframework.web.bind.annotation.RestController import java.security.Principal @SpringBootApplication class DemoApplication fun main(args: Array<String>) { runApplication<DemoApplication>(*args) } @Configuration class OktaOAuth2WebSecurityConfigurerAdapter: WebSecurityConfigurerAdapter() { override fun configure(http: HttpSecurity) { http.authorizeRequests().anyRequest().permitAll() } } @RestController class WebController { @RequestMapping("/") fun home(user: Principal?) = "Welcome, ${user?.name ?: "guest"}!" } Run Your Spring Boot Application

Start your application in the project folder via the command line:

./gradlew bootRun

Then, open a browser at http://localhost:8080. Your web application greets the guest user:

Build a Spring Boot Docker Image

Since version 2.3.0, Spring Boot has supported Cloud Native Buildpacks. It has become straightforward to deploy a web service to the popular clouds using Buildpacks due to the mass adoption of this technology.

Build your application and send the image to the local Docker daemon:

./gradlew bootBuildImage --imageName=springbootdemo

Next, start your containerized web application with Docker:

docker run -it -p8080:8080 springbootdemo

As expected, your web application will be available on http://localhost:8080.

Secure Your Spring Boot Application in Docker

User management is never an easy task and, most certainly, is not the main objective of your application. Okta is an identity provider that helps you to take care of routine work such as implementing OAuth 2.0, social login, and SSO (Single Sign-On). It’s very developer-friendly, and it has excellent integration with different frameworks, including Spring Boot.

Start with installing the Okta CLI tool - it’s a real time saver for developers.

Create your free developer account with Okta, no credit card required:

okta register ...(provide registration details)... ... To set your password open this link: https://dev-xxxxxxxxxxxx.okta.com/welcome/ABCDEFG

Don’t forget to set your password using the link above!

Create an Okta application. In your project directory, execute:

okta apps create --redirect-uri http://localhost:8080/login/oauth2/code/okta

The Okta CLI will prompt you for an application name—choosing the default is OK. Next, select 1 (Web), then 1 (Okta Spring Boot Starter). Accept defaults, the Okta CLI will configure the default login redirect URI (http://localhost:8080/login/oauth2/code/okta) and logout redirect URI (http://localhost:8080). Don’t worry, you can change these values later if something needs to be adjusted.

When an application successfully created, its configuration is saved in the current folder, in a .okta.env file.

cat .okta.env export OKTA_OAUTH2_ISSUER="https://dev-xxxxxx.okta.com/oauth2/default" export OKTA_OAUTH2_CLIENT_SECRET="yyy" export OKTA_OAUTH2_CLIENT_ID="zzz"

You’ll need to run source .okta.env to set these values as environment variables. If you’re on Windows, rename the file to .okta.bat and change export to set. These parameters need to be injected into the application to enable OAuth flows for authentication and authorization.

⚠️ NOTE: make sure you never check in credentials to your source control system.

Configure Spring Security to Lock Down Access

Previously, your webpage was accessible for everyone. To allow access for authorized users only, update the Spring Security configuration in src/main/kotlin/com/okta/demospringboot/DemoApplication.kt:

@Configuration class OktaOAuth2WebSecurityConfigurerAdapter: WebSecurityConfigurerAdapter() { override fun configure(http: HttpSecurity) { http.authorizeRequests().anyRequest().authenticated(); } }

That’s it. The Okta Spring Boot Starter takes care of the rest!

Rebuild the application again:

./gradlew bootBuildImage --imageName=springbootdemo

The --imageName parameter allows specifying an image name. Without it, the name would be something like appName:0.0.1-SNAPSHOT.

Start Spring Boot Application in Docker

When your application starts, the Okta module reads environment variables to configure security in your application. Start your application with your values set:

docker run -it -p8080:8080 \ -e OKTA_OAUTH2_ISSUER="https://dev-xxxxxx.okta.com/oauth2/default" \ -e OKTA_OAUTH2_CLIENT_SECRET="yyyyyyyyyyyyyyyyyyy" \ -e OKTA_OAUTH2_CLIENT_ID="zzzzzzzzzzzzzzzz" \ springbootdemo

The argument -e allows to set an environment variable for the application running inside your container and -p maps container’s point to the localhost.

Now, head over to http://localhost:8080, and you’ll be asked to log in using Okta’s standard form. Enter your login credentials and, upon successful sign-in, your web browser will be redirected to the main page, displaying a welcoming message.

Congratulations, you have created a simple Spring Boot application contextualized with Docker and secured with Spring Security + Okta.

Bonus - Use a dotenv File

While providing a few environment variables in the command-line might be acceptable, it’s not very convenient and can leave unwanted traces of the secrets in your terminal history. Docker supports dotenv file format, which makes it easier to set multiple environment parameters.

Create a .env file in the root of the project and set desirable environment variables: OKTA_OAUTH2_ISSUER=https://{yourOktaDomain}/oauth2/default OKTA_OAUTH2_CLIENT_SECRET={yourClientSecret} OKTA_OAUTH2_CLIENT_ID={yourClientId} Always be extra careful with credentials - avoid leaking them to the version control system even for the pet project. Add .env to .gitignore. echo ".env" >> .gitignore Run Docker providing your .env file via --env-file argument docker run -it -p8080:8080 --env-file .env springbootdemo

Looks much cleaner, doesn’t it?

Deploy Spring Boot + Docker to Heroku

If you’d like to deploy your dockerized Spring Boot app to Heroku, you’ll need to use Heroku Buildpacks. This is because the Paketo buildpacks refuse to allocate heap on containers smaller than 1GB of RAM. A free Heroku dyno has 512MB.

First, you’ll need to add the following to src/main/resources/application.properties so Spring Boot uses Heroku’s PORT environment variable.

server.port=${PORT:8080}

Then, build your image with --builder heroku/spring-boot-buildpacks:

./gradlew bootBuildImage --imageName=springbootdemo --builder heroku/spring-boot-buildpacks

Create an app on Heroku:

heroku create

Log in to Heroku’s container registry and push your app:

heroku container:login docker tag springbootdemo registry.heroku.com/<your-app-name>/web docker push registry.heroku.com/<your-app-name>/web

Set your Okta app settings as environment variables:

heroku config:set \ OKTA_OAUTH2_ISSUER="/oauth2/default" \ OKTA_OAUTH2_CLIENT_ID="{clientId}" \ OKTA_OAUTH2_CLIENT_SECRET="{clientSecret}"

Next, release your container and tail the logs.

heroku container:release web heroku logs --tail

You’ll need to update your Okta OIDC app to have your Heroku app’s redirect URIs as well.

Login redirect URI: https://<your-app-name>.herokuapp.com/login/oauth2/code/okta Logout redirect URI: https://<your-app-name>.herokuapp.com

Run heroku open to open your app and sign in.

Learn More About Docker, Spring Boot, and Buildpacks

In this brief tutorial, you created a secure Spring Boot application and packaged it with Docker. You configured Okta as an OAuth 2.0 provider, built an image into your local Docker daemon, and learned how to run your app in Docker. This bootstrap project is a great starting point for your next cloud-native project.

You can find the source code for this example on GitHub.

See other relevant tutorials:

Deploy a Secure Spring Boot App to Heroku OAuth 2.0 Java Guide: Secure Your App in 5 Minutes Angular + Docker with a Big Hug from Spring Boot A Quick Guide to OAuth 2.0 with Spring Security

Follow us for more great content and updates from our team! You can find us on Twitter, Facebook, subscribe to our YouTube Channel or start the conversation below!

Changelog:

Dec 31, 2020: Updated post to add Heroku instructions, since it requires another buildpack. Thanks for the idea, Maurizio! See the code changes in the example on GitHub. Changes to this post can be viewed in oktadeveloper/okta-blog#514.

MyKey

MYKEY Weekly Report 31 (December 21st~December 27th)

Today is Monday, December 28, 2020. The following is the 31st issue of MYKEY Weekly Report. In the work of last week (December 21st to December 27th), there are mainly 2 updates: 1. We are carrying out a topic activity with rewards: ‘区块链要出圈’ in http://bihu.com until December 29 For details, click to view: https://bit.ly/3rp6EBR. 2. HashKey Hub&MYKEY launched the new period of BTC

Today is Monday, December 28, 2020. The following is the 31st issue of MYKEY Weekly Report. In the work of last week (December 21st to December 27th), there are mainly 2 updates:

1. We are carrying out a topic activity with rewards: ‘区块链要出圈’ in http://bihu.com until December 29

For details, click to view: https://bit.ly/3rp6EBR.

2. HashKey Hub&MYKEY launched the new period of BTC financial products

MYKEY and the third-party partner HashKey Hub launched a new period of 5% BTC 30-day regular financial products on December 22, 2020. Both parties will further deepen cooperation and jointly explore the development of digital currency financial products.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 31 (December 21st~December 27th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.


Jelurida Swiss SA

How Triffic is Gamifying Advertising for Local Businesses

How Triffic is Gamifying Advertising for Local Businesses alberto Mon, 2020-12-28 - 02:06 How Triffic is Gamifying Advertising for Local Businesses Blockchain.news: Triffic has to be able to monetize the platform itself to fund future development and ongoing operation. Therefore, the app makes revenues from its own GPS tokens. These come from three channels – from
How Triffic is Gamifying Advertising for Local Businesses alberto Mon, 2020-12-28 - 02:06 How Triffic is Gamifying Advertising for Local Businesses

Blockchain.news: Triffic has to be able to monetize the platform itself to fund future development and ongoing operation. Therefore, the app makes revenues from its own GPS tokens. These come from three channels – from a share of advertising revenues, from in-app subscriptions for account upgrades, and from a share of revenues from Partner Beacons. Triffic is developed on the Ignis blockchain, the main child chain of the Ardor blockchain. Ignis offers out-of-the-box features that allowed the team behind Triffic to get up and running with a blockchain-based application without having to develop their own platform. Triffic plans to migrate to its own dedicated Ardor child chain in 2021 as part of its expansion plans, which include extending GPS rewards to ridesharing and food delivery apps. However, it’s currently one of three promising projects developed on Ignis. Treecoin is another. The third is Bridge Champ, an online gaming platform for players of the popular strategy card game, contract bridge. All three projects are supported by Jelurida, the firm that operates the Ardor ecosystem. Jelurida has a long pedigree in the blockchain development sector, having been part of the team that developed Nxt in 2013, the first pure proof-of-stake blockchain.

December 28, 2020

Sunday, 27. December 2020

Identosphere Identity Highlights

Identosphere Weekly #12 • Learn Concepts with MATTR • BCDevEx Opportunity

Identosphere wishes you a Happy New Year – full of verifiable credentials and key event logs. (December 19-27)

Being the holidays, and nearing the end of the year, this wasn’t the busiest week in SSI News, but we still found some gems. It feels good to browse the feeds and keep on the pulse.

Thanks, to our patrons, for supporting this publication!

If you haven’t already, you can contribute via patreon.com/identosphere.

Reads from this week Is your company in the W3C?

If the answer is yes - there is an election going on right now for the Techincal Architecture Group ending Tuesday Jan 5th. Please check with your rep and ask them to vote for Amy Guy (Christopher Lemmer Webber recommends) and for Wayne Chang of Spruce Systems (Kaliya recommends)

Thoughtful Biometrics Workshop

The Thoughtful Biometrics Workshop is creating a space to dialogue about critical emerging issues surrounding biometric and digital identity technologies. It’s happening the 1st week of February: Monday, Wednesday, and Friday, 9am PST to 1pm PST / Noon EST to 5 EST.

Register on EventBrite!

Money for Developers!!! @bcdevexbot shares

Extend the Hyperledger Aries Cloud Agent Python protocols to support ZKP W3C Standard Verifiable Credentials based on BBS+ Signatures

This opportunity is for developers familiar with Hyperledger Aries, Aries Protocols, Python and JSON-LD processing to add support in ACA-Py for several important VC formats used by a number of other organizations in the VC community.

Value $60,000

Proposal Deadline: Jan 8, 2021

Adrian Doerk - SSI Ambassador shares:

The trust infrastructure of self-sovereign identity ecosystems

The trust infrastructure is concerned with the question of how and why the presented information can be trusted. It defines the rules for all stakeholders and enables legally binding relationships with the combination of governance frameworks, which are built on top of trust frameworks.

The post includes a section on the core components of identity architecture that includes a graphic based on a post by Phil Windley that infominer appreciated.

SSI in Education 

WellThatsInteresting.tech has a great post about SSI and Education. It highlights the recent announcement from Digitary about having issued over four million digitally verified documents from 100+ institutions for millions of learners in 135 countries.

Evernym’s December Roundup

Here are a few from Evernym’s picks:

Different approaches to Interoperability by Daniel Hardman of Evernym

Several VC ecosystems have grown up around the VC spec. Each touts standards compliance and interoperability, yet they do not currently interoperate with one another. Let’s have a look at their differences and commonalities, and then explore a simple proposal that might make which language your VCs “speak” as transparent as which language you choose when you watch a movie.

Sovrin’s 12 Principles of Self-Sovereign Identity

Representation • Interoperability • Decentralization • Control & Agency • Participation • Equity and Inclusion • Usability, Accessibility, and Consistency • Portability • Security • Verifiability and Authenticity • Privacy and Minimal Disclosure • Transparency

What Does Trust Over IP Mean for Governments and Their Citizens?

credentials can help reopen travel, reduce the costs and improve access to healthcare, streamline KYC and financial transactions, and help connect students with employers needing their skills—all while saving governments billions in the costs of connecting and protecting their digital infrastructure.

[...]

featured speakers from Evernym, Mastercard, LG CNS, Accenture, GLEIF, and other[s]

CCG Mailing List Manifesto: Rules for standards-makers

I've used all kinds of formats and protocols in a long career as a software developer, even created a few. My new manifesto summarizes what I've learned about what works and what doesn't.

Heather Vescent shared this post by Dave Winer about lessons learned over his years in software development, discussion followed.

Video  Building interoperable self-sovereign identity for Europe

Oskar van Deventer, a rockstar from TNO, presents:

ways to build an SSI ecosystem and architecture together that is interoperable and technologically mature fit for society and funding opportunities for SSI projects through grants.

Podcasts

Using Self-Sovereign Identity as the Foundation for Secure, Trusted Digital Relationships with Kaliya Young

Kaliya was on the Human-Centered Security Podcast.  You can find it on the Web, Spotifiy, or Apple Podcast.

SSI Fundamentals Mattr Learn Concepts MATTR rocked out some AMAZING posts on fundamental concepts surrounding SSI. 

We highly recommend them!!

Web of Trust 101

The emerging “Web of Trust” is an idea that has been around since the dawn of the internet. To explain what motivated its creation, let’s take a look at how trust on the internet functions today.

Digital Wallets

The reframing of the user as a first-class citizen and their empowerment as ‘holder’ represents a shift towards a new paradigm. Such a paradigm offers users greater sovereignty of their own information and empowerment to manage their digital identity. Users are able to exercise their new role in this ecosystem by utilizing a new class of software known as digital wallets.

Verifiable Data

refers to the authenticity and integrity of the actual data elements being shared. 

Also covers Verifiable Relationships, Verifiable Processes, Verifiable Credentials, along with Semantics and Schemas.

Semantic Web

The semantic web is a set of technologies whose goal is to make all data on the web machine-readable. Its usage allows for a shared understanding around data that enables a variety of real-world applications and use cases.

Selective Disclosure

An important principle that we want to achieve when designing any system that involves handling Personally Identifiable Information (PII) is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis, while the relying parties receiving the information must be able to maintain assurances about the presented information’s origin and integrity. 

Trust Frameworks

Trust frameworks are a foundational component of the web of trust. A trust framework is a common set of best practice standards-based rules that ensure minimum requirements are met for security, privacy, identification management and interoperability through accreditation and governance. These operating rules provide a common framework for ecosystem participants, increasing trust between them.

VC Model from Trinsic The Verifiable Credential’s Model

At the core of every self-sovereign identity (SSI) use case is what we call the verifiable credentials model. This simple yet effective model helps conceptualize how verifiable credentials are exchanged between people and organizations.

Non-identity news… Financial struggles in the US

The latest data from the Census Bureau’s Household Pulse Survey, taken between November 25 and December 7, found that 35.3 percent of U.S. adults are “living in households not current on rent or mortgage where eviction or foreclosure in the next two months is either very likely or somewhat likely.”

Identity but Not SSI The Nuts and Bolts of OAuth 2.0

Aaron Parecki - Mr. OAuth has a new course out on Udemy

3.5 hours of video content, quizzes, as well as interactive exercises with a guided learning tool to get you quickly up to speed on OAuth, OpenID Connect, PKCE, best practices, and tips for protecting APIs with OAuth.

Beginners Guide to JWTs

A JWT is a structured security token format used to encode JSON data. The main reason to use JWT is to exchange JSON data in a way that can be cryptographically verified. There are two types of JWTs:

JSON Web Signature (JWS)

JSON Web Encryption (JWE)

The data in a JWS is public—meaning anyone with the token can read the data—whereas a JWE is encrypted and private. To read data contained within a JWE, you need both the token and a secret key.

Thanks for Reading

This newsletter is 100% reader-supported.

All contributions are greatly appreciated.

https://patreon.com/identosphere/

Thursday, 24. December 2020

Elliptic

Opinion: The Far-Reaching Effects of FinCEN's Rule on Unhosted Wallets

FinCEN's announcement of the Notice of Proposed Rulemaking (NPRM) on unhosted wallets was made last Friday has certainly made the last few days challenging for those of us who work in crypto. It has also shown us what is best about this industry. Through determined and coordinated action, the industry is loudly and clearly calling into question the effectiveness of the proposed rule a
FinCEN's announcement of the Notice of Proposed Rulemaking (NPRM) on unhosted wallets was made last Friday has certainly made the last few days challenging for those of us who work in crypto. It has also shown us what is best about this industry. Through determined and coordinated action, the industry is loudly and clearly calling into question the effectiveness of the proposed rule as a meaningful anti-financial crime measure.

Wednesday, 23. December 2020

MATTR

Intro to MATTR Learn Concepts

In the world of decentralized identity and digital trust, there are a variety of new concepts and topics that are frequently referenced, written, and talked about, but rarely is there a chance to introduce these concepts formally to audiences who aren’t already familiar with them. For this reason, we have created a new “Learn Concepts” series to outline the the fundamental building blocks ne

In the world of decentralized identity and digital trust, there are a variety of new concepts and topics that are frequently referenced, written, and talked about, but rarely is there a chance to introduce these concepts formally to audiences who aren’t already familiar with them.

For this reason, we have created a new “Learn Concepts” series to outline the the fundamental building blocks needed to understand this new technology paradigm and explore the ways that MATTR thinks about and understands the critical issues in the space.

Over on our MATTR Learn site, we have been building out a variety of resources to assist developers and architects with understanding the MATTR universe of tools and products. We are happy to announce we have updated the site to include this new educational content series alongside our existing resources.

Our Learn Concepts series covers the following topics:

Web of Trust 101 Digital Wallets Verifiable Data Semantic Web Selective Disclosure Trust Frameworks

To facilitate context sharing, each of these Learn Concepts has a distinct Medium post with a permanent URL in addition to being published on our MATTR Learn site. We will keep these resources up to date to make sure they remain evergreen and relevant to newcomers in the space.

We are excited to share what we’ve learned on our journey, and we look forward to adapting and expanding this knowledge base as standards progress and technologies mature.

Intro to MATTR Learn Concepts was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Trust Frameworks

Trust frameworks are a foundational component of the web of trust. A trust framework is a common set of best practice standards-based rules that ensure minimum requirements are met for security, privacy, identification management and interoperability through accreditation and governance. These operating rules provide a common framework for ecosystem participants, increasing trust between them

Trust frameworks are a foundational component of the web of trust. A trust framework is a common set of best practice standards-based rules that ensure minimum requirements are met for security, privacy, identification management and interoperability through accreditation and governance. These operating rules provide a common framework for ecosystem participants, increasing trust between them.

As digital service delivery models mature, it is essential that information is protected as it travels across jurisdictional and organizational boundaries. Trust frameworks define and bring together the otherwise disparate set of best practice principles, processes, standards that apply when it comes to collecting and sharing information on the web. As individuals and entities increasingly share their information cross contextually, across industry boundaries, trust frameworks provide the common set of rules that apply regardless of such differences. For example, service providers ranging from government agencies, banks and telecommunication companies, to health care providers could all follow the same set of data sharing practices under one trust framework. This macro application serves to reduce the need for bilateral agreements and fragmentation across industry. Ultimately trust frameworks serve to increase trust, improve efficiencies, and deliver significant economic and social benefits.

Some use-cases will require more detailed rules to be established than those set out in a trust framework with broad scope. Where this is the case, more detailed rules around specific hierarchies and roles can be established within the context of the higher order trust framework. The goal is always for the components of the framework to be transparent, and adherence to those components to be public. This enables entities to rely on the business or technical process carried out by others with trust and confidence. If done correctly, a trust framework is invisible to those who rely on it every day. It allows individuals and entities to conduct digital transactions knowing that the trust frameworks underpin, create accountability, and support the decisions they’re making.

Use Cases for Trust Frameworks

Historically speaking, trust frameworks have been extraordinarily complex and only worth the investment for high-value, high-volume transactions, such as the ones established by credit card companies. Now, with the introduction of decentralized technologies, there is a need to create digital trust frameworks that work for a much broader variety of transactions. Realizing the scope of this work comes with the recognition that there will be many different trust frameworks, both small and large in scope, for different federations across the web. Given that context, it is important to preserve end-user agency as much as possible as trust frameworks are developed and adoption and mutual recognition increases.

Looking at the ecosystem today, we can broadly group trust frameworks into three categories:

Domain-specific Trust Frameworks These are typically developed to serve a specific use-case, for example within a particular industry Often driven by industry and/or NGOs These have been able to develop faster than national trust frameworks (which are based in legislation), and as such may inform the development of national trust frameworks National Trust Frameworks Typically broad in application and to facilitate a policy objective (for example, increased trust in data sharing) Driven by individual governments to address the needs of their citizens and residents Based in legislation, with more enforcement powers than either Domain-specific Trust Frameworks or International Trust Frameworks Likely to be informed by both Domain-specific Trust Frameworks and International Trust Frameworks International Trust Frameworks These are typically broad in nature and developed to serve many countries, much like a model law Typically driven by governments, industry, or NGOs but geographically agnostic Likely to inform National Trust Frameworks Accreditation and Assurance

An important part of satisfying the operational components of a trust framework is the ability to accredit ecosystem participants against the trust framework. This is a logical extension of the rules, requirements, and regulations trust frameworks set out. Trust frameworks typically include an accreditation scheme and associated ongoing compliance testing.

One aspect of accreditation in the identity context is compliance with standards. In the context of identity related trust frameworks, there are several kinds of assurance that relying parties will typically seek. These can include binding, information, authentication, and federation and identity assurance. Each standard may define their own distinct levels of assurance. The NIST Digital Identity Requirements and New Zealand Identification Management Standards are a good example of how this works in practice.

The process of accreditation and a successful certification is a core part of trust frameworks as it proves to the wider ecosystem (including auditors) that the entity, solution, or piece of software meets the business and technical requirements defined. Digital identity systems are increasingly modular, and one solution might involve a variety of different components, roles and providers. These should be developed and defined as part of the process of standing up a trust framework, testing its capabilities and defining processes around accreditation.

Technical Interoperability

Trust frameworks help to improve interoperability between entities by defining a common set of operating rules. In addition to setting out business and legal rules, it is important that high level technical rules are specified as well. Trust frameworks must clearly define expectations around the technical standards to be used, as well as what aspects of these standards are normatively required, optional, or somewhere in between. When it comes to digital identity trust frameworks, this may mean building on open-source code or evaluating against open test suites.

Test suites allow for normative testing around standards requirements and offer a way for parties to audit and ensure the processes being used throughout the identity lifecycle. They can be incredibly useful not only for entities using the trust framework, but for mutually recognized trust frameworks to understand and interpret the requirements coming from a particular set of rules.

Ongoing development of several digital identity trust frameworks based on the emerging decentralized web of trust can be found at industry organizations such as the Kantara Initiative and Trust Over IP Foundation as well as government-driven initiatives such as the Pan-Canadian Trust Framework.

Learn Concepts: Trust Frameworks was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Selective Disclosure

An important principle that we want to achieve when designing any system that involves handling Personally Identifiable Information (PII) is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis, while the relying parties receiving the information must be able to maintain assurances ab

An important principle that we want to achieve when designing any system that involves handling Personally Identifiable Information (PII) is to minimize the data disclosed in a given interaction. When users share information, they should be able to choose what and how much they share on a case-by-case basis, while the relying parties receiving the information must be able to maintain assurances about the presented information’s origin and integrity. This process is often referred to as selective disclosure of data. As technologists, by having solutions that easily achieve selective disclosure, we can drive a culture based on the minimum information exchange required to enhance user privacy.

Privacy and Correlation

Selective disclosure of information is particularly relevant when evaluating approaches to using verifiable credentials (VCs). Because authorities are able to issue credentials to a subject’s digital wallet, the subject is able to manage which data they disclose to relying parties as well as how that disclosure is performed. This presents an opportunity for those designing digital wallets to consider the user experience of data disclosure, particularly as it relates to the underlying technology and cryptography being used for data sharing.

The problem of user privacy as it relates to digital identity is a deep and complicated one, however the basic approach has been to allow users to share only the information which is strictly necessary in a particular context. The VC Data Model spec provides some guidance on how to do so, but stops short of offering a solution to the issue of managing user privacy and preventing correlation of their activities across different interactions:

Organizations providing software to holders should strive to identify fields in verifiable credentials containing information that could be used to correlate individuals and warn holders when this information is shared.

A number of different solutions have been deployed to address the underlying concerns around selective disclosure. Each solution makes a different set of assumptions and offers different tradeoffs when it comes to usability and convenience.

Approaches to Selective Disclosure

When it comes to solutions for selective disclosure of verifiable credentials, there are many different ways to tackle this problem, but three of the most common are:

Just in time issuance — contact the issuer at request time either directly or indirectly for a tailored assertion Trusted witness — use a trusted witness between the provider and the relying party to mediate the information disclosure Cryptographic solutions — use a cryptographic technique to disclose a subset of information from a larger assertion Just in time issuance

Just in time issuance, a model made popular by OpenID Connect, assumes the issuer is highly available, which imposes an infrastructure burden on the issuer that is proportional to the number of subjects they have information for and where those subjects use their information. Furthermore, in most instances of this model, the issuer learns where a subject is using their identity information, which can be a serious privacy problem.

Trusted witness

Trusted witness shifts this problem to be more of a presentation concern, where a witness de-anonymizes the subject presenting the information and presents an assertion with only the information required by the relying party. Again, this model requires a highly available party other than the holder and relying party present when a subject wants to present information, one that must be highly trusted and one that bears witness to a lot of PII on the subject, leading to privacy concerns.

Cryptographic solutions

Cryptographic solutions offer an alternative to these approaches by solving the selective disclosure problem directly at the core data model layer of the VC, providing a simpler and more flexible method of preserving user privacy.

There are a variety of ways that cryptography can be used to achieve selective disclosure or data minimization, but perhaps the most popular approach is using a branch of cryptography often known as Zero-Knowledge Proofs, or ZKPs. The emergent feature of this technology is that a prover can prove knowledge of some data without exposing any additional data. Zero-knowledge proofs can be achieved in a flexible manner with verifiable credentials using multi-message digital signatures such as BBS+.

Traditional Digital Signatures

Traditional digital signatures look a bit like this. You have a message (virtually any kind of data for which you want to establish integrity) and a keypair (private and public key) which you use to produce a digital signature on the data. By having the message, public key, and the signature; verifiers are able to evaluate whether the signature is valid or not, thereby establishing the integrity of the message and the authenticity of the entity that signed the message. In the context of verifiable credentials, the entity doing the signing is the issuer of the credential, while the entity doing the verification is the verifier. The keypair in question belongs to the issuer of the credential, which allows verifiers to establish the authority on that credential in a verifiable manner.

Sign Verify Multi-message Digital Signatures

Multi-message digital signature schemes (like BBS+), on the other hand, are able to sign an array of messages, rather than a single message over which the entire digital signature is applied. The same mechanism is used wherein a private key produces a digital signature over the messages you wish to sign, but now you have the flexibility of being able to break a message up into its fundamental attributes. In the context of verifiable credentials, each message corresponds to a claim in the credential. This presents an opportunity for selective disclosure due to the ability to derive and verify a proof of the digital signature over a subset of messages or credential attributes.

Sign Verify

In addition to the simple ability to sign and verify a set of messages, multi-message digital signatures have the added capability of being able to derive a proof of the digital signature. In the context of verifiable credentials, the entity deriving the proof is the credential subject or holder. This process allows you to select which messages you wish to disclose in the proof and which messages you want to keep hidden. The derived proof indicates to the verifier that you know all of the messages that have been signed, but that you are only electing to disclose a subset of these messages.

Derive Proof Verify Proof

The verifier, or the entity with which you’re sharing the data, is only able to see the messages or credential claims which you have selectively disclosed to them. They are still able to verify the integrity of the messages being signed, as well as establish the authenticity of the issuer that originally signed the messages. This provides a number of privacy guarantees to the data subject because relying parties are only evaluating the proof of the signature rather than the signature itself.

Learn Concepts: Selective Disclosure was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Semantic Web

With so much data being created and shared on the internet, one of the oldest challenges in building digital infrastructure has been how to consistently establish meaning and context to this data. The semantic web is a set of technologies whose goal is to make all data on the web machine-readable. Its usage allows for a shared understanding around data that enables a variety of real-world applicat

With so much data being created and shared on the internet, one of the oldest challenges in building digital infrastructure has been how to consistently establish meaning and context to this data. The semantic web is a set of technologies whose goal is to make all data on the web machine-readable. Its usage allows for a shared understanding around data that enables a variety of real-world applications and use cases.

The challenges to address with the semantic web include:

vastness — the internet contains billions of pages, and existing technology has not yet been able to eliminate all semantically duplicated terms vagueness — imprecise concepts like ‘young’ or ‘tall’ make it challenging to combine different knowledge bases with overlapping but subtly different concepts uncertainty — precise concepts with uncertain values can be hard to reason about, this mirrors the ambiguity and probabilistic nature of everyday life inconsistency — logical contradictions create situations where reasoning breaks down deceit — intentionally misleading information spread by bad actors, can be mitigated with cryptography to establish information integrity Linked Data

Linked data is the theory behind much of the semantic web effort. It describes a general mechanism for publishing structured data on the internet using vocabularies like schema.org that can be connected together and interpreted by machines. Using linked data, statements encoded in triples (subject → predicate → object) can be spread across different websites in a standard way. These statements form the substrate of knowledge that spans across the entire internet. The reality is that the bulk of useful information on the internet today is unstructured data, or data that is not organized in a way which makes it useful to anyone beyond the creators of that data. This is fine for the cases where data remains in a single context throughout its lifecycle, but it becomes problematic when trying to share data across contexts while retaining its semantic meaning. The vision for linked data is for the internet to become a kind of global database where all data can be represented and understood in a similar way.

One of the biggest challenges to realizing the vision of the internet as a global database is enabling a common set of underlying semantics that can be consumed by all this data. A proliferation of data becomes much less useful if the data is redundant, unorganized, or otherwise messy and complicated. Ultimately, we need to double down on the usage of common data vocabularies and common data schemas. Common data schemas combined with the security features of verifiable data will make fraud more difficult, making it easier to transmit and consume data so that trust-based decisions can be made. Moreover, the proliferation of common data vocabularies will help make data portability a reality, allowing data to be moved across contexts while retaining the semantics of its original context.

Semantic Web Technologies

The work around developing semantic web technology has been happening for a very long time. The vision for the semantic web has been remarkably consistent throughout its evolution, although the specifics around how to accomplish this and at what layer has developed over the years. W3C’s semantic web stack offers an overview of these foundational technologies and the function of each component in the stack.

The ultimate goal of the semantic web of data is to enable computers to do more useful work and to develop systems that can support trusted interactions over the network. The shared architecture as defined by the W3C supports the ability for the internet to become a global database based on linked data. Semantic Web technologies enable people to create data stores on the web, build vocabularies, and write rules for handling data. Linked data are empowered by technologies such as RDF, SPARQL, OWL, and SKOS.

RDF provides the foundation for publishing and linking your data. It’s a standard data model for representing information resources on the internet and describing the relationships between data and other pieces of information in a graph format. OWL is a language which is used to build data vocabularies, or “ontologies”, that represent rich knowledge or logic. SKOS is a standard way to represent knowledge organization systems such as classification systems in RDF. SPARQL is the query language for the Semantic Web; it is able to retrieve and manipulate data stored in an RDF graph. Query languages go hand-in-hand with databases. If the Semantic Web is viewed as a global database, then it is easy to understand why one would need a query language for that data.

By enriching data with additional context and meaning, more people (and machines) can understand and use that data to greater effect.

JSON-LD

JSON-LD is a serialization format that extends JSON to support linked data, enabling the sharing and discovery of data in web-based environments. Its purpose is to be isomorphic to RDF, which has broad usability across the web and supports additional technologies for querying and language classification. RDF has been used to manage industry ontologies for the last couple decades, so creating a representation in JSON is incredibly useful in certain applications such as those found in the context of Verifiable Credentials (VCs).

The Linked Data Proofs representation of Verifiable Credentials makes use of a simple security protocol which is native to JSON-LD. The primary benefit of the JSON-LD format used by LD-Proofs is that it builds on a common set of semantics that allow for broader ecosystem interoperability of issued credentials. It provides a standard vocabulary that makes data in a credential more portable as well as easy to consume and understand across different contexts. In order to create a crawl-able web of verifiable data, it’s important that we prioritize strong reuse of data schemas as a key driver of interoperability efforts. Without it, we risk building a system where many different data schemas are used to represent the same exact information, creating the kinds of data silos that we see on the majority of the internet today. JSON-LD makes semantics a first-class principle and is therefore a solid basis for constructing VC implementations.

JSON-LD is also widely adopted on the web today, with W3C reporting it is used by 30% of the web and Google making it the de facto technology for search engine optimization. When it comes to Verifiable Credentials, it’s advantageous to extend and integrate the work around VCs with the existing burgeoning ecosystem of linked data.

Learn Concepts: Semantic Web was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Verifiable Data

The ability to prove the integrity and authenticity of shared data is a key component to establishing trust online. Given that we produce so much data and are constantly sharing and moving that data around, it is a complex task to identify a solution that will work for the vast majority of internet users across a variety of different contexts. The fundamental problem to address is how to establis

The ability to prove the integrity and authenticity of shared data is a key component to establishing trust online. Given that we produce so much data and are constantly sharing and moving that data around, it is a complex task to identify a solution that will work for the vast majority of internet users across a variety of different contexts.

The fundamental problem to address is how to establish authority on a piece of data, and how to enable mechanisms to trust those authorities in a broad set of contexts. Solving this problem on a basic level allows entities to have greater trust in the data they’re sharing, and for relying parties to understand the integrity and authenticity of the data being shared.

We use the overarching term verifiable data to refer to this problem domain. Verifiable data can be further expanded into three key pillars:

Verifiable data Verifiable relationships Verifiable processes Verifiable data

This refers to the authenticity and integrity of the actual data elements being shared.

Verifiable relationships

This refers to the ability to audit and understand the connections between various entities as well as how each of these entities are represented in data.

Verifiable processes

This describe the ability to verify any digital process such as onboarding a user or managing a bank account (particularly with respect to how data enables the process to be managed and maintained).

These closely-related, interdependent concepts rely on verifiable data technology becoming a reality.

Verifiable Credentials

The basic data model of W3C Verifiable Credentials may be familiar to developers and architects that are used to working with attribute-based credentials and data technologies. The issuer, or the authority on some information about a subject (e.g. a person), issues a credential containing this information in the form of claims to a holder. The holder is responsible for storing and managing that credential, and in most instances uses a piece of software that acts on their behalf, such as a digital wallet. When a verifier (sometimes referred to as a relying party) needs to validate some information, they can request from the holder some data to meet their verification requirements. The holder unilaterally determines if they wish to act upon the request and is free to present the claims contained in their verifiable credentials using any number of techniques to preserve their privacy.

Verifiable Credentials form the foundation for verifiable data in the emerging web of trust. They can be thought of as a container for many different types of information as well as different types of credentials. Because it is an open standard at the W3C, verifiable credentials are able to widely implemented by many different software providers, institutions, governments, and businesses. Due to the wide applicability of these standards, similar content integrity protections and guarantees are provided regardless of the implementation.

Semantics and Schemas

The authenticity and integrity-providing mechanisms presented by Verifiable Credentials provide additional benefits beyond the evaluation of verifiable data. They also provide a number of extensibility mechanisms that allow data to be linked to other kinds of data in order to be more easily understood in the context of relationships and processes.

One concrete example of this is the application of data schemas or data vocabularies. Schemas are a set of types and properties that are used to describe data. In the context of data sharing, schemas are an incredibly useful and necessary tool in order to represent data accurately from the point of creation to sharing and verification. In essence, data schemas in the Verifiable Credential ecosystem are only useful if they are strongly reused by many different parties. If each implementer of Verifiable Credentials chooses to describe and represent data in a slightly different way, it creates incoherence and inconsistency in data and threatens to diminish the potential of ubiquitous adoption of open standards and schemas.

Verifiable Credentials make use of JSON-LD to extend the data model to support dynamic data vocabularies and schemas. This allows us to not only use existing JSON-LD schemas, but to utilize the mechanism defined by JSON-LD to create and share new schemas as well. To a large extent this is what JSON-LD was designed for; the adoption and reuse of common data vocabularies.

This type of Verifiable Credential is best characterized as a kind of Linked Data Proof. It allows issuers to make statements that can be shared without loss of trust because their authorship can be verified by a third party. Linked Data Proofs define the capability for verifying the authenticity and integrity of Linked Data documents with mathematical proofs and asymmetric cryptography. It provides a simple security protocol which is native to JSON-LD. Due to the nature of linked data, they are built to compactly represent proof chains and allow a Verifiable Credential to be easily protected on a more granular basis; on a per-attribute basis rather than a per-credential basis.

This mechanism becomes particularly useful when evaluating a chain of trusted credentials belonging to organizations and individuals. A proof chain is used when the same data needs to be signed by multiple entities and the order in which the proofs were generated matters. For example, such as in the case of a notary counter-signing a proof that had been created on a document. Where order needs to be preserved, a proof chain is represented by including an ordered list of proofs with a “proof chain” key in a Verifiable Credential. This kind of embedded proof can be used to establish the integrity of verifiable data chains.

Overall, the ability for data to be shared across contexts whilst retaining its integrity and semantics is a critical building block of the emerging web of trust.

Learn Concepts: Verifiable Data was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Digital Wallets

In order to coordinate the authentication needs of apps and services on the web, many of today’s users will leverage services such as password managers. These tools help users keep track of how they’ve identified themselves in different contexts and simplify the login process for different services. In many ways, the need to overlay such services in order to preserve non-negotiable security proper

In order to coordinate the authentication needs of apps and services on the web, many of today’s users will leverage services such as password managers. These tools help users keep track of how they’ve identified themselves in different contexts and simplify the login process for different services. In many ways, the need to overlay such services in order to preserve non-negotiable security properties reflects the broken state of identity on the internet today. Users of these apps (i.e. the data subjects) are often an afterthought when a trust relationship is established between data authorities and apps or services consuming and relying on user data.

Asymmetry in the nature of the relationships between participants largely prevents users from asserting their data rights as subjects of the data. Users are left to deal with the problems inherent in such a model, foisting upon them the responsibility of implementing appropriate solutions to patch over the shortcomings of identity management under this legacy model.

The emerging web of trust based upon self-certifying identifiers and user-centric cryptography is shifting this fundamental relationship by refashioning the role of the user. This role (known in the VC data model as a “holder”) is made central to the ecosystem and, importantly, on equal footing with the issuers of identity-related information and the relying parties who require that data to support their applications and services.

The reframing of the user as a first-class citizen and their empowerment as ‘holder’ represents a shift towards a new paradigm. Such a paradigm offers users greater sovereignty of their own information and empowerment to manage their digital identity. Users are able to exercise their new role in this ecosystem by utilizing a new class of software known as digital wallets.

Digital wallets are applications that allow an end user to manage their digital credentials and associated cryptographic keys. They allow users to prove identity-related information about themselves and, where it’s supported, choose to selectively disclose particular attributes of their credentials in a privacy-preserving manner.

Wallets and Agents

When working with technology standards that are inherently decentralized, it’s important to establish a common context and consensus in our choice of terminology and language. Convergence on key terms that are being used to describe concepts within the emerging decentralized identity and self-sovereign identity technologies allows participants to reach a shared understanding. Consequently, participating vendors are able to understand how they fit into the puzzle and interoperability between vendor implementations is made possible.

Through dedicated research and careful coordination with the broader technical community, the Glossary Project at DIF offers a useful definition for both wallets and agents.

Wallets
Provide storage of keys, credentials, and secrets, often facilitated or controlled by an agent.
Agents
An agent is a software representative of a subject (most often a person) that controls access to a wallet and other storage, can live in different locations on a network (cloud vs. local), and can facilitate or perform messaging or interactions with other subjects.

The two concepts are closely related, and are often used interchangeably. In short, the Glossary Project found that an agent is most commonly a piece of software that lets you work with and connect to wallets. Wallets can be simple, while agents tend to be more complex. Agents often need access to a wallet in order to retrieve credentials, keys, and/or messages that are stored there.

At MATTR, we tend to use the terms ‘digital wallet’ or simply ‘wallet’ to holistically describe the software that is utilized by end-users from within their mobile devices, web browsers, or other such user-controlled devices or environments. A digital wallet can be thought of as a kind of agent, though we try to make the distinction between the software that sits on a user’s device and the data managed and logic facilitated by a cloud-based platform in support of the wallet’s capabilities. We like the term ‘wallet’ because it is analogous to real-world concepts that by and large parallel the primary function of a wallet; to store and retrieve identity-related information.

User-centric Design

As end users have often found themselves the casualty of the information systems used by the modern web, there has been little opportunity to allow users to directly manage their data and negotiate what data they wish to withhold or disclose to certain parties. Under the new web of trust paradigm, the rights of the data subject are codified in standards, processes, and protocols guaranteeing the user the power to exercise agency. The interjection of the wallet to support end-users as data subjects on equal footing with issuers of identity information and relying parties provides an indispensable conduit and control point for this information that enables new opportunities for user-centric design.

The innovation in this area is only just beginning and there is no limit to the kinds of new experiences application developers can design and deliver to users. Some examples include:

Allowing users to synchronize their data across multiple applications Allowing users to self-attest to a piece of data or attest to data self-asserted by peers Allowing a user to explicitly give consent around how their data may be used Allowing users to revoke their consent for access to the continued use of and/or persistence of a particular piece of data Allowing users to opt-in to be discoverable to other verified users, provided they can mutually verify particular claims and attributes about themselves Allowing users to opt-in to be discoverable to certain service providers and relying parties, provided they can mutually verify particular claims and attributes about themselves

These are just a handful of the potential ways that developers can innovate to implement user-centric experiences. MATTR offers the tools necessary to create new kinds of wallet and authentication experiences for users and we’re excited to see what developers come up with when given the opportunity to create applications and services inspired by these new standards and technologies.

Learn Concepts: Digital Wallets was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Learn Concepts: Web of Trust 101

The original vision for the World Wide Web was an open platform on which everyone could freely communicate and access information. It was built on the decentralized architecture of the internet, used open standards, and functioned as an accessible platform that would inherit and amplify the fundamentally decentralized nature of the network that underpinned it. However, the reality today has falle

The original vision for the World Wide Web was an open platform on which everyone could freely communicate and access information. It was built on the decentralized architecture of the internet, used open standards, and functioned as an accessible platform that would inherit and amplify the fundamentally decentralized nature of the network that underpinned it.

However, the reality today has fallen far short of its founding vision. The modern internet is largely centralized and siloed. The vast majority of web traffic belongs to a few powerful corporations that control the distribution of data through platforms designed to selectively serve up information based on in-depth analysis of their users’ data. The lack of an identity system native to the internet over time has created an imbalance of power that erodes users’ digital rights.

Several decades after the web was introduced, most of us are now accustomed to widespread spam, fraud, abuse, and misinformation. We don’t have any real agency over how our data is used, and the corporations controlling our data have shown their inability to properly shoulder the responsibility that comes with it. We’re locked into this system, with no reasonable ability to opt out.

As a result, the modern internet has made it incredibly difficult to establish trust with others online, creating many barriers to participation that often leave everyday users out of the value chain. Information and data, and the value they create, are no longer freely accessible by the users creating it — most of whom are utterly unaware of the limited agency they have in accessing it. To fix this fundamental problem of digital trust, we need to begin by building a system that allows users to control their identities and to move their personal data freely from one online platform to another without fear of vendor lock-in.

Evolution of Digital Trust

The emerging “Web of Trust” is an idea that has been around since the dawn of the internet. To explain what motivated its creation, let’s take a look at how trust on the internet functions today.

Though we may not always be aware, we rely on a basic form of security practically every day we use the internet. HTTPS, the secure browsing protocol for the World Wide Web, uses a common infrastructure based on digital signatures to allow users to authenticate and access websites, and protect the privacy and integrity of the data exchanged while in transit. It is used to establish trust on all types of websites, to secure accounts, and to keep user communications, identity, and web browsing private.

Centralized PKI System

This is all based on the usage of cryptographic keys, instead of passwords, to perform security and encryption. Public key cryptography is a cryptographic technique that enables entities to securely communicate on an insecure public network (the internet), and reliably verify the identity of users via digital signatures. It is required for activities where simple passwords are an inadequate authentication method and more rigorous proof is required to confirm the identity of the parties involved in the communication and to validate the information being transferred.

The type of Public Key Infrastructure (PKI) currently used by the internet primarily relies on a hierarchical system of certificate authorities (CAs), which are effectively third-parties that have been designated to manage identifiers and public keys. Virtually all internet software now relies on these authorities. Certificate authorities are responsible for verifying the authenticity and integrity of public keys that belong to a given user, all the way up to a ‘self-signed’ root certificate. Root certifications are typically distributed with applications such as browsers and email clients. Applications commonly include over one hundred root certificates from dozens of PKIs, thereby bestowing trust throughout the hierarchy of certificates which lead back to them. The concept is that if you can trust the chain of keys, you can effectively establish secure communication with another entity with a reasonable level of assurance that you’re talking to the right person.

However, the reliance on certificate authorities creates a centralized dependency for practically all transactions on the internet that require trust. This primarily has to do with the fact that current PKI systems tightly control who gets to manage and control the cryptographic keys associated with certificates. This constraint means that modern cryptography is largely unusable for the average user, forcing us to borrow or ‘rent’ identifiers such as our email addresses, usernames, and website domains through systems like DNS, X.509, and social networks. And because we need these identities to communicate and transact online, we’re effectively beholden to these systems which are outside of our control. In addition, the usability challenges associated with current PKI systems mean that much of Web traffic today is unsigned and unencrypted, such as on major social networks. In other words, cryptographic trust is the backbone of all internet communications, but that trust rarely trickles down to the user level.

A fully realized web of trust instead relies on self-signed certificates and third party attestations, forming the basis for what’s known as a Decentralized Public Key Infrastructure (DPKI). DPKI returns control of online identities to the entities they belong to, bringing the power of cryptography to everyday users (we call this user-centric cryptography) by delegating the responsibility of public key management to secure decentralized datastores, so anyone and anything can start building trust on the web.

A Trust Layer for the Internet

The foundational technology for a new DPKI is a system of distributed identifiers for people, organizations, and things. Decentralized identifiers are self-certifying identifiers that allow for distributed discovery of public keys. DIDs can be stored on a variety of different data registries, such as blockchains and public databases, and users can always be sure that they’re talking to the right person or entity because an identifier’s lookup value is linked to the most current public keys for that identifier. This creates a kind of even playing field where the standards and requirements for key management are uniform across different users in an ecosystem, from everyday users to large corporations and everything in between.

Decentralized PKI System

This will, in the first place, give users far greater control over the manner in which their personal data is being used by businesses, allowing them to tweak their own experience with services to arrive at that specific trade-off between convenience and data protection that best suits their individual requirements. But more importantly, it will allow users to continue to federate data storage across multiple services while still delivering the benefits that come from cross-platform data exchange. In other words, it gives them the ability to manage all their data in the same way while being able to deal with data differently depending on the context they are in. This also allows them to move their personal data freely from one online platform to another without losing access to the services they need, and without fear of vendor lock-in.

Eventually, this will allow for portability not only of data but of the trust and reputation associated with the subjects of that data. For instance, a user might be able to transfer their reputation score from one ride-sharing service to another, or perhaps use the trust they’ve established in one context in another context entirely.

This emerging decentralized web of trust is being forged by a global community of developers, architects, engineers, organizations, hackers, lawyers, activists, and more working to push forward and develop web standards for things like credential exchange, secure messaging, secure storage, and trust frameworks to support this new paradigm. The work is happening in places like the World Wide Web Foundation, W3C Credentials Community Group, Decentralized Identity Foundation, Trust Over IP Foundation, Linux Foundation’s Hyperledger project, and Internet Engineering Task Force, to name a few.

Learn Concepts: Web of Trust 101 was originally published in MATTR on Medium, where people are continuing the conversation by highlighting and responding to this story.


Trinsic (was streetcred)

Trinsic Basics: The Verifiable Credentials Model

At the core of every self-sovereign identity (SSI) use case is what we call the verifiable credentials model. This simple yet effective model helps conceptualize how verifiable credentials are exchanged between people and organizations. In the SSI community, you will often hear the verifiable credentials model referred to as the “Trust Triangle”. Often called the […] The post Trinsic Basics: The

At the core of every self-sovereign identity (SSI) use case is what we call the verifiable credentials model. This simple yet effective model helps conceptualize how verifiable credentials are exchanged between people and organizations. In the SSI community, you will often hear the verifiable credentials model referred to as the “Trust Triangle”.

Often called the "Trust Triangle," this classic diagram helps conceptualize the verifiable credential model. Roles in SSI

In the verifiable credentials model, there are three roles: issuer, holder, and verifier.

 

Issuer: The issuer is the person or organization that creates the verifiable credential and gives it (i.e., issues it) to another person, organization, or thing. An example of an issuer would be a DMV which issues driver’s licenses. Holder: The holder is the person or organization to whom the verifiable credential was issued to. The holder keeps the verifiable credential in their digital wallet. An example of a holder would be the citizen that received his or her driver’s license from the DMV. Verifier: The verifier is the person or organization that gets information shared with them. They can authenticate the information they receive instantaneously. An example of a verifier would be the security agent at the airport that asks to see a person’s driver’s license to verify their identity. In this example, the DMV issues the holder a driver's license as a verifiable credential which the holder then presents to the security agent at the airport (the verifier). The security agent does not need to contact the DMV to verify if the driver's license is a valid credential. The power of verifiable credentials

Verifiable credentials are unique from other digital documents in that they allow the verifier to know the following four things without having to interact with the issuer at all:

 

The original issuing entity (the source of the data) It was issued to the entity presenting it (the subject of the data) It hasn’t been tampered with (the veracity of the data) Whether the issuer revoked the credential as of a particular point in time (the status of the data)

 

The standards and protocols associated with verifiable credentials and supporting technology make it possible for verifiers to make more informed “trust decisions” when accepting information from holders online. Based on the four factors above, verifiers can better determine to trust the information in the credential or not.

Use case diversity

The verifiable credentials model is quite flexible in the number and types of use cases it supports. In the example above, we highlighted the use case of digital driver’s licenses, but this same model could apply to a numberless amount of use cases where sharing identifying information is important. Here are a few examples:

 

Medical insurance credential: In this example, an insurance organization would be the issuer, a person would be the holder, and a doctor’s office could be the verifier. Login credential: When a person is creating an account with a company online, the company would issue the person his or her login credential. From that moment on, the person can use the login credential to access their account, eliminating the need for usernames and passwords.¹ Check out how Trinsic issues login credentials to its customers. University diploma credential: As an issuer, the university would send its graduates their digital diplomas which they could then use to apply for jobs online. Trinsic has written a tutorial on how to set up this specific use case in Trinsic Studio.

 

The SSI use cases above are pretty straightforward, but the verifiable credentials model allows for more complex situations as well. For example, SSI enables the concept of guardianship if a person does not have the ability to control their own wallet (e.g., the person is a child, has dementia, is a refugee, etc.).

 

Further, in the verifiable credentials model the holder does not necessarily need to be a person. Verifiable credentials can be issued to organizations or things. This includes licensing and accreditations for businesses as well as supply chain-related documentation for products. SSI and verifiable credentials enable digital, trusted interactions which were not possible online before.

Build your own use case

No matter your specific use case, Trinsic’s APIs make it easy for developers to integrate SSI.

 

Credentials API: Our core API enables developers to have a turnkey way to issue, verify, and manage verifiable credentials on any Hyperledger Indy network. Check out our documentation or one of our reference applications to get started. Wallet API: An API for creating and managing cloud wallets on behalf of credential holders. It’s the backend of our Mobile SDK, which you can read more about in our recent post about building your own SSI wallets. Get started with the API by checking out the documentation. Provider API: Our newest API enables developers to programmatically provision issuer and verifier cloud agents. Learn more about the provider API in the recent launch announcement.

 

Trinsic Studio is an easy-to-use web interface for managing credentials, API keys, verification policies, and more. Sign up for a free Trinsic Studio account and start building your SSI solution today.

Notes In this case, the issuer and verifier could be the same organization. Or they could be different organizations if the example is a case of federated identity (e.g. Login with FaceBook, Login with Google).

The post Trinsic Basics: The Verifiable Credentials Model appeared first on Trinsic.


KuppingerCole

The Non-Zero Elements of Zero Trust

by John Tolbert The ongoing SolarWinds incident illustrates that the much-lauded Zero Trust security paradigm is, in fact, based on trust. Zero Trust is about authenticating and authorizing every action within a computing environment. It is putting the principle of least privilege into action. In an ideal implementation of Zero Trust, users authenticate with the proper identity and authentication

by John Tolbert

The ongoing SolarWinds incident illustrates that the much-lauded Zero Trust security paradigm is, in fact, based on trust. Zero Trust is about authenticating and authorizing every action within a computing environment. It is putting the principle of least privilege into action. In an ideal implementation of Zero Trust, users authenticate with the proper identity and authentication assurance levels to get access to local devices, on-premises applications and data, and cloud-hosted resources. Access requests are evaluated against access control policies at runtime.

In order for Zero Trust to work though, there are elements that are trusted that are used to evaluate other entities for trustworthiness:

IT products – operating systems, Line of Business applications, office automation products, IaaS environments, mobile devices, IoT devices, etc. IT services – SaaS apps, IDaaS, managed hosting, Managed Detection & Response (MDR), full scale Managed Security Service Providers (MSSPs), etc. Processes – auto-updates of software, communication between vendor products and their cloud services, identity vetting, federation, authentication, and authorization Suppliers – IT vendors, IT service providers, IT security vendors, identity providers, and members of product and service supply chains

A breakdown in any of these foundational components can become an attack vector and can put organizations at increased risks. As 2020 draws to a close, cybersecurity teams are searching for signs that their organizations have been affected by the Sunburst/SuperNova/Solorigate malware and are beginning to remediate. New information about this set of events is arriving daily, and the big picture will likely take some time to materialize.

As we deal with this on the tactical level, we can also think about how this should impact longer-term security strategies. Zero Trust is a good strategy, but we must consider, expose, and evaluate the processes, products, services, and suppliers that make up Zero Trust infrastructure. Zero Trust cannot be effective if on so many levels it’s still based on blind trust. As users and implementers of technology, we will always be dependent to a degree on those technology vendors to demonstrate that their products and services are indeed trustworthy. However, we must also consider how each organization can increase the level of its own monitoring, detection, and response capabilities to guard against similar attacks in the future.

Supply chain security and risk management, particularly in the IT product and service arenas, will likely be top concerns for organizations of all sizes and types for 2021. KuppingerCole will continue to research, discuss, and publish on the products and services that constitute Zero Trust architectures.


Ontology

Ontology Weekly Report (December 15–21)

As we come to the end of December, it’s time to take a look back at what the Ontology team has been up to in the last week. We are closing out the year on a high note with two new partnerships announced last week, which includes adding our Decentralized Identity (DeID) solution to the blockchain-based e-voting system launched by Waves Enterprise. Ontology has also added pre-execution feature

As we come to the end of December, it’s time to take a look back at what the Ontology team has been up to in the last week.

We are closing out the year on a high note with two new partnerships announced last week, which includes adding our Decentralized Identity (DeID) solution to the blockchain-based e-voting system launched by Waves Enterprise. Ontology has also added pre-execution features for batch transactions. Our DeID ensures that the identities and data of people who vote are protected and verified. This week we also announced our collaboration with Litentry, and we aim to onboard 10,000 users to our DeID and OScore solutions together!

Back-end

- Added Ontology pre-execution feature for batch transactions

Product Development

ONTO

- Launched ONTO v3.6.5

- Carried out the “Gas Fee Airdrop” campaign with Binance Smart Chain. Users who transferred BEP-20 BNB in their ONTO wallets got the chance to earn BNB from a pool with the highest equivalent value of $50000. As of December 20th, more than 7,000 ONTO users participated.

dApp

- 108 dApps running on Ontology

- 6,240,890dApp-related transactions since the Ontology genesis block

- 6,587 dApp-related transactions in the past week

Bounty Program

- 1 new application for SDKs

Community Growth

- 146 new members onboarded across Ontology’s Vietnamese, Korean, Tagalog, and Indian communities.

Newly Released

- In the Polkadot Parachain test network Rococo V1 press conference, it was announced that the TestNet will be launched on Christmas Eve, December 24th. Ontology will integrate DelD onto Polkadot to build up Polkadot’s parachains and participate in the parachain slots auction in the future. Besides this, Ontology will also gradually adapt its credit scoring system OScore, DDXF, and other technologies into the Polkadot ecosystem.

- On December 15th, Ontology entered into a technological partnership with Waves Enterprise, an enterprise-grade blockchain platform combining both public and private networks as a hybrid solution. The two companies will collaborate to integrate Ontology’s advanced DeID technologies into the blockchain-based e-voting system launched by Waves Enterprise so that corporate users can seamlessly benefit from a fully decentralized approach.

- On December 17th, we partnered with Litentry, a blockchain identity management layer based on the Polkadot network. Ontology and Litentry are aiming to onboard 10,000 users to our DeID and OScore solutions, enriching aggregated identities and targeting potential customers based in Europe, the USA, Latin America, and South-East Asia.

Find Ontology elsewhere

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (December 15–21) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Infocert (IT)

InfoCert aderisce al programma della Fondazione Internazionale GLEIF per la promozione del vLEI (verifiable Legal Entity Identifiers)

La società del Gruppo Tinexta adotterà il vLEI come standard di verifica dell’identità giuridica tra le controparti all’interno del proprio ecosistema di Self Sovereign Identity DIZME. Roma, 23 dicembre 2020 – InfoCert (Tinexta Group), la più grande Autorità di Certificazione a livello europeo, annuncia la propria adesione al programma di sviluppo intersettoriale promosso dalla Global […] The po

La società del Gruppo Tinexta adotterà il vLEI come standard di verifica dell’identità giuridica tra le controparti all’interno del proprio ecosistema di Self Sovereign Identity DIZME.

Roma, 23 dicembre 2020 – InfoCert (Tinexta Group), la più grande Autorità di Certificazione a livello europeo, annuncia la propria adesione al programma di sviluppo intersettoriale promosso dalla Global Legal Entity Identifier Foundation (GLEIF) per supportare a livello globale la diffusione del vLEI.

Il vLEI èuna credenziale verificabile crittograficamente secondo gli standard W3C e contenente il LEI (Legal Entity Identifiers), il codice identificativo delle persone giuridiche reso obbligatorio dalla Mifid II per poter operare sui mercati finanziari: InfoCert, già LOU (Local Operating Unit) autorizzata da GLEIF, adotterà il vLEI come standard di identificazione all’interno del proprio ecosistema DIZME, la piattaforma decentralizzata di identità digitale basata su blockchain.

Novità dirompente del vLEI è che consentirà la verifica dell’identità giuridica tra controparti, in modo univoco e completamente automatizzato, anche al di fuori dei mercati finanziari. In questo modo, il vLEI – che di volta in volta potrà essere associato a set di credenziali specifiche, ad esempio verticalizzate per settori di mercato – diventerà abilitatore di un numero sempre più crescente di attività di business digitali. Tra queste: l’approvazione di transazioni e contratti commerciali, l’onboarding di nuovi clienti, la gestione di flussi informativi all’interno di reti aziendali di import-export e supply chain, oppure la presentazione di documenti e rapporti normativi.

Inoltre, la verifica dell’identità giuridica delle entità legali dotate di vLEI sarà automaticamente estesa anche alle persone fisiche che coprono ruoli di interesse all’interno di tali entità.

“Siamo convinti che il vLEI costituisca un’incredibile opportunità di crescita per l’intero settore ‘B2B’ a livello globale e per questo abbiamo risposto con entusiasmo all’iniziativa promossa dal GLEIF – dichiara Daniele Citterio, CTO di InfoCert, Tinexta Group – Le ‘verifiable credential’ sono già alla base del modello DIZME e il vLEI ne costituisce un’estensione naturale, che consentirà di aggiungere un ulteriore livello di trust alla nostra piattaforma nel suo complesso”. 

Implementando il vLEI in DIZME, InfoCert potrà verificare istantaneamente l’identità giuridica sia degli issuer – i partner che contribuiscono alla crescita del network con i propri set di credenziali – sia dei verifier – le terze parti che utilizzano tali credenziali all’interno dei propri processi.

Ma non solo. Operando in sinergia con Innolva, altra società del Gruppo Tinexta e leader da oltre 30 anni nei servizi di Credit & Information management in ambito corporate e finanziario,  InfoCert integrerà nei “DIZME vLEI” una serie di informazioni di dettaglio – ad esempio creditizie, immobiliari, pregiudizievoli, oppure relative ai poteri di firma – che potranno a loro volta dar vita a nuovi modelli di business. Il database Innolva vanta miliardi di datapoint da innumerevoli fonti dati ufficiali e da fonti aperte. Ogni dato segue un processo di classificazione, razionalizzazione, normalizzazione e rielaborazione finalizzato a garantire nel tempo qualità e consistenza dell’informazione.

“Innolva, garantendo l’accesso al suo patrimonio informativo a valore aggiunto, supporta la digitalizzazione delle imprese – commenta Matteo Gatti direttore Business Information di  Innolva –  e la sinergia con InfoCert rende il gruppo Tinexta un player all’avanguardia in questo settore.”

InfoCert SpA

InfoCert, Tinexta Group, è la più grande Certification Authority europea, attiva in oltre venti Paesi. La società eroga servizi di digitalizzazione, eDelivery, Firma Digitale e conservazione digitale dei documenti ed è gestore accreditato AgID dell’identità digitale nell’ambito di SPID (Sistema Pubblico per la gestione dell’Identità Digitale). InfoCert investe in modo significativo nella ricerca e sviluppo e nella qualità: detiene un significativo numero di brevetti mentre le certificazioni di qualità ISO 9001, 27001 e 20000 testimoniano l’impegno ai massimi livelli nell’erogazione dei servizi e nella gestione della sicurezza. Il Sistema di Gestione della Sicurezza delle Informazioni InfoCert è certificato ISO/IEC 27001:2013 per le attività EA:33-35. InfoCert è leader europeo nell’offerta di servizi di Digital Trust pienamente conformi ai requisiti del Regolamento eIDAS (regolamento UE 910/2014) e agli standard ETSI EN 319 401, e punta a crescere sempre di più a livello internazionale anche mediante acquisizioni: detiene il 51% di Camerfirma, una delle principali autorità di certificazione spagnole, il 50% di LuxTrust, azienda leader nel Digital Trust in Lussemburgo, e il 16,7% di Authada, Identity Provider tedesco all’avanguardia. InfoCert, infine, è proprietaria dell’80% delle azioni di Sixtema SpA, il partner tecnologico del mondo CNA, che fornisce soluzioni tecnologiche e servizi di consulenza a PMI, associazioni di categoria, intermediari finanziari, studi professionali ed enti.

Tinexta Group

Tinexta, quotata al segmento STAR della Borsa di Milano, ha riportato i seguenti Risultati consolidati al 31 dicembre 2019: Ricavi pari a Euro 258,7 milioni, EBITDA pari a Euro 71,3 milioni e Utile netto pari a Euro 28,8 milioni. Tinexta Group è tra gli operatori leader in Italia nelle tre aree di business: Digital Trust, Credit Information & Management e Innovation & Marketing Services. La Business Unit Digital Trust eroga, attraverso le società InfoCert, Visura, Sixtema e la società spagnola Camerfirma, prodotti e soluzioni per la digitalizzazione: firma digitale, identità digitale, onboarding di clientela, fatturazione elettronica e posta elettronica certificata (PEC) per grandi aziende, banche, società di assicurazione e finanziarie, PMI, associazioni e professionisti. InfoCert è la più grande Certification Authority in Europa e ha acquistato nel 2018 una partecipazione del 50% in LuxTrust e a settembre 2020 una partecipazione del 16,7% in Authada, un Digital Identity Provider con tecnologia all’avanguardia, con sede a Darmstadt in Germania. Nella Business Unit Credit Information & Management, Innolva e le sue controllate offrono servizi a supporto dei processi decisionali (informazioni camerali e immobiliari, report aggregati, rating sintetici, modelli decisionali, valutazione e recupero del credito) mentre RE Valuta offre servizi immobiliari (perizie e valutazioni). Nella Business Unit Innovation & Marketing Services, Warrant Hub è leader nella consulenza in finanza agevolata e innovazione industriale e Co.Mark fornisce consulenze di Temporary Export Management alle PMI per supportarle nell’espansione commerciale. Il 12 ottobre 2020 è stata annunciata la sottoscrizione di accordi vincolanti per acquistare la maggioranza di tre aziende per creare una nuova Business Unit Cybersecurity. Al 31 dicembre 2019 il personale del Gruppo ammontava a 1.293 dipendenti.

* * *

Per maggiori informazioni:

InfoCertPress Relations Advisor BMP Comunicazione per InfoCert team.infocert@bmpcomunicazione.it Pietro Barrile +393207008732 – Michela Mantegazza +393281225838 – Francesco Petrella +393452731667 www.infocert.itTinexta S.p.A.Corporate & Financial Communications Carla Piro Mander Tel. +39 06 42 01 26 31 carla.piro@tinexta.comMedia Advisor Barabino & Partners S.p.A. Foro Buonaparte, 22 – 20121 Milano Tel.: +39 02 7202 3535 Stefania Bassi: +39 335 6282 667 s.bassi@barabino.itSpecialist Intermonte SIM S.p.A. Corso V. Emanuele II, 9 – 20122 Milano Tel.: +39 02 771151

The post InfoCert aderisce al programma della Fondazione Internazionale GLEIF per la promozione del vLEI (verifiable Legal Entity Identifiers) appeared first on InfoCert.digital.


UbiSecure

Towards Self-Sovereign Identity with Tykn Co-Founders, Khalid Maliki and Jimmy J.P. Snoek – Podcast Episode 35

Let’s Talk About Digital Identity with Khalid Maliki, Co-Founder & Managing Director, and Jimmy J.P. Snoek, Co-Founder & CEO at Tykn. Khalid... The post Towards Self-Sovereign Identity with Tykn Co-Founders, Khalid Maliki and Jimmy J.P. Snoek – Podcast Episode 35 appeared first on Ubisecure Customer Identity Management.
Let’s Talk About Digital Identity with Khalid Maliki, Co-Founder & Managing Director, and Jimmy J.P. Snoek, Co-Founder & CEO at Tykn.

Khalid and Jimmy join Oscar for episode 35 of the podcast, discussing everything Self-Sovereign Identity (SSI) and the SSI company they co-founded, Tykn. The conversation details the ‘three pillars of SSI’ (verifiable credentials, decentralised identifiers and blockchain), how SSI fits with existing processes, what it should appear as to end users (and what level of education they need around the technology), the importance of accessibility for inclusivity, and what’s next for Tykn.

“In 5 years, people should take [SSI] for granted”

Khalid Maliki

After many years working in UX at the Dutch Ministry of the Interior, Khalid’s keen product design knowledge combined with a passion for social impact led him to put all his time and efforts into co-founding the award-winning digital ID company Tykn. Khalid believes Self-Sovereign Identity will positively impact billions of people’s lives and has advocated for its adoption on the most important stages, from the Economic Forum in Africa to the United Nations in NYC. He considers one of his biggest achievements to have co-founded a happy family.

Find Khalid on LinkedIn and on Twitter @Khalidworks.

Jimmy J.P. Snoek

Jimmy J.P. is a musician, business developer and entrepreneur, currently residing in The Hague, The Netherlands. After having worked as a professional musician in Spain and having started his first company in The Netherlands before the age of 20, Jimmy was accepted into the prestigious McGill University in Montréal, Canada and co-founded the now award-winning digital ID company Tykn. As an evangelist of data privacy and an early adopter of crypto, Jimmy has spoken about the merits of blockchain and self-sovereign identity at conferences and institutions worldwide since 2017, and has been featured in multiple publications, including The Guardian.

Find Jimmy J.P. on LinkedIn and Twitter @idforgood.

Tykn leverages blockchain technology to bring trust, privacy, and interoperability to identity. Tykn’s Ana platform allows organisations to issue tamper-proof digital credentials which are verifiable anywhere, at any time. Users can prove their ID to access services while remaining in full control of what personal data is viewed, shared & stored.

Find out more at tykn.tech.

We’ll be continuing this conversation on LinkedIn and Twitter using #LTADI – join us @ubisecure!

 

The post Towards Self-Sovereign Identity with Tykn Co-Founders, Khalid Maliki and Jimmy J.P. Snoek – Podcast Episode 35 appeared first on Ubisecure Customer Identity Management.


Okta

One Identity Across Salesforce.com and Mulesoft

Today, I’m going to show you how to plug Okta into a Force.com application (along with Mulesoft’s API gateway) using the OpenID Connect protocol (aka ‘OIDC’). By the end of this tutorial, your Force.com application will have user authentication backed by Okta and will be able to securely call your backend APIs through Mulesoft’s API gateway. Sound good? Let’s get to it! What’s Force.com?

Today, I’m going to show you how to plug Okta into a Force.com application (along with Mulesoft’s API gateway) using the OpenID Connect protocol (aka ‘OIDC’).

By the end of this tutorial, your Force.com application will have user authentication backed by Okta and will be able to securely call your backend APIs through Mulesoft’s API gateway. Sound good? Let’s get to it!

What’s Force.com?

Force.com is Salesforce’s Platform-as-a-Service (aka ‘PaaS’) which allows you to develop and build custom applications.

In this tutorial, you’ll be using the Visualforce framework to build your front end and the APEX development framework to build your back end.

What’s an API Gateway?

An API gateway is a firewall that sits between your API and your users. They range from the simplest proxies which apply throttling and white/blacklisting to fully configurable platforms with fine-grained access mapping individual permissions to specific HTTP verbs and endpoints. Realistically, using an API gateway is not necessary, but it makes some things faster, easier, and more reliable, which allows you to focus on your API.

~Keith Casey, API Problem Solver

In this tutorial, you’ll be using the MuleSoft API Gateway to protect your API and will use an access token to securely call this API through Salesforce application.

Set Up Your Okta Developer Instance

To continue, you’ll need a free Okta account. Head on over to developer.okta.com and create an Okta account if you haven’t already.

For this example, I’ll be using my own Okta tenant, https://phdemo.oktapreview.com.

Once you’ve got an acount, log into the dashboard and click Applications -> Add Application. Select Web and click Next.

This will create an OIDC application that represent your Salesforce application. Change Name to “Salesforce Custom Application” and, within the Grant Type allowed section, make sure you have Authorization Code and Implicit (Hybrid) checked. As for the others fields, leave them alone for now.

Click Done.

Take note of the Client ID and Secret as you will use these values shortly.

Set Up a Salesforce Developer Instance

You can sign up for a free Saleforce developer edition instance through this link.

Once you have signed up and successfully created your instance, you should be able to navigate to your Salesforce admin console. This will usually look something like: https://ap5.salesforce.com/lightning/setup/SetupOneHome/home.

Configure a Custom Domain for Your Salesforce Instance

From your Salesforce dashboard, navigate to Setup -> Settings > Company Settings -> My Domain.

Enter the custom domain you wish to use and click Check Availability. If the domain you’ve chosen is still available, you should get a green check saying the domain is available. Then click Register Domain. You should see the message below:

Once the custom domain configuration has been successfully applied, navigate back to the main screen and you should see some updates on the displayed page:

Make sure you click the Login button so you can test your access to the custom domain once available. Click Deploy to Users to leverage the custom domain for all your users in Salesforce.

We will revisit this page later to allow users to log into Salesforce using OIDC. We’ll do this later.

Integrate Okta as an OIDC Provider Within Salesforce

To configure Okta as an OIDC provider within Salesforce, click Setup -> Settings -> Identity -> Auth Providers from the Salesforce “Setup Admin” console.

Click New and select Open ID Connect as the Provider Type. Then fill in the fields with the red marker.

Name can be any name you wish Set URL suffix to your Okta subdomain name. In my environment, this will be phdemo. Set Consumer Key to the Client ID of the application you created earlier Set Consumer Secret to the Client Secret of the application you created earlier

For the following endpoints, we’ll be using the default authorization server that is enabled for every Okta instance.

You should be able to get your OAuth server settings from the .well-known endpoint. This URL is of the format https://<yourdomain>.okta.com/oauth2/default/.well-known/oauth-authorization-server (e.g., https://phdemo.oktapreview.com/oauth2/default/.well-known/oauth-authorization-server).

Set the Authorize Endpoint URL to https://<your domain>.okta.com/oauth2/default/v1/authorize (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/authorize) Set the Token Endpoint URL to https://<your domain>.okta.com/oauth2/default/v1/token (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/token) Set the User Info Endpoint URL to https://<your domain>.okta.com/oauth2/default/v1/userinfo (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/userinfo) Set the Token Issuer Endpoint URL to https://<your domain>.okta.com/oauth2/default (e.g., https://phdemo.oktapreview.com/oauth2/default) Set the Default Scopes to profile openid email

Once completed, it should look like this:

Next, you need to set a registration handler as part of the process. Luckily, Salesforce allows us to click a link that will automatically create the registration handler. Click it.

Once clicked, you should have something like this:

Make sure you set the field Execute Registration As to your account:

Once done with the above, click Save and you should end up with this:

Notice after saving the OIDC configuration that Salesforce has generated URLs below the OIDC config. You need to copy these values for the following fields:

Set the Test-Only Initialization URL - e.g., https://oktaoidc-dev-ed.my.salesforce.com/services/auth/test/phdemo Set the OAuth-Only Initialization URL - e.g., https://oktaoidc-dev-ed.my.salesforce.com/services/auth/oauth/phdemo Set the Callback URL - e.g., https://login.salesforce.com/services/authcallback/00D2v000002F7VWEA0/phdemo Modify the Okta OIDC Application Redirect URI

Go back to the application instance you created in the Okta dashboard and update the Login redirect URIs with the values provided above.

Once done, click Save to make sure your changes reflect in the OIDC application within Okta.

Enable Login via Okta Within Salesforce

Navigate back to Setup -> Settings -> Company Settings -> My Domain.

Click the Edit button under Authentication Configuration. You should be able to see the OIDC configuration as a checkbox option under Authentication Service. Check this so users can log in to Salesforce using Okta via OIDC.

Click Save once you are done. It should look like this:

You should be able to see additional buttons as a way to login once you navigate to your custom salesforce domain URL. I’m using oktaoidc-dev-ed.my.salesforce.com.

Set Up Your Mulesoft Gateway

You should be able to sign up for a free 30 day trial of Mulesoft through this link. Once you’ve signed up, you should be able to log into the Mulesoft Anypoint Platform.

Deploy a Sample API in Mulesoft

Once you are logged into Mulesoft, navigate to the Design Center from the sidebar navigation.

Click Create new Click Create API specification Provide any name you wish to use Select the Visual Editor radio button Click Create specification

You should be taken to a new page within the Design Center. Click Import Example then check the Contact API Tutorial option. Now, click the Import Asset button.

Then click Import & Replace. On the left-side of the page, there is one resource already available and this resource is called /contacts.

Click Publish -> Publish to Exchange. Provide a version number and click Publish to Exchange.

Set Up Mulesoft as OAuth as a Service Application in Okta

Return back to your Okta dashboard. Click Applications -> Add Application. Select Service and click Next.

Provide a name then click Done.

Take note of the Client ID and Client secret here as you will use this later on in the step below.

Integrate Okta as an OIDC Client Provider in Mulesoft

Return to the home page of the Anypoint Platform and navigate to Management Center -> Access Management.

Navigate to Access Management -> Client Providers.

Click Add Client Provider -> Open ID Connect Dynamic Client Registration and provide the following:

Name: Any name you wish Description: Any description you wish Under Dynamic Client Registration: The Issuer should be: https://<your domain>.okta.com/oauth2/default (e.g., https://phdemo.oktapreview.com/oauth2/default) The Client Registration URL should be: https://<your domain>.okta.com/oauth2/v1/clients (e.g., https://phdemo.oktapreview.com/oauth2/v1/clients) Leave the Authorization Header blank Under Token Introspection Client: Client ID: The client ID generated from the last step Client Secret: The client secret generated from the last step. Under OpenID Connect Authorization URLs: The Authorize URL should be: https://<your domain>.okta.com/oauth2/default/v1/authorize (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/authorize) The Token URL should be: https://<your domain>.okta.com/oauth2/default/v1/token (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/token) The Token Introspection URL should be: https://<your domain>.okta.com/oauth2/default/v1/introspect (e.g., https://phdemo.oktapreview.com/oauth2/default/v1/introspect)

Next, click Save then navigate to Access Management -> Environments.

Click Sandbox and make sure you add the client-provider you created earlier within the environment.

Click Update.

Set Up a Proxy and Apply a Security Policy in Mulesoft

Return to the Anypoint Platform home page and navigate to Management Center -> API Manager. Then click Manage API > Manage API from Exchange.

Search for the Mulesoft API you created earlier.

Within the Managing type, check the Endpoint with the Proxy radio button.

Select CloudHub as the default option and click Save.

You’ll be redirected to a new page. Under Deployment Configuration, set the Runtime version to 3.9.x and provide any name you wish to use for your proxy.

Take note of the Proxy Application URL as you will use that shortly.

Once the deployment is successful, you should see this screen:

On the left-hand side, click Policies.

Click Apply New Policy. Search for Open ID Connect access token enforcement and tick the box.

Click Configure Policy.

Enter openID profile email as scopes and select Skip Client ID validation. Click Apply.

You should now see a new record within the API level policies.

You have completed the configuration setup with Mulesoft. In the next step, you will integrate the Mulesoft protected API with Salesforce.

Get the Auth Provider ID from Salesforce

As per Salesforce documentation, you need to provide an 18-character identifier to get the access token from your 3rd party identity provider,

If you noticed, in the earlier steps where we created an auth provider in Salesforce, the ID only had 15-characters, meaning it is missing three.

To get the 18-character full value, you can go through the steps below.

Navigate to Setup -> Platform Tools -> Custom Code -> Apex Classes then select the Developer Console.

Navigate to File -> Open Resource.

Search for AuthProvider.obj then click Open.

Select Id and click Query twice.

You should now see a new text area. Click Execute. You should now see query results.

Take note of the Auth Provider ID value, as you will use this later on within the code snippet of the Auto-created Registration Handler.

Modify Logic for the Auto-Generated Registration Handler in Salesforce

Now, return to Salesforce. Remember the Autogenerated Registration Handler which was created earlier? You’ll need to modify it so it can do the following:

Show the access token obtained from Okta Call the Okta userinfo API endpoint to show all the user details using the access token Add the access token in the Authorization header of an HTTP request such that Mulesoft API Gateway can verify if the user is allowed to call the API protected by Mulesoft

Navigate to Setup -> Platform Tools -> Custom Code -> Apex Classes.

You should be able to see the auto-created registration handler earlier. Click the Edit link. You should be able to access an in-line code editor. Overwrite the existing code with the following code inside your class:

public String accessToken; public String callOut { get; set; } public String sub { get; set; } public String name { get; set; } public String email { get; set; } public String firstName { get; set; } public String lastName { get; set; } public String userName { get; set; } public String getAccessToken() { HttpRequest req = new HttpRequest(); Http http = new Http(); String url = ' https://<yourOktaDomain>/oauth2/default/v1/userinfo'; //change this to your Okta developer instance userinfo url req.setEndpoint(url); req.setMethod('GET'); req.setHeader('Accept', 'application/json'); req.setHeader('Authorization', 'Bearer '+ Auth.AuthToken.getAccessToken('0SO2v000000XjXAGA0', 'Open ID Connect'));//change the first parameter to the Auth. Provider ID obtained earlier. HTTPResponse resp = http.send(req); if(resp.getBody() != null) { Map<String, String> values = (Map<String, String>)JSON.deserialize(resp.getBody(), Map<String, String>.class); sub = values.get('sub'); name = values.get('name'); email = values.get('email'); userName = values.get('preferred_username'); firstName = values.get('given_name'); lastname = values.get('family_name'); } return Auth.AuthToken.getAccessToken('0SO2v000000XjXAGA0', 'Open ID Connect'); //change the first parameter to the Auth. Provider ID obtained earlier. } public PageReference fetch_data() { HttpRequest req = new HttpRequest(); Http http = new Http(); String url = ' http://protectedmulesoftapi.us-e2.cloudhub.io/contacts'; //change this to your Mulesoft API proxy endpoint you've created earlier req.setEndpoint(url); req.setMethod('GET'); req.setHeader('Accept', 'application/json'); req.setHeader('Authorization', 'Bearer '+ Auth.AuthToken.getAccessToken('0SO2v000000XjXAGA0', 'Open ID Connect')); //change the first parameter to the Auth. Provider ID obtained earlier. HTTPResponse resp = http.send(req); system.debug('INSIDE CALLOUT:'+resp.getBody()); if (resp.getBody() != null) { Map<String, String> values = (Map<String, String>)JSON.deserialize(resp.getBody(), Map<String, String>.class); String append =''; append = values.get('FirstName') + ':' + values.get('LastName') + ':' + values.get('Email') + ':' + values.get('Company'); callOut = append; } return null; } global boolean canCreateUser(Auth.UserData data) { //TODO: Check whether we want to allow the creation of a user with this data return true; } global User createUser(Id portalId, Auth.UserData data) { if (!canCreateUser(data)) { //Returning null or throwing an exception fails the SSO flow return null; } //The user is authorized, so create their Salesforce user User u = new User(); Profile p = [SELECT Id FROM profile WHERE name='Standard Platform User']; //TODO: Customize the username. Also, check that the username doesn't already exist and //possibly ensure there are enough org licenses to create a user. Must be 80 characters //or less. u.username = data.username; u.email = data.email; u.lastName = data.lastName; u.firstName = data.firstName; String alias = data.username; //Alias must be 8 characters or less if (alias.length() > 8) { alias = alias.substring(0, 8); } u.alias = alias; u.languagelocalekey = 'en_US'; u.localesidkey = 'en_US'; u.emailEncodingKey = 'UTF-8'; u.timeZoneSidKey = 'America/Los_Angeles'; u.profileId = p.Id; insert u; return u; } global void updateUser(Id userId, Id portalId, Auth.UserData data) { User u = new User(id=userId); u.email = data.email; u.lastName = data.lastName; u.firstName = data.firstName; }

Click Save.

Navigate back to the Apex Class home page.

Click Security on the AutocreatedRegHandler.

Make sure you add Standard Platform User as an enabled Profile. Click Save.

Configure Trust Over the Remote URL Called Within Salesforce

Navigate to Setup -> Settings -> Security -> Remote Site Settings. Then click New Remote Site.

Add your Okta URL (e.g.: https://{yourDomainHere}.okta.com) and click Save & New.

Add your Salesforce Mulesoft Proxy Endpoint URL (e.g., https://domain.cloudhub.io).

Click Save. You have now successfully added your Okta domain and Mulesoft Proxy Endpoint URL as trusted remote URLs.

Create a Visualforce Page to Tie Everything Together

Within Salesforce, navigate to Platform Tools -> Custom Code -> Visualforce Pages.

Click New. Provide a Label, Name, and Description.

Generally, as per best practice, you would want to create a separate controller or APEX handler for your Visualforce custom page. For this tutorial, you’ll re-use the same one you have modified earlier. Overwrite and paste the following code below:

<!-- change the control value with the class name of your controller --> <apex:page controller="AutocreatedRegHandler1380796002690"> <!-- Begin Default Content REMOVE THIS --> <h1>Congratulations</h1> This is your Salesforce page protected by Okta OIDC and OAuth 2.0 <apex:form > <apex:commandButton id="btn" action="{!fetch_data}" value="Call Protected API behind Mulesoft using OAuth JWT Token below"/> Token: <apex:outputLabel id="one">{!accessToken}</apex:outputLabel> Access Token Details from https://yourdomain.oktapreview.com/oauth2/default/v1/userinfo: <p/> <apex:outputText >Sub : {!sub}</apex:outputText> <p /> <apex:outputText >Full Name : {!name}</apex:outputText> <p /> <apex:outputText >Email : {!email}</apex:outputText> <p /> <apex:outputText >Username : {!userName}</apex:outputText> <p /> <apex:outputText >First Name: {!firstName}</apex:outputText> <p /> <apex:outputText >Last Name : {!lastName}</apex:outputText> <p /> <apex:outputText >Callout : {!callOut}</apex:outputText> </apex:form> <!-- End Default Content REMOVE THIS --> </apex:page>

Click Save.

You’re now done with the Salesforce Configuration and setup.

Testing Out Your Integration

To access or test the Visualforce page you’ve created, use the URL: https://salesforce-custom-domain-name-ed--c.visualforce.com/apex/pageName (e.g., https://oktaoidc-dev-ed--c.visualforce.com/apex/OktaPage).

In this tutorial, you integrated Okta into your custom Salesforce application using Force.com (APEX & Visualforce) as well as the Mulesoft API Gateway.

Credits and Shout out to my colleague Ewan Thomas for helping me troubleshoot the Salesforce APEX handler.

If you have more questions, feel free to leave a comment below!

Learn More About Okta and Development

If you are interested in learning more about security and .NET check out these other great articles:

Test in Production with Spring Security and Feature Flags Deploy a .NET Container with AWS Fargate The Most Exciting Promise of .NET 5 Create a CI/CD pipeline for .NET with the Azure DevOps Project

Make sure to follow us on Twitter, subscribe to our YouTube Channel and check out our Twitch channel so that you never miss any awesome content!


Coinfirm

Brand New Crypto Entrant, Mandala, Partners with Coinfirm for AML Compliance

The hottest new entrant to the blockchain scene, the Mandala exchange, has announced a partnership with Coinfirm
The hottest new entrant to the blockchain scene, the Mandala exchange, has announced a partnership with Coinfirm

Jelurida Swiss SA

Three Projects Showcasing the Features of Ardor

Three Projects Showcasing the Features of Ardor alberto Wed, 2020-12-23 - 02:03 Three Projects Showcasing the Features of Ardor ZyCrypto: It’s now three years since the Ardor blockchain launched on mainnet, the first multi-chain platform of its kind. Ardor was built by the same developers responsible for Nxt, the first pure proof-of-stake blockchain that has been opera
Three Projects Showcasing the Features of Ardor alberto Wed, 2020-12-23 - 02:03 Three Projects Showcasing the Features of Ardor

ZyCrypto: It’s now three years since the Ardor blockchain launched on mainnet, the first multi-chain platform of its kind. Ardor was built by the same developers responsible for Nxt, the first pure proof-of-stake blockchain that has been operating continuously and without interruption since 2013. Jelurida, the firm behind Nxt and Ardor, decided to develop the latter to overcome many of the problems of legacy blockchains, such as bloating, dependency on a single token, and a lack of customization opportunities. Ardor’s parent-and-child chain structure solves each of these issues. The parent chain runs on the ARDR token and is responsible for the overall security of the network. Child chains can connect into the parent chain, operating with their own native tokens. Only those transactions which affect Ardor’s proof of stake validator balances are stored on the parent chain, with other child chain transactions pruned to reduce bloating.

December 23, 2020

Tuesday, 22. December 2020

KuppingerCole

Attack Surface Reduction and XDR

by John Tolbert Many if not most organizations have moved to a risk management model for cybersecurity and identity management. Priorities have shifted in two major ways over the last decade: decreasing attack surface sizes focusing on detection and response technologies instead of prevention only Reducing attack surfaces inarguably improves security posture. Achieving the objective of

by John Tolbert

Many if not most organizations have moved to a risk management model for cybersecurity and identity management. Priorities have shifted in two major ways over the last decade:

decreasing attack surface sizes focusing on detection and response technologies instead of prevention only

Reducing attack surfaces inarguably improves security posture. Achieving the objective of reducing attack surfaces involves many activities: secure coding practices, vulnerability scanning and management, consolidation of functions into fewer products and services, access reconciliation, user de-provisioning, avoidance of over-provisioning, use of Privileged Access Management (PAM), OS and app patching, API security gateways, and so forth.

The realization that some attacks get past preventive measures had led to an increase in the prominence of detection and response techniques and technologies. However, deploying EDR, NDR, or XDR products doesn’t obviate the need for endpoint anti-malware, email/web security gateways, or WAFs.

In light of the Sunburst/SuperNova/Solorigate incident, the scope of detection must be expanded. For example, comms between agents on protected resources and remote sites cannot be overlooked. IT product binaries should not be excluded from anti-malware scans. Processes running signed code still need to be examined for signs of malicious behavior. Security log retention periods must increase. 30, 60, or even 90 days’ worth of logs is not enough to keep on hand when faced with investigations that need to go back 9+ months.

The SolarWinds incident highlights the need for Endpoint Detection & Response (EDR), Network Detection & Response (NDR), or their union, XDR. These tools are the best means for determining if your organization has had any compromises after the event. In the case of the Sunburst/SuperNova/Solorigate malware, most endpoint protection (EPP, or Next Generation Anti-Malware) didn’t recognize the software as malicious initially. These detection focused tools can look for Indicators of Compromise (IOCs) once they are identified and shared as threat intelligence.

No tool is 100% perfect, though. The attackers often acquire admin privileges and use them to remove or cover their tracks. XDR tools can help to automate the environment-wide searches to uncover evidence attackers may have inadvertently left behind.

Both approaches, reducing attack surfaces and increasing detection/response, remain valid after SolarWinds. In fact, this global cybersecurity event shows that most organizations, public and private sector, need to re-double their efforts on both objectives. Reducing the attack surface means additional vetting is needed on IT and cybersecurity tools, which primarily must happen at the vendors. Improving the operational effectiveness of XDR, SIEM, and SOAR tools will require that those who implement these tools to extend their data retention times significantly beyond the default periods, even if it increases the cost of services.

Addressing and mitigating risks in the IT supply chain, especially cybersecurity products and services, will be front-and-center for CISOs in 2021. KuppingerCole will continue to monitor and advise as warranted.


Evernym

SSI Roundup: December 2020

Welcome back to the last SSI Roundup of 2020. Today, we’re sharing a few examples of just how far the self-sovereign identity category has come in the last year and reflections on what COVID-19’s lasting impacts mean for verifiable credentials. The post SSI Roundup: December 2020 appeared first on Evernym.

Welcome back to the last SSI Roundup of 2020. Today, we’re sharing a few examples of just how far the self-sovereign identity category has come in the last year and reflections on what COVID-19’s lasting impacts mean for verifiable credentials.

The post SSI Roundup: December 2020 appeared first on Evernym.


Elliptic

Unhosted Wallets: What You Need to Know About FinCEN’s Proposal

The US Treasury’s Financial Crimes Enforcement Network (FinCEN) issued a notice of proposed rulemaking (NPRM) on Friday, December 18, 2020, regarding transactions with unhosted wallets and wallets hosted at financial institutions in certain jurisdictions. In this blog, we've summarized the current state of play including: What is FinCEN's proposed rule change for compliance

The US Treasury’s Financial Crimes Enforcement Network (FinCEN) issued a notice of proposed rulemaking (NPRM) on Friday, December 18, 2020, regarding transactions with unhosted wallets and wallets hosted at financial institutions in certain jurisdictions. In this blog, we've summarized the current state of play including:

What is FinCEN's proposed rule change for compliance when dealing with unhosted wallets? What could the proposed FinCEN rules mean for my business and crypto compliance operations? How is Elliptic responding to FinCEN's proposed rules changes for unhosted wallets?


HYPR

HYPR

The 6.7 release brings support for more enterprise use cases, improved user onboarding, and security enhancements.

The 6.7 release brings support for more enterprise use cases, improved user onboarding, and security enhancements. The highly anticipated Roaming User Support functionality is now available as a technology preview to enable users to securely unlock any company machine without requiring individual device pairing.

Take Workstation Unlock Anywhere

Not all office environments include assigned seating or even assigned machines. With a cluster of domain-joined computers and unique accounts, users can log into any desktop and start working. But how do you secure that login if you have to pair your mobile device with a specific workstation?

With HYPR’s support for roaming users, a single registration can be employed to authenticate into any workstation, as long as they’re on the same domain. In this flow, users can select to log in with a QR code and once scanned by the HYPR mobile app, they get access into the machine with their own account. This removes the need to register with multiple workstations and eliminates the burdensome overhead of account management in the mobile app.

What does this mean for your business?

The HYPR Cloud Platform 6.7 frees you from requiring a 1:1 relationship between workstation and device to securely unlock your machines. Instead, you can equip your users with the HYPR mobile app and a variety of domain-joined computers to enable safe, simple, and truly passwordless authentication.

Not only will your end-users benefit from the efficient authentication process, but your administrators will also appreciate the reduction of management as the registration is no longer tied to the machines but rather the users themselves.

Other Notable Enhancements

In addition to roaming user support, HYPR now enables authentication using Feitian K9 security keys to create an even more flexible solution. Feitian keys support FIDO authentication to create an additional passwordless login option.

Administrators can improve the device pairing and user onboarding process with dynamic magic links, which automatically adjust where the user is taken based on their device. For example, if the user is on a mobile device, then the link will take them into the HYPR mobile app to pair a new web account, or it will take them to the app store to download the HYPR mobile application; if they click the link on a desktop / laptop, then they will be redirected to the Device Manager page to pair the device.

To learn more about what 6.7 has to offer, read our latest release notes for Workforce Access 6.7 and Customer Authentication 6.7. Or, contact us and our team will be in touch.


Evernym

What makes Verity the platform of choice for production at scale?

Our best-in-class technology, stability, uptime, performance, and support will help you deploy real-world SSI implementations with confidence. The post What makes Verity the platform of choice for production at scale? appeared first on Evernym.

Our best-in-class technology, stability, uptime, performance, and support will help you deploy real-world SSI implementations with confidence.

The post What makes Verity the platform of choice for production at scale? appeared first on Evernym.


Secure Key

Touch Free ID and Digital Advancements in 2020

By: Andre Boysen The digital ID evolution that was gradually advancing prior to 2020 exponentially accelerated for both individuals, governments and businesses amid the global pandemic in March. COVID-19 increased the need to drastically evolve Canada’s digital ecosystem and forced nearly a decade of innovation into a span of just nine months – years of […] The post Touch Free ID and Digital Adv

By: Andre Boysen The digital ID evolution that was gradually advancing prior to 2020 exponentially accelerated for both individuals, governments and businesses amid the global pandemic in March. COVID-19 increased the need to drastically evolve Canada’s digital ecosystem and forced nearly a decade of innovation into a span of just nine months – years of […]

The post Touch Free ID and Digital Advancements in 2020 appeared first on SecureKey Technologies Inc..


Elliptic

Crypto Regulatory Affairs: US Treasury Proposes New Rule For Unhosted Wallets

Race against the clock: US Treasury proposes a rule-change to monitor and restrict US-registered businesses from transacting with unhosted wallets For several months now, rumors have been brewing with regards to the intention of the US Treasury to take action and impose new reporting and recordkeeping requirements for cryptoasset transactions involving unhosted/self-hosted wallets
Race against the clock: US Treasury proposes a rule-change to monitor and restrict US-registered businesses from transacting with unhosted wallets

For several months now, rumors have been brewing with regards to the intention of the US Treasury to take action and impose new reporting and recordkeeping requirements for cryptoasset transactions involving unhosted/self-hosted wallets. This past Friday, December 18th, the Financial Crimes Enforcement Network (FinCEN) released a 72-page Notice of Proposed Rulemaking (NPRM) titled "Requirements for Certain Transactions Involving Convertible Virtual Currency or Digital Assets"


Global ID

The GiD Report#140 — ApplePay finally gets the antitrust attention it deserves

The GiD Report#140 — ApplePay finally gets the antitrust attention it deserves Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. Note: There was late breaking news about the SEC lawsuit against Ripple—WSJ and Fortune. We’ll cover that in next week’s repor
The GiD Report#140 — ApplePay finally gets the antitrust attention it deserves

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

Note: There was late breaking news about the SEC lawsuit against Ripple—WSJ and Fortune. We’ll cover that in next week’s report.

This week:

Coinbase IPO + new FINCEN wallet rules Google antitrust lawsuit Facebook v. Apple (full newspaper ad) ApplePay antitrust scrutiny (it’s about time!) Stuff happens 1. It was another big week, this time led by the news that Coinbase had filed for its IPO — a week after Bitcoin reached new all-time highs over $20K.

Here’s the WSJ:

Coinbase’s filing is the culmination of the yearslong development of both the company and the cryptocurrency industry, moving from an anarchist experiment in alternative money to a mainstream asset class that has attracted hedge funds, mobile-money providers and insurance companies.
Coinbase, founded in 2012 by Brian Armstrong and Fred Ehrsam, has grown into one of the most recognizable names in the cryptocurrency industry. It currently has about 35 million users, more than Charles Schwab Corp.’s platform.
FINCEN did its best to spoil the party, however, making good on recent rumors about new wallet rules (revealed in a series of tweets by Coinbase CEO Brian Armstrong not long ago).

Here’s Dealbook:

Late on Friday, the Treasury Department proposed a new disclosure rule for certain digital currency transactions “aimed at closing money laundering regulatory gaps.” Critics of the move in the cryptocurrency community decried the proposal.
Critics are focused on the process. While one crypto executive said that the rule, which requires disclosure for certain big transactions with the Treasury Department and had been rumored for some time, “could’ve been worse,” critics took particular issue with its rollout. The department is allowing only two weeks over the holidays for public comment, which they say isn’t enough time.

Relevant:

Bitcoin Exchange Coinbase Files for Initial Public Offering Coinbase files to go public confidentially and we’re hyped — TechCrunch Largest U.S. cryptocurrency exchange Coinbase files for IPO as bitcoin soars past $23,000 2. The antitrust hits also kept coming. It was Facebook again — but also Google.

Here’s CNBC (via Jakob):

Google is set to face a new antitrust lawsuit from a group of state attorneys general led by Texas, this time targeting its advertising technology services.
The announcement follows a separate complaint from the Department of Justice claiming Google has illegally maintained a monopoly in online general search services by cutting off competitors from key distribution channels.
The complaint claims Google and Facebook, which it names a “co-conspirator,” harmed competition through an unlawful agreement to rig auctions and fix prices.

Relevant:

Via /jTexas antitrust lawsuit against Google alleges unlawful agreement with Facebook Ten States Sue Google, Alleging Deal With Facebook to Rig Online Ad Market What the Google Ads Antitrust Lawsuit Doesn’t Say How New European Rules Could Affect Amazon, Google and Other Tech Icons Google Dominates Thanks to an Unrivaled View of the Web An Amazon Rival in France Awaits New Europe Regulations FTC launches sweeping privacy study of top tech platforms 3. Facebook made news elsewhere as well, though this time from the other side of the aisle, attacking Apple in a full-page newspaper ad about new iOS platform rules regarding tracking and privacy.

We’ve talked before about how legal action is only one prong in a multi-pronged attack when it comes to antitrust. The other is competition as well as showing people/consumers a better way. Facebook’s latest skirmishes with Apple show that, well, hey, a little bit of healthy competition isn’t all that bad now is it?

Here’s the ad (via /m):

Relevant:

Via /mFacebook attacks Apple in full-page newspaper ads — 9to5Mac Via /m — Apple rebuffs Facebook criticism, says iOS anti-tracking features are about ’standing up for our users’ — 9to5Mac

And here’s the Information’s take:

It’s hard not to feel that Apple CEO Tim Cook has only himself to blame for the battle he now finds himself in with Facebook. Over the past few years he spent an inordinate amount of time preaching about the evils of the advertising-based business model of Facebook and others. It’s not a surprise Facebook CEO Mark Zuckerberg would retaliate. For all of Cook’s sermonizing, Apple is not free of sin when it comes to how its customers are treated (as anyone buying one of Apple’s high-priced cables could attest.)
And whatever the merits of Facebook’s argument — which focuses how Apple uses its control of the App Store to hurt developers and advertisers — one thing seems clear. The dispute between the two companies foreshadows a loosening of the gatekeeper powers enjoyed by all the big tech companies, whether Apple, Google or Facebook.
After all, it’s one thing for small companies to complain about the power of big companies. That happens all the time and sometimes feels like sour grapes. But when members of the big tech cartel are also speaking out about the gatekeeper status of one of their colleagues, you know something is wrong.
The whole point of scale in business is that it empowers you to fight others with scale. What Facebook’s complaints suggest is that even great scale isn’t enough to deal with one company’s control of the mobile ecosystem. And Facebook isn’t the only big tech company to complain about Apple’s App Store — Microsoft has also raised concerns about it.
There used to be a saying about the hazards of attacking a newspaper: “Never pick a fight with someone who buys ink by the barrel.” Perhaps Tim Cook should have realized that he shouldn’t pick a fight with someone who has three billion users. — Martin Peers
3. Speaking of Apple, ApplePay is also getting caught in the antitrust crossfire.

Which makes sense. We’ve all about had it when it comes to Big Tech leveraging their platforms to force us into anti-competitive payment rails.

Here’s the FT:

Photo: Shinya Suzuki
Apple’s next antitrust battle is shaping up to be over Apple Pay, the company’s digital wallet, as the Covid-19 pandemic turbocharges use of contactless payments.
The Cupertino company has spent much of 2020 fighting allegations of anti-competitive behaviour in its App Store. By contrast, its financial services business has attracted little attention in the US, meriting only a passing mention in a 450-page report by Congress’ antitrust subcommittee in October.
But Apple Pay is growing fast as consumers try not to touch buttons or handle cash because of the coronavirus. Jennifer Bailey, head of Apple Pay, said this month that contactless payments have gone from “being a convenience to a matter of public health.”
Jason Gardner, chief executive of California-based payments platform Marqeta, said: “They are really building out, within the Apple Pay team, a financial services juggernaut.”
“Apple Wallet is a killer app — more of a killer app than much of the world really understands right now. And it’s absolutely going to become a battleground for regulators in the future.”
In June, the European Commission opened a formal antitrust probe into Apple Pay and this month competition regulators in the Netherlands launched their own investigation. Australia’s central bank chief Philip Lowe recently said Apple’s approach to payments was “raising competition issues”.

Relevant:

Apple Pay draws antitrust attention 4. Stuff happens Ben Thompson: Social Networking 2.0 The brewing workplace debate over vaccinations Webinar: IATA Travel Pass — Evernym Even with vaccine, COVID tests will be the passport to travel in 2021 American Express Dives Deeper Into Crypto With Trading Platform Investment U.S. accuses Switzerland and Vietnam of currency war “A grave risk”: As Trump remains silent, massive cyberhack increasingly looks like act of war Via /ctomcNo cookie for you — The GitHub Blog Guggenheim’s Scott Minerd Says Bitcoin Should Be Worth $400,000 ASC X9 Issues First Blockchain Terminology Standard Pornhub: Now Accepting Crypto Only — Decrypt DoorDash & Payfare Partner To Launch DasherDirect Visa Card & Mobile Banking App JPMorgan Using Blockchain to Move Billions in Repo-Market Trades U.S. Congress bans anonymous shell companies Central Bank Digital Currency and the future Visa publishes new research China-Based Executive at U.S. Telecommunications Company Charged with Disrupting Video Meetings Commemorating Tiananmen Square Massacre Via /pstavSquare’s Cash App Now Lets Customers Get Bitcoin Back on Purchases — CoinDesk Why we won’t see sweeping mandates for coronavirus vaccines Why Ledger Kept All That Customer Data in the First Place — CoinDesk Via Jeff — Natalie Luu on LinkedIn: 2021 will be the year for FinTech infrastructure startups to disrupt | 59 comments

The GiD Report#140 — ApplePay finally gets the antitrust attention it deserves was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Authenteq

2020 in Review: KYC and AML by the Numbers

The post 2020 in Review: KYC and AML by the Numbers appeared first on Authenteq.

Forgerock Blog

The Future of Identity: ForgeRock Shares 2021 Predictions

While 2020 has been a roller coaster of a year, there has been one note of certainty: digital transformation has accelerated at an unprecedented rate, and identity and access management (IAM) is a big part of that evolution. In anticipation of 2021, we asked four of our experts to share their perspectives on what you can expect in the new year. Allan Foster, Chief Evangelist: “Move over multi-f

While 2020 has been a roller coaster of a year, there has been one note of certainty: digital transformation has accelerated at an unprecedented rate, and identity and access management (IAM) is a big part of that evolution. In anticipation of 2021, we asked four of our experts to share their perspectives on what you can expect in the new year.

Allan Foster, Chief Evangelist: “Move over multi-factor authentication. User managed access (UMA) will reign supreme in 2021.”

With more services online than ever before, users have come to expect amazing digital experiences. To keep up with these expectations, digital experiences will need to involve more than one identity as more organizations start to embrace the idea of delegation. Often, authorized users are geographically separated or using a variety of devices – and these accounts or devices may not even be connected. 

Here’s a common example. When renting out Airbnb lodging, the host needs to grant guests access to connected things like the thermostat, smart TV, and other devices for the duration of their stay. To make this happen, the host needs to be able to work with more than one identity associated with each device and delegate access to guests. Of course, once the guest departs, access to the devices needs to be revoked. Situations like this drive the need to focus on how to work with a collection of identities.

Enterprises have a similar situation, with a population of users needing access to specific applications at certain times, but not necessarily all the time. Many applications rely on traditional means of authentication, like multi-factor authentication (MFA), but transactions that involve more than one person or identity are not really an authentication problem. This is where advanced technologies like user-managed access (UMA) can help customers and employees manage who is allowed access to their resources, for how long, and under what circumstances. Essentially, UMA facilitates the connections of the identities while optimizing the user experience. 

In 2021, solutions that provide a convenient central management system for organizing digital resources that reside in many locations, delegating access to others, and monitoring and revoking access when necessary will take over from traditional authentication and MFA controls. 

Ben Goodman, Senior Vice President: “ 2021 will be the year of ambient identification methods as organizations shift to ‘zero login’.”

Now that passwordless authentication technology, such as biometrics, is widely used, we will see a shift toward a “zero login” process, which removes friction for the user unless there is an issue with the initial authentication. 

There are huge upsides to this. There will be no credentials to remember, and MFA will be silent on the back end. Zero login will be more secure than using a password, username, or MFA because it can use factors such as device enrollment and device reputation, fingerprints, keyboard typing patterns, the way the phone/device is held, and other markers to verify identity in the background while the user enjoys a frictionless experience.

For zero login to be successful, all these identity verification factors must be measured and combined in a transparent way so that consumers don’t feel their privacy is being compromised. Organizations should also have the option to introduce authentication steps into the process if they prefer to introduce more friction for bigger or riskier actions. This is the approach online retailers like Amazon take when customers want to purchase big ticket items. By not allowing the “buy in one click” option for purchases over a certain dollar amount, they are adding friction to the purchase process to ensure the buyer is who they say they are. Rather than only authenticating at the “front door” with passwords or MFA, extra security steps will be added right at the point of potential fraud during the transaction to create a better digital experience for users. Zero login enables smarter authentication that adjusts as necessary for a more seamless login experience across an individual's devices. 

Eve Maler, CTO: “IT will infuse access governance with intelligence to protect workforce cybersecurity in 2021.”

Accelerating changes in enterprise technologies, cyberthreats, and the user landscape are increasing pressure on traditional identity governance and administration (IGA) solutions and, in turn, on security and compliance teams. On top of growing compliance risks, enterprise IT environments become more complex every year, increasing the number of applications and systems that need to be accessed by their users. These challenges are driving organizations to seek out artificial intelligence (AI)-driven solutions that simplify and automate the access request, access approval, certification and role-modeling processes.

In 2021, we will see AI increasingly employed to enable an autonomous identity approach. AI-infused authentication and authorization solutions will be layered on top of, or integrated with, existing IGA solutions, providing contextual, enterprise-wide visibility by collecting and analyzing all identity data and enabling insight into different risk levels of user access at scale. The use of AI will allow systems to identify and alert security and compliance teams about high-risk access or policy violations. Over time we will see these AI systems produce explainable results while increasing automation of some of the most complex cybersecurity challenges within the enterprise. 

Mary Writz, Vice President, Product Management: “National identities will become more prevalent."

There are two fundamental shifts in the way we view and define digital identity on the horizon for 2021.

First, non-human identities will be just as important as human identities. While we often associate digital identity with a person, many “things” will need identities – from watches to wristbands, from supervisory control and data acquisition (SCADA) sensors to medical equipment, and even DevOps containers and Kubernetes resources. While the number of human identities may grow at a slow pace, the number of non-human identities will explode. For example, enterprises want to attach identities to machines, such as virtual machines, hosts, or containers in order to control security, as well as spend on cloud computing. The ratio of humans or developers to machine identities is 1:200 and still growing.

Second, national identities will become more prevalent as national, state and local governments transform to provide services primarily in digital format. COVID-19 is driving the need for new services like contact track and trace and remote access to benefits services, which will continue globally. For example, the new Japanese prime minister has aggressively called for the digitization of government and a new digital agency that will be established to drive “e-everything.” In the U.K., we saw the emergence of an NHS COVID-19 contact tracing app that citizens could use to enter pubs and restaurants. These examples show how this trend is already evolving.

We hope that you find our predictions insightful and that they help you to uncover new ways to look at the power of identity. We’re looking forward to an exciting year ahead, full of new innovations that will continue to shape the ever-evolving digital identity landscape. 


API Security in Action With the ForgeRock Identity Platform

To celebrate the launch of my book, API Security in Action, which was just published by Manning Publications, I've teamed up with my employer, ForgeRock, to demonstrate how some of the techniques in the book can be accomplished with less effort using the ForgeRock Identity Platform.  API Security in Action discusses five primary security mechanisms you can use to strengthen your applicatio

To celebrate the launch of my book, API Security in Action, which was just published by Manning Publications, I've teamed up with my employer, ForgeRock, to demonstrate how some of the techniques in the book can be accomplished with less effort using the ForgeRock Identity Platform. 

API Security in Action discusses five primary security mechanisms you can use to strengthen your application programming interfaces (APIs) against common threats:

Encryption of data in transit and at rest ensures confidentiality and integrity. Rate-limiting reduces the damage of denial-of-service (DoS) attacks. Authentication checks users are who they say they are. Authorization ensures that users can't do anything they aren't allowed to do. Audit logging allows users to be held accountable for their actions.

 

In the book, you'll learn in detail how to build these features into your APIs and really understand how they work and why you need them. ForgeRock's comprehensive Identity Platform can get you up and running with all of these security controls in no time. Let's take a closer look at a few examples.

API Protection at the Edge With Identity Gateway

One of the core security controls in API Security in Action is the use of rate-limiting to protect against distributed denial-of-service (DDoS) attacks. To get the most from this protection, you really want to push rate-limiting as far out to the edge of your network as possible to reject requests early before they consume significant resources. ForgeRock Identity Gateway (IG) provides a suite of functionality for protecting your APIs at the edge, including sophisticated rate-limiting and throttling filters.

However, IG can do much more than just rate-limiting, and is a veritable Swiss Army Knife of API security. The Cross-site request forgery (CSRF) protection discussed in Chapter 4 is built right into IG as a general-purpose filter. It can also handle sophisticated authentication and single-sign on flows, protecting APIs with JSON Web Tokens, OAuth2, and OpenID Connect. If that's not enough, you can extend IG with scripts and Java plugins to implement almost any API security pattern described in the book.

Powerful Authorization Options

Perhaps the most important topic in the book is authorization: working out who is allowed to do what. You'll learn about three important topics in depth in authorization:

Identity-based authorization, including role-based access control (RBAC) and attribute-based access control (ABAC). Capability-based access control, which uses fine-grained tokens that act a bit like keys in the real world. Delegated authorization, using OAuth2, allowing a user to delegate some of their authority to a third-party application or service.

One of the strengths of the ForgeRock Identity Platform is its comprehensive support for authorization technologies, through the ForgeRock Access Management product (AM) and policy enforcement points, such as IG or our dedicated policy agents. AM's powerful policy engine can be used to implement sophisticated access control decisions, fully integrated with it's mind-blowing intelligent authentication.

As you might expect, AM also provides one of the most advanced OAuth2 authorization server implementations available today, with excellent support for the latest best practices. IG's Resource Server Filter makes accepting access tokens at your APIs easy and secure, including support for advanced features like OAuth mutual TLS discussed in Chapter 11. The latest release of AM also supports issuing access tokens as Macaroons, a powerful new token format described in Chapter 9, which brings many of the benefits of capability-based security within the framework of existing OAuth2 standards.

Kubernetes-Ready and Rocking the IoT

The final third of the book looks at protecting APIs in two increasingly important deployment scenarios:

Microservice architectures running in a Kubernetes container orchestration environment. Constrained devices operating in the industrial or consumer Internet of Things (IoT).

ForgeRock has invested heavily in recent years in ensuring that our Identity Platform works well in a Kubernetes (k8s) environment, and provides recipes for deploying our products to k8s. Our ForgeRock Identity Cloud runs on Kubernetes, so we've got deep knowledge on how to scale and secure deployments in this platform, some of which is distilled into Part 4 of the book.

IoT environments are very different from the comfortable world of servers running in data centers, both due to the constrained nature of the devices involved and how exposed they can be to external threats. ForgeRock has spent several years investigating the challenges of these environments and developing our ForgeRock Things offerings that can help secure IoT applications and integrate devices with our identity platform.

Get 40% off your copy of API Security in Action here using this code: forgerock40. I hope you enjoy it!

 


Civic

Civic Milestones and Updates: Second Half of 2020

During the second half of 2020, Civic has continued charging ahead, making strong gains with our Civic Wallet app, building out Health Key by Civic, and providing our users with a best-in-class experience by transitioning Civic Secure Identity users to Civic Wallet. Key Milestones Following the app store launch of Civic Wallet in June 2020, […] The post Civic Milestones and Updates: Second Half

During the second half of 2020, Civic has continued charging ahead, making strong gains with our Civic Wallet app, building out Health Key by Civic, and providing our users with a best-in-class experience by transitioning Civic Secure Identity users to Civic Wallet.

Key Milestones

Following the app store launch of Civic Wallet in June 2020, thousands of users have created Civic Wallets. Civic Wallet users love that their cryptocurrency is protected by our $1M Cryptocurrency Protection Guarantee (see Terms & Conditions) and is hassle-free, without the need to remember complicated passwords. In the words of one Civic Wallet reviewer: “[It’s]…the most secure way to store your crypto, guaranteed.”

Enhancing the value of Civic Wallet for our users is a top priority. This is why we’ve contacted many of our users to get their unfiltered feedback and improved our app to give them the features they need to make crypto easier. We were also proud in the second half of 2020 to expand our global coverage, which now includes 195 countries. Our customers live around the world, and Civic has them covered where they need to be.

We’ve also expanded Health Key by Civic, our health status verification service, which will help create safer working environments. Health Key by Civic is now available to help reopen daily life in any use case where health information is required for access, and across borders. This summer, Civic presented an overview of the service to a forum convened by the United Nations World Tourism Organization. Sharing health status is easy for consumers, who can access Health Key by Civic through their Civic Wallet. With Covid, navigating border regulations has never been harder, but we’re making the process easier for businesses and other organizations. Our new demo video and the recent deep-dive article appearing in BeInCrypto help explain how the system works. Finally, we made strides in laying a strong foundation for future health status legislation through our work with the State of California.

When we designed and launched Civic Wallet earlier this year, we took into account lessons learned from Civic Secure Identity to build a better, more flexible experience for users in Civic Wallet. With the improved user experience in place, we’ve removed Civic Secure Identity from app stores and asked our users to download Civic Wallet. By streamlining our processes, we’re better able to serve all of our users.

Civic in the News

During the second half, Civic was covered in a number of publications, including Cointelegraph, Biometric Update, Identity Review, and Crowdfund Insider. As a first-mover in the blockchain space for health status, Civic was also featured in an industry profile in Cointelegraph. Civic CEO Vinny Lingham was also featured in podcasts and events, such as BlockspeakThe Curious Cult and Blockdown.

What’s Next for Civic

As we take a look around the corner into 2021, Civic is already hitting the ground running with our stand-alone identity verification service for businesses, including more flexible options and pay-as-you-go pricing. Compliance has never been more critical than in 2020, where a remote world needs secure systems to work. Civic Identity Verification does just that by ensuring that commerce with customers is compliant across KYC, AML, GDPR, CCPA and HIPAA regulations. 

In addition, we’re focused on enhancing the value of Civic Wallet for our users. As they continue to bring in and hold BTC, our users are assured knowing that their crypto is protected with a $1M Cryptocurrency Protection Guarantee. We’ll be introducing more features in 2021 that are sure to appeal to our loyal users.

The post Civic Milestones and Updates: Second Half of 2020 appeared first on Civic Technologies, Inc..


Caribou Digital

Female livelihoods in the gig economy: tensions and opportunities

Diagnostic: Female livelihoods in the gig economy Tensions and opportunities Diagnostic is a series of essays and hosted conversations exploring the challenges of building more inclusive Digital economies. For hosted (virtual) conversations, like the one described below, Caribou Digital convenes a diverse set of experts and thought leaders with unique insights on an issue, and facilitates t
Diagnostic: Female livelihoods in the gig economy Tensions and opportunities

Diagnostic is a series of essays and hosted conversations exploring the challenges of building more inclusive Digital economies. For hosted (virtual) conversations, like the one described below, Caribou Digital convenes a diverse set of experts and thought leaders with unique insights on an issue, and facilitates the exploration of a topic together.

For this conversation, we had 12 experts on women and platform work in emerging markets, ranging from those running platforms to donors, private sector, academia and advocacy. The conversation was under Chatham House rules so this is an edited summary of the key themes and not attributed to any one person. However, we have a list of names and resources discussed at the end of this piece.

A still from a promotional video for HelloTask, illustrating the importance of identity and professionalism What’s the landscape in a COVID-19 world?

As we began our Diagnostic discussion, we acknowledged that COVID-19 illustrated the worst case scenario for many platform workers, and even more so for women. Research by Caribou Digital and Qhala in partnership with Mastercard Foundation has shown that platform work for women holds many tensions, and this has been further heightened by the lack of a safety net during COVID-19. The myth of flexible work may in many cases be no more than cheap labour in disguise, and self employment a way for employers to shift responsibility to workers. This can impact women more so than men. COVID-19 also brought home the reality (perhaps obvious) that women still carry most of household duties while also working, another flip side of flexibility which puts more strain on women trying to combine both kinds of work.

Below, we distill our discussion into three tensions, three actions and the way forward on women and platform work, especially in emerging economies.

The tensions of formal, flexible work

On the one hand, the formality of platforms was seen as a plus point. Being given an identity (e.g. a badge, a relationship) was something women valued, according to one platform provider. It had a similar appeal for the customer: “I think, people use a platform rather than go the traditional way for the responsibility a platform always takes. The way Uber verifies rides, indemnifies any problems users face and takes users’ feedback seriously…every platform should take the responsibility of ensuring a digital identity, verifying ID cards and including for finance”. However, too many restrictions have made platform work unfair. A report on Kenya, for example, showed that drivers were “worse off” after the introduction of Uber. Platform lock-in can also be disadvantageous to the worker (for example, when a woman works across different companies or platforms… shouldn’t they be able to have ownership over their identity and ratings)?

Equally, in theory, platforms provide an opportunity for women’s emancipation, notably by providing “flexible” work, i.e. flexible hours, working from home etc. Yet, even pre-COVID, many platforms fell short of protecting their workers. Now, the flexibility benefits of platform work, especially for women, are being questioned, as a kind of “bogus self-employment”. From a gender perspective, real change is unclear. One participant stated that “platforms do nothing to change the gendered occupational segmentation that characterises offline labour markets around the world”.

Building identity…or having to fake it

What do platforms do regarding workers’ sense of identity? On the one hand, platforms provide a sense of identity and belonging: “helpers are motivated to work on HelloTask’s platforms because of the identity they get on our platform rather than working like a ghost.” On the other hand, workers can fake their identity to counter the racial or gender biases that exist on platforms. For example, some female workers in Kenya were adopting “white, male” identities to get (more) work, depending on the type of work. In some sectors, being a woman will be preferred, (e.g. beauticians, domestic work). So how much does one need to identify oneself?

Broad reach versus high-tech

Despite this significant opportunity for women’s emancipation mentioned above, how can we ensure we move beyond only higher skilled women having the opportunity for digital work? One study mentions the increasing integration of social commerce for female-headed small and medium enterprises. However at the same time the risk of “too much tech” was brought forward — with a large part of potential gig workers (the “jua kali” type workers, as described in Kenya’s informal economy) not being digitised, not having smartphones let alone computers. Women are particularly affected by the digital gap, often only accessing feature phones and either unable to participate or participating through intermediaries, for example giving someone else’s contact number or bank account for payment, which increases the risk of dependence.

What are the priority actions and who is responsible for them?

Making platforms accessible to all — including the lower income and often unemployed communities, especially women was one of the key points. How to ensure that inclusion and who is responsible for it is a harder “hot potato” question. We agreed that platforms had a significant role to play, in some cases along with funders/investors, regulators/policy bodies, workers themselves and workers’ organisations.

More feature phone integration

One way of encouraging scale and inclusion is by integrating basic feature phone access to platform services. For example HelloTask used IVR to connect with domestic workers when there is a job for them as they rarely had smartphones. Similarly, using SMS and USSD services is key to ensuring platforms reach all workers — including at the last mile. However, there was skepticism on interest and investor funding on this: “I wonder what chances you’d get in 2020 at getting funding for an SMS-based service when most money seems to be drawn to the web and apps”.

Address analogue gaps such as ID

Women are particularly at risk of not having their own ID credentials resorting to their family’s ID (often father/brother) which raises a first question as to whether platforms can change the existing informal power relationships or do they leave them unchanged. Intertwined with ID requirements are payments and financial inclusion as not having an ID may often lead women to being financially excluded. Platforms’ payments structures often do not favour workers: “wage management and payment processes are set up for the convenience of the “employers” who are upper/middle class clients.” This not only limits access but also impacts on other issues such as credit scoring or financial independence. Platforms and digital financial service providers could push for these types of processes to be put in place.

Invest in education and training

Education and training of workers is required to ensure that they can fully leverage the opportunities platforms provide — especially in order to get more skilled and better paid gig work online. Currently, to fill the existing training and knowledge gaps, workers organically regroup on social media platforms to create self-help groups. However this may not be sufficient and women may not have the time or possibility to join these groups in the first place.

Platforms seem to take on the responsibility of trust in the worker for the customer’s sake (e.g. vetting a domestic helper) but not necessarily being responsible for the worker’s health and safety or their training: “platforms avoid training as it can be used for a claim that they have misclassified independent contractors as employees/workers…this is why algorithmic control is preferred to explicit methods of control / direction / training in many cases”.

The way forward

As a group, we agreed that all stakeholders had a responsibility, especially platforms themselves — be it in training or security. Platform work should not just be work but meaningful and dignified work. This is obviously not just an emerging markets issue. Governments around the world — the Canadian government was mentioned as an example — and regulators do not always seem to know what to do as this is still a nascent area. As a whole, we could devise metrics on women’s inclusion and safety on platforms (see the work of the Fairwork Foundation for example). One suggestion was that: “development agencies and investors could play the role of the regulator to ensure certain metrics are met when it comes to ensuring gender parity”. Worker rights movements or organisations, women’s rights bodies and unions could exert pressure. Private bodies such as Perks could also create sustainable models for gig workers’ benefits for safety nets and guidance on how to ensure that platforms provide decent work to workers, in particular women. In short, there is still much work to be done on women and platform work worldwide, including in emerging markets.

Participants Anjali Banthia, Tala Dr. Jamie Woodcock, Open University, https://www.jamiewoodcock.net Jessica Colaço, Brave Venture Labs Dr. Kathi Kitner, Payments, Google Katie Highet, FinEquity, convened by CGAP Ken Banks, Yoti Mahmudul Hasan Likhon, HelloTask Rani Deshpande, CGAP Shikoh Gitau, Qhala Shirley Mburu, BFA Global Shyam Krishna, Royal Holloway, University of London Dr. Tricia Williams, Mastercard Foundation

Other participants chose to remain anonymous

Female livelihoods in the gig economy: tensions and opportunities was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Jelurida Swiss SA

Four innovative crypto companies to watch in 2021

Four innovative crypto companies to watch in 2021 alberto Tue, 2020-12-22 - 01:57 Four innovative crypto companies to watch in 2021 CryptoNewsFlash: 2020 was a landmark year for Jelurida, the software development firm behind the Ardor and Nxt blockchains. Back in June, at the height of the pandemic, it was announced that the Austrian government would leverage its
Four innovative crypto companies to watch in 2021 alberto Tue, 2020-12-22 - 01:57 Four innovative crypto companies to watch in 2021

CryptoNewsFlash: 2020 was a landmark year for Jelurida, the software development firm behind the Ardor and Nxt blockchains. Back in June, at the height of the pandemic, it was announced that the Austrian government would leverage its Ardor multi-chain architecture to facilitate secure communication between authorities, institutions and citizens, particularly as pertaining to data exchange relating to COVID-19 tests. The same government also funded HotCity, a sustainable energy project that launched on Ardor in May.

With a series of novel projects building on both platforms, and post-Covid management likely to be the focus of many governments’ attention, it’ll be fascinating to see how Jelurida’s solutions develop in the year ahead. Of course, enterprise use is just part of it; the company recently developed an on-chain version of Bridge on its child chain Ignis, with Jelurida Director Lior Yaffe opening the door to tokenized gaming. Might we see more of this in 2021?

December 22, 2020

Monday, 21. December 2020

HYPR

HYPR

It’s shaping up to be a strong month for cybersecurity rom-coms and hacker action movies.

*SATIRE WARNING*

Everyone likes a good hacker flick. And ever since Mr Robot wrapped up there’s been a void in quality infosec dramas.

The good news is January 2021 is shaping up to be a strong month for cybersecurity rom-coms and hacker action movies. Here’s a peek at what’s coming to Hulu throughout the month of January.

Passwords on a Plane
An FBI agent boards a flight from Boston to San Francisco, escorting the country’s most wanted script kiddie. As he tries to connect to the $49 WiFi, our hero finds out the hardcoded password has been changed – and no one knows what it is. Panic ensues at 30,000 feet as hundreds of Bay Area residents become suddenly unable to access Slack or Gmail. Can they band together to crack one WiFi password?

13 Going On 30 Character Passwords
In this feel-good fairy tale, a teenage security engineer dreams about starting a company to eliminate passwords. Suddenly, her secret desire becomes a reality and she is transformed into a 30-year-old Cybersecurity CEO. But entrepreneurship, with its own set of growing pains, isn’t as easy as it sounds…

Locked Up
Some guy forgets his laptop password. This Seth Rogen classic needs no introduction.

Crazy Rich Vendors
A CISO is eager to accompany her best friend, a startup founder, to the RSA Conference in Singapore. She’s also shocked to learn that his company is worth billions, and he has spent the entire marketing budget on the event. Thrown into the chaos of the world’s biggest security conference, the two of them must fight through family, friends, and founders to prove that it’s worth spending $5,000,000 on a booth.

Mission Impossible: Password Protocol
Blamed for a phishing email attack on the Kremlin, Ethan Hunt (Tom Cruise) and his team are disavowed by their top software reseller. Forced to go “off the grid” — Hunt must somehow clear their name and prevent another large-scale password reset. You won’t believe the twist in this Oscar-nominated thriller.

Not Another Hacker Movie
An aspiring DevOps engineer who wears paint-splattered overalls is outcast by her peers for rebelling against the use of Kubernetes. The team that brought you American Spy is back with this lighthearted comedy and a large array of zany takes on cybersecurity! 

There you have it. We’ll update the list as more items become available. Meantime, if you’re eager for more security content and can’t wait until January, check out our library of passwordless demo videos.


Smarter with Gartner - IT

Top 10 Smarter With Gartner IT Articles in 2020

Whether top trends across the organization or learning how to lead through COVID-19, executives were looking for insight and guidance on what step to take next. In a year of disruption, here are the top 10 IT articles for Smarter With Gartner. No. 1:  5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020 This is the most popular Gartner Hype Cycle each year, and acts as a

Whether top trends across the organization or learning how to lead through COVID-19, executives were looking for insight and guidance on what step to take next. In a year of disruption, here are the top 10 IT articles for Smarter With Gartner.

No. 1:  5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020

This is the most popular Gartner Hype Cycle each year, and acts as a roadmap for CIOs and other leaders looking to be aware of and knowledgeable about new and emerging technologies. This year’s Hype Cycle highlighted 30 different technology profiles — from a list of 1,700 — organized into five different must-know trends from beyond silicon to digital me. Read now. 

No. 2:  4 Actions to be a Good Leader During COVID-19 Disruption

It’s difficult to lead during times of great disruption. Mary Mesaglio, Gartner Distinguished VP Analyst, offers four key actions for leaders based on her work with executive leaders and their teams. She specifically focuses on actions for leaders at any level, and especially those that might be overlooked. Read now.

No. 3:  Gartner Top 10 Trends in Data and Analytics

From smarter, faster, more responsible AI to the decline of the dashboard, these top 10 trends for data and analytics will help guide leaders as their organizations move forward into 2021. Given the sharp shifts in the market and changing needs of the enterprise, these trends are key for a successful post-COVID world. Read now.

No. 4:  Gartner Top Strategic Technology Trends for 2021

One of the biggest branded pieces of Gartner research for the year, the Gartner Top Strategic Technology Trends highlights nine technologies that will have a significant impact on organizations in the next five to 10 years. This year’s list, released during the virtual 2020 Gartner IT Symposium/Xpo®, features a video, infographic and e-book download with all the details. Read now.

No. 5:  Gartner Top 9 Security and Risk Trends for 2020

The shortage of technical security staff, the rapid migration to cloud computing, regulatory compliance requirements and the unrelenting evolution of threats continue to be the most significant ongoing major security challenges in this top trends list. Read now.

[swg_ad]

No. 6:  4 Trends Impacting Cloud Adoption in 2020

As cloud computing rapidly proliferates across enterprise IT, CIOs must pay attention to four aspects of cloud computing that will affect their adoption of services in 2020. Read now.

No. 7:  Coronavirus: CIO Areas of Focus During the COVID-19 Outbreak

From leveraging technology to addressing customer demand to expanding digital workplace resources and access, CIOs had a lot to focus on during the onset of the COVID-19 outbreak, including how to prepare systems to safely and reliably handle a vast increase in remote workers and digital fulfillment. Read now.

No. 8:  6 Trends on the Gartner Hype Cycle for the Digital Workplace, 2020

From the new work nucleus to the distance economy, the Gartner Hype Cycle for the Digital Workplace, 2020 highlighted top digital workplace trends for the upcoming years. COVID-19 drove a lot of changes in 2020. From meeting solution software to enterprise chat platforms to desktop-as-a-service, the pandemic rapidly elevated many digital workplace technologies from nice-to-have to must-have status. Read now. 

No. 9:  Gartner Top 10 Strategic Predictions for 2021 and Beyond

Based on one of the most popular sessions at Gartner Symposium IT/Xpo, the Gartner Top Strategic Predictions for 2021 and Beyond highlight areas leaders should be focused on over the coming years. This year’s list included DNA storage trials, CIOs as COOs, voice of society metrics and using on-site childcare as a way to entice employees. Read now.

No. 10:  2 Megatrends Dominate the Gartner Hype Cycle for Artificial Intelligence, 2020

Despite the global impact of COVID-19, 47% of artificial intelligence (AI) investments were unchanged since the start of the pandemic, and 30% of organizations actually planned to increase such investments, according to a Gartner poll. Two megatrends drove the technologies on this Hype Cycle. Read now. 

The post Top 10 Smarter With Gartner IT Articles in 2020 appeared first on Smarter With Gartner.


Forgerock Blog

Answers to the SolarWinds Hack Date Back a Decade

So here’s the story. A nation-state attacks a technology company, leveraging a backdoor in a piece of software to infect computers. Then using the infected machines as jumping-off points to move laterally across what was previously thought to be a secure network, the threat actors take aim at targets of interest to the U.S. government. You may think I am talking about the SolarWinds hack. In fact,

So here’s the story. A nation-state attacks a technology company, leveraging a backdoor in a piece of software to infect computers. Then using the infected machines as jumping-off points to move laterally across what was previously thought to be a secure network, the threat actors take aim at targets of interest to the U.S. government. You may think I am talking about the SolarWinds hack. In fact, I am talking about the 2009 attacks on Google and at least 20 other technology companies commonly known as Operation Aurora.

Operation Aurora exposed what many of us already knew – that securing the perimeter of a network through firewalls left the heart of the network vulnerable. A single compromised node on a secure network can put all protected resources at risk, especially when faced with the reality that it’s a challenge to secure every endpoint. Google published an approach that would address this, known as BeyondCorp. It moved the perimeter from the edge of the network to each application and treated every application as unprotected, as if it were connected directly to the internet and needed to be secured with authentication and authorization. BeyondCorp assumes that no user or device can be trusted until it has been evaluated. This approach is better known as Zero Trust, or what Gartner calls CARTA (Continuous Adaptive Risk and Trust Assessment). 

There is so much we still don’t know about this latest attack and the damage it has done, but what we do know is chilling. Nation-state hackers, believed by experts to be from Russia, used SolarWinds’ build system to infiltrate their digital supply chain, which was used to deliver software updates. They then distributed a “security update'' that included a backdoor. A follow-on “security update” leveraged the backdoor to add a malicious payload to a select group of targets. Once those devices were compromised, they were used to deliver additional malware and move laterally across the network to attack other systems.

It’s easy to hear the echoes of these two attacks over a decade apart. It shows us that history repeats itself and will continue to if we don’t break the cycle. Cybercriminals exposed more than 5 billion records in 2019, costing U.S. organizations over $1.2 trillion. It also shows us that many of the answers to this challenge exist today.

With more than 20 years of industry experience, I look at most things through an identity lens. When you spend years using hammers, a lot of things look like nails. However, the reality is that some problems are uniquely nail shaped. That is why I see a layered, identity-centric approach that builds on the principles of Zero Trust and BeyondCorp as our best answer to prevent malicious actors from obtaining the initial foothold within an environment as well as reducing the possibility of further lateral attacks. With this model, you can:

Minimize attack surface by leveraging modern Privileged Access Management (PAM) and identity governance and administration (IGA) techniques. This means enlisting artificial intelligence (AI) and machine learning (ML) to understand and manage appropriate levels of access. By limiting users’ access to the bare minimum needed to do their jobs, risk can be dramatically reduced. According to our 2020 ForgeRock Consumer Identity Breach Report, unauthorized access was the most common attack vector used in 2019, responsible for 40% of all breaches. Monitor user and device behavior to create a baseline for “normal” behavior so that anomalous and risky behaviors can be flagged and analyzed quickly and at scale. This is often referred to as user and entity behavioral analysis (UEBA) and is another opportunity to enlist the help of AI and ML. These behavioral changes may be subtle – for instance accessing systems that aren’t typically accessed, logging in at times that don’t make sense, or moving more data than usual. They can also be distinctive at the technology layer by requiring automation and AI to define, and make visible such as logging in from multiple IP's with disparate geographical locations. If an organization understands how a user or device typically behaves, they can quickly address a compromised system before the threat can move laterally across a secure network, dramatically reducing the risk of this type of attack. It may be cliché to say that identity is the new perimeter, but if the perimeter moves to each application, the existence of a firewall becomes moot. Scalable technology that can continually validate device posture, user authorization, and authentication is a game changer to preventing a compromised device or user from doing damage via lateral attack. A compromised system is a perfect place to steal usernames and passwords – those credentials can be used to access and attack other systems. This is why passwordless and multi-factor authentication (MFA) can eliminate a massive attack vector leveraged in lateral attacks. 

There is no silver bullet to solving these problems, and hackers will continue to get more sophisticated. However, the recommendation represents concrete steps that CISOs and other security executives can take to mitigate risks. When we encounter a problem, it is human nature to try to find a new tool to deal with it. But sometimes, the tool has been there all along (or for about ten years).

For more information about how to protect your organization against lateral attacks, visit our Zero Trust solutions page. 

---------------------------------------------------------

Editor’s Note: 

ForgeRock Identity Cloud and ForgeRock’s corporate infrastructure was not impacted by the SolarWinds breach. ForgeRock has conducted a full review of its environment and has confirmed that neither the ORION product nor the modules that make up the ORION product exist within our corporate infrastructure. In addition, we are conducting reviews of all critical suppliers to establish any potential risk to operations via our supply chain.


Evernym

Getting to Practical Interop With Verifiable Credentials

Standards compliance is a noble ideal, but by itself, it’s not going to get us to practical interop on verifiable credentials (VCs). That’s because there’s not enough agreement on which standards we’re talking about, and the ones we all like aren’t clear enough to force alignment. The W3C’s Verifiable Credential standard defines a data model, […] The post Getting to Practical Interop With Verifi

Standards compliance is a noble ideal, but by itself, it’s not going to get us to practical interop on verifiable credentials (VCs). That’s because there’s not enough agreement on which standards we’re talking about, and the ones we all like aren’t clear enough to force alignment. The W3C’s Verifiable Credential standard defines a data model, […]

The post Getting to Practical Interop With Verifiable Credentials appeared first on Evernym.


KuppingerCole

AWS – A new Vision for Hybrid IT?

by Mike Small Attending AWS re:Invent is always an exceptional experience and, despite it being virtual, this year was no different. As usual, there were the expected announcements of bigger better and faster services and components. AWS always shows a remarkable level of innovation with many more announcements than it is practical to cover comprehensively. Therefore, in this blog, I will focus o

by Mike Small

Attending AWS re:Invent is always an exceptional experience and, despite it being virtual, this year was no different. As usual, there were the expected announcements of bigger better and faster services and components. AWS always shows a remarkable level of innovation with many more announcements than it is practical to cover comprehensively. Therefore, in this blog, I will focus on what I think are some of the highlights in the areas of hybrid IT, edge computing, machine learning as well as security and compliance.

There is an old adage – “Keep it Simple Stupid” and this is excellent advice. In his keynote, Andy Jassy described the need for organization to avoid complexity in order to achieve agility. This is the thinking that lies behind how AWS is evolving its services towards platforms focussed on simplifying the solutions to real-world problems. Jassy also described how he believes the definition of the hybrid cloud needs to be rethought.

Hybrid IT

What now exists in most organizations is a complex mixture of IT services some delivered on-premises and some delivered through the cloud. This is currently defined as hybrid IT.

However, there is now an increase in computing outside of the data centre and at the edge where much of the useful data exists. In addition, advanced networking capabilities provided by technologies like 5G will increase access to this data as well as augment the possibilities for remote control and automation.

In his keynote, Jassy stated that people have become too settled on the definition of hybrid as meaning on-premises plus cloud. AWS’s vision is that there are now so many IT components outside of the data centre, in offices, factories, on ships and elsewhere that cloud plus the edge will become the dominant elements in the future IT infrastructure. This is a vision for clouds of things being the infrastructure for the organization of tomorrow.

However, managing and securing this increasingly complex set of services, upon which the world has now become dependent, are now the critical factor. Just consider the impact of the recent short Google outage with people complaining that they could not switch the lights on in their home.

Edge Computing

Given the importance placed on this area by Jassy, there were several announcements.

Like other vendors, AWS offers a “cloud in a box” which the customer can deploy wherever it is needed. There may be several reasons that customers choose this including proximity to data, compliance related to service/data location, and network latency. AWS announced that AWS Outposts will be offered in 2 new sizes, including a smaller size in 2 flavours Graviton / Intel.

AWS says that they now have thousands of customers using this. One key benefit of this product is that it enables the full range of Machine Learning services close to where they may be needed. Since most of this is managed by AWS, it effectively extends the cloud to wherever it is located. However, beware - this introduces some extra customer responsibilities – for physical security, power and local infrastructure as shown in figure 1. It therefore adds a little more complexity to the already complex hybrid management challenges.

Figure 1: Customer responsibilities for IaaS and Cloud on Premises

IoT is an important element of AWS’s vision of this new hybrid IT environment and there were several announcements in this area. AWS IoT Greengrass release 2.0 is now available - this provides an open-source edge runtime, which includes a set of pre-built software components, tools for local software development, and new features for managing device software on large fleets of devices.

Another area is integration with 5G to achieve very low network latency. AWS achieves this through what they call “Local Zones”. Here AWS infrastructure is deployed closer to large population, industry, and IT centres where no AWS region exists.  AWS announced a preview of AWS Local Zones in Boston, Houston, and Miami, with plans to launch 12 additional AWS Local Zones throughout 2021 in key metro areas in the United States including Atlanta, Chicago, and New York. There are currently no plans for these in Europe.

5G is not the only communication type supported. AWS also announced AWS IoT Core for LoRaWAN, a fully managed capability that allows AWS IoT Core customers to connect and manage wireless devices that use low-power long-range wide area network (LoRaWAN) connectivity with the AWS cloud. It enables them to set up a private LoRaWAN network by connecting their own LoRaWAN devices and gateways to the AWS cloud - without developing or operating a LoRaWAN Network Server (LNS).

Industrial Edge

AWS says that their innovation always follows customer demands and so it is interesting to note how AWS is expanding their services with a focus on an industrial plant.

AWS announced Amazon Lookout for Equipment, a service which provides a way for customers with existing sensors on their industrial equipment, to send their sensor data to AWS to build machine learning models and return predictions to detect abnormal equipment behaviour. This enables predictive maintenance that allows them to act before machine failures occur and avoid unplanned downtime.

Amazon Lookout for Vision is a new machine learning service to find visual defects in industrial products, accurately and at scale. It uses computer vision to identify missing components in products, damage to vehicles or structures, irregularities in production lines, and even minuscule defects in silicon wafers — or any other physical item where quality is important.

In addition, AWS announced AWS IoT SiteWise Edge (Preview), a new feature of AWS IoT SiteWise providing software that runs on-premises at industrial sites and makes it easy to collect, process, and monitor equipment data locally before sending the data to AWS Cloud destinations.

Machine Learning

In line with AWS’s intentions to make it simpler to use their services, several new useful tools have now been now fully integrated into Amazon SageMaker.

One interesting example is the announcement of Amazon SageMaker Clarify to help machine learning developers achieve greater visibility into their training data and models so they can identify and limit bias and explain predictions.

Training machine learning models is hard because a neural network does not provide an explanation of why a conclusion was reached. In addition, this also makes it difficult to ensure that the neural network has not been trained on biased data. There are mathematical models from game theory that can help in this area – notably the use of Shapely values that help to show the relative contribution made by different elements. The challenge is that some of the tools available are not robust or well-integrated.

 Amazon SageMaker Clarify provides an integrated approach to detect potential bias during data preparation, after training, and in the deployed model by examining attributes specified. SageMaker Clarify also includes feature importance graphs that help to explain model predictions and produces reports that can be used to identify issues with the model that can then be corrected.

Security and Compliance

Stephen Schmidt AWS CISO in his keynote reviewed the AWS security capabilities that customers had found most impactful during 2020. Top of this list was Amazon Guard Duty S3 Protection. This emphasizes the importance organizations place on taking care of their data how the first sign of a cyber-attack is often unusual access to data.

While AWS take great care to secure the service that they deliver organizations still have concerns over how they can prove that their use of the service meets their security objectives and compliance obligations.

This often involves manually collecting and collating evidence from multiple sources to respond to queries from internal and external auditors. To simplify this process AWS announced AWS Audit Manager.

This is a new service that helps to continuously audit AWS usage to assess risk and to demonstrate compliance with regulations and industry standards. It automates the collection of evidence on whether the controls are operating and effective. It includes predefined templates for common compliance needs such as CIS AWS Foundations Benchmark, the General Data Protection Regulation (GDPR), and the Payment Card Industry Data Security Standard (PCI DSS).

It is expected that internal and external auditors will be the main users of this service.

My Opinion

AWS has a unique approach to innovation with a very strong focus on providing what the customer wants. Accepting the need for simplification in the complex world of IT services is very welcome and the announcements show how AWS is achieving this for its platform.

AWS’s alternative vision for the hybrid cloud recognises that the future of the IoT is inescapably linked with the cloud. The IoT will actually become the clouds of things.

 However, while it is right to emphasize this aspect of hybrid IT for the future, today many organizations are struggling today with the problems of securing and managing their mixture of SaaS, IaaS and on-premises services and a solution to this problem is urgently needed.

The release of AWS SageMaker Clarify provides a useful out of the box solution for some of the current challenges of bias and explanation related to machine learning. However, while it provides a robust implementation of the state of the art in this area, it does not provide a revolutionary solution.

AWS Audit Manager will provide welcome capabilities for organizations to reduce the costs of demonstrating the effectiveness of their AWS controls and how these are meeting their compliance obligations. The challenges of doing this for the whole heterogeneous IT stack involved in many business-critical applications remain to be solved.

For more detailed research on these topics:

European Identity & Cloud Conference 2021

Explainable AI

The Story of Edge AI

5G - How Will This Affect Your Organization?


MyKey

MYKEY Weekly Report 30 (December 14th~December 20th)

Today is Monday, December 21, 2020. The following is the 30rd issue of MYKEY Weekly Report. In the work of last week (December 14th to December 20th), there are mainly 1 updates: 1. We are carrying out a topic activity with rewards: ‘Me and BTC’ in http://bihu.com until December 23 For details, click to view: https://bit.ly/3nqxsiJ. !!! If you encounter any abnormal situation while us

Today is Monday, December 21, 2020. The following is the 30rd issue of MYKEY Weekly Report. In the work of last week (December 14th to December 20th), there are mainly 1 updates:

1. We are carrying out a topic activity with rewards: ‘Me and BTC’ in http://bihu.com until December 23

For details, click to view: https://bit.ly/3nqxsiJ.

!!! If you encounter any abnormal situation while using MYKEY, remember not to uninstall MYKEY APP, please contact MYKEY Assistant: @mykeytothemoon in Telegram.

!!! Remember to keep the 12-digit recovery phrase properly from [Me] — [Manage Account] — [Export Recovery Phrase] in MYKEY even if your account is not real-named.

About Us

KEY GROUP: https://keygroup.me/

MYKEY Web: https://mykey.org/

BIHU: https://bihu.com/people/1133973

Telegram: https://t.me/mykey_lab

Twitter: https://twitter.com/mykey_lab

Medium: https://medium.com/mykey-lab

Github: https://github.com/mykeylab

Youtube: MYKEY Laboratory

MYKEY Weekly Report 30 (December 14th~December 20th) was originally published in MYKEY Lab on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 20. December 2020

Identosphere Identity Highlights

Identosphere Weekly #11: More from MyData • SSI vs Santa • VCs make Forbes

VIdeos, clips, and presentations from MyData, esatusAG shares 4 future scenarios for IAM in 2040, DHS Wallet UX finalists interviewed.
Welcome to the Identosphere Weekly

Feel free to Share\Forward this message and Subscribe for Updates!

This publication is 100% reader-supported! Thanks to our Patrons! 

We stop just short of the length when e-mail providers begin truncating. Advertisements would mean less info!

For those who haven’t already: Consider supporting its creation on Patreon! 

Upcoming Events Thoughtful Biometrics Conference

The Thoughtful Biometrics Workshop is creating a space to dialogue about critical emerging issues surrounding biometric and digital identity technologies. It’s happening the 1st week of February: Monday, Wednesday, and Friday, 9am PST to 1pm PST / Noon EST to 5 EST.

Register on EventBrite!

Identiverse - Call for Papers is Open 

This year at Identiverse, discover how modern identity systems are the enabler for services and businesses that put people first. Join us June 21-23, 2021 in Denver, Colorado. Identiverse 2021. This time, it’s personal.

News & Updates Verifiable Credentials featured on Forbes

Berners-Lee recently suggested that the web needs a midcourse correction. Part of that change involves making systems accountable and making it easy for users to find where information comes from. Verifiable credentials promise major strides in that direction. 

Mattr Introduces OpenID Connect Credential Provider!

OIDC Credential Provider is “an extension to OpenID Connect which enables the end-user to request credentials from an OpenID Provider and manage their own credentials in a digital wallet.”

FIDO & DIDs

MMM...not sure where this is going but it feels like an interesting development

This presentation from Day 1 of 2020 FIDO from Team Dr. Who (consisting of a project manager, 2 developers, and a public healthcare specialist from the World Health Organization 9 (WHO). Their Proof of Concept introduces smart health insurance card services that link Distributed IDentity technology and FIDO Authentication. The team aims to solve the problem of existing physical cards, which are an inferior way of identifying someone’s actual identity.

Here is a link to the video presentation (in Korean) 

Trinsic Releases Studio 2.0

2.0 comes with a simplified pricing model based on credential exchange, that is credentials issued, and credentials verified. Subscription plans starting at Free: 50 Credential exchanges a month, Developer $18/month: 100 credential exchanges, and Production $112/month: 500 credential exchanges.

It also comes with a fresh clean UI, and improved performance.

we migrated the Studio from server-side rendering to client-side via WASM. [...] the end result is that pages load 2x faster.

Digital Identity Wallet UI Competition

Kathleen Kenyon & Anil John believe that blockchain-based identity wallets are designed for engineers, not users, and created the Digital Identity Wallet UI Design Competition to address that challenge. Hear from the finalists’: Jeff Stephens of Dignari, Josh Welty of Trinsic, along with Ken Ebert and Scott Harris of Indicio.

GlobalID: What we learned at IIW

GlobalID shares their takeaways, including enthusiasm about KERI, Chained Credentials, and Guardianship. 

One of the hottest topics at IIW 31 — in part because of how much it offers — was our new friend KERI, which stands for Key Event Receipt Infrastructure. 

GlobalID Introduces Trustees for Key Recovery

Trustees can be friends or family members from your contact list. Once selected, each Trustee is granted a shard of your private key. Restoring your lost Identity requires approval from the majority of your Trustees.

EPS for SSI (Self-Sovereign Identity)

In my earlier post, I failed to refer specifically to the people working for Self-Sovereign Identity and the likes of blockchain that support the distributed/decentralised storage of secrets. [...] you might all be interested to hear that the key function of Expanded Password System is to convert images to high-entropy codes that work as very long passwords and also as the seeds of symmetric/asymmetric cryptographic keys.

trustbloc/hub-router

DIDComm mediator and router with mailbox features.

The TrustBloc hub-router is a working implementation of the Mediator Coordination and the Pickup protocols built using Hyperledger Aries Framework - Go.

@OlfertSarah of @esatusAG shares:

Four Future Scenarios about Identity & Access in 2040. We talk about Total Surveillance, Mega Corporations, Identity Chaos & Self-Sovereign Identity 2.0.

After a look into the past and present, we now fast-forward to the year 2040, where we meet Julia and accompany her through her everyday life. In four different IAM future scenarios, which can be viewed separately, we will understand how life under total surveillance feels for Julia. In times where identity chaos prevails, we can see what it means when Julia can no longer be sure about her digital identity, with only her physical identity being certain. In a world dominated by mega corporations Julia‘s experiences with her employer show us the far-reaching significance and influence such corporations have gained in relation to IAM. Finally, we experience how Julia is able to regain her informational self-determination thanks to her identity wallet - Self-Sovereign Identity 2.0.

Webinar: Trust over IP and Government

@trustoverip shares:

Recordings are now available for our webinar (Dec 15) - “Trust over IP and Government”

Session 1: bit.ly/3mrqV6a

Session 2: bit.ly/3rbzlST

SSI vs Santa

Phil Wolff shares:

In 2021 Santa decentralizes his list, no longer relying on children’s Real Names in compliance with kid privacy laws. Self-sovereign identity lets kids ask Santa, confident their identities are authenticated (right toys to the right kid) & that he uses verified naughty/nice data.

Identity not SSI  WOOP-WOOP-WOOP - NOT SSI but using DIDs?

GADI presented at the Vienna Digital Identity Meetup (now virtual, very good, much recommend). The GADI architecture is a federated identity ecosystem where Digital Address Platforms (DAPs) issue unique individual identifiers controlled by the GADI ecosystem.  This is the fundamental difference in identity philosophy between GADI and SSI based systems.  The Digital Address is a lifetime connected identifier and under the control of the DAP. The video is here.

U.S. Treasury breached by hackers backed by a foreign government

@Cred_Master shares

“The hackers ... have been able to trick the Microsoft [Office 365] platform's authentication controls.”

#SSI #VerifiableCredentials

OASIS releases KMIP 2.1 

The Key Management Interoperability Protocol (KMIP) is a single, comprehensive protocol for communication between clients that request any of a wide range of encryption keys and servers that store and manage those keys. By replacing redundant, incompatible key management protocols, KMIP provides better data security while at the same time reducing expenditures on multiple products.

Exploring Facial Biometrics

This is a fantastic article by the DIACC (Digital Identity and Authentication Council of Canada) 

for the purposes of allowing a user to positively identify themselves from their own device, only face verification and face authentication are employed. Face verification creates trust, while face authentication maintains it. Both functions are covered in the Pan-Canadian Trust Framework™ that is intended to support a robust digital identity, trust ecosystem that will allow all Canadians to do more online, in a safer, more secure, and confident way.

Transformation in a Digital Age

Digital Caribou shares their thoughts on Digital Transformation and inclusion - very good thinking for all of us working on digital identity. 

We believe that the emphasis on transformation as both process and effects is particularly important, especially as although digitization and digitalization are well underway, accelerated by the response to COVID-19 (remote working, payments, etc.), these are not inevitable processes. They are the results of human decisions. Similarly, the effects of these are not inevitable, either.  

#linkeddata and crypto goodies

@BartHanssens shares

proofs: w3c-ccg.github.io/ld-proofs, cryptosuite: w3c-ccg.github.io/ld-cryptosuite-registry, #GnuPG: signatures gpg.jsld.org/contexts

MyData 2020 Online good sessions continued… MyData4Children-OpenSpace2020

Three questions, to try to understand how MyData may lead a way to create a safe, enjoyable and empowering digital world for children. 

What is the main challenge(s) we face today regarding children’s rights in a digital world? 

What would be the ideal digital experience (safe, enjoyable, feasible and viable) for children, parents & educators? 

What needs to be done to enable that ideal experience?

#THEGLASSROOM - Misinformation Edition

The website above is a cool interactive webpage that was shared during the conference. A neat infographic called How your phone is designed to grab your attention is part of the interactive webpage. There’s also a video.

In this edition we explore how social media and the web have changed the way we read information and react to it. We include our animations:

Trackography: You Never Read Alone

Serious Profiling: Have you been profiled yet?

Personal Data: Political Persuasion, Inside the Influence Industry, Whats for sale?

Living with Algorithms: Why should you care about algorithms?

Clips from the conference

@mydataorg shared some video clips from the conference in a few tweet threads:

"20% of average family budget goes to mobility services. With better understanding through #MobilityData the costs and the CO2 impacts could be managed much easier,”@Paultheyskens #PersonalData is an important enabler of sustainable mobility in the future!

Better use of #mobilitydata could empower also citizens with special needs to move easier. “When data starts to flow, we can build tailored mobility applications,” says @Rafke from @info_vlaanderen

@BeyerMalte explains how to go from strategy to practice with the @EU_Commission's new #EUDataStrategy & #DataGovernanceAct and what is the role of trusted data intermediaries like #MyDataOperators.

To share or not to share your personal data. Benefits include free service, better service or moral satisfaction. But the risk is manipulation, Professor @MaxGrafenstein

There should be a way for our #data to gain value, be it in example monetary or ethical. So how valuable is “my data”? @nlaout answers the million(or billion)-dollar question

A traditional implementation of creating trust concerning data use is cookies. However, it’s a “hell of a user experience”.  Now we have the opportunity to build something completely different that really inspires and keeps trust! @arionair89.

Dr. Mawaki Chango on understanding the origins of identity

#Identity management is not a new problem. Mawaki Chango, PhD explains briefly it's interesting history starting from the Roman Catholic Church keeping records of their believers, leading all the way from passports to the current situation we are in with digital identity credentials! At the #MyDataOnline2020 conference. Read more of his work fromhttps://digilexis.com

@esatusAG shares

Excellent session #SSI in action with @KudrixD, @Claudia94601743, and @doerkadrian at #MyDataonline2020 – here is a quick wrap-up of the discussions. We foresee lots of new SSI use cases becoming operative next year.

Data as competitive advantage & control mechanism in platform economy

Presenters: Sangeet Paul Choudary, Molly Schwartz Session host: Riikka Kämppi Molly Schwartz chats with Sangeet Paul Choudary - best-selling author of Platform Revolution and Platform Scale and founder of Platformation Labs - unpacks the ethics and economics of data.

MyData Strategy of Global Enterprises

Visionaries from around the world will present success stories and explain why it is important to align MyData’s human-centric principles in the data economy.

Slides:

Jyrki Suokas – Opening and ClosingAlex David – Exploring MyData concept from a Korean Perspective

Ilona Ylinampa – Why Technology Is A Method and Not An Intrinsic Value? Examples of Finnish Human-Centric Data Cases

Junseok Kang – Korea gaining momentum and what we are doing

Pascal Huijbers – How Trust in Digital Data can make our world a better place

Vincent Jansen – Consumer Data Rights in Australia

Thanks for Reading

If you find this publication to be valuable, consider supporting its creation on Patreon!

See you next week! 

Saturday, 19. December 2020

Rohingya Project

R-Academy: Providing Education & Skills to the Stateless

Introduction The Rohingya Project is a grassroots initiative to uplift and empower the stateless and financially excluded Rohingya people by creating a secure and transparent digital ecosystem. The R-Academy is a new initiative under the Rohingya Project which aims to provide Rohingya people with skills that they would find applicable and necessary in their lives. … R-Academy: Providing Education
Introduction The Rohingya Project is a grassroots initiative to uplift and empower the stateless and financially excluded Rohingya people by creating a secure and transparent digital ecosystem. The R-Academy is a new initiative under the Rohingya Project which aims to provide Rohingya people with skills that they would find applicable and necessary in their lives. …

R-Academy: Providing Education & Skills to the Stateless Read More »

Source


Caribou Digital

Transformation in a Digital Age

Digital transformation is everywhere. It is used to describe so many things that it risks losing its meaning, and with it, the potential for a common approach to supporting it in pursuit of equitable, sustainable development. We’re sharing this blog, then, as part of defining this important area. Digital transformation should include both the processes and outcomes of mindful, intentional, and sh

Digital transformation is everywhere. It is used to describe so many things that it risks losing its meaning, and with it, the potential for a common approach to supporting it in pursuit of equitable, sustainable development. We’re sharing this blog, then, as part of defining this important area.

Digital transformation should include both the processes and outcomes of mindful, intentional, and shapeable efforts that lead to intended, prosocial effects. Definitions of digital transformation at the firm level tend to focus on existing or new business models. For McKinsey, it is to “enable existing business models by integrating advanced technologies,” while for Salesforce, it’s “using digital technologies to create new — or modify existing — business processes, culture, and customer experiences.”

Definitions at the national level tend to take a broader perspective. The European Commission characterizes digital transformation as “a fusion of advanced technologies and the integration of physical and digital systems.” The OECD defines it as “the economic and societal effects of digitization and digitalization.” This is important; while the first three definitions focus on process, the last focuses on effects and the consequences of these significant changes.

Development-focused definitions emphasize this outcome-focused perspective but highlight challenges and risk. The Pathways Commission at Oxford University cautions that “The use of digital technologies will not automatically lead to the inclusion of the poor and marginalized.” UNCTAD points to the challenge of measuring the broader national digital economy and emphasizes how digital transformation is “highlighting the significant divides that exist.” The United Nations Foundation Digital Impact Alliance, having reviewed over 30 frameworks, concludes that “more work is required to attempt an overarching articulation of the most important elements” of digital transformation. This blog is a contribution to this continuing effort.

We believe that the emphasis on transformation as both process and effects is particularly important, especially as although digitization and digitalization are well underway, accelerated by the response to COVID-19 (remote working, payments, etc.), these are not inevitable processes. They are the results of human decisions. Similarly, the effects of these are not inevitable, either. We agree that technologies amplify intent, and left unchecked, digital transformation may amplify the most powerful, strengthening autocracy and dystopia. With strategic engagement and clear intention, it is possible to build ethical and inclusive digital economies, societies, and politics that mitigate risk and prioritize protection.

Digital transformation that works for all requires values-based intentional focus on the promises of this new era and mindfulness to mitigate their perils. The following five considerations are insights that help us ensure we are intentional in our support to governments, donors, and other partners in their work to ensure digital transformation prioritizes people, protects fundamental rights, and promotes equality of opportunity.

Digital transformation as development in a digital age. Since we made this phrase central to our work in 2016, we’ve only become more convinced that while digital is fundamental to processes of change in politics, economy, and society, the emphasis needs to be not on digital but on the changes and developmental outcomes. It is vital to recognize that this is not simply focusing on technology to drive changes, and that there are particular characteristics and challenges to achieve developmental goals in a digital age. As we wrote for one donor, “Digital transformation introduces new risks as well as opportunities,” so building inclusive, ethical transformation demands engaging “with complex, nuanced concepts around agency, literacy, power, and structure in the digital age.Digital transformation requires an ecosystem approach. Most digital development projects still exist in sectoral silos and/or are driven by a relentlessly optimistic approach to “connecting the last billion.” This is despite all the evidence pointing to the significant cross-cutting — both negative and positive — impacts of digital technologies on economies and societies as the excellent 2016 World Development Report on “Digital Dividends” laid out. We have found it helpful to firstly define and bind digital ecosystems, at the national or firm level, for example, and to engage at this level — for example, on mapping a digital identity or on entrepreneurial ecosystems. Although the map is not the territory, it is vital to know the terrain before plotting a course to successfully reach your goal. Digital transformation must be inclusive, leave no one behind, and prioritize the protection of vulnerable communities. Otherwise, at best, it will leave them behind, or at worse, it will increase vulnerability and exclusion. The promise of digital transformation in terms of added value to government and the economy are increasingly recognized, but there is a growing realization of the harms these transformations can bring, particularly on the most vulnerable. We have found that for groups such as refugees, women, and informal, gig, and platform workers, there are increased risks to privacy, personal protection, and secure jobs that enable social wellbeing — not forgetting the risks to political process and social polarization. In supporting clients to develop digital transformation strategies, we have emphasized the importance of incorporating “responsible digital” and “digital inclusion” as critical components of strategy to support ethical and inclusive digital transformation. Digital transformation should be integrative, not additive. In our work with partners, we are often asked to develop a digital program — building a new team and developing a program of work. We have found that to effectively influence the arc of digital transformation, development outcomes can be better achieved if programs and organizations incorporate digital into existing teams, programs, and work. We advised on ways that digital information systems might enable humanitarian to social protection transitions while emphasizing protection and security. We worked with Oxfam to help their team conceptualize, develop, and sustain an approach that emphasized digital technologies that “are not an end in themselves; but …integrated into existing programs.” Digital transformation requires leadership and ownership. All change is hard, and without leadership, it is often sidelined and withers. Efforts to direct digital transformation are no different, and require leadership to own and push for the changes that are required. Globally, efforts such as the Digital Public Goods Alliance provide helpful normative framings. With national governments, we’ve found it to be critical to identify champions who take ownership and invest political capital in attaining success. However, champions cannot be isolated figures. One international development donor that we supported in developing a team of digital champions found they were unable to gain traction because they had no ownership over the processes they tried to influence.

There are many different interests advancing the processes and effects of digital transformations, but for these to lead to inclusive development outcomes requires purposeful, intentional, and considered engagement. This will be particularly important as the challenges of equitable progress, social inclusion, and climate change place ever greater demands on the international community and its capacity to respond.

Transformation in a Digital Age was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Fission

Fission Demo Day December 2020

Watch the full video from Fission's December 2020 Demo Day. Learn about the platform, and check out the launch of Drive+ on Open Collective.

We invited everyone to join us for a Demo Day to learn about the company, meet the Fission team, and share everything we've built in the past year. We shared our technical building blocks and the roadmap that we have planned. We ended with demos of apps built on our platform so far.

Fission: Fast App Publishing Platform

Fission is building an app publishing platform that supports developers in building great apps, connecting to users, and getting app makers paid.

Rather than "just" being next-gen cloud infrastructure and technology that powers great apps, we're saying that the Fission platform will help developers reach their ultimate goal of building an app: having app users, and having those users support the app.

In short:

🚢 Ship Apps, 🗣 Talk to Users, 💵 Make Money

Is that too bold or blunt? We are a platform that supports developers, and ensuring that digital makers are successful means that more great software can get built.

Here are some of the further themes, movements, and beliefs we're thinking about:

Open Collective

Fission is now a Fiscal Host on Open Collective. We are doing this in order to work closely with open source developers that want to work together with their users to build and maintain apps sustainably over time.

We are also placing our own open source software on Open Collective, starting with Fission Drive.

Try Fission Drive »

The Drive+ membership tier is an annual support tier that supports building out Drive and the associated Fission account. Backers get special badges on our Discord chat and Discourse forums, will have members-only access to communication channels, including voting on features and other opportunities. Drive+ members will also get a one year Fission Premium account when we launch support for it.

If you like what we're doing, and want to support more of it, sign up for Drive+ »

Thank you so much to Gyuri, Helder, Brian, and the Omo Earth team for being backers on the very first day. We appreciate your early support in this!

Full Demo Day Video

The full 2 hour video is split into chapters so you can jump to different sections. We'll be sharing smaller portions of the video after a little bit of editing.

Slides are available on Notist »

Open Source Collaboration

We are very proud of all the technical building blocks we've shipped so far, and have had some early discussions with other teams about adopting some of the protocols we've defined. During Demo Day, it was fantastic to hear that a lot of you want to get more involved!

We've scheduled a community kick off meeting to work on the WNFS Database layer for January 7th, 2021. Full details on the forum, please RSVP »

Thank you to everyone that joined us. It was great to see you and we really appreciate your interest and the excitement you showed. The team has put a lot of work into what we've shipped so far, and we can't wait to see what you build with Fission.

We shared and announced a lot during this one event, so we'll be posting more from the event over the next week. We wish everyone a restful time as we head into the winter holidays, and look forward to more in 2021!

Friday, 18. December 2020

Coinfirm

Germany’s First Regulated Crypto ATM Provider Leverages Coinfirm’s AML Platform

LONDON, 18 December – Germany’s leading Bitcoin ATM provider, spot9, will leverage top RegTech Coinfirm’s AML platform for cryptocurrencies and blockchain to increase transparency and strengthen compliance in the crypto industry. spot9’s users can find the nearest ATM to conveniently buy Bitcoin and other cryptocurrencies with cash – no prior knowledge needed. spot9 enables customers...
LONDON, 18 December – Germany’s leading Bitcoin ATM provider, spot9, will leverage top RegTech Coinfirm’s AML platform for cryptocurrencies and blockchain to increase transparency and strengthen compliance in the crypto industry. spot9’s users can find the nearest ATM to conveniently buy Bitcoin and other cryptocurrencies with cash – no prior knowledge needed. spot9 enables customers...

IBM Blockchain

Blockchain Newsletter for December: Building the COVID-19 vaccine supply chain

This time of year brings holiday treats, a global sigh of relief that 2020 is almost over, and a spate of predictions for 2021. Martha Bennett, IT industry analyst at Forrester Research, predicts that 30 percent of blockchain pilot projects will make it into production. IBM Blockchain general manager Alistair Rennie confirms that the 30 […] The post Blockchain Newsletter for December: Building t

This time of year brings holiday treats, a global sigh of relief that 2020 is almost over, and a spate of predictions for 2021. Martha Bennett, IT industry analyst at Forrester Research, predicts that 30 percent of blockchain pilot projects will make it into production. IBM Blockchain general manager Alistair Rennie confirms that the 30 […]

The post Blockchain Newsletter for December: Building the COVID-19 vaccine supply chain appeared first on Blockchain Pulse: IBM Blockchain Blog.


Nuggets

2020: The year in review

Firstly, I hope you’re keeping well and looking forward to the holiday season. This has been a difficult year for many, but I’m pleased to say progress has continued unabated here at Nuggets. In fact, we’ve made some significant strides forward. We’ve all been working remotely for the past 10 months, with no in-person events since early March. But everyone has adapted well, and the pandemic

Firstly, I hope you’re keeping well and looking forward to the holiday season. This has been a difficult year for many, but I’m pleased to say progress has continued unabated here at Nuggets. In fact, we’ve made some significant strides forward.

We’ve all been working remotely for the past 10 months, with no in-person events since early March. But everyone has adapted well, and the pandemic restrictions haven’t held us back.

Looking back at some of the highlights from the last 12 months, I’m extremely proud of what we’ve been able to achieve — and enormously excited for 2021.

Here’s a quick recap…

Product

Our big product news this year was the iOS and Android releases across Europe in May, and shortly after in Australia. This was a huge step for us: making Nuggets available to any business that wants to protect its customer data. Now, anyone can enjoy the privacy, security and simplicity of Nuggets. We also launched our own demo store, which allows businesses to test the platform in a live environment. Any business can download Nuggets and experience the onboarding, log in and payment flows — without sharing or storing their data. It gives potential partners a full overview of exactly how the platform works, and allows them to run through the user experience. Our engineering team grew this year — and will again in 2021. As always, there’s been a huge amount of work going on behind the scenes.The team has been hard at work on UI/UX, infrastructure and architecture updates, continually improving the platform’s resilience, robustness and functionality. This puts us in a great position for the exciting things we have lined up for 2021. In June we created a liquidity pool for NUG on Uniswap, meaning NUG is now available to swap with any other ERC20 token available on the platform.

Partnerships and memberships

We announced our commercial partnership with LexisNexis Risk Solutions in November, after working with them for some time. This is a big one. LexisNexis is one of the world’s largest protectors of private and confidential data, and one of the biggest risk and fraud companies in the world. And this partnership has huge, positive implications for us. Specifically, we’re working together to deliver self-sovereign digital identity (SSI) solutions for existing and prospective customers. There’s more on this on Alastair’s blog. We joined the Open Identity Exchange (OIX) earlier this year, complementing our existing membership of the Decentralized Identity Foundation. Formed in 2010 to address the increasing challenges of building trust in online identity, OIX is uniquely dedicated to ID Trust. As a member, we’re collaborating on thought leadership initiatives and getting involved in sector-specific working groups. Fellow members include companies such as Barclays, Microsoft, LexisNexis and HSBC. We’ll also be able to support pilot projects, working alongside other members. For more on OIX, read my blog from September.

Awards and recognition

We’ve done well on this score in the past, and thought last year would be hard to beat. Little did we know! We don’t celebrate these accolades just to show off — they’re important evidence of the impact we’re making.

Let’s start with the awards:

Most recently, we won ‘Best Use of Blockchain in Financial Services’ at the Emerging Payments Awards We won ‘Mobile Innovation of the Year’ at the Retail Systems Awards, and ‘Security Innovation of the Year’ at the Payments Awards — where we were nominated for five awards in total. Nuggets also won the Women in Payments Unicorn Challenge, where I got to showcase the platform to a panel of judges from Santander, JP Morgan and American Express. Competing against five other fintechs, we were especially proud to win both the main judges’ award and the audience choice award. On a personal note, I was honoured to receive some individual awards this year, including Entrepreneur of the Year at the Booking.com Technology Playmaker Awards, and the Female FinTech Competition 2020 held by Atos, Deutsche Bank, Google and TechQuartier.

We were nominated for other awards too:

We were named as one of the 10 Best Payments Startups globally in Efma-Capgemini’s Financial NewTech Challenge 2020 We were nominated for ‘Best Enterprise Security Product’ at Computing.co.uk’s Technology Product Awards, and were up for a Financial Services Forum’s Product and Service Innovation Award, in the Security category

There was plenty of high-profile recognition for Nuggets too:

We were named one of the ‘Most Influential Firms in Financial Technology for 2020’ by Financial Technologist Magazine. Judges from EY and Lloyds Banking Group selected the shortlist based on innovation, growth, product and team. We were listed in BusinessCloud’s 100 FinTech Disrupters, at number 42. That put Nuggets alongside some of the UK’s most established names such as Revolut, Starling Bank, Monzo and Checkout.com We were also included in the Efma-Capgemini Global Financial NewTech 2020 Watchlist. The list is made up of 100 Financial NewTechs that will help reshape financial services in 2020. Alastair and I appeared in the UK’s ‘Top 32’ fintech leaders by Business Leader News I was also named one of the Most Influential Women in Payments 2020 by PaymentSource, and selected for the Female Founders First programme, created by Barclays and Techstars. Just two weeks ago, I made it onto the Top 100 Asian Stars in UK Tech list.

Team

Our sales team grew this year, and we were delighted to welcome Ben Geleit as Director of Partnerships and Alliances. Ben’s been driving all the end-to-end go-to-market activity and commercialisation. We’re now working through a number of great opportunities, and hope to reveal more in the new year.

Events and media

We haven’t been to as many events as previous years — for obvious reasons! But we did manage to get to a few at the start of the year.

I took part in a fireside chat as part of the European Women Payments Network Meetup, held at legal firm Allen and Overy’s offices in London in February. In January, I was invited by MasterCard to speak at Paris FinTech Forum. I showcased Nuggets during a session entitled ‘Building trust in the future with…Data’. I was also involved in a panel discussion at February’s Women in Identity Meetup, where the discussion focussed on ‘Diversity and Identity’. Our last event before the lockdown was in March, when Alastair took to the stage at London Blockchain Week — twice! On Day One he was on a panel on Decentralised Finance. Day Two saw him discussing data sovereignty on the Decentralised Data Marketplace panel.

Once again, there’s been a huge amount of media coverage throughout the year. We’ve been in titles like CoinDesk, RetailWeek, Cointelegraph (here and here), TechRadar, The Fintech Times, Ledger Insights and Biometric Update, to name but a few. Visit the Media page on our website for a full list of this year’s coverage.

Alastair continued to publish his influential thought leadership articles on Forbes. This year’s highlights included:

Contactless Deliveries With Doorstep Verification Deliveries To People Not Place Payment Tied Digital IDs Can Solve Amazon’s Fake Reviews Problem

What’s next…

With positive news on vaccines, we’re all hoping for a return to some form of normality in 2021. What that will look like is still uncertain, but we’re more sure than ever of the need for Nuggets.

Millions of people have moved large parts of their lives online this year. And this accelerated switch from physical to digital is unlikely to be reversed.

Unfortunately, if unsurprisingly, this has led to an increase in various types of fraud across sectors like banking, payments and logistics. The need for Nuggets is more pressing than ever. We’re in a fantastic position to help businesses and consumers transact safely online, protect their privacy, and take control of their data.

Nuggets has gained real momentum in 2020, and we’re incredibly excited about 2021. We’ll be enabling more partners and customers, and expanding the team. As consumers’ demand to take back control of their data grows, we’ll be able to offer our platform to new customers in more territories.

Thanks again for your continued support, and here’s to a healthier, happier, and even more successful new year.

2020: The year in review was originally published in Nuggets on Medium, where people are continuing the conversation by highlighting and responding to this story.


One World Identity

State of Identity Rewind: Top 10 of 2020

Rewind & Replay! OWI’s Top 10 most impactful 2020 State of Identity Podcasts highlight the growth and evolution across digital identity this year. Tune in to hear top industry leaders candidly discuss their experiences and provide groundbreaking insights during this transformative time in the identity space.     Subscribe & Listen Now!    1. Mastercard and … St

Rewind & Replay! OWI’s Top 10 most impactful 2020 State of Identity Podcasts highlight the growth and evolution across digital identity this year. Tune in to hear top industry leaders candidly discuss their experiences and provide groundbreaking insights during this transformative time in the identity space.  

 

Subscribe & Listen Now! 

 

1. Mastercard and the Future of Identity and Trust

May 21, 2020 

 

Mastercard SVP of Digital Identity Products Charles Walton, The Commons Project CEO Paul Meyer, and Deakin University Emergent Technologies Manager Alan Longmuir join State of Identity to discuss the vital importance of digital identity, the role it can play in addressing the COVID-19 crisis, and the challenges and opportunities across the healthcare and educational sectors in a digital world.

Listen Now 

 

2. Au10tix: Automated IDV

June 4, 2020

 

Au10tix President & COO Carey O’Connor Kolaja joins State of Identity to discuss her career across the Fintech industry, what sets the Au10tix platform apart from other competitors in the identity verification space, and why no one-size-fits-all approach can be successful when building out customer onboarding flows.

Listen now

 

3. Evernym: They Wrote the Book on Self-Sovereign Identity!

February 20, 2020

 

Authors Drummond Reed and Alex Preukschat join State of Identity to offer a sneak preview of their new book on Self-Sovereign Identity, a deep-dive on the current industry landscape across both startups and standards organizations, and their predictions for the future of self-sovereign identity in 2020 and beyond.

Listen Now

 

4. Trulioo: COO Zac Cohen

February 25, 2020

 

Trulioo COO Zac Cohen returns to State of Identity to share an update on Trulioo’s latest identity verification technology, and their new research on consumer attitudes, expectations and behavior toward IDV and digital onboarding experiences in the US and UK.

Listen Now

 

5. Auth0: Extensibility in Identity

June 25, 2020

 

Auth0 Principal Architect Vittorio Bertocci joins State of Identity to discuss his pivot from computational geometry to digital identity, what sets Auth0 apart from competitors in the crowded authentication space, and the growing importance of developers and extensibility in identity.

Listen Now

 

6. Clearsale: Preventing Fraud and False Declines

May 28, 2020

 

Clearsale Executive Vice President Rafael Lourenco joins State of Identity to discuss the impact of COVID-19 on eCommerce, why preventing false declines is just as important as fighting fraudulent transactions, and the challenges he’s faced in shifting to a remote workforce nearly overnight.

Listen Now

 

7. WSO2: The Power of Open-Source IAM

January 30, 2020

 

WSO2 Vice President of Security Architecture joins State of Identity to discuss WSO2’s, API-first, decentralized approach to identity, the benefits of taking an open-source approach, and the continued convergence of the CIAM and IAM markets.

Listen Now

 

8. SecureKey: A Primer on Decentralized Identity

July 23, 2020

 

SecureKey Chief Identity Officer Andre Boysen returns to State of Identity to discuss the impact of COVID-19 on digital identity and SecureKey’s latest whitepaper offering a primer on decentralized identity.

Listen Now

 

9. Veriff: Scalable Identity Verification 

April 14, 2020

 

Veriff SVP of Revenue Guy Zerega joins State of Identity to discuss the growing need for scalable identity verification solutions across industry sectors, the impact of COVID-19 on the identity verification market, and his predictions for the future of the sector after the world returns to normalcy.

Listen Now

 

10. Socure: Fighting the Uptick in Identity Fraud

September 17, 2020

 

Socure Senior Counsel & Privacy Lead Annie Bai joins State of Identity to discuss why an uptick in identity fraud usually follows economic downturns, the new types of fraud proliferating in the COVID-19 era, and how financial institutions and fintechs can protect themselves.

Listen Now

The post State of Identity Rewind: Top 10 of 2020 appeared first on One World Identity.


Infocert (IT)

Aziende private in arrivo su SPID: Directa lancia l’apertura di conti trading con Identità Digitale

Già qualche anno fa, da quando abbiamo cominciato a parlare di Sistema Pubblico di Identità Digitale (SPID), abbiamo preventivato che, nel momento in cui il sistema avesse raggiunto un numero sufficientemente alto di utenti attivi, imprese e aziende private avrebbero cominciato ad affiancare le Pubbliche Amministrazioni per rendere disponibile l’accesso ai propri servizi attraverso le […] The po

Già qualche anno fa, da quando abbiamo cominciato a parlare di Sistema Pubblico di Identità Digitale (SP