Last Update 3:52 PM October 25, 2021 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Monday, 25. October 2021

Finicity

Open Banking Helps Clear Stipulations and Close More Auto Loans

Not every customer who walks through the door of your dealership on a sunny weekend afternoon can breeze through the auto financing process with no issues. This group could comprise as much as 18% of the borrowers who step into your dealership. In our quickly-evolving economy, many borrowers can have multiple sources of income, thin […] The post Open Banking Helps Clear Stipulations and Close Mo

Not every customer who walks through the door of your dealership on a sunny weekend afternoon can breeze through the auto financing process with no issues. This group could comprise as much as 18% of the borrowers who step into your dealership. In our quickly-evolving economy, many borrowers can have multiple sources of income, thin credit, or a compromised FICO profile. Verifying income with consumer-permissioned data is a digital solution that can help clear stips and smooth out those tricky loan approvals in real-time.

If you have a borrower sitting in your financing department, the team at your dealership has already invested time with them. They’ve shown them several vehicles, developed rapport, and maybe took a test drive. Don’t lose the sale just because the customer isn’t carrying a stack of freshly-printed pay stubs. Using FinicityLend’s income verification, the borrower permissions access to the data in their bank account, generating a verification of income (VOI) report. The results that are returned show a high level of accuracy in determining actual income. This low-friction experience cuts down total time spent on each loan application, finishing the verification process in real-time. The borrower doesn’t have to scramble to collect documentation from various sources just so they can leave with the car. 

Extend Credit With Confidence. Get More Approvals.

Tightened auto lending guidelines and a pandemic-influenced drop in consumer demand are causing a slump in subprime originations. When your customer falls in love with a car and is ready to buy, you don’t want limited lending options to cost you the sale. Traditional FICO models are a good start, but when credit scores aren’t enough to get the green light, FinicityLend augments the credit decisioning with real-time income data that makes the credit decisioning process faster and more efficient for lenders. 

Today’s lending process is in the midst of a vast digital transformation, placing Finicity, a Mastercard company and its open banking technology at the forefront of this data revolution. 62 million Americans have a thin credit file, with only one to four tradelines listed. While they may be more than capable of financing a car, traditional credit scoring models could result in a denial of these applications. With FinicityLend’s income verification, permissioned data access to their bank accounts gives a more robust picture of their actual financial health. The borrower can be issued a loan that they can be approved for and repay. Finicity’s open banking network covers 95% of direct deposit accounts in the United States, greatly increasing the number of buyers you can assist in buying a car. Accurate income data helps satisfy loan stipulations and lets buyers purchase a vehicle right away. 

Reducing Risk

When a borrower misrepresents their income, it’s nearly always to gain access to more credit than they can handle. 2020 was a record year for fraud in the auto industry, reaching a whopping $7.3 billion in losses. One of the biggest increases came in income and employment misrepresentation. With a 100% year-over-year jump in falsified incomes, one of the most important investments you can make is in prevention. Be sure that you have accurate data with FinicityLend’s income verification. 

Higher Verification Success Rates Translates to Fewer Lost Sales

With the increased confidence and efficiency that comes with FinicityLend’s income verification, good sales don’t walk out the door just because there’s not enough paperwork on hand. Verified, real-time data gets the sale closed. Over the course of a fiscal year, lost sales and fraud can cause extensive damage to your bottom line. Finicity income verification is fast and secure. The speed of the process means you can invest time and effort into higher-return work activity. This positive snowball effect can show up in the win column of your balance sheet. 

Our team will be attending the Auto Finance Summit in Las Vegas, October 27-29. To Learn more about Finicity’s income verification for auto, check out our overview or request an auto lending demo.

The post Open Banking Helps Clear Stipulations and Close More Auto Loans appeared first on Finicity.


Infocert (IT)

5 novembre, Danilo Cattaneo ospite di Maurizio Pimpinella (Presidente A.P.S.P) per un evento online sul tema Fintech.

Il 5 novembre, dalle 11:00 alle 11:30, Danilo Cattaneo – CEO di InfoCert – sarà ospite del Centro Studi A.P.S.P. (Associazione Italiana Prestatori Servizi di Pagamento), per prender parte ad un incontro one-to-onecon Maurizio Pimpinella – Presidente A.P.S.P. L’evento fa parte della serie “Le nuove frontiere del Fintech – L’evoluzione dell’Open Banking e

Il 5 novembre, dalle 11:00 alle 11:30, Danilo Cattaneo – CEO di InfoCert – sarà ospite del Centro Studi A.P.S.P. (Associazione Italiana Prestatori Servizi di Pagamento), per prender parte ad un incontro one-to-onecon Maurizio Pimpinella – Presidente A.P.S.P.

L’evento fa parte della serie “Le nuove frontiere del Fintech – L’evoluzione dell’Open Banking e del mondo dei pagamenti. Big Data, tecnologie abilitanti, PNRR e normativa vigente” e sarà possibile seguirlo via web. Seguirà una fase di Q&A.

***

La A.P.S.P. nasce con l’obiettivo di favorire lo sviluppo, l’informazione e la conoscenza della moneta elettronica e più in generale di tutti i prestatori di servizi di pagamento, promuovendo l’attività di carattere culturale ad essi connessa mediante tavole rotonde, convegni e conferenze.

Come annuncia il titolo, il nuovo format di incontri web col Presidente Maurizio Pimpinella, a partire dal 28 ottobre, approfondirà con i diversi ospiti i temi dell’Open Banking e dei pagamenti elettronici, dei Big Data, delle tecnologie connesse e del PNRR ed andrà ad affiancare quello dei webinar di formazione offerti dall’associazione.

Come seguire l’evento?

È possibile prenotarsi per seguire l’evento via web, facendo riferimento alla data e/o all’ospite, attraverso un’e-mail all’indirizzo:

francesca.rossetti@apsp.it o centrostudi@apsp.it

The post 5 novembre, Danilo Cattaneo ospite di Maurizio Pimpinella (Presidente A.P.S.P) per un evento online sul tema Fintech. appeared first on InfoCert.


auth0

How Social Engineering Has (And Hasn’t) Evolved Over Time

There’s no patch for humanity, but that’s not a bad thing
There’s no patch for humanity, but that’s not a bad thing

Onfido Tech

How we improved our project and file structure in Figma

Like many design teams, Onfido made the transition from Abstract + Sketch, to Figma in the last couple of years. One of the common resistance points internally before making that switch, was Figma’s lack of Git-style branching that Abstract enabled. The design team had come to rely on branching and had built a lot of our processes around it. During the transition we tried to structure our fi

Like many design teams, Onfido made the transition from Abstract + Sketch, to Figma in the last couple of years. One of the common resistance points internally before making that switch, was Figma’s lack of Git-style branching that Abstract enabled. The design team had come to rely on branching and had built a lot of our processes around it.

During the transition we tried to structure our files, projects, and design system in a way that would help maintain some (though not all) of the benefits we got from branches.

I hope some of the ideas here spark inspiration for ways you could improve your project and file structures, however keep in mind that this system was designed for our specific needs (yours are probably different) and that these decisions should change over time in a culture of continuous improvement. I highlight some of our learnings at the end of the article. As always, your mileage may vary!

Structure

We are on a Figma Organisation plan, and structure our teams in a way that roughly maps to our cross-functional product teams.

Each team has three main types of “projects” (folders, if you prefer):

Source files Work in progress (WIP) Archive Source files

A source file is like our main branch. They are one or more files that contain a reasonably up to date version of each screen in the product, and typically not a lot more. We try to keep these files as slim as possible, with as little screen duplication as possible. We do this to reduce the impact of breaking changes further down the road, and to simplify manual effort when merging.

The rule of thumb is that if someone can look at a screen in the live product, and put together an accurate version of that screen in Figma in under 2 mins, then what’s in the source file is enough. We don’t need an exhaustive lists of all the error states for a form, as long as there’s one solid example of what an error looks like.

These files are often the starting point for design work on existing products. Streamlined source files enable us to quickly duplicate flows and have them be immediately useful without a lot of busy work removing edge and error cases.

Work in progress

This is usually where the Figma part of the design process starts. We have a project template file which specifies a basic page structure, and a standardised cover sheet. This will be copied into the work in progress folder and named with the project name, and JIRA ticket number where applicable.

This file acts as our branch. Branching is unfortunately very manual right now, but not too difficult at our scale of two to three designers per product.

Screens are copied from the source file (which should be up to date), and design exploration is done from there. Our template specifies an Exploration page, which carries the expectation of no pre-defined structure, as messy and exploratory as the designer wants to be. We don’t critique people’s exploration pages.

We also specify a Handoff page, where engineers know to look for final screens, design specs and flows. Redlines (using Figma Measure), and custom annotation components are commonly used here.

Between those two pages, the designer can add as many pages as they need. Typically these progress in order, and new pages are often created at points where feedback is required. We’ll limit pages to only the stuff we want feedback on, so design dead ends and exploratory screens don’t distract people away from what you need actual feedback on.

UI kits

In our design system team we have a UI Kits project. These UI kits contain any commonly used UI components that are more product-specific than our Design System supports. Each product has a UI kit that’s owned and managed by that product team.

A note on naming: You’ll see the UI kits have an 05 in front of their name. This is part of our design system’s naming convention that I’ll dig deeper in with another article. We use this numbering system throughout our files to keep libraries in a consistent order in the library panel. Themes first, then assets, components, patterns, and UI kits.

When it comes to how we structure our files, one of our few hard rules is:

WIP and Source files should not contain any local components

The reasons we do this are to:

Reduce library management noise
If every WIP or Source file is published as a team library, trying to find the right libraries to enable/disable to get a particular version of a particular component becomes a nightmare very quickly. Make all screens easily portable/reusable by designers
If local components are in a file, but aren’t published as a team library, when someone copies a screen to another file, all links to that main component are lost. The result is an orphaned instance that doesn’t update when the main component is updated because changes aren’t published & pushed. This is totally invisible to the designer, as it visually appears like a standard instance. Unpublished local components lead to broken instance linksHaving a single published team library allows screens to be easily portable Merging

Once a design has been approved and development is in progress or completed, the new screens will be copied back to the Source file, replacing existing screens where necessary. Because no published components exist here, screens can be replaced safely without worry of affecting other work in progress.

For changes in UI kits, components will often be labelled as [BETA] while in progress, or given a version number, or something similar that lets us manage multiple versions of a component existing simultaneously while the work is in progress. When the screens are merged and we want to clean up those UI kit duplicates, we will follow the process here: How to deprecate old components in Figma design systems.

Archiving

Once complete, the work in progress file has the cover status changed to Archived and it is moved to the Archive folder. The archive then acts as an easy way to see the process behind a piece of work, the comment discussions, and document experiments and directions that could potentially be drawn on later.

Things we’ve learned so far Cover status badges are a huge pain to maintain and keep up to date. No one remembers to change them to the right status, so they’re effectively useless. Occasionally we’ll do a big batch-change of everything in the Archive folder, as having these visibly well marked is really beneficial for searches. Some kind of background API integration that synced files with JIRA in real-time would be amazing here. *dreams* There can be some confusion around what the right level of detail is to keep in a source file. We suggest keeping an open mind, and working with your teams to find the right balance for each team, as their products may have different needs. Figma’s new Branches feature does not really meet our needs as it is currently designed. Once merged, the process part is a lot more hidden and difficult for others to find than our archived files. That archival side may not be an issue for you, which is fine. The other issue is that our styles and components live in separate libraries. If we need to make a change that includes both a style and component change, that can‘t be done in a single branch. There’s also no good way of testing the effect your branch changes will have on linked instances in other files right now. Again, a multi-file branch system would solve these. Having a JIRA ticket number in file names is really useful for search, however search only returns results from teams you are a member of, making cross-team discoverability a lot worse than it should be. This is especially true for larger orgs with more teams, or stricter permissions than we have (anyone is free to join any team at Onfido).

I hope that’s sparked some ideas about how you could structure your own teams and files. If you have any questions, or there are things you’d like to see deeper dives on, please reach out on Twitter or find me in the Design Systems, or Friends of Figma slacks.

Thanks to Mark Opland, Raemarie Lee, and Luis Ouriach for helping make this article better.

How we improved our project and file structure in Figma was originally published in Onfido Tech on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Ontology Harbinger Interview Series: John

This is an ongoing series featuring Ontology’s Harbingers. Harbingers are exceptional leaders that help Ontology grow its community and ecosystem, making it an ever-stronger blockchain, and decentralized identity and data leader. In the ninth interview in this series, Ontology speaks with John Francisco, administrator of our Spanish community. 1. How did you hear about Ontology? What drew y
This is an ongoing series featuring Ontology’s Harbingers. Harbingers are exceptional leaders that help Ontology grow its community and ecosystem, making it an ever-stronger blockchain, and decentralized identity and data leader.

In the ninth interview in this series, Ontology speaks with John Francisco, administrator of our Spanish community.

1. How did you hear about Ontology? What drew you to Ontology?

I heard about Ontology when I met Sesameseed. They had recently launched staking with Ontology. I really liked their ecosystem and started to learn more about the project.

2. What made you become a supporter of Ontology and a champion of its brand?

As an end user in my beginnings knowing Ontology, I was very fascinated by its infrastructure and its staking. I later learned that it is aiming to provide digital identity solutions that companies and day-to-day users need, today. I like technology, therefore, this led me to become even more interested in Ontology.

3. What do you see as the key role of a Harbinger? What is your favorite thing about being a Harbinger?

The community is very important in every project. For this reason, building communities and giving them the tools they need to understand the project is vital for the future and its success. Being a Harbinger is having the responsibility of spreading the word about the future, leading new generations to contribute and generate change in all aspects of technology, and that is what Ontology does, day after day.

4. Why is being part of the Ontology community important to you?

As I have mentioned before, the community is very important; it is the heart of every project. It is the starting point because they direct the course of the project anyway; they help to build a better future. For that and many more reasons, being part of Ontology is being that new paradigm of improvements towards the future.

5. How is Ontology’s community different from other blockchain communities, is there anything that makes it stand out?

Each community has its own, however Ontology allows all its members to be part of the project in a more direct way, such as via the Ontology Harbinger program. This program is where you can contribute your knowledge about Ontology’s technology and long-term objectives more directly. Ontology makes you special by being part of its community, part of its ecosystem and as an end user you feel that you have done something important for today and the future of Ontology.

6. What do you use as your key channels for engagement with the Ontology community and why? Would you like to see any others?

Telegram is a very important channel for Ontology; it is the place where the community feels welcomed by the moderators and administrators. There is a more direct interaction and it is the perfect place to disseminate what Ontology is. As a secondary channel, Twitter is an essential part of the communication ecosystem because it interacts with the most technical and intrepid users who also contribute as the Ontology community.

7. Can you share a memorable experience or something you’ve learned from being a Harbinger? What advice would you give to someone looking to become a Harbinger?

In the first instance it was an arduous path, but I did it. Now my responsibilities do not merely include being a single guide to a member of the community but also being an ambassador for Ontology anywhere and for everyone. To be a Harbinger, my biggest advice is, be part of Ontology, understand Ontology and be more than a community member. Understand Ontology’s potential and how you can contribute to the project in a healthy way. Learn every day from Ontology because technology advances every day and without stopping.

8. How would you describe the Ontology community in three words?

Amazing, fascinating, visionaries.

9. How do you think Ontology could expand its community going forward? What would you like to see more/less of? What kinds of things do you see community members do that you think help our community grow?

Communication is vital for a community to grow and prosper. Ontology and the Harbinger program open the doors for the community to grow and have solid foundations for its expansion to other horizons.

The members of the community are the visionaries and owners of having the project in their hands, with their contribution, their knowledge, their way of understanding Ontology leaving an enriched legacy of pure knowledge, they make and will make Ontology and its community even stronger in time.

10. What do you see as the key milestones for Ontology and how can the community help with achieving these?

Its decentralized identity solutions that return to the end user control over their data. It is important that community members utilize these tools and be an example to others who may later come to adopt them, too. We need to demonstrate to those around us how Ontology’s tools benefit our everyday lives. We can use ONTO to manage our digital identities or build wealth through staking and show our friends and family how convenient and easy it is.

To learn more about Ontology’s Harbinger Program and how you can get involved, check out our updated GUIDE.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Ontology Harbinger Interview Series: John was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Guarantee Asset Ownership with Blockchain Passports

The post Guarantee Asset Ownership with Blockchain Passports appeared first on Tokeny Solutions.
October 2021 Guarantee Asset Ownership with Blockchain Passports

In the DeFi world, digital assets are stored in blockchain wallets. Obviously, wallets are not identities, they are anonymous addresses and contain no data on the owner’s identity. A private key is the ‘password’ of the wallet, whoever knows it can access the wallet. It cannot be changed, but it can be stolen. In other words, a wallet is more like a browser for blockchain, where you can see what is inside, but you cannot be sure that you are the only one that has access to your assets. This causes a number of issues:

Lack of ownership guarantee: If the private key of the wallet is accessible by another user, then who owns the assets? 
Lack of compliance: For token issuers or DeFi protocols, it is impossible to comply with regulations if the owner of the digital asset is unidentifiable and eligibility criteria cannot be enforced.  

So, what is the solution? Digital identities. Just as our assets are secured by tying our identity to them in the real world, digital identities can serve as blockchain passports in the DeFi world to guarantee the ownership of digital assets. 

Here is how it works:

Link digital identity to tokens: Use a token standard that codes ownership of tokens with a digital identity into the token, such as ERC3643, the official permissioned token standard. With this standard, token recovery is even possible due to the use of ONCHAINID, a decentralized digital identity system. A functional blockchain passport: Once the tokens are issued, they will be allocated to a wallet associated with the digital identity. Furthermore, digital identity functions go far beyond this, and can allow you to recover/transfer tokens, add/delete wallets (e.g. Metamask, Ledger), etc.

More interestingly, when personal data (e.g. country of residence, age…) is verified by trusted parties (e.g. government, big four, KYC providers…), the proof of the verification is stored onchain in the digital identity, making these proofs reusable. For instance, identity owners can grant access to their proof of age to a DeFi protocol that only accepts users above the age of 18.

Without revealing their personal information, the DeFi protocol can determine if the identity holders are over 18 years old via the verified proof. Digital identity is thus like a privacy-protected blockchain passport, making it easier, safer and faster to explore DeFi.

Market Spotlight

PRESS RELEASE

T-REX Protocol Recognized as ERC-3643: An Official Standard for Permissioned Tokens

Read More

MARKET INSIGHTS

Where Do Custodians Stand in the Age of Digital Assets?

Read More

TOKENY’S TALENT

DevOps Engineer Cyrille’s Story

Read More Market Insights

ERC3643: Tokeny Comments on New Protocol for Tokenized Assets

The Ethereum community has officially recognized the T-REX protocol as ERC3643, making it an official standard for permissioned tokens.

Crowdfund Insider

Read More

Why Civic Technologies Is Building DeFi Identity Tools on Solana 

Solrise Finance will host the Civic Pass, which aims to make DeFi chime with institutions.

Coindesk

Read More

First Bitcoin ETF Begins Trading

Eight years after the first application for a Bitcoin ETF was filed by the Winklevoss brothers, the first Bitcoin ETF in the United States begins trading at the New York Stock Exchange (NYSE).

Investopedia

Read More

The Well-Known Welt Der Wunder Tokenized It’s 25-Year-Old Media House

Through tokenized equity, Welt der Wunder will further develop its national television networks in Germany and Switzerland, which currently has 15 million viewers via TV alone and another 12 million online viewers per month.

Welt Der Wunder

Read More

Archax subsidiary Montis Digital to launch blockchain based digital post-trade infrastructure

Archax, the first FCA regulated digital securities exchange, broker, and custodian, has today announced its subsidiary, Montis Digital, which is building digitally native, blockchain based post-trade infrastructure

The Tokenizer

Read More

German financial authority BaFin grants third permission to provide crypto custody services to Munich-based company Tangany

The Munich-based technology and financial services provider Tangany has received a license from the German Federal Financial Supervisory Authority (BaFin).

The Tokenizer

Read More

Bitcoin-based security token offering approved in Germany

Germany joins countries such as France, Luxembourg, Spain and Portugal by greenlighting the Bitcoin-based EXOeu token.

Cointelegraph

Read More

Revolut to Launch Crypto Token

Revolut, a fintech company with a $33 billion valuation that offers cryptocurrency buying as part of its services, is looking to launch its own cryptographic token, according to two people with knowledge of the plans.

Coindesk

Read More

 

Credit Suisse acts as depositary for tokenized stocks on public Ethereum 

Ski and sports resort Alaïa used the Taurus platform to tokenize its stock. Credit Suisse took on the role of depositary.

Ledger Insights

Read More

Compliance In Focus

G7 Finance Ministers and Central Bank Governors’ Statement on Central Bank Digital Currencies (CBDCs) and Digital Payments

A statement on exploring how digital innovation could maintain access to, and enhance the benefits of, central bank money in the form of Central Bank Digital Currencies.

HM Treasury

Read More

Founders of Crypto ICO Plead Guilty to Tax Evasion After Raising $24 Million from Investors

The owners of a cryptocurrency company have pleaded guilty to tax evasion, announced Acting U.S. Attorney for the Northern District of Texas Chad E. Meacham.

U.S. Department of Justice 

Read More

The SEC May Be Coming for Stablecoins, While Congress Aims to Treat Cryptos More Like Stocks

The country’s top securities cop is talking tougher on cryptos, especially stablecoins, “staking,” and lending services.

Barrons

Read More

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Oct25 Guarantee Asset Ownership with Blockchain Passports October 2021 Guarantee Asset Ownership with Blockchain Passports In the DeFi world, digital assets are stored in blockchain wallets. Obviously, wallets are not identities, they… Sep13 How Fractionalization of NFTs Enables Co-Ownership September 2021 How Fractionalization of NFTs Enables Co-Ownership For months, non-fungible tokens (NFTs) are making headlines in the blockchain space. NFTs are supposed to be… Aug16 DeFi vs Regulation: Round One August 2021 DeFi vs Regulation: Round One Due to their decentralized nature, the crypto and DeFi markets have always been complicated subjects when it comes… Jul12 Public Blockchains Are Now Suitable for Financial Institutions July 2021 Public Blockchains Are Now Suitable for Financial Institutions A blockchain is a network, it is an ecosystem. It is a group of people…

The post Guarantee Asset Ownership with Blockchain Passports appeared first on Tokeny Solutions.


MyDEX

DEPLOYING PERSONAL DATA STORES AT SCALE

An important change is beginning to sweep the world of personal data. For many years, people have debated the question ‘what if individuals — e.g. citizens, customers — were able to assert genuine control over their own data?’ Now the debate is moving on to how to make this happen, at scale. Look at some recent developments. In recent weeks: The UK Government’s proposed legislat

An important change is beginning to sweep the world of personal data. For many years, people have debated the question ‘what if individuals — e.g. citizens, customers — were able to assert genuine control over their own data?’

Now the debate is moving on to how to make this happen, at scale.

Look at some recent developments. In recent weeks:

The UK Government’s proposed legislation to reform UK data protection laws, includes (amongst many less positive things) new provisions for data intermediaries including for “Personal information management systems, which seek to give data subjects more control over their personal data.” The UK Department of Culture Media and Sport’s latest paper on its new National Identity and Attributes Trust framework specifically mentions citizens being able to hold verified attributes in their own personal data stores. The Scottish Government’s proposed Scottish Attribute Provider Service includes a provision “where people can choose to save their personal information securely to an individual ‘locker’ (a digital attribute store), in order to reuse when they wish to apply to other services”. The UK Government via its Government Digital Service is to provide a One Log-in for Government which includes the concept of a personal data store to enable citizen control over how their data is being shared across Government. Tom Read GDS Chief Executive Officer said that “One Log-in for government is currently the organisation’s most important piece of work.” Scottish Government has just signed a contract with Mydex CIC to improve recruitment of citizens to participate in projects to co-design public services that ensures privacy via the use of its Inclued platform and personal data stores.The UK NHS and BBC are now experimenting with personal data stores for health and media consumption records

In other words, multiple different parties and people are converging on the same solution — of providing citizens with their own personal data stores — to solve multiple different problems.

The big question now is how to enable this to happen at scale, safely, securely and efficiently. One key element of this is useful, easy-to-use interfaces, the taps and switches that mean people can use the infrastructure without having to think much about it. We’ve written about this here.

But operational deployment as scale presents its own challenge. It’s one thing to build something in a lab to illustrate an idea’s potential. It’s quite another to make the transition to 24/7/365 operations, working at scale in the real world. Answering the question ‘how’ requires robust answers to many hard questions relating to infrastructure resilience, security, system architecture, governance, trustworthiness, business model and legal compliance. Here’s a checklist of these questions in a little more detail.

Are its underlying design principles fit for purpose, robust and built to last?

We talk about this issue in detail here.

Is the individual’s data really secure?

It’s very easy to make promises about data security, but very difficult to keep these promises permanently, especially when a system is operating at scale. Here are some of the safeguards that need to be built.

All data should be encrypted in motion and at rest Distributed architecture: every PDS is separately and individually encrypted (which means the system is not creating a massive centralised database that becomes a honeypot for hackers) No knowledge operations. Every individual should hold their own private key to their own data: the PDS operator does not have access to this private key, and cannot look into the personal data stores it provides or make any changes to who can send or collect data from it only the individual can. Everything the company does relating to the security of information management should be independently assessed and certified. To be legitimate, the PDS provider should be certified under ISO 27001 for information security Management.

The Mydex platform displays all these criteria.

How does the PDS operator cover its costs and make its money?

Is the PDS’s business model open and transparent? Does it, for example, publish a public price tariff, where what organisations are paying, for what, is open for all to see? How does this business model affect the PDS provider’s incentives? For example, some PDS providers have generated business models where they take a ‘cut’ every time data is shared. This generates an incentive for the PDS provider to maximise the amount of data that is shared, thereby creating a potential conflict of interest between it and the citizens it is supposed to be serving.

To make their offerings attractive to organisations that want to retain control, other PDS providers have created halfway-house ‘personal data stores’ which remain inside the organisation’s operational boundaries, where the individual signs in using the organisation’s systems, and where the organisation places restrictions on what data the individual can share with who. Such faux personal data stores may generate revenue streams for the technology provider, but they generate a conflict of interest with the citizen that defeats the object of having a personal data store in the first place.

Does the PDS provider’s business model create revenue streams that are stable, e.g. designed to last in perpetuity?

Mydex’s business model is designed to be open, to avoid conflicts of interest and to be stable. The model is very simple. Organisations pay a fee to connect to the platform, to enable safe efficient data sharing with individuals. There is no limit on what data can be delivered or accessed by who and for what purpose that is under the control of the individual at all times.

Does the PDS provider have governance structures designed to ensure its trustworthiness in perpetuity?

In a new ‘market’ like this, many would-be PDS providers are start-ups that are hungry for funding. Many of them seek funding from venture capitalists who, by definition, are seeking ‘an exit’ in the form of an IPO or trade sale.

This brings two dangers. First, it incentivises the PDS provider to create a business model that focuses on financial extraction — making money out of citizens and/or organisations — rather than genuine service provision.

Second, it means that any promises it makes in terms of commitments to privacy, data protection, business model or anything else may only last until the venture is sold. For a PDS provider to be legitimate, its business model and governance must include legally enforceable guarantees that mean it cannot simply go back on its promises in event of ownership of the organisation changing hands.

That is why Mydex has chosen to be a Community Interest Company — because CIC status builds in legal requirements on the company to stay true to its mission of empowering citizens with their own data.

Is the IT infrastructure robust and capable of operating at scale?

Many people operating in IT today have a ‘hacker mindset’. They love writing bits of code that do cool things, and every time they come across a new technical challenge they automatically start writing another, separate, bit of code. Hackers are often brilliant at creating cool point solutions. But as these point solutions add up, they generate complexity and chaos. People with the hacker mindset are not good at building robust, integrated, efficient solutions that operate at scale. For that, we need an engineering, infrastructure-building mindset that is always asking ‘how does this piece fit with everything else that has already been built? Will it work stably and reliably, at volume?’ This requires an engineering mindset, not a hacker mindset.

Can the system scale without generating mounting risks, costs or complexity?

Providing a million personal data stores that are being used to store and share data, day in and day out, is very different to building a demo in a lab. Having robust software development, testing and deployment systems is essential if the flaws of the hacker mindset are to be avoided. If the system can only work on a particular device such as a smartphone, everyone has to have access to such a device, these devices need to be designed so that they can ‘talk’ to each other, and problems arise if the device is lost, stolen or malfunctions. The only way millions of people can access their data from multiple different devices is if their data is stored (safely) in the cloud. Some ways forward, such as Open Banking, envisage individuals giving permission for their data to be ported from one service provider to another without it being deposited in the individual’s personal data store. This, proponents claim, cuts out the unnecessary extra step of having a data ‘middleman’. The approach works fine for just one or two transactions. But it creates complexity and cost catastrophes as volumes rise. It’s why (for example) telephone exchanges were invented rather than every telephone line trying to create its own unique connection with every other line.

Independent scrutiny and certification

It’s very easy for start-ups to make grand claims about what their technology can do or what their beliefs are. Selling ‘brochureware’ and ‘vaporware’ is a time honoured practice in software: Step 1) sell what you intend to make. Step 2) Use the money made from these sales to actually make what you promised. But an operation that works day in, day out, at scale cannot be fed by ‘vision’ and the apparent confidence of the salesman. What’s needed is independent scrutiny and certification.

That’s why Mydex is independently certified for data management security under ISO27001 and with Fair Data and why it has met the requirements to be listed on UK Government procurement frameworks like G-Cloud and to gain an Open Banking licence.

Built-in regulatory compliance

For any system to scale efficiently it has to make it easier, not harder, for service providers to comply with data protection regulations. This requires dedicated tools and infrastructure that a) ‘designs in’ compliance with key principles such as data minimisation under GDPR and b) enables both citizens and service providers to manage related processes simply, quickly and where possible, automatically.

Leaving compliance to data protection regulations aside to a ‘different department’ — creates a gap and disconnect between ‘legal’ and operations and is not something that can work efficiently and effectively at scale.

Summary

We know, because we’ve been there. Bringing new ideas to life in a lab environment is a positive, necessary thing to do. But making sure they can be implemented at scale, robustly, reliably and resiliently involves another — very different — set of considerations. This blog sums up our experience of what these considerations are.

DEPLOYING PERSONAL DATA STORES AT SCALE was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


Affinidi

Affinidi’s Presence at IIW

The Internet Identity Workshop (IIW) is a biannual gathering of identity enthusiasts where ideas and problems related to Internet identity are exchanged and discussed. The 32nd edition of IIW took place from October 12th -14th 2021, and it was a mix of online and offline audiences. Affinidi had a presence at IIW as it was a unique opportunity to showcase Affinidi’s work in the Self-Sovereign Ide

The Internet Identity Workshop (IIW) is a biannual gathering of identity enthusiasts where ideas and problems related to Internet identity are exchanged and discussed.

The 32nd edition of IIW took place from October 12th -14th 2021, and it was a mix of online and offline audiences.

Affinidi had a presence at IIW as it was a unique opportunity to showcase Affinidi’s work in the Self-Sovereign Identity (SSI) space and to understand the latest trends in this sphere.

Here are some things we did at IIW.

Demo of Schema Manager

Affinidi has built a Schema Manager to manage, create, clone, update, and reuse verifiable credential types and existing schemas, and this was showcased at IIW.

This tool is sure to come in handy for developers, product managers, and engineers who are looking to build an SSI application and face the problem of ensuring trust between actors (mainly, Issuers and Verifiers).

Essentially, the Schema Manager allows you to create any custom VC to meet the needs of a given situation or application. You also get to choose the attributes for each VC, and once you’re done, the tool generates a JSON schema that you can include as part of your code.

The schema you create can be public or private, depending on your preference. Also, you have the choice to publish it as a searchable schema so others can reuse them.

You can try our Schema Manager here.

Group Discussion: What does NOT get adoption for SSI? AKA Failures in SSI

Kamal Laungani, the Global Developer Ecosystem (GDE)Manager at Affinidi, was the convenor of a group discussion on the failures of SSI and the reasons for its non-adoption

This group of eminent participants talked about how hard it is to adopt SSI and the reasons for the same were deliberated.

Some of the barriers discussed were:

Onboarding for humans

There is a lack of understanding on why privacy is important and what can be done to preserve one’s privacy.

Also, there is not a whole lot of simple content that explains what Self-Sovereign Identity (SSI is and how it can be a potential game-changer for the security and privacy of online identities.

Onboarding for IT and DevOps and IAM

There is a gap between community knowledge and information available for IT, DevOps, and IAM professionals. Further, the lack of a well-defined ecosystem and a limited or no incentives model is making adoption more difficult.

Short term barriers vs clarity of immediate benefits

Since SSI is an emerging field, there are many short-term barriers such as developmental cost, a well-developed and easy-to-understand tech stack, and web 3.0 skill sets. Also, there is little clarity on the immediate benefits because the ecosystem is not in place and there seems to be a lot of uncertainties around government legislation.

Legacy integration

The big jump from Web 2.0 to Web 3.0 raises questions on integration with existing systems and apprehensions about building a complete infrastructure from scratch.

Who goes first?

Right now, this is a classic chicken and egg problem. Companies have few incentives to build applications and are waiting for greater adoption. In turn, users are apprehensive as well because there aren’t enough applications in this space.

So, this question of who goes first is impeding the growth of SSI greatly.

High Project Costs

The overall uncertainty coupled with high developmental costs are a barrier for SSI adoptions because the project costs/risks are high, so wrangling internal buy-in is not easy.

All these reasons were believed to contribute to the low rate of SSI adoption, and this has led every attendee to ponder over what can be done to address these concerns.

Based on these discussions, we, at Affinidi, have also started pondering over questions on the high costs, how to make it easy for people to know the importance of preserving their privacy through SSI, and what we can do as an organization to drive its adoption.

Attending Other Programs

Besides the Schema Manager demo and the group discussion, we attended many discussions, workshops, demos, and events, all of which have greatly enriched our knowledge and understanding. We hope to convert these insights into actionable items shortly, so make sure you stay abreast of all that’s happening at Affinidi by being a part of our mailing list and following us on LinkedIn, Twitter, and Facebook.

If you have any specific questions or feedback, reach out to us on Discord.

The information materials contained in this article are for general information and educational purposes only. It is not intended to constitute legal or other professional advice.

Affinidi’s Presence at IIW was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.

Saturday, 23. October 2021

FindBiometrics

Payments, Financial Services, and the PKI Fallacy: This Week’s Top Biometrics Stories

Money is top of mind in this week’s roundup of FindBiometrics’ top stories, with three of the most popular articles concerning payments or financial services. But readers also showed strong […] The post Payments, Financial Services, and the PKI Fallacy: This Week’s Top Biometrics Stories appeared first on FindBiometrics.

Money is top of mind in this week’s roundup of FindBiometrics’ top stories, with three of the most popular articles concerning payments or financial services. But readers also showed strong interest this week in airport biometrics, and in a forceful argument about how authentication should be approached from an industry leader.

The industry leader, FaceTec, was represented by North American Operations VP Jay Meier in a new interview with FindBiometrics. The discussion touches on a number of important topics, but its primary focus is on what Meier calls ‘the PKI Fallacy‘, an approach to identity verification that FaceTec’s leadership feel has some serious drawbacks:

INTERVIEW: FaceTec’s Jay Meier Wants the Identity Industry to Understand ‘the PKI Fallacy’

As for the topic of airport biometrics, it was broached by SITA, which got some attention this week with its argument that in order to recover from the hit delivered by COVID-19, airports must accelerate their digital transformations. That means embracing tech-driven innovations like mobile self-screening apps and facial recognition to enable passenger identification:

SITA Highlights Urgent Need for Digital Transformation in Air Travel

Turning to fiscal matters, Fingerprint Cards got a lot of reader attention this week with survey data indicating that consumers are getting more interested in biometric payment cards. Survey polling was focused on France, and found that 59 percent of respondents were willing to use the kind of fingerprint-scanning payment cards that FPC has been working on with its financial services partners:

FPC Survey Points to Accelerating Biometric Payment Card Market

There was also some news about a biometric payments system that doesn’t even require a card. In Moscow, commuters can now take advantage of a system that lets them pay their subway fares with a simple face scan – a development that has prompted some serious concerns about privacy and surveillance:

Moscow Metro Moves Forward With Launch of Face Pay Naked Payments System

And finally, a guest post from HID Global’s Wladimir Alvarez saw strong, sustained reader interest this week. Alvarez stepped up onto the FindBiometrics platform to draw some attention to the enormous opportunities in the Latin American financial services market – and the serious threat of fraud as criminals have sought to exploit accelerating financial activity online:

How LATAM Banks Can Face Down Fraud Amid Booming Digital Business

*

Stay tuned to FindBiometrics for the latest news from the exciting world of biometrics. You can also visit our sibling site Mobile ID World to read the latest about digital identity.

October 23, 2021 – by Alex Perala

The post Payments, Financial Services, and the PKI Fallacy: This Week’s Top Biometrics Stories appeared first on FindBiometrics.

Friday, 22. October 2021

FindBiometrics

IDEMIA I&S North America Appoints Chief Technology Officer

IDEMIA‘s North American identity and security arm is welcoming an important new member to the executive team, naming Douglas Harvey its new Chief Technology Officer. In announcing the appointment, IDEMIA […] The post IDEMIA I&S North America Appoints Chief Technology Officer appeared first on FindBiometrics.

IDEMIA‘s North American identity and security arm is welcoming an important new member to the executive team, naming Douglas Harvey its new Chief Technology Officer.

In announcing the appointment, IDEMIA noted Harvey’s previous work with Computer Science Corporation and L3Harris, both companies that operate in the defense technology space. IDEMIA highlighted Harvey’s experience at L3Harris in particular, explaining that he led the company’s efforts working with the Federal Aviation Administration to develop its System Wide Information Management Cloud Distribution Service.

Harvey’s appointment comes after the promotion of Donnie Scott to the position of IDEMIA I&S North America’s CEO earlier this year. At the time of his appointment, Scott noted that the IDEMIA arm has “ambitious growth objectives”; now, it appears that Harvey will be an important figure in helping to meet them. In his new role with IDEMIA, Harvey will be tasked with accelerating product design and innovation, as well as establishing new business channels for IDEMIA I&S North America.

“IDEMIA is thrilled Doug has joined our leadership team, boosting our collective expertise in digital technology and product innovation,” Scott said. “Doug’s experience spans more than two decades and multiple technology sectors. His insights will be critical as we work to elevate IDEMIA’s position in industry and across the technology sector.”

For his part, Harvey called it “an honor” to join the ranks of IDEMIA, praising the company for its position as an innovator and industry leader. “I am excited to put my experience to work and collaborate on an accelerated technological vision for the company as it solidifies its place as the global leader in security and authentication,” he said.

IDEMIA I&S North America is, among other things, the provider of technological solutions for TSA PreCheck, the Transportation Security Administration’s expedited passenger processing solution, which reached the important milestone of 12 million enrollments earlier this month.

October 22, 2021 – by Alex Perala

The post IDEMIA I&S North America Appoints Chief Technology Officer appeared first on FindBiometrics.


Authenticate Conference Gains Momentum in 2021

Highlights and Trend-Setter Standouts The Authenticate 2021 conference, only in its second year, has wrapped and the plenary sessions for standards body work under the FIDO Alliance are underway. The […] The post Authenticate Conference Gains Momentum in 2021 appeared first on FindBiometrics.
This article is a guest-contribution by identity industry expert Darrell Geusz, Senior Product Manager, Ping Identity Highlights and Trend-Setter Standouts

The Authenticate 2021 conference, only in its second year, has wrapped and the plenary sessions for standards body work under the FIDO Alliance are underway. The obvious questions that came to mind for the conference, especially given it was a conference started during a pandemic: How well was it attended? Were there any trend-setting end-user organizations that presented? Were there any new products, technologies or use cases that were featured? The short answers are: well, yes, and yes!

In-person attendance reached approximately 250 attendees, with good safety measures in place. No doubt we had to quickly get over the surreal nature of networking among security and identity experts while donning masks – symbology frequently used in the industry to represent the hackers! Virtual attendance was more than three-times the in-person numbers, and the tech and orchestration of the virtual speakers was very smooth. Considering the conference got its start only last year during a pandemic, the numbers are not bad at all. Some standout presentations are highlighted here:

Dave Kleidermacher, VP of Engineering for Android Security & Privacy for Google, presented four solutions for key challenges of digital safety for users. Kleidermacher sees a day when each user retains exclusive access to their private information processed by systems and services so that no one unauthorized has access to the plain-text information, not even the service providers (Private Computing). Safety in public content platforms would be ensured via ‘Radical Curation’ where subject matter experts will review apps (that opt in) related to their space to ensure ‘good nutrition’ (e.g. teachers reviewing apps for kids). Transparency regarding the security and privacy ‘ingredients’ inside of products via certification labels that would ‘raise the tide’ so users know how the apps and devices are protecting them.

And finally, and most exciting to this geeky nerd, Kleidermacher emphasized how digital ID wallets on smartphones, tablets, laptops and other devices can bring easy to use and strong access to websites, applications and IoT devices. Kleidermacher called this the “miracle of unphishable authentication” and reinforced that FIDO passkeys will be available in Google’s digital wallet, including APIs for developers, linked to or in addition to government and other ID types.

Anand Bahety, Software Engineering Manager at eBay, presented a sobering and refreshing view on the challenges of implementing FIDO within a production environment. Many presenters would gloss-over these challenges. Instead, Bahety helped the audience understand how to approach these challenges head-on and mitigate them before going live. Bahety also revealed key gaps in the FIDO specification that need to be addressed through updated specifications in the future, including key APIs that should exist but don’t, and the inability for a user to register using the same device with the same single service provider (relying party) across various applications or websites. Today, using FIDO, you have to re-register for each and every app or website that the relying party hosts.

Itai Zach, Senior Product Specialist, and Anthony Bahor, Customer Success Manager, from Ping Identity reminded the audience that implementing passwordless within an enterprise or for customers can bring major benefits if it is treated as a “journey” rather than just a program or project. Zach gave a very logical high-level walkthrough of the journey template to succeed in passwordless deployments. Bahor followed up by providing first-hand experience in the details when planning and executing each step along the way. The nuances of how to avoid obstacles and glide over the speedbumps during activities along the journey to get to a passwordless experience were very helpful. There is no doubt taking these experience-based recommendations when it comes to FIDO and implementing passwordless will make life a lot easier.

Tom Sheffield, Senior Director of Cybersecurity at Target, gave us an impressive overview of how FIDO was rolled out for the the company’s workforce. Amazingly, Target discovered how important creating an internal brand and related iconography was to help educate users and drive adoption. Target also learned that trying to force adoption, including mandating a particular FIDO authenticator for all users (e.g. a particular hardware device or methodology), would not work in their environment. Users had to be convinced to opt in on their own and had to be afforded flexibility for their specific work environment and habits. Some excellent program metrics were also presented by Target, including how and when to measure each metric, useful to anyone implementing passwordless for their workforce. Very impressive indeed to see a giant like Target working smart deploying FIDO throughout their ecosystem.

Kayla Shapiro, Production Engineer from Facebook, highlighted the work her team did building an end-to-end trusted environment for employees that also incorporates FIDO. Shapiro masterfully delivered understandable and meaningful content around topics like “key exfiltration” and components like “Merkle trees.” Shapiro artfully presented these aspects in the context of identity (as opposed to generic PKI or blockchain, as is typically the case). It was also clear to the audience how these aspects discriminate the solution presented. Shapiro mentioned their work is being considered for OpenSource availability – something to definitely keep an eye out for.

These kinds of presentations certainly were refreshing, as compared to only listening to how a standard workflow happens or has been improved. Highlighting concrete information regarding real-world implementations of FIDO, including metrics, really builds confidence that the FIDO standard has come into its own and is ready for primetime. Organizations looking for the “miracle of unphishable authentication” to strengthen their security posture, while simultaneously improving the user experience, should consider FIDO to get things started. It’s clear the next step is for those same organizations to demand that FIDO be combined with the strong root-of-trust incorporated in mobile IDs and driver licenses now being provisioned by government issuers and being inserted in various wallets. This combination has a real chance to truly digitize and revolutionize how we engage daily with in-person, online and IoT systems and applications, and then yes, maybe… just maybe… the password will be dead. 

We can’t wait to attend next year’s Authenticate conference, when hopefully someone will be showcasing this exact combination! Until then, we are pleased to see FIDO stepping up to the table to begin independent lab testing of ID verification vendors starting early next year (i.e. UX/UI and image quality assurance and back-engine ML/AI processing for physical government ID authentication and selfie-to-ID photo matching with liveness). This will be very helpful as a stopgap until mobile IDs become more ubiquitous across jurisdictions.

We would be remiss not to also mention the important key activities sponsored by the OpenID Foundation (OIDF) that happened the same week as Authenticate 2021 to offer FIDO Alliance member engagement to participate in:

Global Assured Identity Network (GAIN) – An initiative kicked off by a white paper with 150 authors of a white paper focused on a common-sense approach that addresses the challenge of international interoperability of digital identity. An overview of the program and planned proof of concept was presented and invitation to FIDO Alliance members to collaborate and participate. Interconnecting Mobile Driver Licenses and OIDF Protocols with FIDO – Established standards (e.g. ISO 18013-5 and Open ID Connect SIOP), and standards in development like ISO 18013-7 and workgroup effort under ISO 23220, can benefit greatly by considering or interconnecting complementary FIDO standards to enable safer and more secure online experiences that need to take advantage of the mobile driver license data and root-of-trust. Shared Signals and Events Framework (SSE) – OIDF recently launched a new standard that is designed to fight fraud, securely and privately, through the sharing of security events, state changes, and other signals between related and/or dependent systems. By standardizing how organizations share various fraud, risk and identity proofing / vetting signals, along with built-in access control mechanisms and additional FIDO standards, the number of errors and omissions would be reduced, consent and approval can be realized, and dealing with exceptions becomes a lot easier and faster.

These sessions were icing on the cake for the week. All-in-all, Authenticate 2021 packed a good punch of industry progress backed up with real-world results and metrics, topped off with a glimpse of the future where things can go when mashing up key standards. I look forward to what’s in store at Authenticate 2022!

The post Authenticate Conference Gains Momentum in 2021 appeared first on FindBiometrics.


FIDO Authentication Barometer Shows Rise of Biometrics

The FIDO Alliance has launched a new Online Authentication Barometer to better understand what the world thinks about different authentication methods. The initial barometer is based on the feedback of […] The post FIDO Authentication Barometer Shows Rise of Biometrics appeared first on FindBiometrics.

The FIDO Alliance has launched a new Online Authentication Barometer to better understand what the world thinks about different authentication methods. The initial barometer is based on the feedback of 10,000 consumers in 10 countries all over the world, and will be updated in the future to track changes in people’s attitudes over time.

In that regard, FIDO believes that people will show a greater interest in biometric authentication as they become more familiar with the technology. However, passwords remain the method of choice at the current moment. That’s especially true in financial services, where more than half (56 percent) of the respondents had accessed an account with a password sometime in the past two months. Meanwhile, a full 19 percent of the public still believe that passwords are the strongest security method available online (despite the considerable evidence to the contrary).

Thankfully, there is evidence that suggests that those perspectives are shifting. Eighty-four percent of the public have taken steps to strengthen the security of their online accounts, indicating that most people are at least aware that passwords alone are not a sufficient security method. Biometrics (regardless of modality) are now the second most popular form of authentication, and were used by 35 percent of the respondents.

The shift toward biometrics reflects the fact that people are starting to place more trust in the technology. Just under a third (32 percent) regard biometrics as the strongest authentication method, while 28 percent would prefer to use biometrics over another form of verification. The overall trend was consistent for all 10 countries captured in the survey.

FIDO went on to suggest that education may be the best way to reach out to those who are still clinging to passwords. Of those who had not updated their security practices, 37 percent indicated that they did not know what steps to take, while 26 percent said that the process was too complicated. The increased support for passwordless technologies amongst tech giants like Apple and Microsoft is expected to help normalize alternative authentication methods.

The first Online Authentication Barometer survey was conducted in September, and captures the responses of consumers in the UK, France, Germany, Australia, Singapore, Japan, South Korea, India, China, and the United States. The Alliance’s first in-person Authenticate conference, meanwhile, is taking place this week in Seattle, Washington.

(Originally posted on Mobile ID World)

The post FIDO Authentication Barometer Shows Rise of Biometrics appeared first on FindBiometrics.


TruNarrative Provides Onboarding Services for Al Rayan Bank

Another financial institution is turning to TruNarrative for customer onboarding and risk screening. The company’s latest customer is Al Rayan Bank, which was founded in 2004 and remains the oldest […] The post TruNarrative Provides Onboarding Services for Al Rayan Bank appeared first on FindBiometrics.

Another financial institution is turning to TruNarrative for customer onboarding and risk screening. The company’s latest customer is Al Rayan Bank, which was founded in 2004 and remains the oldest and largest Islamic bank in the United Kingdom.

The TruNarrative partnership is in keeping with the bank’s broader digital transformation efforts. Al Rayan will integrate TruNarrative’s technology into its existing tech stack and its digital banking app to enable remote onboarding for businesses and individual customers. TruNarrative will also provide ongoing monitoring services to watch for potential signs of fraud.

On the onboarding front, the TruNarrative platform offers both face and document recognition. Each user’s personal information will be cross-referenced with various third-party databases to make sure that the data is accurate, and to ensure that the user is not facing any sanctions and is not on any lists of politically exposed persons. Facial recognition, meanwhile, will match a selfie to official ID to ensure that that person is indeed the true owner of the identity document.

TruNarrative will then assign a risk score to each transaction after the account has been opened. The company will watch for anomalous behavior that could point to the presence of a fraudster, which in turn will help prevent fraud and other forms of financial crime. The solution will help Al Rayan comply with Know Your Customer and Anti-Money Laundering requirements. Al Rayan will also benefit from TruNarrative’s centralized case management tools, and from its integration with the UK’s Cifas fraud database, which will speed up fraud investigations.

“[TruNarrative’s] technology means that we do not have to piece together this part of our tech stack from multiple different suppliers, giving us end to end onboarding and fin-crime in one place with full audit trail and reporting,” said Al Rayan COO Imran Pasha.

The deal builds on TruNarrative’s strong customer base in the UK. The company is already providing onboarding and fraud prevention services for the banking start-up MoneeMint, the gambling platform The Tote, and for the RegTech firm ieDigital.  

October 22, 2021 – by Eric Weiss

The post TruNarrative Provides Onboarding Services for Al Rayan Bank appeared first on FindBiometrics.


Florida Seeks Apple Collaboration Ahead of Mobile ID Launch

“…the Florida Smart-ID app will not function in exactly the same way as Apple’s mobile ID.” Florida can be added to the list of states working with Apple on a […] The post Florida Seeks Apple Collaboration Ahead of Mobile ID Launch appeared first on FindBiometrics.
“…the Florida Smart-ID app will not function in exactly the same way as Apple’s mobile ID.”

Florida can be added to the list of states working with Apple on a mobile, virtual ID solution.

The news, first reported by Florida Politics, came by way of a state Senate panel last week, at which representatives of the Florida Department of Highway Safety and Motor Vehicles (FLHSMV) confirmed the Apple collaboration.

The official acknowledged that Florida hadn’t been on the list of states officially working with Apple when the tech giant offered an update on its mobile ID efforts at the end of summer. At the time, the list included Arizona, Connecticut, Georgia, Iowa, Kentucky, Maryland, Oklahoma, and Utah.

Terrence Samuel, Motorist Modernization Director at the FLHSMV, reportedly expressed surprise that Florida hadn’t been included on the list. But the explanation may lie in the fact that Florida’s mobile ID efforts were not being led by Apple. Rather, the state has been working on its own digital driver’s license system since 2014, and is now preparing to launch its Florida Smart-ID app – on iOS and Android – in mid-November. The FLHSMV had been presenting to the state Senate on that effort generally, rather than on the Apple Mobile ID program in particular.

As described in the Florida Politics report, the Florida Smart-ID app will not function in exactly the same way as Apple’s mobile ID. While the latter appears to communicate ID holder data via NFC, Florida Smart-ID will require the ID holder to scan a QR code from any official asking to see ID (such as a law enforcement officer), and will then bounce the request to a third party server to retrieve the information needed.

However, like Apple’s mobile ID, the Florida Smart-ID app will share information selectively, based on the context and the requesting party. A clerk asking for proof of age at a store will not be sent the same biographic information as a police officer, for example.

It isn’t yet clear whether the Florida Smart-ID app will have a similar identity confirmation process as that of Apple’s mobile ID, which will require users to submit selfie videos in which they perform specified face and head movements during the onboarding process. Apple will use facial recognition to match end users to images of their photo IDs.

In any case, the FLHSMV appears keen to work with Apple to get Florida’s mobile ID integrated into the Apple Wallet, with Samuel having told the Senate panel that there is “nothing that we see that would prevent us from being on the list” of Apple mobile ID partners.

Whether Florida’s app is adapted into Apple’s mobile ID platform or not, the state’s plans for an imminent launch of a mobile ID system reflect growing enthusiasm over the technology, with the government of Ontario – Canada’s most populous province – also currently working on its own mobile ID solution, as another example.

Sources: Florida Politics, 9to5Mac, MacRumors

(Originally posted on Mobile ID World)

The post Florida Seeks Apple Collaboration Ahead of Mobile ID Launch appeared first on FindBiometrics.


Touchless Technology to Drive $37.6 Billion Gesture Recognition Market

MarketsandMarkets is anticipating major growth in the Gesture Recognition and Touchless Sensing Market. The firm believes that the market will nearly triple in the next few years, jumping from $13.6 […] The post Touchless Technology to Drive $37.6 Billion Gesture Recognition Market appeared first on FindBiometrics.

MarketsandMarkets is anticipating major growth in the Gesture Recognition and Touchless Sensing Market. The firm believes that the market will nearly triple in the next few years, jumping from $13.6 billion in 2021 to $37.6 billion in 2026. Those numbers correspond to a CAGR of 22.6 percent.

According to MarketsandMarkets, the demand for touchless technologies will outpace the demand for touch-based gesture recognition solutions. That trend reflects the impact of COVID-19, which has made people more aware of the health risks associated with devices that require physical contact. Touchless user interfaces are expected to be particularly popular in the next generation of smart vehicles, where they can help people navigate menus more quickly, and without taking their eyes off the road.

Touch-based gesture recognition technologies will still turn up in consumer electronic devices such as gaming systems, due in large part to their low cost and ease of integration. North America will be the biggest market for gesture recognition tech, though the Asia Pacific region will display a higher CAGR. Countries like China, India, South Korea, and Japan are using gesture recognition in government-backed smart city projects, while consumers have expressed interest in devices (such as TVs and smartphones) with gesture recognition capabilities.

The growing demand for touchless technologies in the APAC region will coincide with a growing demand for biometric technologies more generally. Biometric tech can improve security, while touchless tech will help meet the demand for more hygienic solutions.

MarketsandMarkets listed HID Global, Infineon, Cognitec, and iProov as some of the biggets players in the gesture recognition space, placing them alongside perennial tech titans like Microsoft, Qualcomm, Apple, and Google. The numbers are in keeping with the firm’s 2020 report on the gesture recognition market, which predicted a $32.3 billion market for 2025. Valuates similarly expects the market to reach $34.3 billion in the same time frame.

October 22, 2021 – by Eric Weiss

The post Touchless Technology to Drive $37.6 Billion Gesture Recognition Market appeared first on FindBiometrics.


Onfido Survey Finds Consumers Getting Comfortable With Online Services

Onfido and Okta have released a new report that suggests that companies are now racing against the clock when they try to onboard new customers online. The report found that […] The post Onfido Survey Finds Consumers Getting Comfortable With Online Services appeared first on FindBiometrics.

Onfido and Okta have released a new report that suggests that companies are now racing against the clock when they try to onboard new customers online. The report found that while people are more comfortable using online services, they expect the onboarding process to take less than 10 minutes, and will lose confidence in businesses that exceed that time.

The findings in the Digital by Default report are based on the responses of over 1,000 people in France, and another 4,000 people in the US, the UK, and the Netherlands. The survey itself was conducted in September of 2021.

Digging into the details, French consumers were the most enthusiastic about digital services, with the overwhelming majority (94 percent) indicating that they feel comfortable when accessing things online. Many (43 percent) said that their comfort level increased during the pandemic, which suggests that people become more receptive to digital services the more that they have to use them. In that regard, COVID-19 led to increased adoption of digital services as people turned to remote service options during pandemic lockdowns.

The French respondents cited convenience as the biggest advantage that online channels have over their in-person counterparts. The majority (64 percent) appreciated the fact that they didn’t have to leave their home, while nearly half (44 percent) liked the fact that many online services are available 24/7. Many (48 percent) also felt that online services were safer than physical venues during the pandemic.

On the security front, 70 percent of the respondents expressed a willingness to use biometric authentication instead of a password. Meanwhile, 85 percent of those who have already used a biometric to open an account would be willing to do so a second time.

The trend was true across industries and countries, though consumers are particularly keen to use the internet while banking, when booking hotels, and when accessing telecom accounts. Seventy-eight percent of those in France want to be able to open an account in less than 10 minutes, regardless of the nature of that service.

“From the moment a consumer visits a service provider’s website or downloads an app, they’re evaluating whether the business can deliver a trusted digital service, providing security and keeping their data private,” said Onfido CEO Mike Tuchen. “Those that can offer low or zero friction during verification and authentication will positively differentiate themselves in a market where digital services have become the norm.”

Onfido itself has enjoyed record growth during the pandemic, a fact that reflects the rising demand for secure remote onboarding technology. Tiger Brokers and the BUX investment platform are some of the company’s most recent customers.

October 22, 2021 – by Eric Weiss

The post Onfido Survey Finds Consumers Getting Comfortable With Online Services appeared first on FindBiometrics.


Elliptic

DarkSide bitcoins on the move following government cyberattack against REvil ransomware group

$7 million in bitcoin held by the DarkSide ransomware group is on the move, five months after the attack on Colonial Pipeline that crippled fuel supplies along the US East coast. These funds had remained dormant since the group shut down on May 13.

$7 million in bitcoin held by the DarkSide ransomware group is on the move, five months after the attack on Colonial Pipeline that crippled fuel supplies along the US East coast. These funds had remained dormant since the group shut down on May 13.


Okta

A Quick Guide to Angular and GraphQL

Over the past five years, GraphQL has established itself as the most popular alternative to REST APIs. GraphQL has several advantages over traditional REST-based services. First of all, GraphQL makes the query schema available to the clients. A client that reads the schema immediately knows what services are available on the server. On top of that, the client is able to perform a query on a subset

Over the past five years, GraphQL has established itself as the most popular alternative to REST APIs. GraphQL has several advantages over traditional REST-based services. First of all, GraphQL makes the query schema available to the clients. A client that reads the schema immediately knows what services are available on the server. On top of that, the client is able to perform a query on a subset of the data.

Both of these features make the API much more flexible. The server can freely extend the API or make different parts of it available to different clients without breaking any client code. Another major advantage is the ability to perform complex queries. This reduces the number of API calls a client has to perform, and therefore improves performance.

In this tutorial, I will show you how to consume GraphQL in an Angular client. First, I will create a simple GraphQL server using Express. Then I will use the Apollo library to create a service in Angular that connects to the API and performs queries on it. I will implement a simple application that lets you browse the characters of the Star Wars franchise and look up details about their races and home planets. I will not assume any prior knowledge of Express, Angular, or GraphQL. I will assume that you are familiar with JavaScript and Node and have an up-to-date version of the npm package manager installed on your computer.

Prerequisites:

Node 14 Okta CLI

Table of Contents

Implement a GraphQL server with Express Create an Angular GraphQL client Integrate OIDC for auth Add JWT authentication to the server Add Okta to Angular Learn more about Angular, GraphQL, and single-page applications Implement a GraphQL server with Express

In this section, I will show you how to implement the GraphQL server using the Express framework. To start, open your terminal in an empty folder of your choice. This will be the folder that will contain the server project. Inside that folder, run the following command to initialize your Express project.

npm init -y

This will create a package.json in your folder containing the project information and dependencies. Next, you need to install the dependencies required for this project. In your terminal, run the following command.

npm i -E express@4.17.1 graphql@15.6.1 express-graphql@0.12.0 cors@2.8.5 \ body-parser@1.19.0 csv-parse@4.16.3

To make the server work, you will need some data. I have chosen to keep it simple and use existing CSV data from Kaggle, a machine learning and data science community. Joe Young has put together Star Wars character data from the Fandom Wiki and made it freely available at this link https://www.kaggle.com/jsphyg/star-wars. You will need only two files from his collection for this project characters.csv and species.csv. You might have to create a Kaggle account to download the data. Place the files into your project folder.

GraphQL uses a global schema to define the queries allowed on the API and the data they return. Create a file schema.graphql and paste the following contents into it.

type Query { characters(offset: Int = 0, limit: Int = 10): CharactersResult character(name: String!): Character species(name: String!): Species } type CharactersResult { count: Int characters: [Character] } type Character { name: String height: Int mass: String hair_color: String skin_color: String eye_color: String birth_year: String gender: String homeworld: String species: String } type Species { name: String classification: String designation: String average_height: String skin_colors: String hair_colors: String eye_colors: String average_lifespan: String language: String homeworld: String }

The Query type defines the entry point of the schema. It defines three queries: characters, character, and species. The data returned by these queries is defined in separate types in the schema file. Now, create a new file app.js, and fill it with the code below.

const express = require('express'); const cors = require('cors'); const { json } = require('body-parser'); const fs = require('fs'); const { graphqlHTTP } = require('express-graphql'); const { buildSchema } = require('graphql'); const parse = require('csv-parse/lib/sync'); const app = express() .use(cors()) .use(json()); const schema = buildSchema(fs.readFileSync('schema.graphql', 'utf8')); const characters = parse(fs.readFileSync('characters.csv', 'utf8'), { columns: true }); const species = parse(fs.readFileSync('species.csv', 'utf8'), { columns: true }); const root = { characters: (args) => { return { count: characters.length, characters: characters.slice(args.offset, args.offset + args.limit) }; }, character: (args) => { return characters.find((ch) => ch.name === args.name); }, species: (args) => { return species.find((ch) => ch.name === args.name); }, }; app.use('/graphql', graphqlHTTP({ schema, rootValue: root, graphiql: true, })); app.listen(4201, (err) => { if (err) { return console.log(err); } return console.log('Server listening on port 4201'); });

This is all the code you need to write to create the GraphQL server. The graphqlHTTP middleware registers the handlers on the /graphql route of the express server. The root object defines so-called reducers that will be called whenever a client queries the API. They are responsible for collecting and returning the data.

The nice thing about GraphQL is that you can immediately try it out. In your terminal, run the following command.

node app.js

Now open your browser at http://localhost:4201/graphql and you will be presented with an interactive console.

On the left side, you can try out queries. Give it a go by pasting the following query into the query editor in the browser and pressing the play button at the top.

{ characters(offset: 2, limit:3) { count characters { name homeworld species } } species(name:"Human") { classification homeworld } }

You should see this result on the right side of the screen.

{ "data": { "characters": { "count": 87, "characters": [ { "name": "R2-D2", "homeworld": "Naboo", "species": "Droid" }, { "name": "Darth Vader", "homeworld": "Tatooine", "species": "Human" }, { "name": "Leia Organa", "homeworld": "Alderaan", "species": "Human" } ] }, "species": { "classification": "mammal", "homeworld": "Coruscant" } } } Create an Angular GraphQL client

Angular provides a command-line tool that makes it easy for anybody to set up and maintain an Angular project. The Angular CLI tool can be installed globally using npm by running the following command.

npm install -g @angular/cli@12.2.10

This package provides the global ng command. With this, open the terminal in a folder of your choice and create a new project by running the command below.

ng new graphql-client --routing --style=css

This will create a folder graphql-client, install the Angular framework into it, set up all the necessary toolchains, and create a default application skeleton that you can extend.

The Apollo library offers a convenient GraphQL client. Navigate into the new project folder and add it to the project by running the following command.

ng add apollo-angular

When asked to confirm the installation of the library, answer with y. Next, you will be asked to provide the endpoint URI. Type in http://localhost:4201/graphql and press enter. Now that the basic setup is done, you are ready to create a few components.

Start by creating three components: a Navbar for the page navigation, Home for the home page, and Browse, containing a page to let you browse through the query results. In the terminal, run the following commands.

ng generate component navbar --module=app ng generate component home --module=app ng generate component browse --module=app

In an editor of your choice, open src/app/app.component.html and replace its content with the following two lines.

<app-navbar></app-navbar> <router-outlet></router-outlet>

There are only two components here. The first will show the navigation bar. The <router-outlet> is a special component in Angular. The Angular router is responsible for reading the navigation URL and deciding which component to show based on the requested path. The <router-outlet> will then show that component.

Next, open src/app/navbar/navbar.component.html and paste the code below into the file.

<div class="navbar"> <div> GraphQL Angular App </div> <nav> <ul> <li><a routerLink="/">Home</a></li> <li><a routerLink="/browse">Browse</a></li> </ul> </nav> </div>

Notice the routerLink properties on the anchor tags. These links are similar to relative links inside regular HTML. But instead of reloading the page, they instruct the router to replace the active component in the router-outlet.

The navigation bar could use some styling. Open src/app/navbar/navbar.component.css and fill in the CSS code below.

.navbar { padding: 8px 16px; background-color: #333333; color: #ffffff; display: flex; justify-content: space-between; align-items: center; } nav ul { display: flex; list-style: none; padding: 0; } nav li { margin-left: 8px; margin-right: 8px; } a { color: #ffffff; text-decoration: none; }

Now, update the Home component by pasting the following line into src/app/home/hone.component.html.

<h1>Consuming GraphQL with Angular</h1>

To style the component, open src/app/home/hone.component.css and add the following code.

h1 { text-align: center; margin: 4rem; }

Before you can start work on the Browse component, you need a service class that encapsulates the calls to the GraphQL API. This class is the core part of this application and demonstrates just how easy it is to communicate with the server using GraphQL. Open the terminal in your project folder and run the following command.

ng generate service characters

This will create a new file src/app/characters.service.ts. Open this file in your editor and replace its contents with the code below.

import { Injectable } from '@angular/core'; import { Apollo, gql, QueryRef } from 'apollo-angular'; export interface Character { name: string; homeworld: string; species: string; } export interface CharacterDetail extends Character { height: number; mass: string; hair_color: string; skin_color: string; eye_color: string; birth_year: string; gender: string; } export interface CharactersResult { count: number; characters: Character[]; } export interface Species { name: string; classification: string; designation: string; average_height: string; skin_colors: string; hair_colors: string; eye_colors: string; average_lifespan: string; language: string; homeworld: string; } @Injectable({ providedIn: 'root' }) export class CharactersService { private charactersQuery: QueryRef<{characters: CharactersResult}, { offset: number}>; private findCharacterQuery: QueryRef<{character: CharacterDetail}, { name: string}>; private findSpeciesQuery: QueryRef<{species: Species}, { name: string}>; constructor(private apollo: Apollo) { this.charactersQuery = this.apollo.watchQuery({ query: gql`query characters($offset: Int!) { characters(offset: $offset) { count characters { name homeworld species } } }` }); this.findCharacterQuery = this.apollo.watchQuery({ query: gql`query character($name: String!) { character(name: $name) { name height mass hair_color skin_color eye_color birth_year gender homeworld species } }` }); this.findSpeciesQuery = this.apollo.watchQuery({ query: gql`query species($name: String!) { species(name: $name) { name classification designation average_height skin_colors hair_colors eye_colors average_lifespan language homeworld } }` }); } async getCharacters(offset: number): Promise<CharactersResult> { const result = await this.charactersQuery.refetch({ offset }); return result.data.characters; } async findCharacter(name: string): Promise<CharacterDetail> { const result = await this.findCharacterQuery.refetch({ name }); return result.data.character; } async findSpecies(name: string): Promise<Species> { const result = await this.findSpeciesQuery.refetch({ name }); return result.data.species; } }

This implementation of the service consists of three parts. At the top of the file, I declared the TypeScript types related to this service. Each type represents the result of a query. The second part is the constructor. Here, the Apollo queries are defined. The queries make use of the gql tag to template strings. This tag translates the strings to GraphQL documents. Finally, the member functions of the service make the queries publicly available to components in your application.

The queries are asynchronous and return promises that resolve the query result once a response from the server has been received.

With this, you can now implement the Browse component. Open src/app/browse/browse.component.ts and change its content to match the following code.

import { Component, OnInit } from '@angular/core'; import { Character, CharactersService } from '../characters.service'; @Component({ selector: 'app-browse', templateUrl: './browse.component.html', styleUrls: ['./browse.component.css'] }) export class BrowseComponent implements OnInit { offset: number = 0; count: number = 0; characters: Character[] = []; constructor(private charactersService: CharactersService) {} async ngOnInit(): Promise<void> { await this.updateCharacters(); } async updateCharacters() { const result = await this.charactersService.getCharacters(this.offset); this.count = result.count; this.characters = result.characters; } showPrevious() { return this.offset > 0; } showNext() { return this.offset + 10 < this.count; } async onPrevious() { this.offset -= 10; await this.updateCharacters(); } async onNext() { this.offset += 10; await this.updateCharacters(); } }

The function updateCharacters() uses the CharactersService to obtain a list of characters from the server. The functions onPrevious() and onNext() are callbacks that can be used to page through the results. Now, open src/app/browse/browse.component.html and copy the following contents into it.

<div class="browse"> <h1>Browse Characters</h1> <table> <tr> <th>Name</th> <th>Homeworld</th> <th>Species</th> </tr> <tr *ngFor="let character of characters"> <td><a [routerLink]="['/character']" [queryParams]="{name: character.name}">{{character.name}}</a></td> <td>{{character.homeworld}}</td> <td><a [routerLink]="['/species']" [queryParams]="{name: character.species}">{{character.species}}</a></td> </tr> </table> <div class="pager"> <button class="prev" *ngIf="showPrevious()" (click)="onPrevious()"> prev </button> <button class="next" *ngIf="showNext()" (click)="onNext()"> next </button> </div> </div>

The component displays the character data in a table and provides navigation buttons to page through the results. It also provides links to routes that will display details of a character or its species. You will be implementing those routes later.

The CSS file src/app/browse/browse.component.css can be used to provide some styling to the browse component. Update the contents to the code below.

.browse { width: 100%; display: flex; flex-direction: column; align-items: center; } .browse > * { width: 100%; max-width: 800px; } table { margin-top: 16px; margin-bottom: 16px; border-collapse: collapse; border: 1px solid #bbbbbb; } tr:nth-child(odd) { background-color: #dddddd; } th, td { padding: 4px 8px; border-right: 1px solid #bbbbbb; border-left: 1px solid #bbbbbb; } .prev { float: left; } .next { float: right; }

Now, create the two components that will display the character and species details. Open the terminal again and type in the following commands.

ng generate component character --module=app ng generate component species --module=app

This will create the files for the two components. Open src/app/character/character.component.ts and paste the following code into it.

import { Component, OnInit } from '@angular/core'; import { ActivatedRoute } from '@angular/router'; import { CharacterDetail, CharactersService } from '../characters.service'; @Component({ selector: 'app-character', templateUrl: './character.component.html', styleUrls: ['./character.component.css'] }) export class CharacterComponent implements OnInit { character!: CharacterDetail; constructor(private route: ActivatedRoute, private characterService: CharactersService) { } ngOnInit(): void { this.route.queryParams.subscribe(async (params) => { this.character = await this.characterService.findCharacter(params.name); }); } }

This is a simple component that reads the character’s name from the query parameters in the URL and then uses CharactersService to obtain the character detail. The corresponding template simply shows the queried data. Open src/app/character/character.component.html and copy the following contents into it.

<div *ngIf=character> <p><strong>Name:</strong> {{character.name}}</p> <p><strong>Mass:</strong> {{character.mass}}</p> <p><strong>Hair Color:</strong> {{character.hair_color}}</p> <p><strong>Skin Color:</strong> {{character.skin_color}}</p> <p><strong>Eye Color:</strong> {{character.eye_color}}</p> <p><strong>Birth Year:</strong> {{character.birth_year}}</p> <p><strong>Gender:</strong> {{character.gender}}</p> </div> <a [routerLink]="['/browse']">Back</a>

The component showing the species information follows the same pattern. Open src/app/species/species.component.ts and update the content to match the code below.

import { Component, OnInit } from '@angular/core'; import { ActivatedRoute } from '@angular/router'; import { CharactersService, Species } from '../characters.service'; @Component({ selector: 'app-species', templateUrl: './species.component.html', styleUrls: ['./species.component.css'] }) export class SpeciesComponent implements OnInit { species!: Species; constructor(private route: ActivatedRoute, private characterService: CharactersService) { } ngOnInit(): void { this.route.queryParams.subscribe(async (params) => { this.species = await this.characterService.findSpecies(params.name); }); } }

Next, open src/app/species/species.component.html and paste in the code below.

<div *ngIf=species> <p><strong>Name:</strong> {{species.name}}</p> <p><strong>Classification:</strong> {{species.classification}}</p> <p><strong>Designation:</strong> {{species.designation}}</p> <p><strong>Hair Colors:</strong> {{species.hair_colors}}</p> <p><strong>Skin Colors:</strong> {{species.skin_colors}}</p> <p><strong>Eye Colors:</strong> {{species.eye_colors}}</p> <p><strong>Average Lifespan:</strong> {{species.average_lifespan}}</p> <p><strong>Language:</strong> {{species.language}}</p> <p><strong>Homeworld:</strong> {{species.homeworld}}</p> </div> <a [routerLink]="['/browse']">Back</a>

I already mentioned how the Angular router is responsible for deciding which component to render, based on the path in the URL. In the file src/app/app-routing.module.ts you can define these associations.

import { NgModule } from '@angular/core'; import { RouterModule, Routes } from '@angular/router'; import { HomeComponent } from './home/home.component'; import { BrowseComponent } from './browse/browse.component'; import { CharacterComponent } from './character/character.component'; import { SpeciesComponent } from './species/species.component'; const routes: Routes = [ { path: '', component: HomeComponent }, { path: 'browse', component: BrowseComponent, }, { path: 'character', component: CharacterComponent, }, { path: 'species', component: SpeciesComponent, }, ]; @NgModule({ imports: [RouterModule.forRoot(routes)], exports: [RouterModule] }) export class AppRoutingModule { }

You are now ready to test the application. Make sure your server is running as described in the previous section. Then open a terminal in the client Angular project folder and run the following command.

ng serve

Open your browser, navigate to http://localhost:4200 and you should see the homepage of your app. You can use the navigation bar to go to the browse page. You should see something like the image below.

Integrate OIDC for auth

Every good application needs some user control. Okta lets you add user authentication easily to your application.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create. Select the default app name, or change it as you see fit. Choose Single-Page App and press Enter.

Use http://localhost:4200/login/callback for the Redirect URI and set the Logout Redirect URI to http://localhost:4200.

What does the Okta CLI do?

The Okta CLI will create an OIDC Single-Page App in your Okta Org. It will add the redirect URIs you specified and grant access to the Everyone group. It will also add a trusted origin for http://localhost:4200. You will see output like the following when it’s finished:

Okta application configuration: Issuer: https://dev-133337.okta.com/oauth2/default Client ID: 0oab8eb55Kb9jdMIr5d6

NOTE: You can also use the Okta Admin Console to create your app. See Create an Angular App for more information.

Make a note of the Issuer and the Client ID. You will need them in the next steps.

Add JWT authentication to the server

Adding authentication to the server is easy. Open a terminal in the server’s project folder and install a few more dependencies by running the following command.

npm i -E @okta/jwt-verifier@2.3.0 express-bearer-token@2.4.0

Now, create a new file auth.js with the content below.

const OktaJwtVerifier = require('@okta/jwt-verifier'); const oktaJwtVerifier = new OktaJwtVerifier({ clientId: '{yourClientID}', issuer: '/oauth2/default' }); module.exports = async function oktaAuth(req, res, next) { try { const token = req.token; if (!token) { return res.status(401).send('Not Authorized'); } await oktaJwtVerifier.verifyAccessToken(token, 'api://default'); next(); } catch (err) { return res.status(401).send(err.message); } };

This module exports an Express middleware that checks the bearer token of an incoming request and verifies its validity. To use this middleware, open app.js and add the following require statements to the top of the file.

const bearerToken = require('express-bearer-token'); const oktaAuth = require('./auth');

Then, update the creation of the Express server with the declaration of the middleware to match the following.

const app = express() .use(cors()) .use(json()) .use(bearerToken()) .use(oktaAuth);

The bearerToken middleware extracts the bearer token from a request header. oktaAuth then checks the token. You can now run node app.js again to start the server. Only now, if you try to access the API you will get a 401 Unauthorized error response from the server.

Add Okta to Angular

To allow the client to access the GraphQL API, you need to also add Okta authentication to the Angular application. Open a terminal in the client’s project folder and install the okta-angular dependency by running the following command.

npm install -E @okta/okta-angular@4.1.0 @okta/okta-auth-js@5.6.0

Open src/app/app.module.ts and create an OktaAuth instance by adding the following lines after the existing import statements.

import { OKTA_CONFIG, OktaAuthModule } from '@okta/okta-angular'; import { OktaAuth } from '@okta/okta-auth-js'; const config = { issuer: '/oauth2/default', clientId: '{yourClientID}', redirectUri: window.location.origin + '/login/callback' } const oktaAuth = new OktaAuth(config);

Make sure to replace the {...} placeholders with your OIDC app settings.

Next, add OktaAuthModule to the array of imports in the NgModule configuration.

@NgModule({ ... imports: [ ... OktaAuthModule ], ... })

Also, create a new providers property that configures the module by pasting the code below after the imports array.

providers: [{ provide: OKTA_CONFIG, useValue: { oktaAuth }}],

You now need to configure the router to accept the login callback from Okta. Open src/app/app-routing.module.ts and add the following import statement to the top of the file.

import { OktaAuthGuard, OktaCallbackComponent } from '@okta/okta-angular';

Update the routes configuration object to match the code below.

const routes: Routes = [ { path: '', component: HomeComponent }, { path: 'browse', component: BrowseComponent, canActivate: [ OktaAuthGuard ] }, { path: 'character', component: CharacterComponent, canActivate: [ OktaAuthGuard ] }, { path: 'species', component: SpeciesComponent, canActivate: [ OktaAuthGuard ] }, { path: 'login/callback', component: OktaCallbackComponent }, ];

Here you have added the OktaAuthGuard to the three routes that display character data. When a user who isn’t logged in attempts to access these routes, they will now automatically be redirected to the Okta sign-in page before gaining access.

In the last section, you added authentication to the server. This means that any access to the server now needs to provide an access token. To provide this token, open src/app/graphql.module.ts and replace the existing createApollo function with the implementation below.

import { setContext } from '@apollo/client/link/context'; import { OktaAuth } from '@okta/okta-auth-js'; const uri = 'http://localhost:4201/graphql'; export function createApollo(httpLink: HttpLink, oktaAuth: OktaAuth): ApolloClientOptions<any> { const http = httpLink.create({ uri }); const auth = setContext(async (_, { headers }) => { const token = oktaAuth.getAccessToken(); return token ? { headers: { Authorization: `Bearer ${token}` } } : {}; }); return { link: auth.concat(http), cache: new InMemoryCache() }; }

The function now expects an OktaAuth instance as the second argument. This means that you have to add it to the deps array in the NgModule provider for the Apollo options. Update the deps array to match the following.

deps: [HttpLink, OktaAuth]

Finally, you want to provide feedback to the user’s sign-in status and also provide manual ways of logging in and out of the application. Open src/app/navbar/navbar.component.ts and replace its contents with the code below.

import { Component } from '@angular/core'; import { OktaAuthStateService } from '@okta/okta-angular'; import { OktaAuth } from '@okta/okta-auth-js'; @Component({ selector: 'app-navbar', templateUrl: './navbar.component.html', styleUrls: ['./navbar.component.css'] }) export class NavbarComponent { constructor(public authStateService: OktaAuthStateService, private oktaAuth: OktaAuth) {} async login() { await this.oktaAuth.signInWithRedirect(); } async logout() { await this.oktaAuth.signOut(); } }

The OktaAuthStateService can be queried to provide the user’s status. The callbacks login() and logout() can be used to sign the user in and out. Now, open src/app/navbar/navbar.component.html and add the following lines as the first element inside the <nav> element.

<div> <button *ngIf="!(authStateService.authState$ | async)?.isAuthenticated" (click)="login()">Login</button> <button *ngIf="(authStateService.authState$ | async)?.isAuthenticated" (click)="logout()">Logout</button> </div>

This will show a Login button if the user is not logged in and a Logout button if the user is logged in.

Congratulations! You have completed the implementation of an Angular app that consumes GraphQL from a Node/Express server. The application is secured with Okta’s authentication service. You can now start the app by running ng serve in your project folder. Make sure the server is also running as described above. Open your browser to http://localhost:4200 to open the app. When you click on Browse in the navigation bar, you will be taken to the Okta sign-in page. Once you logged in, you will be redirected to the character browser. You should see something like the image below.

Learn more about Angular, GraphQL, and single-page applications

In this tutorial, I have shown you how to create a single-page application that consumes a GraphQL API using Angular. You’ve seen how to create a GraphQL schema that describes the queries and the results that the server supplies.

GraphQL provides a flexible way of implementing APIs that can be extended without breaking existing clients. Clients can specify the data they want to receive and can combine multiple requests into a single query. This makes GraphQL faster than traditional REST APIs because the number of API calls can be greatly reduced.

The app you have written consists of a server and a client. The server uses Express to parse incoming requests before passing them to the GraphQL middleware. The client is built using Angular and uses the Apollo library to interface with the GraphQL API. If you want to learn more about GraphQL, Angular, or single-page apps in general, feel free to follow the links below.

What Is Angular Ivy and Why Is It Awesome? Quickly Consume a GraphQL API from React Build a secure GraphQL API with Node.js Stop Writing Server-Based Web Apps

You can find the code for this tutorial on GitHub at https://github.com/oktadev/okta-angular-graphql-example.

If you liked this tutorial, chances are you like others we publish. Please follow @oktadev on Twitter and subscribe to our YouTube channel to get notified when we publish new developer tutorials.

Thursday, 21. October 2021

KuppingerCole

Why Continuous API Security Is Key to Protecting Your Digital Business

In the era when data has replaced oil as the most valuable commodity, APIs have become an important logistical foundation of modern digital business. As a result, APIs have also become a popular target for cyber attackers, and therefore effective API security is essential. However, focusing only on the operational aspects is no longer enough. Security teams typically struggle to keep up with the

In the era when data has replaced oil as the most valuable commodity, APIs have become an important logistical foundation of modern digital business. As a result, APIs have also become a popular target for cyber attackers, and therefore effective API security is essential. However, focusing only on the operational aspects is no longer enough.

Security teams typically struggle to keep up with the increasing volume and scale of APIs, and traditional security and API management solutions do not address all API security challenges. A new approach is needed, according to Alexei Balaganski, Lead Analyst at KuppingerCole and Isabelle Mauny, co-founder and field CTO of 42Crunch.

Join these experts as they discuss the benefits of an integrated, continuous, and proactive approach to API security that combines proactive application security measures with continuous activity monitoring, API-specific threat analysis, and runtime policy enforcement.

Alexei Balaganski explains how the security and compliance risks that APIs are exposed to are shaping the future of API security solutions and provides an overview of the latest innovations in protecting the whole API lifecycle.

Isabelle Mauny shows how a continuous API security model can be achieved and shares a recent case study from a global manufacturer, with an overview of the 42Crunch Developer-First API Security Platform.




FindBiometrics

ThreatMark Brings in $3 Million and a New CEO

ThreatMark has raised some significant capital and appointed a new CEO as the anti-fraud company positions to expand its market presence. The firm has announced that it has raised $3 […] The post ThreatMark Brings in $3 Million and a New CEO appeared first on FindBiometrics.

ThreatMark has raised some significant capital and appointed a new CEO as the anti-fraud company positions to expand its market presence.

The firm has announced that it has raised $3 million in a funding round led by Springtide Ventures, indicating in a statement that the capital will help to fuel ThreatMark’s expansion in the European and North American markets. Seasoned tech executive Daniel Rawlings, meanwhile, is the company’s new CEO and President.

Rawlings comes to the company from a previous position as Chief Commercial Officer of Trustonic, another digital security specialist. He has also held Chief Revenue Officer positions with Tyfone, ID Analytics, and SMSI; and he has served in VP roles with Oracle, Perfect Commerce, and SAP. Rawlings was previously in the position of CEO in his term with SourcingLink, from 2001 to 2003.

In taking his position with ThreatMark, Rawlings is taking the reins from co-founder and previous CEO Michal Tresner, who will stay on as Chairman of the Board and will also take on the role of Chief Product Officer & Head of Solution Strategy. Commenting on the appointment, Tresner framed it as a very important step forward for the company.

“Dan’s appointment is the result of an extensive global talent search we conducted to find just the right person, and I am delighted to have Dan join the company,” he said. “The experience, industry expertise and relationship network that Dan brings to ThreatMark is a big step forward for us. Now we’re stronger and better equipped to help our customers around the world succeed in preventing digital fraud both today and in the future.”

As for Springtide Ventures’ interest, the firm’s COO, David Marek, explained that with digital fraud being a critical threat for the financial services industry, “ThreatMark is perfectly positioned to lead the market with their advanced solution.” That solution is designed to analyze behavioral biometrics and other data patterns to assess fraud risks in online transactions, and received praise from Forrester earlier this year in its “Now Tech: Enterprise Fraud Management” report.

October 21, 2021 – by Alex Perala

The post ThreatMark Brings in $3 Million and a New CEO appeared first on FindBiometrics.


BIO-key Brings PortalGuard to EDUCAUSE 2021

BIO-key is continuing its road trip with a stop at the 2021 EDUCAUSE Conference in Philadelphia. The company has already attended two conferences in October, after putting in appearances at […] The post BIO-key Brings PortalGuard to EDUCAUSE 2021 appeared first on FindBiometrics.

BIO-key is continuing its road trip with a stop at the 2021 EDUCAUSE Conference in Philadelphia. The company has already attended two conferences in October, after putting in appearances at the NASCIO Conference and Connect:ID earlier this month.

The EDUCAUSE Conference is dedicated to higher learning, and will run from October 26-29 at the Pennsylvania Convention Center. The event gives educators a chance to talk about the issues that are affecting the industry, and to view solutions that can potentially address them.

For its part, BIO-key noted that cybersecurity has become a pressing concern, since academic institutions (like virtually every other institution) faced a growing volume of cyberthreats when they transitioned to a remote work and remote learning environment. Academic institutions now need a way to protect their students without disrupting their core educational mission.

That’s why BIO-key will specifically be showcasing its PortalGuard IAM platform, which can help simplify an organization’s cybersecurity set-up. To that end, the platform enables Single Sign-on capabilities for a number of different applications, and supports more than 16 different authentication options (including biometric options) for those who want to move away from passwords. Those features have made PortalGuard extremely popular with academic institutions, to the point that more than 200 higher learning facilities are now using the service.

“Now more than ever we need to come together to transform the education sector and ensure that the right cybersecurity safety measures are put into place,” said PortalGuard President Mark Cochran. “We look forward to connecting with higher education IT professionals to discuss how we can help evolve their cybersecurity strategies together, giving them IAM solutions that are built for them, delivered with the resources they have today.”  

PortalGuard will be conducting demos of PortalGuard at booth 638, in addition to participating in Q&A sessions throughout the event. The platform also has a self-service password reset feature that can help reduce the number of calls to the IT department.

October 21, 2021 – by Eric Weiss

The post BIO-key Brings PortalGuard to EDUCAUSE 2021 appeared first on FindBiometrics.


IBM Blockchain

Fueling the financial industry with open source cross-border payments

Financial services, payments, streamlining inefficient processes is in our blood. So, it’s no surprise my colleague and contributor to this article, Nitin Gaur and his team, tackled the unthinkable four years ago by building a global payment network addressing remittance and interbank payments. As if that wasn’t challenge enough, the team also designed it for […] The post Fueling the financial i

Financial services, payments, streamlining inefficient processes is in our blood. So, it’s no surprise my colleague and contributor to this article, Nitin Gaur and his team, tackled the unthinkable four years ago by building a global payment network addressing remittance and interbank payments. As if that wasn’t challenge enough, the team also designed it for […]

The post Fueling the financial industry with open source cross-border payments appeared first on IBM Blockchain and Supply Chain Blog.


auth0

Should You Give Users Access Before They Register

The business case for lazy registration
The business case for lazy registration

Tokeny Solutions

Tokeny’s Talent|Eloi’s Story

The post Tokeny’s Talent|Eloi’s Story appeared first on Tokeny Solutions.
Eloi Garrido is Full Stack Developer at Tokeny Solutions.  Who are you?

My name is Eloi Garrido. I graduated as an electronics engineer with a master’s degree in embedded systems from Delft University of Technology (NL) and right away moved into programming. I lived in the Netherlands for several years, where I met my partner, and after several attempts I managed to convince her to come to the sunny south. Since my early years, I’ve been interested in technology and tinkering with anything that has a spark. Still, to this day, my parents blame me for any appliance that has stopped working at home.

How did you land at Tokeny Solutions?

Since early 2017 I ended up falling into the crypto rabbit hole and preaching to my friends about it. However, at that point it was mostly a hobby and a bit of an obsession, but still something that at the time didn’t cross my mind could become a profession.

Soon after returning from the Netherlands to Barcelona, I started looking for my next professional adventure and landed at Tokeny, and it felt like the planets aligned, the rest is history.

How would you describe working at Tokeny Solutions?

Working at Tokeny has been a pleasure, very welcoming since day one and surrounded by very talented and engaged people. The team has always been flexible and adaptable, even in the face of our current world situation, allowing me to grow and partake in interesting projects.

What are you most passionate about in life?

I’m a sucker for anything tech related. When I’m not checking the latest toys, I like to gather around friends and family for some relaxing long meals or gaming sessions.

What is your ultimate dream?

From a personal point of view, I dream about having a comfortable and joyful life with my loved ones without any complications. On a global scale, I would love for all of us to stop fighting inwards over every detail and work together towards our general betterment, what couldn’t we achieve then?

What would you change in the world if you could?

A bit of a generic statement, but I would love to see hate disappear, or at least mostly reduced. So that people would focus more on what makes us similar than what differentiates us, all this would leave us with more time to enjoy life instead of wasting it in squabbles.

He prefers: check

Coffee

Tea

check

Book

Movie

Work from the office

check

Work from home

check

Cats

Dogs

check

Text

Call

check

Burger

Salad

check

Mountains

Ocean

Wine

check

Beer

check

Countryside

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

Fiat

Morning

check

Night

More Stories  Tokeny’s Talent|Barbora’s Story 28 May 2021 Tokeny’s Talent|Luis’s Story 16 July 2021 Tokeny’s Talent|Eloi’s Story 21 October 2021 Tokeny’s Talent|Shurong’s Story 20 November 2020 Tokeny’s Talent|Joachim’s Story 23 April 2021 Tokeny’s Talent|Cyrille’s Story 17 September 2021 Tokeny’s Talent|Mario’s Story 25 June 2021 Tokeny’s Talent|Xavi’s Story 19 March 2021 Tokeny’s Talent|Nida’s Story 15 January 2021 Tokeny’s Talent|Eva’s Story 19 February 2021 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent|Eloi’s Story appeared first on Tokeny Solutions.


OWI - State of Identity

The Buzz Behind Zero Trust

The Zero Trust model is the belief that no one should be trusted from inside or outside your network, until their identity has been verified. Zero trust refers to the alignment of maturing identity practices, an established understanding of user behaviors, and the application of least-privilege access security policy decisions to trust boundaries. In this week's State of Identity, host, Cameron D’

The Zero Trust model is the belief that no one should be trusted from inside or outside your network, until their identity has been verified. Zero trust refers to the alignment of maturing identity practices, an established understanding of user behaviors, and the application of least-privilege access security policy decisions to trust boundaries. In this week's State of Identity, host, Cameron D’Ambrosi debunks the buzz around zero trust architecture with Ryan Case, Chief Cybersecurity Architect at SailPoint. They provide first-hand guidance on identity-defined security architecture and how it enriches authorization with both contextual and intelligent access information to make objective decisions. In this session, you'll learn how to reduce friction to end-users while enabling deliberate identity verification and policy decisions.


KuppingerCole

CSLS Speaker Spotlight: Deutsche Telekom CSO Thomas Tschersich on His Cybersecurity Predictions for 2022

by Fabian Süß Thomas Tschersich, Chief Security Officer at Deutsche Telekom, served as an advisor in the preparation for the Cyber Council Panel on Cybersecurity Predictions 2022 which will see CISOs, CIOs, and CSOs discuss next year's cybersecurity threatscape on Wednesday, November 10 from 09:30 pm to 10:10 am at Cybersecurity Leadership Summit 2021. To give you a sneak preview of what to exp

by Fabian Süß

Thomas Tschersich, Chief Security Officer at Deutsche Telekom, served as an advisor in the preparation for the Cyber Council Panel on Cybersecurity Predictions 2022 which will see CISOs, CIOs, and CSOs discuss next year's cybersecurity threatscape on Wednesday, November 10 from 09:30 pm to 10:10 am at Cybersecurity Leadership Summit 2021.

To give you a sneak preview of what to expect, we asked Thomas some questions about his predictions.

Could you give us a sneak peek into the “Cybersecurity Trends 2022”?

Yeah, actually, well, when I look at the recent developments, I believe that DDoS attacks will be definitely an ongoing trend next year as well. So we see a couple of thousand attacks every month. And the second thing which will drive us will be, in my prediction, the ransomware topic, as this is still growing. We might see some new deep fakes. So cases with CEO fraud, with really deep faked videos instead of just cheating email conversations or phone calls. But that's in a nutshell what I do expect.

Which three Cybersecurity types of attacks do you expect being most threatening to your organization and why in 2022?

So I have no special type of attack in mind here. I would turn that a little around. So what I do expect most threatening not only for us but for all organizations, is still that we don't have the basics under control. What do I mean by that? Not having the basics under control is, for me, not more and less than cleaning up our infrastructures by introducing software updates really in a timely manner to the new detection of vulnerabilities. So we're all not good in doing so. And the result is that more than 95 percent of the attacks are being successful just because of missing software updates or bad configurations of systems.

This should keep us awake at night and this should be the priority number one, two, and three for next year, really to get it fixed. And so we need to bundle our forces so we need a better exchange between enterprises to share whether an update is working or not. So not to do testing in any case at any company so we can become better just by sharing that knowledge. But we also need better support from vendors. There are still a lot of also huge vendors existing which are not really open and honest about their vulnerabilities and which are not really supporting their clients on a daily basis.

Which three Cybersecurity topics are most important to you and why in 2022?

Look, as the whole industry is now moving more and more into cloud services, the perimeter of the company is paralyzing itself. So there is no perimeter to protect any longer, actually no perimeter like in the past. So there's not the one and only perimeter. So, therefore, I believe concepts like Zero Trust becoming increasingly important to bundle the security more to the endpoint and on the other side, to the data itself and not any longer to the corporate network infrastructure. So this is for me, one definite trend we have.

The second thing which is closely aligned with that is that prevention only is not sufficient enough for the future. It's more about having capabilities to monitor the status, to monitor your infrastructure, and then to be able to react fast if there are some anomalies being detected there. Yeah, this is the biggest challenge for most of the organizations, as a lot of organizations were pretty much focused on shielding themselves, building a fence around their infrastructure. And that's more or less one hundred percent of the cyber protection. And nowadays, you have to assume that attackers will be successful at one point in time, and therefore, you need to have these better monitoring capabilities without for sure losing the preventive approach totally. So but it's shifting the forces, and that's definitely one of the priorities for me and my team.

What are your three biggest challenges in implementing Cybersecurity and why?

I would say the challenge, number one, is the shortage of resources, of skilled resources, at the market and it's the challenge number two, three, four, five, six, seven, eight, and ten, and so on. That's the biggest issue currently, really to find the right people for your team to find the right resources to deal with security as the market is demanding so much at the moment. It is totally empty. So we're investing a lot in training and education for our people to train them by ourselves. So that's the only way out at the moment. And yeah, I guess this is what we need to solve. Otherwise, we got lost in the cybersecurity arena. It's not that much a technical issue because there are a lot of technical solutions available and in place. It's more the people to run those infrastructures.


ValidatedID

The importance of the legal evidence in electronic signatures

A court in California has ruled a document signed by a large eSignature company inadmissible. Validated ID collects evidence so that this doesn't happen.
A court in California has ruled a document signed by a large eSignature company inadmissible. Validated ID collects evidence so that this doesn't happen.

The digital transformation of the education sector

The electronic signature improves the experience in education for students, teachers, parents, guardians and other school staff.
The electronic signature improves the experience in education for students, teachers, parents, guardians and other school staff.

ValidatedID is now part of the AS4EDI2020 consortium

Validated ID participates in the AS4EDI20 project for the implementation of the CEFeDelivery AS4 profile in Europe.
Validated ID participates in the AS4EDI20 project for the implementation of the CEFeDelivery AS4 profile in Europe.

Wednesday, 20. October 2021

KuppingerCole

Governance over hybrid SAP Environments – the ANZ Story

by Graham Williamson Cloud adoption, and migration of on-premise applications to cloud services, is increasingly being undertaken by organisations wanting to leverage the business efficiencies that cloud infrastructure affords. For organisations with SAP environments there are impediments to a smooth journey. This report presents the responses to a survey of companies, located in the Australian-N

by Graham Williamson

Cloud adoption, and migration of on-premise applications to cloud services, is increasingly being undertaken by organisations wanting to leverage the business efficiencies that cloud infrastructure affords. For organisations with SAP environments there are impediments to a smooth journey. This report presents the responses to a survey of companies, located in the Australian-New Zealand region, that are using SAP for their enterprise resource planning. The current state of cloud adoption is explored and options for providing governance across hybrid environments are presented.

Enabling the Digital Business with Identity for the Internet of Things

by Martin Kuppinger The Internet of Things (IoT) is ubiquitous. It is driving innovation in many industries, from automotive to retail or healthcare. However, with a strong approach on digital identities for connected things, digital transformation will struggle. Managing identities and the relationships between things, to humans, and of other forms is essential for successfully delivering new dig

by Martin Kuppinger

The Internet of Things (IoT) is ubiquitous. It is driving innovation in many industries, from automotive to retail or healthcare. However, with a strong approach on digital identities for connected things, digital transformation will struggle. Managing identities and the relationships between things, to humans, and of other forms is essential for successfully delivering new digital services. Accenture Memority delivers a platform that enables this, providing better time-to-value in delivering to the digital business.

Shyft Network

Shyft Network’s Guide To Compliant DeFi

A look into how we’re ensuring compliance without removing the “De” from DeFi. Decentralized finance is the new frontier of investment and value propagation on the internet. It offers a new, innovative method for investing in assets. It removes the hassle of going through regulated channels and submitting many documents for verification of personal data. On the surface, this seems like an id
A look into how we’re ensuring compliance without removing the “De” from DeFi.

Decentralized finance is the new frontier of investment and value propagation on the internet. It offers a new, innovative method for investing in assets. It removes the hassle of going through regulated channels and submitting many documents for verification of personal data. On the surface, this seems like an ideal solution for many people since it gives them a way to invest and keep their assets and earnings anonymous to those it doesn’t concern. It also introduces an added element of security for many individuals since they are in complete control of their finances.

Unfortunately, DeFi also opens up a nest of issues, including users bypassing Anti Money-laundering (AML) laws and Countering the Financing of Terrorism (CFT) guidelines. Given the decentralized nature of the sector, however, how can compliance even be implemented? Compliant DeFi, something we’re proud to be pioneering at Shyft Network, would deal with these issues and other regulatory elements such as Know Your Customer (KYC) rules without forcing users and companies to conform to a centralized data store. Recent regulatory constraints such as the FATF Travel Rule and stipulations in the Infrastructure Bill mean that DeFi needs to start taking compliance seriously.

Not As Counter-Intuitive As It First Seems

Compliance in the financial sector helps to bring confidence to the consumer, financial institutions, and regulators. Unfortunately, the side effect of this added consumer confidence is the financial institution’s storage of personal user data, shared at will with governments. Since there’s no decentralized methodology for collecting and storing information in decentralized finance, implementations of KYC, CFT, and AML rules must become more innovative. Luckily, Shyft has been looking into the problem, and we’ve developed several solutions that fit the needs of decentralized finance platforms the world over.

The biggest issue we’ve seen with compliance is that it tries to impede the very soul of decentralization. Regulators are afraid that decentralized applications (DApps) don’t have a central body responsible for their function. The FATF Travel Rule (among others) underlines the reasoning behind regulators’ insistence on establishing a compliance framework for cryptocurrencies.

In essence, the most significant concern regulators have boils down to accountability. If someone does something illegal within the realm of DeFi, who is responsible for the fallout? Is it the decentralized autonomous organization’s (DAO’s) members? Is it the DApp that facilitated the illegal action or collected the proceeds on behalf of the perpetrators? Is it the stakeholders of the blockchain where the illegal proceeds were stored? There exists a need for this sort of accountability in the cryptoverse, but the implementation cannot go against the core foundation of why decentralized finance exists. We cannot create a decentralized platform and then centralize its data for compliance purposes. The solution, therefore, is to find a way to decentralize this data collection and verification process.

A Way Forward

Shyft Network has spent a considerable amount of time assessing, analyzing and researching these challenges, developing a solution that works well with current compliance regulation, and marrying it to an approach that maintains the spirit of decentralization. The solutions we are developing examine the existing framework of decentralization and use those as a base for creating a network that exists in harmony with current implementations. The Shyft Network doesn’t force any users to make fundamental changes to how their underlying systems work. Instead, it’s a modular implementation that allows for secure financial transactions with minimal risk to users. Solutions we’ve developed to deal with the compliance problem in DeFi include:

Veriscope GDPR/AML Compliant Discovery Mechanism: Veriscope is one of Shyft Network’s most recent innovations to offer Virtual Asset Service Providers (VASPs) like crypto exchanges a way to deal with the FATF Travel Rule and reach compliance. Veriscope is a decentralized solution (in holding with Shyft’s focus on keeping the decentralization in DeFi) that helps VASPs deal with the regulatory requirements instituted by authorities. Proof of Identity and Whitelisted Addresses: The Shyft Network can work with both centralized and decentralized entities to bring about equality in verifying user information. DeFi participants can submit their data to a verifying oracle which will be used to safelist their address for transactions. Once the address has been whitelisted and linked with the user’s identification, they can participate in DeFi transactions freely. Recently, Shyft Network partnered with Kylin to develop these decentralized oracles in keeping with their goal to provide a safe, secure way for financial institutions to interact with the DeFi space. Risk Assessment Mechanisms for DeFi Projects: To ensure that bad actors don’t abuse the system or take advantage of users, wallet transactions with smart contracts will be continually monitored for signs of untoward behavior. DeFi platforms that want to know more about their users can easily access this information to ensure that they have a more trustworthy network. Expanding Markets Mean More Compliance is Necessary

DeFi is a space that will see exponential growth in the coming years. Early adopters already have their hands full, trying to keep abreast of the burgeoning growth and development of decentralized applications. However, as the popularity of DeFi grows, the threat of regulation will always linger, and facing it head on is the best way to deal with it. Compliance allows decentralized platforms to stay one step ahead of regulators, giving users the peace of mind they want. Shyft Network will be continuing to lead this charge to create proactive, innovative compliance in its network. Stay tuned to find out more!

Shyft Network aggregates trust and contextualizes data to build an authentic reputation, identity, and creditability framework for individuals and enterprises.

Join our Newsletter
Telegram (https://t.me/shyftnetwork)
Follow us on Twitter (https://twitter.com/shyftnetwork)
Check out our GitHub (https://github.com/ShyftNetwork)
Check out our website (https://www.shyft.network)
Check out our Discord (https://discord.gg/ZcBNW37t)

Anonym

Growing Privacy Inequality Yet Another Reason for National Regulation and Privacy-First Tools

This quote from Tech Crunch commentor Cillian Kieran caught our eye: “Alongside regulation, every software engineering team should have privacy tools immediately available. When civil engineers are building a bridge, they cannot make it safe for a subset of the population; it must work for all who cross it. The same must hold for our data infrastructure, […] The post Growing Privacy In

This quote from Tech Crunch commentor Cillian Kieran caught our eye: “Alongside regulation, every software engineering team should have privacy tools immediately available. When civil engineers are building a bridge, they cannot make it safe for a subset of the population; it must work for all who cross it. The same must hold for our data infrastructure, lest we exacerbate disparities within and beyond the digital realm.”

The quote comes from an article on “widening inequality on the digital frontier” which poses valuable questions about whether we now need “a material perspective on widening privacy disparities — and their implication in broader social inequality — to catalyze the privacy improvements the U.S. desperately needs.” Kieran raises the material argument since, he says, the law doesn’t always recognize the emotional perspective of privacy as harm or as a driver for regulatory change.

Kieran makes two great points: First, almost all users in the US (except those who live in California under the CCPA) have no rights to opt-out of aggressive data collection and targeted advertising. But the fact that Californians can opt out implies there are harms from targeted advertising, which we know to be true and have discussed herehere and here. So why aren’t we protecting the millions of users who live anywhere other than California from these harms?

Kieran cites the example of targeted advertising leading to discrimination in housing and employment opportunities, “sometimes in violation of federal law” where it “impedes individuals’ autonomy, preemptively narrowing their window of purchasing options, even when they don’t want to.” 

Second, Kieran points out money talks, including in digital privacy protection. He points to Apple making noises about privacy being a human right all while building and marketing luxury products only some can afford. 

He argues: “… if those declaring privacy as a human right only make products affordable to some, what does that say about our human rights? Apple products skew toward wealthier, more educated consumers compared to competitors’ products. This projects a troubling future of increasingly exacerbated privacy disparities between the haves and the have-nots, where a feedback loop is established: Those with fewer resources to acquire privacy protections may have fewer resources to navigate the technical and legal challenges that come with a practice as convoluted as targeted advertising.” 

“We deserve meaningful privacy protections that everyone can afford. In fact, to turn the phrase on its head, we deserve meaningful privacy protections that no company can afford to omit from their products. We deserve a both/and approach: privacy that is both meaningful and widely available.”

Kieran says the only way to close the disparity gap and get meaningful protections is via privacy legislation and privacy tools. Lawmakers must set the standards and engineers must have no reason not to embed user data privacy controls in products they develop. 

He says: “We need privacy rules set by an institution that is not itself playing the game. Regulation alone cannot save us from modern privacy perils, but it is a vital ingredient in any viable solution.”

Which brings us back to where we started:

“Alongside regulation, every software engineering team should have privacy tools immediately available. When civil engineers are building a bridge, they cannot make it safe for a subset of the population; it must work for all who cross it. The same must hold for our data infrastructure, lest we exacerbate disparities within and beyond the digital realm.”

Read the full article.

If you’d like to know how Anonyome Labs is trying to reduce privacy inequality and build tools to empower all users with the capabilities they need to protect and control their digital information, check out Sudo Platform and MySudo.

Photo By Smile Studio

The post Growing Privacy Inequality Yet Another Reason for National Regulation and Privacy-First Tools appeared first on Anonyome Labs.


auth0

Top 6 Cybersecurity Predictions for 2022

Security leaders will increase visibility and diversity of thought to meet consumer (and business) expectations
Security leaders will increase visibility and diversity of thought to meet consumer (and business) expectations

Ontology

Ontology Weekly Report (October 12–18, 2021)

Highlights On October 11, Messari, a leading crypto data analysis company, published a research report entitled “Ontology: Layer 1 Trust Network for Decentralized Identity and Data”. The report introduces Ontology’s suite of decentralized identity and data services, as well as its technical specifications. Latest Developments Development Progress We have completed the launch of Ontology’s EVM
Highlights

On October 11, Messari, a leading crypto data analysis company, published a research report entitled “Ontology: Layer 1 Trust Network for Decentralized Identity and Data”. The report introduces Ontology’s suite of decentralized identity and data services, as well as its technical specifications.

Latest Developments Development Progress We have completed the launch of Ontology’s EVM TestNet and are 75% done with testing. At the same time, a large number of community developers are actively participating in the “Security Vulnerabilities and Threat Intelligence Bounty Program”, launched by Ontology and SlowMist. We have completed Ethereum RPC support and are 100% done with internal testing. The TestNet has been synchronized online; we are 79% done with testing. We have completed 100% of Ontology’s Ethereum account system development and the TestNet has been synchronized online; we are 78% done with testing. The EVM/OEP-4 asset seamless transfer technical solution, which facilitates the efficient conversion between OEP-4 assets and EVM assets, is complete and the TestNet has been synchronized online; we are 81% done with testing. We are 40% done with the survey on the improvement of the decimals of ONT and ONG. Product Development ONTO hosted a campaign with Mars Ecosystem. Participants used ONTO for a chance to win rewards. The activity is ongoing, with about 500 participants. ONTO hosted a limited edition NFT event with NFTBomb. Participants were asked to complete tasks such as downloading ONTO for the chance to win a NFTBomb Limited Edition NFT. The number of participants exceeded 2,000. ONTO is hosting a limited edition NFT event with World Games. The event has had participants download ONTO for the chance to win a limited edition NFT. The activity is ongoing, with about 1,000 participants. On-Chain Activity 121 total dApps on MainNet as of October 18, 2021. 6,790,552 total dApp-related transactions on MainNet, an increase of 5,904 from last week. 16,570,370 total transactions on MainNet, an increase of 33,831from last week. Community Growth 1,314 new members joined our global community this week. The newly launched community activities have attracted a large number of new members to join us. We held our weekly Discord Community Call, led by Humpty Calderon, our Head of Community. He introduced the DID solutions integrated on different public chains; community members also discussed the DID competitors and technical solutions in the same industry. We held our new Telegram weekly Community Call, led by Astro, an Ontology Harbinger from our Asian community. He introduced the research report “Ontology: Layer 1 Trust Network for Decentralized Identity and Data”, written by Messari Hub Analysts, and answered questions from members of the community. As always, we’re active on Twitter and Telegram where you can keep up with our latest developments and community updates. Global News Ontology’s Chief of Global Ecosystem Partnerships, Gloria Wu, introduced her work experience at Ontology in the article “Women in Fintech: Greatest Achievements with Goodbox, Pensionbee, TomoCredit, Ontology and LXME”, published in The Fintech Times, a leading financial technology newspaper. She said blockchain and crypto are founded on egalitarian principles and she is very glad to be representing women in a more male-dominated industry. Ontology published #EVM 101 series articles “What Are Ethereum Virtual Machines and Why Should You Care?” on Twitter, introducing the features and advantages of Ontology EVM: Increasing interoperability between Ontology and Ethereum; developers will be able to seamlessly migrate assets between both blockchains. Allowing users to access EVM-based projects deployed on Ontology, whilst enjoying lower gas fees and faster block production. Ontology in the Media

Messari — Ontology: Layer 1 Trust Network for Decentralized Identity and Data

Trust has been the centerpiece of commerce, community, and happiness. In the course of human history, trust has changed its meaning from a person’s word to a bill of exchange, to a legal document and, most recently, in the 21st century, to technology. In the last decade, blockchain has created “trustless” systems where individuals do not need to trust anyone except software to verify transactions. Blockchain brought not only trust to the masses through shared access of decentralized information but has also fundamentally changed the future of trust ecosystems.

Ontology, with its MainNet launch in June 2018, is building infrastructure for a peer-to-peer trust network with a decentralized identity framework that allows users to have full control over their data for real-world use cases. While traditional economies rely on third party service providers to facilitate trust, Ontology enables decentralized trust that aims to integrate the blockchain and different business sectors through partnerships to provide distributed services including distributed communities, data verification, data exchange, and credit across industries.

Cointelegraph — Likes out: Facebook blackout sparks ideas for Web 3.0

Over the past few years, the influx of moral concerns surrounding privacy breaches, data gathering, censorship and fake news has fuelled the conversion for a renovation of the social media platforms that have come to dominate our democracy. The prevalence and severity of such issues have even begun to deter conscious users away from centralized behemoths like Facebook, YouTube and Twitter, in favor of more liberating alternatives. The rapid emergence of cryptocurrency and blockchain technology, specifically its native features of decentralization, transparency and community reward, has empowered the rise of these next-generation initiatives.

Going forward, a move toward decentralized identity solutions is essential to ensure data privacy and security. By facilitating end-to-end technology run on blockchain, decentralized solutions enable private information to be shared securely, while users remain in full control of their data. In contrast to centralized solutions, decentralized systems ensure private data remains immutable and secure and is only able to be shared when users consent to provide information.

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (October 12–18, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Imageware

Managed Security Service Provider (MSSP), Uses, and Benefits

In the late 1990s, Internet Service Providers (ISPs) were the forerunners of Managed Security Service Providers (MSSP). Some ISPs began supplying firewall equipment to their customers at that time, and if the customer desired, the ISP would also manage the firewall. The post Managed Security Service Provider (MSSP), Uses, and Benefits appeared first on Imageware.

MyDEX

DESIGNED FOR INTEGRITY

In our last blog we talked about diverse, practical ways in which we help people and organisations use personal data to get stuff done — to enrich peoples’ lives and improve service quality in a way that cuts friction, effort, risk and cost for both sides. The tools and capabilities we discuss in that blog are great. But to be truly valuable, they also need to be robust and ethical. They nee

In our last blog we talked about diverse, practical ways in which we help people and organisations use personal data to get stuff done — to enrich peoples’ lives and improve service quality in a way that cuts friction, effort, risk and cost for both sides.

The tools and capabilities we discuss in that blog are great. But to be truly valuable, they also need to be robust and ethical. They need to be based on sound design principles.

Below are some of the design principles that underpin our infrastructure and services — principles designed to ensure that what we do truly does serve citizens, today and into the future.

Safe, efficient data logistics Our personal data stores (PDSs) use APIs to enable safe efficient data sharing between individuals and organisations. Organisations can deposit data in an individual’s PDS, and individuals can share data with these organisations. Our PDSs don’t only provide safe storage facilities. They also provide safe, efficient data sharing capabilities which keep the individual in control. We call this ‘personal data logistics’. Individual the point of integration Our personal data stores enable individuals to aggregate data about themselves that is currently dispersed across many different service providers. For example, most UK citizens have over a dozen relationships with different financial service providers: one or more banks and building societies, loan providers, mortgage providers, savings providers, investment services, pensions, insurances and so on.

It’s only by aggregating data from all these service providers (and by adding additional information that only the individual knows) that it’s possible to gain a true, fully-rounded picture of an individual’s financial circumstances, and therefore to be able to give truly personalised, relevant advice. That is why we applied for and got an Open Banking licence.

Our infrastructure is designed to enable the provision of such genuinely personalised advice across every walk of life: money, health, career, etc.

An asset for life By enabling individuals to aggregate data (including verified attributes) about themselves in their own personal data store, we provide them with an asset for life: an asset that grows in value over time as more data is added to it. Seek win-wins Many data relationships are adversarial, where one side seeks to extract value from another. We seek to enable mutually beneficial, productive relationships between citizens and bona fide service providers. For example, as the data in an individual’s personal data store grows in richness and value, citizens can bring this data with them to relationships with service providers, helping both sides access and use the data they need for better services at lower cost. Neutral In line with the above, our platform is not designed to help one organisation gain competitive advantage over another. It is designed to enable all sides to improve the way they operate, by helping everyone involved reduce friction, effort, risk and cost for example. So our charging structure doesn’t favour one organisation over another and it doesn’t incentivise us to try and make money out of individuals’ data either. (For example, if we charged a fee per data transaction, our revenues would grow with the volume of data sharing and that could incentivise us to pressurise individuals into sharing more data than they want to. So we don’t.) Truly independent Our personal data stores are truly independent and under the control of the individual. They do not sit inside any data-holding service provider’s organisational boundaries and do not depend on any service provider’s systems and technologies. Individuals don’t have to use any organisation’s identity processes to access their data. Organisations which deposit copies of an individual’s data in that individual’s personal data store cannot tell the individual what they should or shouldn’t do with it, or who they can or can’t share it with.

This is important because the more fashionable personal data stores become, the more people there are out there pretending to offer personal data stores but where, if you look a little more closely, it actually continues the organisation’s control of individuals’ data (by doing one or more of the above).

Safety and security Each individual’s personal data store is separately and individually encrypted. This means our infrastructure is distributed, not centralised — so that it is designed to be the opposite of a honeypot for hackers. (If a hacker wants to access the records of millions of people in a centralised corporate database, they only need to succeed in one hack. For a hacker to access the records of a million people on our infrastructure, they would have to succeed at a million separate, different hacks, each time accessing the data of only one individual. That makes the hacker’s incentives a million times a million less attractive (a million times harder, for a millionth of the reward). No knowledge Each individual holds their own encryption key to their PDS, which we don’t have. We have designed our systems so that they do not look at an individuals’ personal data or who is delivering it or requesting it. We have provided the tools needed for the individual to be in control and able to see who is using their data for what purpose and approve such uses. We cannot see and don’t want to see their data. As a Community Interest Company we want to help people, not exploit them or intrude upon them. Interoperable There is no point in doing all the above good things if the end result is to trap people inside a system that they can’t get out of. People have to have choices; genuine alternatives. Competition over who can provide the best genuine data empowerment can only be a good thing. For this reason, we expect many personal data store providers to emerge, and we are fully committed to enabling users to transfer their data from our systems to other systems and vice versa. We believe in interoperability. Commercial integrity Our business model is designed to support all the above principles. We are a Community Interest Company. Yes, we are commercial — to fulfil our mission we have to cover our costs. But we are not in this to maximise the profits we can make, but to maximise the benefits citizens can get from their data. So we don’t charge individuals for having a PDS. Instead, organisations pay us for enabling safe, efficient data sharing relationships with customers. Summary

As the concept of personal data stores grows more fashionable, we’ve got no doubt that clever people will invent many exciting new tools that do wonderful things. That’s great. It’s how it should be.

But for such creativity to really deliver value, it must be built on solid foundations. We believe design principles like the ones listed above provide the foundations our society needs to put personal data on a new, trustworthy footing.

DESIGNED FOR INTEGRITY was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


Let's Talk about Digital Identity

Identity management in Mergers & Acquisitions with Keith Uber, Ubisecure – Podcast Episode 53

Let's talk about digital identity with Keith Uber, VP Customer Success at Ubisecure. In episode 53, Oscar and Keith explore the role of Identity and Access Management (IAM) in Mergers and Acquisitions (M&A). With the importance of customer experience at the centre, Keith and Oscar discuss standards considerations, available options and practical steps for successful consolidation of IAM systems
Let’s talk about digital identity with Keith Uber, VP Customer Success at Ubisecure.

In episode 53, Oscar and Keith explore the role of Identity and Access Management (IAM) in Mergers and Acquisitions (M&A). With the importance of customer experience at the centre, Keith and Oscar discuss standards considerations, available options and practical steps for successful consolidation of IAM systems.

[Scroll down for transcript]

“The most important part of mergers and acquisitions is that the customer is the value of the company.”

“Take advantage of the opportunities that moving to a new identity and access management system can provide for customers.”

Keith is VP Customer Success at Ubisecure. As an Identity and Access Management product expert, he leads the Sales Engineering team and is involved in many stages in the planning and design of demanding customer implementation projects. Keith is active in various industry organisations and has a keen interest particularly in government mandated digital identity systems. He holds a bachelor’s degree in I.T. and a master’s degree in Economics, specialising in software business.

Check out Keith’s blog and comprehensive white paper on the topic of IAM in M&A:

Blog – The critical role of Customer IAM in M&A White Paper – Mergers & Acquisitions: Enabling identity integration and opportunities with IAM

Connect with Keith on LinkedIn and follow him on Twitter @keithuber.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

 

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello and thanks for joining today. After some time we are having a guest from the house, from Ubisecure, and he is a guest who has been in Ubisecure for 12 years. So let me introduce to you, Keith Uber.

He is the VP Customer Success at Ubisecure. As an identity and access management product expert, he leads the Sales Engineering team and is involved in many stages in the planning and design of demanding customer implementation projects. Keith is active in various industry organisations, and has a keen interest particularly in government mandated digital identity systems. Having been involved in dozens of IAM implementation projects, he is quick to identify organisation’s needs, and provide suitable configuration, integration and roadmap guidance.

Hello Keith and welcome.

Keith Uber: Hello, Oscar. And thank you very much for having me. It’s a pleasure to be here.

Oscar: It is really great talking with you. You had really long experience in Ubisecure and in the industry so have super interesting things to tell us. We will talk about mergers and acquisitions today. But before that, we’d like to hear a bit more about yourself, so please tell us your journey to this world of digital identity.

Keith: For me, digital identity became part of my career when I moved to Finland in 2000. So this was the height of the .com boom. I got a job working for Sonera, which is now Telia, one of the largest Telco operators in the Nordic countries. As part of that role, one of my jobs was to help Telia to combine the login systems for various small start-up companies, various small projects that they had acquired during the .com phase. They acted as a kind of a technology incubator for many small companies too so they had a huge portfolio of disparate services, all with different ways to sign in and authenticate. That’s where my journey started.

So I have a background as a software engineer. I have a Bachelor of IT and previously worked in logistics field as a software developer. But after moving to Finland, I later studied software business then continued after graduation joined Ubisecure and I’ve been working with identity and access management, particularly customer identity and access management ever since then. So in various roles I have been involved in so many different customer projects, not only individual customers, but also entire industries or government services that are going online. So I’ve been really fortunate to have a wide range of experience in quite a short amount of time.

Oscar: Very interesting. Sounds to be your first contact with digital identity there in Sonera – was the previous name for what is Telia today. So you mentioned several small companies that they had these challenges of authentication, login, registration, this kind of stuff. I’m guessing, we talk about 2000 is that there were not standards like we had today about digital access management. So well, I can imagine challenging has been that time, today, still challenging, but we have standards, and that is great.

But also that comes to the topic of today, right? So company can acquire smaller organisations. And one of the challenges is, for instance, if a company – let’s move from Telco to, let’s say e-commerce – can be a big e-commerce place acquires a smaller one, another e-commerce place, and what the customers would, by knowing that if they read that on the news, they will expect that “OK, and now I can log in from one place to another automatically, right?” That’s what the customer would expect. I think that’s one of the challenges I guess you are going to tell us more. But in your opinion, why is identity an important topic when it comes to mergers and acquisitions?

Keith: I think the most important part of mergers and acquisitions is that the customer is the value of the company. They’re the consumer who is buying the services or the people interacting with the company. So in a merger or acquisition or divestiture, when a company splits into two, it’s important that you retain those customers, that you don’t by mismanagement or mistake accidentally isolate your customers, make it difficult for them to sign in, or make it difficult for them to start to enjoy the services of the company that you’re also acquiring. So that’s all about keeping that valuable asset, keeping the customer happy.

Oscar: Yeah, I agree because it might be that you, as a customer, one can be super happy with a service, with a company. And after that, after the merger/acquisitions, well, things don’t work so smooth anymore for the customer, right? And that can make as you said, have the customers unhappy.

Keith: In most cases, mergers are all about acquiring some type of complementary company or company that offers a service that is enjoyable for the current user population. So it’s all about allowing them to start to consume more and more services so you get more revenue for every customer.

Oscar: Exactly. So now if you put the shoes of company that knows that they are going to be merged or there is acquisition already in place and start to plan how we are going to merge the systems, et cetera. What are these, for you, are the top, the key identity considerations that this organisation has to go through?

Keith: You spoke about the importance of standards and how the industry has really standardised on some key identity standards. So for companies who are looking to grow through either being acquired or acquiring other companies, it’s important that they have their identity management under control. They’re using standardised systems, they’re using the same standards that other services use, that makes the migration and integration process much, much easier.

And part of that underlying is all about data quality. So making sure that you’re keeping your user information, your user database, not only secure but up to date that it’s full of verified and valued information that could be easily used to match accounts as you go through that merging process.

Another really important thing is to really plan carefully about how that merger will look from the end user perspective, what will they retain, which account will they retain, will become their core account, or will you allow the customers to sign in using both types of credentials that they already have? And then what’s the timeline, for example sunsetting the unused or the system that will be turned off.

Generally, in terms of mergers and acquisitions, you want to simplify the IT systems by merging them and then sunsetting or turning off the system that you don’t choose to continue with. And that’s all about reducing the overall costs for not only licensing of the system, but also uptake and management of the system and simplifying the IT architecture to reduce, for example, simple things like customer support, when somebody has trouble logging in, or resetting a password, or enabling a new authentication method that is common across the whole organisation.

There’s a slight risk there that by combining multiple systems, even two or three into one, you’re also putting your eggs into one basket so there’s of course a risk of having a single point of failure. But on the flip side, you have more people, more IT resources looking at one system, focusing on one system and having expertise in that one system and keeping that up-to-date and smoothly running. And all modern identity and access management systems are fully reliable at scale in that type of scenario.

Oscar: And I guess also, one point is we’ve been talking mostly about customers, but about the workforce, about the employees I think they are also important considerations.

Keith: Oh, absolutely. The backend staff, who need to access these systems, and as the internal employees also merge, you want to make that as smooth as possible so that the company that’s being required, they also feel like fully-fledged employees, as soon as we can.

During this mergers and acquisitions process, often the process is followed very, very closely, especially for publicly listed companies, and you want to try to achieve merging both IT systems as quick as you can, so that you can show that the corporate acquisition has been a success on both the technical level and on a sales level. So in terms of planning for an IT organisation, it’s important to start that planning very early in the process and getting ready and planted well, so you can do that quickly and efficiently, and move to that new common platform as quickly as you can.

Oscar: Yeah, I think you mentioned the fact that the system has to be consolidated, you know that that’s the biggest part of these projects of identity and access management, in cases of mergers and acquisitions. So to start seeing, what are the drawbacks and the benefits of consolidating these systems?

Keith: The benefits I would hope would be improved information security, probably moving away from an older legacy style identity access management system to a newer one. You bring with that new functionality, which might simplify the IT architecture. So for example, rather than having custom registration processes or custom tools for user management, you might move to out of the box solutions where those facilities are provided as a standard part of the package. In many cases, you might combine the move with a move from on-premises to cloud. Most companies are moving towards the cloud, if they have that on their corporate strategy for IT then they can make that change either at the same time or prepare to do that shortly afterwards.

The risks, as you say, is a very simple thing can be if, for example, consumer facing services, a lot of users are already logged in to devices, or they’re already logged in with their browser and different password managers, different browsers and applications remember the user in different ways. And if you start to upset the flow of how those passwords are remembered and presented to applications, you can very easily sort of lock your users out. For a wider consumer population, the sheer frustration of having to remember their password or go through a password reset process can be a big difference, it’s a difference between giving up and downloading the next app of your competitor, so it requires a bit of careful planning to make that flow as smooth as you can.

Typically in that system, you’d allow the users who are migrating off the old system to sign in ones using the old system. Ideally, you would be able to migrate any existing password hashes from the old system to the new system, or have a very smooth onboarding flow to allow them to either set a new secure password or log in using a strong authentication method, either government or banking ID used by users in that region, or some other existing strong authentication method that the user already has.

Oscar: That remembers me some cases of being involved in which the passwords have been hashed, has to be of course, but long time ago, and when there were less agreed standards about that, and that can be tricky if the system that is- the new system that is importing this hash passwords is not compatible with the old system. So that can create, of course, can create problems, so that’s why standards is- fulfilling the current standards, and of course, also having some backward compatibility. It’s super important into this.

Keith: There are really simple benefits, for example, in many older legacy systems, the username is based around either a randomly generated number or letter combination, or it’s provided by the user themselves. And for many end users, it’s a challenge to even remember what their username is.

In the extreme situations, we see corporate systems, for example, B2B systems where the customer needs to put in not only the customer number, but they also put in a user ID as well as a password. To migrate to a system where you maybe simplify that more like consumer facing services where you could either centralise around using your email address as your identifier, which is much, much easier to remember. Or allow the user to link their account to an existing account, such as a social media account, or LinkedIn, or other Microsoft account, for example, to allow the users to sort of move away from having a proprietary username format, and move on to having, for example, a corporate email address or their personal email address as their account identifier.

Oscar: Yes, and now moving to how to make this consolidation in a good way, how to have success in this CIAM consolidation, what are the – what’s a good approach? Could you tell us the main points about that?

Keith: Yeah, I touched earlier the importance of data quality. So to make sure that the data is clean before you start to do any type of migration, it’s really important to understand what data in each system is and how it’s validated and make sure that those formats are valid in the new system. There’s a very clear decision point of to which system you’re going to migrate, which system will be the ongoing system in the future. So typically, that would be one of the two existing systems, to the more modern of the two, to the one which is with the more capabilities and more features, longer lifecycle.

But it may be that you choose a third system, a new system to replace both and migrate those systems to the new. And that provider, for example, jump forward, for example, to support newer scenarios, such as not only different types of strong authentication techniques, or identity protocols, but also things like identity delegation, or electronic power of attorney, these types of services, which existing legacy systems might not support out of the box.

From the end user perspective, so whether they’re B2B users or consumer users, it’s really important to have very clear communications that, OK, the company has been acquired, the name is possibly changing, the brand is changing. Please be aware of this from the very beginning. And then very clear communication about when things change in the IT system. So allow the users to be aware that when they log in their login screen might change, might look different. They might have to set a new password or re-verify their email address, for example. It’s really important to have that communication in as clear user focused language as possible.

Oscar: Yes.

Keith: A lot of IT teams make the mistake of using the language that they understand and not the language that their user understands.

Oscar: Yeah, I have seen actually some examples like that. I don’t remember the names, but sometimes come one email saying “OK, we are having this transition to a new brand.” And it looks so simple that “OK, yeah, nice.” And other times it’s completely techie jargon, then “Oh, my God, what are you talking about?”

Keith: That’s right, yeah. Then you have to decide whether to move everything in one go. So have a certain date where you migrate everything in one go, do a big bang approach. Or you’re migrating applications, sort of one by one or user groups one by one. Some of that can be done, for example, as the user logs in, you can migrate users on a user-by-user basis. Or it kind of depends how much the applications are shared across the different companies; how many users use applications from both of the companies involved in the merger.

Oscar: So those are the main points for consolidation of identity and access management systems.

Keith: From a technical perspective, it’s important to try to identify if there’s a common attribute between those two systems. So if you have a trusted attribute from each of the user repositories, for example, an email address that has been verified or a phone number that has been verified, even identity number, social security number or national identity number. These can be used as common attributes to allow automated migration of users so that they can log in automatically. In cases where, for example, password hashes can be migrated, or where the user is signing in using a government or national ID platform.

Oscar: Yes. And in terms of compliance, normally, what are the compliance points that are relevant in this type of projects?

Keith: Yeah, so particularly, it’s around collecting consent where required from the user about other data is going to be, maybe data controller will be processed or will be changed during that process. So using again your IAM tools to collect that as the user signs in again, explaining that and communicating any changes to the terms and conditions. In cases where the system is in different geographical areas or different legal jurisdictions, there may be requirements, for example, for storage of user data within certain countries’ boundaries. So you may have to replicate data or have data stored within a region, you have to think carefully about those situations, especially in global or cross regional mergers and acquisitions.

Oscar: Yeah, yeah, certainly. And we’ve already touched in several aspects beyond the technical, the technical side, of course. We talk about standards, we talk about security, but we also talk about compliance for instance. And we talk about branding and how to communicate these things. Many people are definitely involved in all these projects that are connected to the migration of identity and access management. But this is way beyond what identity people work on that. So if you can close telling us what is your ending piece of advice, advice for companies that are today planning on mergers and acquisition?

Keith: I’d say definitely take advantage of the opportunities that moving to a new identity and access management system can provide for customers. So for example, if you haven’t been able to take advantage of things like linking accounts to existing user accounts, allowing users to have federation – meaning that when they’re signed into your system, they can also sign into complementary services, giving the users new tools around managing their own account, having a really good user dashboard and a way to agree or modify their permissions for the system.

I think beyond just the roar savings in turning off one system, or reducing the support costs for having to deal with multiple different systems, there’s real advantages in moving to one common system that has the new corporate brand, is smooth and secure for the end user, and gives them also new functionality.

They see that, “OK, it’s the advantage is I get to sign in. But oh, I also get this new thing. I also get a more secure account or easy way to sign in.” I think that’s something to use as an advantage in that. Also, a real big move and a real good opportunity is to think about at that stage moving to the cloud, if you’ve been hesitating about moving your identity and access management system to a cloud-based service, it can be a good time to do that during a merger and acquisition.

Oscar: Indeed. And as one of these last things you are talking about and – is one of the first things you talk about today is a customer, taking the opportunity to bring a better experience to the customers.

Keith: The way you sign in is often the front door for the service, the first place that the user ends up in when they click on a link from marketing or when they’re trying to login to their own account and you want to make that as a smooth process as possible.

Oscar: Yeah, exactly. Well, thanks a lot, Keith, for telling us this about mergers and acquisitions. And for the people who would like to follow, continue this conversation with you, what are the best ways to follow up or get in touch with you?

Keith: Yeah, the best way to get hold of me is in LinkedIn. Look for my name, Keith Uber, you’ll find me there and please send me a message via LinkedIn. It’s been a real pleasure to talk to you today, Oscar. I look forward to hearing more episodes with you in the future.

Oscar: Thanks a lot. It was a pleasure talking with you, Keith. All the best.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

[End of transcript]


Okta

Flying Into Okta

“Just follow your heart and keep smiling.” – Kiki’s Delivery Service I’m embarking on a new adventure and entering the wide world of Developer Advocacy at Okta! Much like Kiki setting out on her journey, I’m full of enthusiasm and curiosity and am ready to fly. I’m thrilled to be here at Okta and looking forward to everything. Now, I just need to get a talking cat… “Smile. We

“Just follow your heart and keep smiling.”

Kiki’s Delivery Service

I’m embarking on a new adventure and entering the wide world of Developer Advocacy at Okta! Much like Kiki setting out on her journey, I’m full of enthusiasm and curiosity and am ready to fly. I’m thrilled to be here at Okta and looking forward to everything. Now, I just need to get a talking cat…

“Smile. We have to make a good first impression.”

Kiki’s Delivery Service

I’m a military brat who grew up in Japan and then moved to Kansas City. When I’m not diving deep into technical docs or hacking on a project, I love engaging with dev communities and relaxing at home. You might find me sharing knowledge as a Google Developer Expert in Angular, organizing ngGirls workshops as a core team member, co-organizing AngularKC meetups, or creating a curriculum for women to learn web development skills at Coding & Cocktails. You might also find me leaning into my homebody ways by reading, enjoying a glass of wine, playing word puzzle games, or watching K-Dramas. Oh, and yes, I do love watching Studio Ghibli films.

Photo by Raychan on Unsplash

”Painting Coding and magical powers seem very similar to me.”

Kiki’s Delivery Service, revised by Alisa

My professional technical background is over ten years of enterprise software development working on B2B products in various languages, from C++, C#, Java, and JavaScript. From image rendering to creating web services and APIs to implementing event orchestration to developing mobile-friendly web apps, I’ve had an opportunity to learn a lot. But I took a shine to all things JavaScript. Of course, I love Angular and am also excited about both front-end and back-end JavaScript frameworks. It’s incredible what we can create with code, and I am always in awe of all the innovations in software development! We, as devs, are pretty magical! ✨

“I came to deliver this letter.”

Kiki’s Delivery Service

I’m looking forward to sharing knowledge, sparking your imagination, and hearing about your aspirations and difficulties using Okta developer tooling. Let’s keep in touch so we can do so! You can find me on GitHub, dev.to, and Twitter at @AlisaDuncan.


Matterium

Countering Marketplace Deception with Mattereum’s Trust-as-a-Service Platform

Marketplace deception is everywhere, at great cost and risk to consumers and businesses. Regulation alone won’t fix it. Can Mattereum Asset Passports and Product Information Markets help secure trust in B2B and B2C trade? On October 13, 2021, the Federal Trade Commission issued a Notice of Penalty Offenses to over 700 companies, putting pressure on American businesses to disengage from decep

Marketplace deception is everywhere, at great cost and risk to consumers and businesses. Regulation alone won’t fix it. Can Mattereum Asset Passports and Product Information Markets help secure trust in B2B and B2C trade?

On October 13, 2021, the Federal Trade Commission issued a Notice of Penalty Offenses to over 700 companies, putting pressure on American businesses to disengage from deceptive practices such as fake reviews and false endorsements or else face civil penalties.

FTC Puts Hundreds of Businesses on Notice about Fake Reviews and Other Misleading Endorsements

The list of companies on the notice include some of the largest companies in the world across a range of industries, such as Alphabet, Inc. (Google), Amazon, Apple, Microsoft, Shell Oil, Starbucks, McDonalds, and many others. A quick skim through the list gives the impression that almost any household name company actively deceives consumers as part of their ongoing business strategy, at least according to the FTC.

This form of marketplace deception is not limited to B2C relationships. On October 14, 2021, Reuters reported that aerospace giant Boeing had notified the Federal Aviation Administration (FAA) that it had discovered defective parts for its 787 Dreamliner fleet which were sourced by a supplier and manufactured by another company.

Boeing finds new defect in continuing struggle to produce Dreamliner 787

These forms of marketplace deception are seemingly omnipresent in trade at all scales. While regulation may be able to get many businesses to more authentically engage with consumers and other businesses, some of these entities are of such a size that they can simply absorb civil penalties en masse and proceed with business as usual.

To combat this endemic deception of consumers, we need a combined effort of effective regulation and technological solutions to secure trust in digital commerce. More specifically, we need to establish standards for consumer protection, and implement the protocols capable of meeting them.

The Mattereum Protocol is well-suited for tackling the challenge of holding companies to account for their stated claims, specifically by offering buyers warrantied claims around their purchased goods powered by an incentivized network of third-party expert certifiers. Let’s explore how Mattereum as a trust-as-a-service platform can help create more authentic relationships between businesses and consumers and between businesses themselves.

How do we build Trust-as-a-Service?

Ultimately, Mattereum is building a system to secure truth in trade at all scales: documenting and offsetting negative externalities, creating a circular economy of reuse, recycling, and upcycling of goods, and designing incentives which align profitability with sustainability. Let’s unpack the Mattereum approach and explore how it would work in B2C and B2B contexts.

Asset Passports: Living Product Documentation

The Mattereum Asset Passport (MAP) is the core mechanism of the Mattereum Protocol.

In short, a MAP is a bundle of legal warranties tied to an object. While these warranties can vary with the object in question, the initial warranty is often some form of identification. Other warranties in a MAP may include authentication methods, carbon offsets, anti-slavery certification, tokenization (or connections to any smart contract or software system), and many others. These warranties are essentially “legal lego” of various contract terms that will range greatly between different asset classes and will accrue around assets over time.

All claims are cryptographically signed, secured, and backed by financial stake, giving all warrantors accountability and skin-in-the-game for their assessments. This framework also provides access to dispute resolution protocols in the event of systemic or commercial fallout via an arbitration clause in the contract.

Asset Passports are not a static structure but a dynamic, living documentation that evolves throughout the product lifecycle. Once this initial documentation is established, we need a suitable incentive mechanism in place to supply and secure warrantied product information without relying wholly on centralized institutions.

Product Information Markets: Breaking Out of the Silos of Separate “Truths”

In both B2C and B2B contexts, trust is heavily centralized. We source our product knowledge from companies either directly or through ratings programs they design and companies source their materials and product information with trade partners or distant multiple-degree connections in their supply chains.

Instead of relying on companies to secure trust and accountability when they are incentivized to stack the deck in their favor and shield themselves from liabilities, we propose a more decentralized, networked solution that can bring in third-party expertise to supply and secure warranties in a structure we call production information markets (PIMs).

In short, a PIM is method of incentivizing truth in markets by allowing industry experts to make money from their niche knowledge and face consequences for erroneous claims. For the crypto-savvy, the PIM model makes use of a cryptoeconomic system — or protocol-based mechanism design — to incentivize a particular set of behaviors within the network, in this case the supplying and securing of production information over time.

Together, the living product documentation and bundle of warranties of the MAPs and the incentive structure enabled by PIMs can help create more authentic B2C and B2B relationships which don’t rely on deceptive business practices at the expense and detriment to many.

Mattereum Trust-as-a-Service: B2C

The Mattereum model is a win-win for businesses and their customers. Any of the 700+ companies listed in the FTC Notice of Penalty Offenses — ranging from tech giants to telecomms to food services — would benefit from embracing a more decentralized approach to securing information and accountability around the sale of goods and services. Fake reviews and shady endorsements simply don’t work well within the Mattereum Protocol. By design, any and all faulty information has consequences.

Companies can take initiative by integrating the Mattereum Protocol into their launch process or wait for their customers to do so down the line. The former option is certainly ideal.

An Asset Passport can be generated at any point throughout a product’s lifecycle. Of course, having a MAP at the beginning of the cycle at the point of manufacturing or even design stage would allow for much more information-rich documentation over the course of time, but MAPs can be created even years after initial product release.

Instead of putting the burden on companies to create and implement their own trust frameworks, they can instead plug their operations into an existing protocol. This makes adoption easier than a patchwork, disconnected solution.

There is a potentially huge long-term effect in this approach. If a credibly-neutral, autonomous, decentralized third-party system for warrantying product information takes off, it will put pressure on businesses to improve the quality and authenticity of their offering. Failure to adapt to the new paradigm will result in a flight of customers to more provably trustworthy competitors.

This is key: product information markets turn the trustworthiness of an enterprise into a competitive advantage in the marketplace while also maintaining regulatory compliance. All in the same system.

Mattereum Trust-as-a-Service: B2B

As above, so below.

While the FTC notice highlights a severe misalignment in the average business-consumer relationship with a list of companies that looks like a library catalog, this trust problem also extends to the deals which happen much farther upstream to the corporate supply chain.

Between mineral and materials sourcing, manufacturing, and distribution, the sheer scale of supply chains makes it difficult to document product information before it reaches digital or physical storefronts.

The only other sources of truth available beyond the manufacturer are the specialist firms which rate and certify objects of a particular domain: fine art, collectible cards, instruments, vehicles, etc. However, these institutions are limited in their capacity by their lack of a shared record of an object’s history. Best case there’s an entry in a single database. Worst case: a single paper certificate.

This disconnected certification system and lack of initiative and coordination in securing product information creates opportunities for even the world’s largest companies — such as Boeing — to be supplied defective or counterfeit parts, components, or ingredients at real risk to public health and safety.

Had Boeing integrated MAPs within their supply chain and production process, they could have paired their incredibly detailed product specifications with warranties supplied by third-party engineering firms and other entities. Clear lines of accountability throughout a vast web of B2B deals.

While we delineate B2C and B2B for explanatory purposes, ultimately the benefits of provable authenticity cascade throughout the entire system. If a business sources materials from verifiable and transparent sources, the company will be less likely to perpetuate faulty parts or information downstream to its own customers.

The goal of Mattereum’s trust-as-a-service approach is simple in theory but profound in its potential: to power a market economy that doesn’t prey on individuals and institutions, while aligning profitability with sustainability.

The cost and optics of civil penalties will get us nowhere. Let’s try something different.

About Mattereum

London-based Mattereum was established in 2017 by a trans-disciplinary team with a track record in designing and launching nation state-level infrastructure and headed by former Ethereum release coordinator Vinay Gupta. Mattereum is building an innovative trust-as-a-service platform for securing trust and liquidity in the sale of physical assets, creating durable secondary markets, and removing negative externalities of trade.

Follow us as we bring the Mattereum Protocol to an expanding variety of markets and industries.

More at: http://www.mattereum.com

Twitter: https://twitter.com/mattereum

Telegram: https://t.me/mattereum

Countering Marketplace Deception with Mattereum’s Trust-as-a-Service Platform was originally published in Mattereum - Humanizing the Singularity on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 19. October 2021

IBM Blockchain

Picturing the digital assets momentum, framed through a European lens

The birth and rise of the Bitcoin cryptocurrency paved the way for several waves of innovation that are gradually shifting paradigms in the financial ecosystems. The digital asset momentum, empowered by tokenization and underpinning blockchain technologies, is now undeniable. Amongst the recent milestones that made the headlines; the ever-growing rise of Bitcoin value, the incredible […] The pos

The birth and rise of the Bitcoin cryptocurrency paved the way for several waves of innovation that are gradually shifting paradigms in the financial ecosystems. The digital asset momentum, empowered by tokenization and underpinning blockchain technologies, is now undeniable. Amongst the recent milestones that made the headlines; the ever-growing rise of Bitcoin value, the incredible […]

The post Picturing the digital assets momentum, framed through a European lens appeared first on IBM Blockchain and Supply Chain Blog.


Blockchain newsletter: Emerging coronavirus variants spur blockchain innovations in healthcare

Get a first look at the Gartner report for decentralized identity and verifiable claims. Access promising use cases, risks and considerations, and expert recommendations on creating value for a fully decentralized future. Here’s your complimentary access to Gartner’s Innovation Insights. Delta variant refocuses attention on vaccine passports The surge of COVID-19 cases due to the […] The post Bl

Get a first look at the Gartner report for decentralized identity and verifiable claims. Access promising use cases, risks and considerations, and expert recommendations on creating value for a fully decentralized future. Here’s your complimentary access to Gartner’s Innovation Insights. Delta variant refocuses attention on vaccine passports The surge of COVID-19 cases due to the […]

The post Blockchain newsletter: Emerging coronavirus variants spur blockchain innovations in healthcare appeared first on IBM Blockchain and Supply Chain Blog.


Blockchain newsletter: Real-world success stories from IBM Blockchain

Visionary companies are innovating how they work with IBM Blockchain, convening business networks that extend collaboration and optimization beyond organizational walls. We are laser focused on the areas delivering significant business value for our customers and their industry partners today: Supply chain. Bolster your supply chain with multi-tier visibility and workflow automation. This matters

Visionary companies are innovating how they work with IBM Blockchain, convening business networks that extend collaboration and optimization beyond organizational walls. We are laser focused on the areas delivering significant business value for our customers and their industry partners today: Supply chain. Bolster your supply chain with multi-tier visibility and workflow automation. This matters more […]

The post Blockchain newsletter: Real-world success stories from IBM Blockchain appeared first on IBM Blockchain and Supply Chain Blog.


Why open source isn’t free: Support as a best practice

The use of open source code is on the rise. Red Hat’s 2021 Enterprise Open Source Report found that 90% of companies use open source code and 79% of IT leaders expect their business use of open source to increase. Also on the rise, unfortunately, is malware and ransomware up 158% in 2020 according to […] The post Why open source isn’t free: Support as a best practice appeared first on IBM Blockc

The use of open source code is on the rise. Red Hat’s 2021 Enterprise Open Source Report found that 90% of companies use open source code and 79% of IT leaders expect their business use of open source to increase. Also on the rise, unfortunately, is malware and ransomware up 158% in 2020 according to […]

The post Why open source isn’t free: Support as a best practice appeared first on IBM Blockchain and Supply Chain Blog.


Geospatial data: The really big picture

The combination of a pandemic and a record-setting year of extreme weather events has reminded leaders in every industry that the health of our people, our global economy and the environment are inextricably linked. Sustainability is now a strategic business imperative, critical to creating new levels of resiliency and responsible practices that preserve our planet […] The post Geospatial data:

The combination of a pandemic and a record-setting year of extreme weather events has reminded leaders in every industry that the health of our people, our global economy and the environment are inextricably linked. Sustainability is now a strategic business imperative, critical to creating new levels of resiliency and responsible practices that preserve our planet […]

The post Geospatial data: The really big picture appeared first on IBM Blockchain and Supply Chain Blog.


KuppingerCole

Meeting the Identity and Access Challenges in a Multi-Cloud World

Multi-cloud deployments are becoming increasingly common as organizations seek to remain competitive in the digital economy and address demands for increased remote working. But while cloud migration is enabling business success, it is not without its identity and access challenges. Join experts from KuppingerCole Analysts and Beyond Trust for a discussion about the risks associ

Multi-cloud deployments are becoming increasingly common as organizations seek to remain competitive in the digital economy and address demands for increased remote working. But while cloud migration is enabling business success, it is not without its identity and access challenges.

Join experts from KuppingerCole Analysts and Beyond Trust for a discussion about the risks associated with cloud environments and how to use IAM and PAM to protect this expanded attack surface against threat actors.  

Paul Fisher, Senior Analyst at KuppingerCole outlines the identity and access challenges associated with multi-cloud deployments. He explains why IAM is essential, and why PAM is also important in terms of traditional privileged access management and secrets management.  

Brian Chappell, Deputy CTO at Beyond Trust highlights the need to address the complexity and interoperability issues arising from siloed identity stores, native toolsets, and conflicting shared responsibility models. He also outlines the benefits of standardizing management and security controls across the IT ecosystem, and other cloud security best practices.  




IBM Blockchain

It’s time to break the cycle and restore trust in advertising

As part of the CES 2021 conference, IBM’s Bob Lord participated in a panel with leaders from CVS and Delta to discuss the importance of marketers using advanced technology like AI for social good and to transform the ad industry. We sat down with him for a deeper dive. Over the last year there has […] The post It’s time to break the cycle and restore trust in advertising appeared first on IBM Bl

As part of the CES 2021 conference, IBM’s Bob Lord participated in a panel with leaders from CVS and Delta to discuss the importance of marketers using advanced technology like AI for social good and to transform the ad industry. We sat down with him for a deeper dive. Over the last year there has […]

The post It’s time to break the cycle and restore trust in advertising appeared first on IBM Blockchain and Supply Chain Blog.


Opening New York State for business with the power of blockchain

What excites me the most about being part of the team at IBM is the work we do for our clients that truly makes a difference in individual lives and provides for smarter and safer interactions with each other and our planet. The urgency to reopen all areas of the economy safely as we navigate […] The post Opening New York State for business with the power of blockchain appeared first on IBM Bloc

What excites me the most about being part of the team at IBM is the work we do for our clients that truly makes a difference in individual lives and provides for smarter and safer interactions with each other and our planet. The urgency to reopen all areas of the economy safely as we navigate […]

The post Opening New York State for business with the power of blockchain appeared first on IBM Blockchain and Supply Chain Blog.


Accelerating your journey to decarbonization through digitalization

Corporations around the world are increasingly focused on sustainability and implementing strategies for achieving net-zero emissions in greenhouse gases (GHG) by 2050 or sooner. To stop global warming and the impacts of climate change, we need to reduce the 51 billion tons of carbon emitted into the atmosphere annually down to zero, and limit global […] The post Accelerating your journey to dec

Corporations around the world are increasingly focused on sustainability and implementing strategies for achieving net-zero emissions in greenhouse gases (GHG) by 2050 or sooner. To stop global warming and the impacts of climate change, we need to reduce the 51 billion tons of carbon emitted into the atmosphere annually down to zero, and limit global […]

The post Accelerating your journey to decarbonization through digitalization appeared first on IBM Blockchain and Supply Chain Blog.


Building a more sustainable, equitable future with trust

IBM has a strong heritage in social responsibility. Our technical and industry professionals across business units and research divisions develop new ways of helping to solve difficult environmental problems based upon data and today’s exponential information technologies — including AI, automation, IoT and blockchain, which also have the power to change business models, reinvent processes, […]

IBM has a strong heritage in social responsibility. Our technical and industry professionals across business units and research divisions develop new ways of helping to solve difficult environmental problems based upon data and today’s exponential information technologies — including AI, automation, IoT and blockchain, which also have the power to change business models, reinvent processes, […]

The post Building a more sustainable, equitable future with trust appeared first on IBM Blockchain and Supply Chain Blog.


Indicio

Identity Blockchains and Energy Consumption

The post Identity Blockchains and Energy Consumption appeared first on Indicio Tech.
Bitcoin has given blockchain the carbon footprint of Godzilla; but when it comes to identity, blockchain-based distributed ledgers are light on energy use and long on benefits

Blockchain has become synonymous with cryptocurrency, and crypto is rapidly becoming to energy consumption what crack cocaine once was to addiction. Headlines about bitcoin miners stealing electricity to “Bitcoin consumes ‘more electricity than Argentina” have generated much heat but not always a lot of light (this article from Harvard Business Review offers a nuanced view of the energy consumption controversy).

The problem is that this mental shortcut can leave the impression that the energy intensive computation required to validate bitcoin transactions — which is known as “proof of work”— is a process required by all blockchains, thereby making the technology environmentally unfriendly in general

It isn’t and here’s why:

An identity blockchain like the Indicio Network uses signatures rather than mathematical computation to generate proof. No complex mathematical processes are needed. You either accept the signature or you don’t.

 

A write to the ledger (and one write can be the basis for millions of identity credentials) or a look up on the ledger uses no more energy, and possibly less, than browsing a web page.

 

A decentralized network using a blockchain-based distributed ledger means you can use Peer DIDs to move most “transactions” and their cryptographic proofing off ledger. This means that for those peer-to-peer interactions, identity blockchains don’t need to do any ledger transactions at all.

 

As most of our digital interactions are on a one-to-one basis, there is no need for them to take place on the blockchain; the blockchain is simply the root of trust for the identities of the parties issuing credentials: once these identities have been looked up and confirmed by each party, everything else happens peer-to-peer. And with Peer DIDs, each communication is cryptographically unique — a huge advancement in privacy and security requiring no more energy than, say, using encrypted email.

Although harder to quantify, the energy saved from using a technology that enables you to trust information online is  also something to  be taken into account. The  same goes for more efficient and effective usability and much better risk mitigation. But the point doesn’t require this detailed analysis to hold true: All blockchains are not Bitcoin and identity blockchains using Peer DIDs are low energy consumers.

That’s why we run the Indicio Network and believe in and advocate for this technology: and that’s why it would be a huge loss  if a low energy use of blockchain were to be mistakenly seen as having the carbon footprint of Godzilla.

The post Identity Blockchains and Energy Consumption appeared first on Indicio Tech.


Coinfirm

Are 44.73% of Uniwap V2 Liquidity Pools Rug Pulls?

*It should not be inferred from this analysis that rug pulls on Uniswap are more or less common than other DEXes. To have your token traded on Uniswap V2, a token creator must create a liquidity pool. To do so they need to provide the same amount of supply of wrapped Ethereum as their token....
*It should not be inferred from this analysis that rug pulls on Uniswap are more or less common than other DEXes. To have your token traded on Uniswap V2, a token creator must create a liquidity pool. To do so they need to provide the same amount of supply of wrapped Ethereum as their token....

Finicity

Secure Account Opening Wins in a Digital World

In a hyperconnected world, it’s hard to name a transaction, financial or otherwise, that takes more than a few moments. Much of our business and personal lives take place on a tiny mobile screen. Instant results are the universal expectation. Buying a pair of shoes, commenting on a social post, and paying utility bills are […] The post Secure Account Opening Wins in a Digital World appeared firs

In a hyperconnected world, it’s hard to name a transaction, financial or otherwise, that takes more than a few moments. Much of our business and personal lives take place on a tiny mobile screen. Instant results are the universal expectation. Buying a pair of shoes, commenting on a social post, and paying utility bills are all part of consumers’ continuous, uninterrupted flow. If it happens on a screen, it has to be now, now, now. Secure account opening is no different. Make your customer wait, and they’re gone. 

While account opening should take only seconds, issues like manual uploads and microdeposits add delays. Finicity, a Mastercard company, is addressing this new reality aggressively, stripping away friction points in the account set-up, onboarding and funding process. Open banking introduces new ways for financial institutions to verify account ownership and authenticate credentials. Accounts are opened and funded in moments, with a full package of data that includes account owners, details and balances. This creates the perception of immediacy that the end-user expects throughout their on-screen day.

Locking Fraudsters Out of the System

Understandably the risk, compliance, and customer experience balance is delicate. Efficiency can’t compromise fraud prevention. That’s why Finicity built its payment solution behind a driving principle: Pay Confidently. This means secure, lightning-fast account credential and balance verification. Financial service providers can verify account ownership and simplify the process by implementing their own application pre-fill functionality. Access to bank account data with instant account verification boosts sign-ups, reduces non-sufficient funds (NSF) fees and lowers abandonment rates. The connectivity is also useful when customers have an existing relationship with an FI but want to move money or add services. Data services help FIs verify account details and balances in milliseconds so they can move money accurately and securely.

Throughout the 2020 COVID-19 lockdowns, online fraud attacks rose by 250%. By far, the majority of them were account takeover scams, where fraudsters steal credentials and account information to siphon funds away from account holders. These attacks rose by a staggering 650%. In 2020 alone, the FTC tracked $3.3 billion in fraud losses to consumers. The convergence of fast-paced digital banking growth and a new wave of inexperienced customers created an opportunity for criminals to exploit. Finicity built its payment solution to cancel out these threats. Finicity Pay uses secure, tokenized access that yields no meaningful data if hacked. Our account verification service instantly and accurately identifies the account holder, stamping out account takeover scams before they get started. The user experience is positive, fast, and most importantly, onboarding is completed at the exact moment that the consumer wants to complete it. 

Why Finicity Open Banking? 95% Market coverage of direct deposit accounts. From the largest FIs to the smallest credit unions  – Finicity has you covered. Receive fast, reliable financial data that has been permissioned by the consumer for their benefit. Finicity is leading the industry towards direct API connections, signing Data Access Agreements with the largest financial institutions, payroll providers and wealth management companies. Added intelligence and deep learning. The analytics layer in our data services enables accurate, confident payments and verifications. Our added intelligence helps mitigate fraud risk, reduce payment failure and fees, enable onboarding, and maintain compliance. A True Partner 

The open banking wave is just beginning to rise. COVID-19 has only accelerated the shift to digital banking options that offer faster, slicker ways to set up new accounts, move money and make payments. From October 2020 to August 2021 alone, Gen Zers and Millennials doubled their adoption of digital banks as the primary holder of their accounts. Open banking is the backbone behind the innovation and lifestyle options that are driving fintech app growth.

This is where Finicity provides a differentiated experience. As an innovation partner, Finicity’s development team can identify the best tools to suit your unique use case with the transparency and control that consumers demand. If they feel they can easily set up a new account and then pay quickly and safely, they will adopt your platform. The confidence that secure account opening inspires will drive down-funnel conversions and build confidence in your organization. 

Visit our demo page to see how Finicity can help you innovate with data today.

The post Secure Account Opening Wins in a Digital World appeared first on Finicity.


auth0

A Tour Through the OWASP Top 10

A quick look at the refreshed OWASP Top 10 to celebrate Cybersecurity Awareness Month
A quick look at the refreshed OWASP Top 10 to celebrate Cybersecurity Awareness Month

Global ID

GiD Report#182 — Bitcoin goes mainstream

GiD Report#182 — Bitcoin goes mainstream Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: The Bitcoin ETF has finally arrived Flare launches its Songbird test network A primer on DeFi Meme of the week Stuff happens 1. The Bit
GiD Report#182 — Bitcoin goes mainstream

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

The Bitcoin ETF has finally arrived Flare launches its Songbird test network A primer on DeFi Meme of the week Stuff happens 1. The Bitcoin ETF has finally arrived. The first Bitcoin ETF is now trading on the NYSE, Photo: Jayson Photography

2021 will be remembered as the year that Bitcoin went mainstream. First came the Coinbase IPO, a watershed moment for the industry as the largest U.S. exchange touched a market cap of over $95 billion on its first day listed on the NASDAQ.

Soon after, El Salvador’s 40-year-old president Nayib Bukele announced that Bitcoin would be considered a national currency. (That experiment is ongoing.) There are rumblings that Brazil might follow suit. (Brazilians have bought more than $4 billion worth of cryptocurrency this year, according to their central bank.)

And this week, Proshares has launched the first Bitcoin ETF in the U.S. on the NYSE. (Exchange traded crypto products have already launched in Canada and across Europe.)

Countless Bitcoin ETFs have been proposed over the years. Until now, the SEC has rejected the idea, but SEC chief Gary Gensler recently softened his tone, noting that an ETF based on Bitcoin futures was more palatable to the enforcement agency. Since mutual fund rules would apply, Gensler believes such a setup would provide “significant investor protections.”

A Bitcoin ETF is a big deal because it provides institutional investors access to crypto exposure through a highly regulated investment vehicle.

Here’s Dealbook:

“2021 will be remembered for this milestone,” said Michael Sapir, the C.E.O. of ProShares. Investors who are curious about crypto but hesitant to engage with unregulated crypto exchanges want “convenient access to Bitcoin in a wrapper that has market integrity,” he said. For nearly a decade, crypto entrepreneurs and traditional finance firms have sought permission to launch a Bitcoin E.T.F. in the U.S., but their applications have been delayed or denied by the S.E.C. Many remain pending.

It’s also just the beginning:

“This is an exciting step but not the last,” Douglas Yones, the N.Y.S.E.’s head of exchange traded products, told DealBook. He foresees a range of crypto-linked E.T.F.s getting approval, eventually. Tomorrow’s E.T.F. launch is another sign of crypto’s mainstream legitimacy in a year of milestones for the industry, including the crypto exchange Coinbase going public. Critics remain wary of cryptocurrencies, as do regulators, but the digital asset craze of 2021 shows few signs of abating.

Elsewhere in crypto:

Bitcoin Comes to the Big Board CFTC Fines Tether and Bitfinex $42.5M for ‘Untrue or Misleading’ Claims SEC Set to Allow Bitcoin Futures ETFs as Deadline Looms Crypto Lender Celsius Network Raises $400M in Bid to Reassure Regulators Stripe Is Hiring a Crypto Team 3 Years After Ending Bitcoin Support Global Finance Watchdog Says $133B Stablecoin Sector Remains Niche ConsenSys Acquires Treum’s Team, NFT Platform — Blockworks Coinbase Plans to Launch NFT Marketplace — Blockworks SEC throws sop to US investors with bitcoin ‘lite’ equity ETFs 2. Flare launches its Songbird test network

The Flare Network was announced to much hooplah last November — promising to bring the power of Ethereum smart contracts to XRP but without the scalability limitations of Ethereum’s proof-of-work or even proof-of-stake mechanisms.

Why is that a big deal? Smart contracts — programs stored on the blockchain — are the basic building blocks for the DeFi revolution we’ve witnessed over the last year. (They’re also important to NFTs and dApps or decentralized applications.)

In addition to XRP, the Flare Network should eventually be able to bring smart contract features to any blockchain. (The current launch plans also include Litecoin and Dogecoin.)

The project recently hit a new milestone with the launch of Songbird. Songbird is a Canary Network, essentially a fancy term for a test network, powered by its own scarce token (SGB), allowing developers and users to get acquainted with the Flare Network prior to launch.

I spoke to the Flare team this morning, and they expect Songbird to be feature complete by the end of the year with the Flare Network itself launching sometime in Q1 2022.

In the meantime, you can learn more about Flare here and grab the Bifrost wallet, which supports the SGB token.

3. A primer on DeFi

Speaking of DeFi, here’s a great primer for those looking for a quick 101 crash course on the topic, courtesy of the Wharton School (via /gregkidd and /junhiraga):

DeFi Beyond the Hype:
DeFi is a general term for decentralized applications (Dapps) providing financial services on a blockchain settlement layer, including payments, lending, trading, investments, insurance, and asset management. DeFi services typically operate without centralized intermediaries or institutions, and use open protocols that allow services to be programmatically combined in flexible ways.
4. Meme of the week

Via /gregkidd:

4. Stuff happens SEC Chief to Wall Street: The Everything Crackdown Is Coming The creator economy is failing to spread the wealth Via /easwee — The gross payouts of the top 100 highest-paid Twitch streamers from August 2019 until October 2021 Oversight Board to meet with Frances Haugen | Oversight Board Opinion: SEC commissioner: Investors have the right to make their own decisions without regulators standing in the way Facebook’s Ad-Tracking Loss Is Startups’ Gain

GiD Report#182 — Bitcoin goes mainstream was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Dec 02, 2021: Urgent Need to Protect the Most Critical Business Assets: Data & People

Data is widely considered the “new oil" because data has great value and provides direct and secondary revenue streams. But, also like oil, data can leak from organizations that depend on it. Therefore, data security and the prevention of data leakage is imperative for business as well as regulatory compliance reasons.
Data is widely considered the “new oil" because data has great value and provides direct and secondary revenue streams. But, also like oil, data can leak from organizations that depend on it. Therefore, data security and the prevention of data leakage is imperative for business as well as regulatory compliance reasons.

Dock

Credenxia leaps into the future with Dock’s API

Preparing to move towards a decentralized solution for creating and managing employee credentials, Credenxia is working with Dock to build a Proof of Concept (PoC) application to create, issue, manage, and verify credentials instantly. This PoC is now complete!

Preparing to move towards a decentralized solution for creating and managing employee credentials, Credenxia is working with Dock to build a Proof of Concept (PoC) application to create, issue, manage, and verify credentials instantly. This PoC is now complete!

Helping businesses around the world work smarter, better, and faster, Credenxia offer a digital platform that uses cloud and smart technology to verify employee identity and establish proof of qualifications. With a mission to simplify management of the workforce, Credenxia understood the need to provide alternatives to their existing centralized verification solutions.

The better solution? Verifiable credentials. These are digital versions of an individual's identity documents, academic achievement, licenses and more. Verifiable credentials use cryptography, providing assurance as to who issued the credential and who it belongs to, as well as guaranteeing the legitimacy of it. Verifiable credentials sit in a digital wallet on the individual’s mobile device, giving users back control over their identity.

What are Verifiable Credentials and what makes it a better option?

These digital certificates are stored on an individual's digital wallet that can reside on a mobile device. If the accreditation requires a time stamp on how long it is valid for, the verifiable credential can be time stamped, anchoring them to the Dock blockchain. These credentials can be digitally presented to a third party for instant verification. Users receive the certifications and identity from the issuing body, i.e. qualifications from educational organizations, they own it and use it entirely on their own free will.

It saves a workforce several days on the time it takes to issue and verify an individual's identity and credentials. A university can take months to prepare, issue and print certificates and diplomas, and an employer would spend weeks verifying an employee’s identity and accreditations. With verifiable credentials, hundreds of certifications and identities can be legitimately issued and verified instantly, saving hours and days of tedious work.

In addition, verifiable credentials are tamper-proof and use cryptography to ensure that only the issuer can make changes to the credentials, and only the intended recipient can present the credentials in a verifiable way.

Dock’s work with Credenxia

To date, Credenxia has provided a centralized solution with verification through their cloud-based portal. Wanting to offer an alternative to their existing centralized solutions that provides the added benefits of using verifiable credentials with blockchain technology, Credenxia approached Dock to help them develop a Proof of Concept (PoC) application. The vision was to give back control over one’s credentials and identity and allow instant verification of credentials.

Using Dock’s API, Credenxia will also offer decentralized solutions, building an application where credentials are held by an individual and where all data held is correct and legitimate, using an identity verification service to verify details of individuals prior to registration.

Dock CEO Nick Lambert said of the collaboration “Credenxia’s use of verifiable credentials is a great use case and we’re excited to support them in bringing this product to the market. Credenxia are getting in on the ground floor with this technology which will significantly reduce verification times and costs for their customers”.

Credenxia CEO Terry Jones said of the collaboration, “We are very pleased to have completed the Proof of Concept with Dock. It is a key strategic imperative for Credenxia to offer existing and new clients a decentralised option that maintains highest levels individual data privacy and integrity. Dock has led the way, by releasing the first public W3C credential verifier, which utilises the Verifiable Credentials Data Model (VCDM). Meeting the highest global standard in what will be a rapidly evolving digital compliance landscape is essential for our future growth roadmap. Following the PoC, I am confident that the Dock framework will support these objectives.”


KuppingerCole

Dec 09, 2021: Mitigate Risks, Cut Cost, and Achieve Compliance With AI-Driven IGA

Effective Identity Governance and Administration (IGA) is becoming increasingly important as digital transformation, cloud computing, and remote working increase the scope and complexity of Identity and Access Management (IAM) to new levels. But legacy role-based access control (RBAC) solutions alone are unable to meet the changing and dynamic IGA requirements of modern enterprises.
Effective Identity Governance and Administration (IGA) is becoming increasingly important as digital transformation, cloud computing, and remote working increase the scope and complexity of Identity and Access Management (IAM) to new levels. But legacy role-based access control (RBAC) solutions alone are unable to meet the changing and dynamic IGA requirements of modern enterprises.

Okta

Security.txt: Make Vulnerabilities Easier to Report

We all know that all software has bugs and that security is hard, but somehow we are still surprised when we see new vulnerabilities. Vulnerability A bug, flaw, weakness, or exposure of an application, system, device, or service that could lead to a failure of confidentiality, integrity, or availability. In 2020, there were 18,395 vulnerabilities reported, which mea

We all know that all software has bugs and that security is hard, but somehow we are still surprised when we see new vulnerabilities.

Vulnerability

A bug, flaw, weakness, or exposure of an application, system, device, or service that could lead to a failure of confidentiality, integrity, or availability.

In 2020, there were 18,395 vulnerabilities reported, which means about 50 new vulnerabilities are reported every day. These numbers only include what has been reported to MITRE’s Common Vulnerability and Exposures (CVE) database, which means the actual value is likely much higher. Every year we add more software to the world, so these numbers will only increase. I didn’t list this number here to scare you; more so, they are a fact of life, and we all need to be aware of how to deal with them.

Reporting Vulnerabilities

Vulnerabilities are NOT handled the same way as a typical software bug. A properly handled vulnerability is reported privately to the project’s maintainers, then fixed and released before any information about the vulnerability is made public. It may seem like a good idea to report a security issue on GitHub Issues, but it isn’t! Handling vulnerabilities privately may seem counter-intuitive, especially for open-source projects where everything is public, but this isn’t just good manners, even more importantly it reduces potential exposure for everyone using the project.

Okay, so if not a public issue tracker, then where do you report the issue?

Where do you report a vulnerability

The actual "where do you report a vulnerability" is where things get complicated; every project and company has a different process to handle vulnerabilities. Figuring out who to report an issue to is often an exercise in frustration; try to browse a website, do a Google search, maybe even look on bug bounty sites like HackerOne or Bugcrowd. There is an easier way! Enter security.txt, a draft RFC, that aims to standardize a method for security policies to be discovered.

Defining a security.txt is easy. Create a small .well-known/security.txt file at the root of your domain containing your security team’s contact information. There is even a form on securitytxt.org that will generate one for you in a few seconds!

For example, Okta’s https://www.okta.com/.well-known/security.txt is:

Contact: mailto:security@okta.com (1) Expires: 2023-01-01T05:00:00.000Z (2) # Optional Fields Preferred-Languages: en (3) Policy: https://www.okta.com/vulnerability-reporting-policy/ (4) Hiring: https://www.okta.com/company/careers/ (5) 1 How to contact the security team. 2 A date when to consider this data stale. 3 List of languages the security team prefers. 4 Link back to a reporting policy page with more details. 5 Even a link back to relevant job postings.

The contents of the security.txt, a quick summary of who to contact and where to go to find more information, is not a replacement for existing security policy pages. Still, it can contain additional information such as a company’s PGP keys, acknowledgments, and canonical URL information.

If you are familiar with OpenID Connect, the .well-known part of the above URL may look familiar to you. "Well Known URIs" are defined by RFC-8615, and there is an official IANA registry containing a variety of entries, everything from Tor Relay information to IoT protocols. The goal of all of these individual "well-known" endpoints is to make it easy to discover metadata about a specific service.

Bonus: Update your bug tracking templates

If we as developers were in the habit of reading documentation, RTFM wouldn’t be a thing, or maybe we are just busy:

In general, your users are trying to get something done, and they see reading the manual as a waste of time, or at the very least, as a distraction that keeps them from getting their task done.

— Joel Spolsky
joelonsoftware.com

If your project uses a public bug tracker (like GitHub Issues), information about handling security issues needs to be obvious to the reporter. One way to do this is to define a SECURITY.md file and create ISSUE_TEMPLATE.md; that way it’s the first thing the reporter sees when the file an issue. For example, the Spring Security issue template starts with:

<!-- For Security Vulnerabilities, please use https://pivotal.io/security#reporting --> What makes this so clever is the HTML comment; only the reporter sees the comment when opening an issue. It doesn’t get displayed when the Markdown is rendered! Learn more about application security

Most folks don’t know that vulnerabilities require special handling; it isn’t taught in a typical university computer science curriculum or coding bootcamp. Information about how to report security issues needs to be obvious and easily discoverable.

If you want to learn more about application security, check out these great posts:

How to Hack OAuth

Stealing OAuth Tokens With Open Redirects

SQL Injection in PHP: Practices to Avoid

If you have questions, please leave a comment below. If you liked this post, follow @oktadev on Twitter, follow us on LinkedIn, or subscribe to our YouTube channel.


Identosphere Identity Highlights

Identosphere #54 • NFT, VC & PICO • Data Privacy Floor • Challenges to SSI

A weekly publication following the leading news in decentralized identity standardization, governance, education and development.
This weekly newsletter is possible thanks to Patrons, like yourself.

Just over a year since this weekly review began. Incredible! Thanks for sticking with us!

Consider paying us a small amount each month via Patreaon

Support our work on Patreon — Get Exclusive Content!!

Read previous issues and Subscribe : newsletter.identosphere.net

Contact \ Content Submissions: newsletter [at] identosphere [dot] net

Upcoming events

Self-Sovereign-Identity & eIDAS a contradiction? Challenges and chances of eIDAS 2.0 University of Murcia/Alastria 10/19

The Business Models Made Possible By Economic Incentives 10/19

Authenticate Virtual Summit Recap and looking forward Authenticate 2021 • 10/18-20 • Fido Alliance

OpenID Foundation Sessions at the FIDO Member Plenary 10/21

Game Changers - Is Self-Sovereign Identity Going Exponential? 10/26

Does the W3C Still Believe in Tim Berners-Lee’s Vision of Decentralization? Evernym 11/3 (register)

Last month, Google, Apple, and Mozilla lodged formal objections to W3C approval of the W3C Decentralized Identifiers (DIDs) 1.0 specification.

Engineering Successful IAM Projects to Support Digital Business 11/23 KuppingerCole

Explainers What is Self Sovereign Identity Florian Strauf Decentralizing Identity - Taking Back Control Madigan Solutions

Self-Sovereign Identity allows individuals to manage their own identities by moving physical credentials to digital devices. An individual will receive a credential from an issuer which will be stored in their digital wallet.

The Sovereignty Stack: Re-thinking Digital Identity for Web3.0 w/ Greg KIDD [METACO TALKS #23]

In his latest venture Global ID, Greg is acting on his long-held belief that people’s identity should be truly portable and owned by individuals themselves rather than corporations or governments.

Standards Work OIDC with SIOPv2 and DIF Presentation Exchange Sphereon Sign in with Ethereum is being developed by Spruce

Already used throughout web3, this is an effort to standardize the method with best practices and to make it easier for web2 services to adopt it.

Why Are DIDs The Future of Digital Identity Management?

Why would you have 75 logins when you could have 1? 

Gimly ID: SSI with OpenID authentication

About Dick Hardt’s new thing 

Gimly ID is leading self-sovereign identity innovation, with the implementation of SSI with self-issued openID provider (SIOPv2) and full support for openID connect and DIF presentation exchange.

Explore Affinidi Schema Manager Proof-of-possession (pop) AMR method added to OpenID Enhanced Authentication Profile spec Mike Jones

I’ve defined an Authentication Method Reference (AMR) value called “pop” to indicate that Proof-of-possession of a key was performed. Unlike the existing “hwk” (hardware key) and “swk” (software key) methods [...] Among other use cases, this AMR method is applicable whenever a WebAuthn or FIDO authenticator are used.

https://openid.net/specs/openid-connect-eap-acr-values-1_0-01.html

https://openid.net/specs/openid-connect-eap-acr-values-1_0.html

OpenID Connect Presentation at IIW XXXIII Mike Jones

Introduction to OpenID Connect (PowerPoint) (PDF)

The session was well attended. There was a good discussion about the use of passwordless authentication with OpenID Connect.

NFTs, Verifiable Credentials, and Picos Phil Windley

Summary: The hype over NFTs and collectibles is blinding us to their true usefulness as trustworthy persistent data objects. How do they sit in the landscape with verifiable credentials and picos? Listening to this Reality 2.0 podcast about NFTs with Doc Searls, Katherine Druckman, and their guest Greg Bledsoe got me thinking about NFTs. 

Development Clear is better than clever Cheney.net

“why would I read your code?” To be clear, when I say I, I don’t mean me, I mean you. And when I say your code I also mean you, but in the third person. So really what I’m asking is, “why would you read another person’s code?”

Survey Finds Customers Frustrated With Passwords, Open to Biometrics FindBiometrics

Passwords were a major point of contention in that regard, with a strong majority (68 percent) of consumers indicating that it is difficult to remember and key in a large number of passwords. Nearly half (44 percent) believe that biometric authenticators are easier to use, while 34 percent would prefer to use them as their primary means of identity

Ecosystem These competitors joined forces to allow readers to use a single login across their news sites Nieman Lab

The founding media partners all agreed, however, that having more first-party data and increasing the share of registered visitors would allow them to build better relationships with readers and more relevant news products.

Self-sovereign identity use cases Cheqd 

While self-sovereign identity (SSI) sounds like an unfamiliar concept for some, others are actively leveraging the technology to address industry-specific challenges — take the KYC trial of the Financial Conduct Authority or the IATA Travel Pass.

Challenges to Self-Sovereign Identity Damien Bod

I based my findings after implementing and testing solutions and wallets with the following SSI solution providers:

Trinsic

MATTR.global

Evernym

Azure Active Directory Verifiable Credentials

Different Wallets like Lissi

MyData Weekly Digest for October 15th, 2021

Data Co-Operatives through Data Sovereignty
[…] This article illustrates an open debate in data governance and the data justice field related to current trends and challenges in smart cities, resulting in a new approach advocated for and recently coined by the UN-Habitat programme ‘People-Centred Smart Cities’.

Public Sector EEMA Training Launches with Focus on eIDAS, Self-Sovereign and National Identity, Blockchain, EU Legal Frameworks and Cyber Security 

Everyone who takes part in a course will have been taught by those leading their respective fields. We do not believe this caliber of training can be found anywhere else.

The US Data Privacy Law “Floor”: What Deserves Basic Protections? Anonym

The New York Times recently did a deep dive into the United States’ lack of a national data privacy law

1. Data collection and sharing rights
2. Opt-in consent
3. Data minimization
4. Non-discrimination and no data use discrimination

 IDnow AutoIdent will soon be usable according to German TKG

Automated identification procedures ensure seamless processes without media disruption and increase cost efficiency. By modernizing laws that allow these procedures, many cases can be simplified and modernized in the future.

DIDAS provides extensive commentary to the target vision for e-ID in Switzerland

Our submission (in German) is available here

The vital role of LEI Issuers in facilitating wider adoption of globally recognized business identities across Africa GLEIF

We spoke with Alberta Abbey, LEI Analyst, Data & Analytics, LSEG to discuss how this initiative will support wider adoption of globally recognized business identities, in the form of Legal Entity Identifiers (LEIs), across Africa and how to encourage more entities across Africa to obtain LEIs.

Use Cases On solving the worldwide shipping crisis Doc Searls

“The supply chain is essentially in the hands of the private sector,” a White House official told Donna Littlejohn of the Los Angeles Daily News, “so we need the private sector…to help solve these problems.” But Biden has brokered a deal among the different stakeholders to end what was becoming a crisis.

It's been 15 years of Project VRM: Here's a collection of use cases and requirements identified over the years rebooted

I categorize them by the stage of the relationship between customer and vendor:

Category 1: Establishing the relationship

Top 5 Most Interesting NFT Use Cases (Part 1) Europechain

From racehorses to virtual sushi: a dizzying NFT panoply 

The Future of Healthcare Relies on Adaptation auth0

Information security and identity management is not their core business, yet is a critical factor in compliant, secure business operations.

TrustBloc - Duty Free Shop use case (CHAPI Save + WACI Share)

This video demonstrates the TrustBloc platform to Issue a W3C Verifiable Credential through CHAPI and Share the Verifiable Credential/Presentation through WACI.

Governance Battle of the Trust Frameworks with Tim Bouma & Darrell O’Donnell Northern Block

Levels of Assurance (LOA)

The Concept of Trust

The World of Trust Frameworks

The Importance of Open Source for Trust Creation

Personal Data XSL Labs: Your Data Belongs to You 

The SDI maintains to keep the practicality of a unique identifier while guaranteeing the security of the data and the user's sovereignty over it.

Portpass app may have exposed hundreds of thousands of users' personal data

The federal privacy commissioner also said it has not yet received a report, and said it has contacted Portpass to seek further information in order to determine next steps, and that it is in communication with its provincial counterpart.

Company News Building towards a decentralized European Data Economy: A Minimal Viable Gaia-X (MVG) Ocean Protocol

Gaia-X is the cradle of an open, transparent, decentralized digital ecosystem, where data and services can be made available, collated, shared, and monetized in an environment of trust. More than 300 organizations and over 2500 contributors are already supporting Gaia-X.

Node Operator Spotlight: Anonyome Indicio 

Each of the capabilities of the Sudo Platform is attached to a persona. This includes masked email and masked credit cards, private telephony, private and compartmentalized browsing (with ad/tracker blocker and site reputation), VPN, password management, decentralized identity and more.

Okta + Auth0 Showcase 2021: Identity for All

Cloud, mobile, and Bring Your Own Device (BYOD) have transformed the dynamics of the digital world over the past decade. At the same time, IT is struggling to keep up with all of these changes, and developers are more burdened than ever

Auth0 Identity Platform Now Available on Microsoft Azure

secure cloud deployment option for organizations seeking strategic fit with their technology stack, supporting regional data residency capabilities and higher control over customer data.

Introducing: Civic Pass Integration Guide

developers can plug Civic Pass into their platform and create an identity layer that allows for a permissioned dApp platform, be it a DEX, an NFT marketplace or mint, a metaverse

Equifax Launches Digital Identity as a Service CU Ledger

Equifax’s suite of identity protection products including Digital Identity Trust, Document Verification and the recently acquired Kount Identity Trust Global Network are incorporated into the new holistic solution. 

How Yoma Uses Trinsic to Help African Youth Build Digital CVs

Verifiable credentials is a beautiful set of technology that allows people and organizations to get the data in a verifiable form that still respects agency

WAYF certificeret efter ISO 27001

WAYF has now been certified according to the standard for information security ISO 27001.

Bloom OnRamp Has Arrived

Beyond OnRamp’s direct data integrations, the platform also supports the ability for third party credentials to enter the OnRamp platform via the WACI specification

Decentralization GiD Report#181 — The future will be self sovereign

Just as the World Wide Web empowered people to connect and share knowledge and information, the rise of Bitcoin taught us that we could have direct ownership over our valuable assets and payments — no middle man or central operator necessary.

ID Not SSI There’s No Distributed Ledger Technology (DLT) in X-Road

All the nodes of an X-Road network have their own sequence of transactions that are not shared with any other nodes – not even in a Security Server cluster. Therefore, there’s no need for a consensus algorithm in X-Road.

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net

Support this publication: patreon.com/identosphere

Contact \ Submission: newsletter [at] identosphere [dot] net

Monday, 18. October 2021

KuppingerCole

Understanding the Privileged Access Management (PAM) Market

Privileged Access Management (PAM) solutions are critical cybersecurity controls that address the security risks associated with the use of privileged access in organizations and companies. To reduce the risk of privileged accounts being hijacked or used fraudulently, and to uphold regulatory compliance, a strong PAM solution is essential. But finding the right PAM solution can be challenging. C

Privileged Access Management (PAM) solutions are critical cybersecurity controls that address the security risks associated with the use of privileged access in organizations and companies. To reduce the risk of privileged accounts being hijacked or used fraudulently, and to uphold regulatory compliance, a strong PAM solution is essential. But finding the right PAM solution can be challenging.

Changing business practices, agile software development and digital transformation has meant that users of privileged accounts have become more numerous and widespread, making PAM one of the most important areas of risk management and security in any organization.

Therefore, it is crucial to understand this fast-growing and increasingly important market before investing.  Attend this webinar to help you navigate the PAM market to find the best solution for your company to ensure the most powerful user accounts are safe from compromise.

Paul Fisher, Senior Analyst at KuppingerCole, provides an overview of the KuppingerCole Leadership Compass for Privileged Access Management (PAM), which examines the market segment, and evaluates the key players, their market share, products, and services.  He also looks at the Leadership Compass methodology, the criteria used for vendor ratings, and the results of the comparative analysis.




Saviynt Cloud PAM

by Paul Fisher Saviynt Cloud PAM is a privileged access management solution engineered to work primarily as a service and sits as part of the Saviynt Enterprise Identity Cloud platform. It is a modern and competitive PAM package that performs most of the essential components of a PAM solution with a zero-footprint deployment model. It should be of interest to a wide number of organizations.

by Paul Fisher

Saviynt Cloud PAM is a privileged access management solution engineered to work primarily as a service and sits as part of the Saviynt Enterprise Identity Cloud platform. It is a modern and competitive PAM package that performs most of the essential components of a PAM solution with a zero-footprint deployment model. It should be of interest to a wide number of organizations.

uPort

ENS names are Decentralized Identifiers (DIDs)

Decentralized Identifiers (DIDs) are a new type of unique identifiers that can be controlled solely by the user. With zero transaction costs, users can easily create their own DIDs. They will then be able to prove control over their DID and allow counterparties to find their public encryption key, signature verification key and public services. Those services can be used to interact with the user’

Decentralized Identifiers (DIDs) are a new type of unique identifiers that can be controlled solely by the user. With zero transaction costs, users can easily create their own DIDs. They will then be able to prove control over their DID and allow counterparties to find their public encryption key, signature verification key and public services. Those services can be used to interact with the user’s DID.

The DID specification has matured over the last couple of years and is about to become a formal W3C standard. It defines a universal abstract data model representation for identifiers and their verification material (e.g. public keys), relationships and services. The specification is extensible by design which means new types of services, verification materials and other features can be supported. In the core, the specification contains a simple interface to resolve a DID Document from a DID (similar to an Ethereum Account from an ENS name) by anyone who knows the DID of the user. The DID Document will then contain the relevant information to enable use cases such as sign up, sign in, data encryption, secure communication, verifiable authorship and data provenance etc. Since DIDs are URI-compliant, they also make perfect sense for web ontologies.

For the Decentralized Identity (or Self-Sovereign Identity) Community in the Decentralized Identity Foundation (DIF), European Blockchain Services Infrastructure (EBSI), Hyperledger Aries, W3C and OIDC4SSI in OpenID Foundation (OIDF) and many others, DIDs have been a central component and building block for user-centric identity solutions for years.

Other DID-like Identifiers …

However, the Ethereum community is exploring other identifiers like NFTs and Ethereum Name Service (ENS) as identifiers (as a byproduct) with similar goals as DIDs in mind. Both can be created and solely controlled by the user. ENS is a distributed, open, and extensible naming system based on the Ethereum blockchain. ENS’s job is to map human-readable names like ’vitalik.eth’ to machine-readable identifiers such as Ethereum addresses and metadata.

A lot of Web3 users have been using ENS names as their identifiers (see Etherscan). Also Twitter is experimenting lately with NFTs and therefore ENS names since they comply with the EIP-721 standard. We can expect more traction in the near future.

ENS names are now DIDs …

DIDs are not a replacement for Ethereum Accounts and ENS. Instead, DIDs can be seen as an abstract representation of those which makes it easier for developers to build applications across different chains and platforms. Many developers in the Decentralized Identity Community are already building a lot of Open Source tooling/products and protocols facilitating trust, privacy, security and data sovereignty, to integrate with the Ethereum ecosystem and vice versa. Some of the Open Source components include decentralized agents (e.g. Veramo), secure communication (i.e. DIDComm Messaging), capabilities-based authorization and delegation frameworks (e.g. ZCaps), login w/ identity wallets (i.e. SIOPv2), user-controlled confidential (e.g. Kepler) or public storage (e.g. Ceramic) and more.

DID-based representations for Ethereum Accounts have been already defined and used. Examples include:

did:ethr:mainnet:0xb9c5714089478a327f09197987f16f9e5d936e8a did:safe:0xff6229bc3655cf0204e850b54397d3651f5198c4_eip155.1 did:pkh:eth:0xb9c5714089478a327f09197987f16f9e5d936e8a

We have now defined a DID-representation for ENS names such as:

did:ens:mainnet:vitalik.eth

This has two purposes:

to wrap existing ENS names as DIDs to facilitate interoperability of emerging technologies in the Decentralized Identity and Ethereum community, to define a canonical way to augment ENS names with DID capabilities (e.g., encryption) as mentioned above.

We have already officially registered a DID method specification for did:ens in the W3C DID registry and are now looking for an appropriate home of the specification for further development. We are happy to donate the specification including the ens-did-resolver implementation.

Now, every ENS name can be represented as a DID with no extra steps. The default DID Document will always contain the ENS registry as the public profile service and the current owner of the ENS name as the controller of the DID which already enables use cases such as issuing and proving control of verifiable attestations (i.e. based on W3C Verifiable Credentials) which can be used for sign up and sign in.

Through the ENS registry, standardized TEXT records (as defined in the did:ens specification) can be added to the ENS name to enable more DID-like features, e.g., encryption, delegation, confidential storage, communication etc.

For example, to enable encryption for a did:ens DID, just add the following TEXT records to your ENS name (see did:ens:ropsten:awoie.eth):

ENS Text Records for DIDs

You can now use did:ens with Veramo or the did-resolver in your own projects.

Using DIDs as a standard representation for Decentralized Identifiers will increase interop and synergies between different platforms, chains, applications and communities.

Wanna talk about how to use ENS names as DIDs? Join our Discord

ENS names are Decentralized Identifiers (DIDs) was originally published in uPort on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: OFAC Publishes Brochure on Sanctions Compliance Guidance for the Virtual Currency Industry

🇺🇸  Office of Foreign Assets Control (OFAC) Publishes Brochure on Sanctions Compliance Guidance for the Virtual Currency Industry On Friday of last week, the Office of Foreign Asset Control (OFAC), which sits within the United States Department of the Treasury, released a guidance brochure (OFAC Brochure) detailing the ways in which virtual currency industry participants may
🇺🇸  Office of Foreign Assets Control (OFAC) Publishes Brochure on Sanctions Compliance Guidance for the Virtual Currency Industry

On Friday of last week, the Office of Foreign Asset Control (OFAC), which sits within the United States Department of the Treasury, released a guidance brochure (OFAC Brochure) detailing the ways in which virtual currency industry participants may effectively implement compliant sanctions controls, along with the potential consequences they may face for failing to do so. The brochure not only outlines the relevant sanctions rules and regulations, but also provides industry best practices and case studies exemplifying the points made throughout the piece.


Ocean Protocol

OceanDAO Round 11 is Live

500,000 OCEAN available for sustainable data-oriented projects! Hello, Ocean Community! OceanDAO is a grants DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month. The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, ser

500,000 OCEAN available for sustainable data-oriented projects!

Hello, Ocean Community!

OceanDAO is a grants DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month. The Ocean ecosystem becomes self-sustainable as the builders of the Web3 data economy leverage Ocean Protocol to create products, services, and resources that the community finds valuable.

The OceanDAO website has up-to-date information on getting started with OceanDAO.

OceanDAO’s next funding round — Round 11 — has 500,000 OCEAN available ($400,000 USD at today’s prices). Submissions are due November 2nd; we encourage you to submit early for better feedback and engagement with the community. The rest of this post has details about Round 11.

Thank you to all of the OceanDAO participants, voters, and proposers.

OceanDAO Round 11 Updates, Announcements and Guidelines Round 11 Parameters

Amount

There is 500,000 OCEAN in grant funding available in Round 11.

Grant Funding Categories

Building / improving applications or integrations to Ocean Community or developer outreach (grants don’t need to be technical in nature) Unleashing data Building or improving core Ocean software Improvements to OceanDAO itself

Project Standing

If you have previously received a grant, you must update your current Project Deliverables to remain in a good standing to be eligible to participate in future Rounds. Follow the instructions on the Project Standing Dashboard.

Funding Tiers

The amount requested is in USD; the amount paid is in OCEAN token. The conversion rate is the market price on the Proposal Submission Deadline.

To continue incentivizing completion and outcomes, we are continuing Funding Tiers, which can be acquired by Teams delivering on their grant promises. Below are the Funding Tiers accessible by projects for OceanDAO Grants Round 11. “Unleashing Data” has two tiers ($3,000 USD, $10,000 USD max); other categories have four tiers ($3,000 USD up to $35,000 USD max).

“Unleashing Data” Category

Open to All Funding Ceiling: $3,000 USD

2. Experienced Project

Funding Ceiling: $10,000 USD Requires: $1,000 USD of legitimate consumptions in the previous month AND the team has received less than $30,000 USD funding total.

All other Categories (max per team):

New Project Funding Ceiling: $3,000 USD Requires: No one in your project has ever received a grant from OceanDAO. Open to all. Benefits: Earmarked. Receive feedback during application process. Introduced to related projects.

2. Existing Project

Funding Ceiling: $10,000 USD Requires: You have completed 1 or more grants. Benefits: Same as above. Receive promotion via Newsletter, Twitter, and other channels.

3. Experienced Project

Funding Ceiling: $20,000 USD Requires: You have completed 2 or more grants.

4. Veteran Project

Funding Ceiling: $35,000 USD Requires: You have completed 5 or more grants.

Earmarks

“Earmarks” means that there are funds available exclusively to the first three groups listed below, without having to compete. For example, New Teams (non-outreach) have 60,000 OCEAN available as “air cover” without having to compete against other groups. Beyond that, they have to compete.

60,000 OCEAN for New Teams (non-outreach category) 30,000 OCEAN for New Teams (outreach category) 30,000 OCEAN for Core Tech Initiatives (listed below) 380,000 OCEAN for remaining General Grants

Core Tech Initiatives

Proposals that work on any of the following fall under the “Core Tech Initiatives” earmark, for air cover when voting.

Enhance Ocean’s Fine-Grained Permissions to support Verifiable Credentials (vs just Ethereum addresses) in allowlist and denylist. (The schema already allows it.) First-class integration with WebSockets as a data service. It would include a fork of Ocean Provider, plus affordances elsewhere in the stack. First-class integration with Arweave as a data service. It may already have some support; this needs to be validated. Then the aim is to remove friction and leverage Arweave more fully. It may be as a fork of Ocean Provider, and/or elsewhere in the stack First-class integration with Filecoin as a data service. It already supports via IPFS; the aim here is to remove friction and leverage Filecoin more fully. It may be as a fork of Ocean Provider, and/or elsewhere in the stack

Burn

As with DAO Rounds 8, 9, and 10, any remaining funds not granted in Round 11 will be burned. Here’s how burning relates to voting in OceanDAO.

Rules:

[Earmarked Grants] winners are counted first. Projects that do not win an [Earmarked Grant] are then considered for funding under the [General Grant] category. [New] teams are thus eligible to be funded from both [Grant Category]. Returning teams are eligible to be funded from [General Grants]. 50% or more “Yes” Votes receive a grant. Any remaining funds not granted, are burned.

Voting Parameters

There are 2 major improvements being rolled out for OceanDAO Round 11 via Snapshot. The key concept is that these changes are designed to reduce the impact of whale voters, increase the impact of voter representation, and make the process easier to navigate for all participants.

Single Ballot:

All Grant Proposals will now be listed in a single voting ballot together (as opposed to each proposal having their own Y/N ballot) OCEAN can only be used to vote once (as opposed to each proposal’s Y/N ballot being voted on with your full OCEAN amount).

Quadratic Voting (QV):

QV balances “one person one vote” (democratic ideal) with “one token one vote” (skin-in-the-game). Each voter may spread voting power across any number of proposals (single ballot). QV gives more voice to smaller holders, and softens the impact of whales while acknowledging higher skin-in-the-game Voters who are interested in learning more can view this slide deck to see a practical explanation.

To help communicate the difference between these systems, we have also put together the following comparison matrix to explain the different voting systems.

Town Hall

Town Hall has been running weekly since the beginning of OceanDAO. Currently the DAO is expanding and with this growth so will the Town Hall.

Town Hall will widen its scope and begin to include the entire Ocean Ecosystem. The call will continue to take place at the same time as already planned. Keep an eye out for new format changes as they are introduced.

Round 11 Information

As a builder, submit your project proposal on the Round 11 Ocean Port Forum. As a voter, this is where you can see all the expanded proposal details to make an informed vote decision.

Additionally, you can check out the OceanDAO Round 11 Proposal Submission Dashboard to see all the proposals in one place as they are submitted.

The grant proposals from the snapshot ballot that meet these criteria are selected to receive their Funding Amount Requested (in OCEAN) to foster positive value creation for the overall Ocean Protocol ecosystem.

Round Deadlines Proposals Submission Deadline is November 2nd at midnight GMT Add OCEAN to Voting Wallet by the Proposal Submission Deadline. A 2 day proposal Due Diligence Period, ends November 4th, 2021 at 23:59 GMT Voting Starts on November 4th, 2021 at 23:59 GMT Voting Ends on November 8th, 2021 at 23:59 GMT

If your Proposal is voted to receive a grant, please submit a Request Invoice to the Ocean Protocol Foundation (OPF) for the Ocean Granted amount, if you haven’t already.

OceanDAO Ecosystem

Continue to support and track progress on all of the Grant Recipients here!

Much more to come — Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 11 is Live was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Launch a blockchain-based data marketplace in under 1 hour

Learn how to fork Ocean Market and get your own data marketplace up and running in 9 steps Sections 1. Introduction 2. Before You Begin 3. Required Prerequisites 4. Quickstart: data marketplace up and running in under 1 hour! 4.1 Fork Ocean Market 4.2 Clone the market locally 4.3 Install the dependencies 4.4 Run your Market fork for the first time 4.5 Change your Market N
Learn how to fork Ocean Market and get your own data marketplace up and running in 9 steps Sections
1. Introduction
2. Before You Begin
3. Required Prerequisites
4. Quickstart: data marketplace up and running in under 1 hour!
4.1 Fork Ocean Market
4.2 Clone the market locally
4.3 Install the dependencies
4.4 Run your Market fork for the first time
4.5 Change your Market Name
4.6 Change the Logo
4.7 Change the Styling
4.8 Change the Fee Address
4.9 Build and host your Data Marketplace
5. Conclusion
6. Further Reading 1. Introduction

Have you ever thought about monetising the data that your organisation generates? You might imagine that this involves dealing with data brokers, but with Ocean you can sell your data directly to your customers with no third party in-between. Ocean makes this all super easy for you with some pretty cool tech under the hood.

Using Ocean Market is already a big improvement on the alternatives that are out there, but it gets even better. Ocean Market is completely open-source and made freely available under the Apache 2 license. This means that you can fork Ocean Market and set up your own data marketplace in just a few steps. This guide will walk you through the process, you’ll be surprised how easy it is. No prior blockchain knowledge is required!

2. Before You Begin

If you’re completely unfamiliar with Ocean Market or web3 applications in general, you will benefit from reading these guides first:

To use your clone of Ocean Market, you’ll need a wallet. We recommend getting set up with metamask. You’ll also need some Ocean tokens on a testnet to use your marketplace. When you have the testnet tokens, have a go at publishing a data asset on Ocean Market. Now run through the process of consuming a data asset on Ocean Market. 3. Required Prerequisites

Make sure that you have all of the necessary prerequisites:

Git. Instructions for installing Git can be found here. Node.js can be downloaded from here (we’re using version 16 in this guide) A decent code editor, such as Visual Studio Code You’ll need a Github account to fork Ocean market via Github.

In addition to the above list, knowledge of CSS will help you make a quick start to adjusting the styling and prior experience with React will help but isn’t required.

4. Quickstart: Data Marketplace up and running in under 1 hour! 4.1 Fork Ocean Market

The first step is to log into Github and navigate to https://github.com/oceanprotocol/market, you’ll need to log in or create a Github account. Now you need to click “Fork” in the top right-hand corner. If you are a member of an organisation on Github, it will give you the option to clone it into either your personal account or the organisation, choose whichever is suitable for you.

4.2 Clone the market locally

Now we need to clone the market fork locally so that we can start making changes to the code. Upon forking Ocean Market, GitHub will take you to the repository page. Here, you should copy the URL of the repository. To do this, click on the green “Code” button and then click the copy icon to copy the HTTPS URL. Make sure that you have git installed and set up and installed on your computer before proceeding, see this guide if you’re not familiar with git.

Now you’re ready to make a local clone of your forked version of the Ocean Market. Open up the terminal, or command prompt on Windows. At this point, you may wish to navigate to the directory where you want this repository to live `cd Desktop` for example, but it’s also perfectly fine to clone it in the home directory.

To clone the repository, enter the following command into your terminal:

git clone https://github.com/your-profile/market.git

You need to replace the link with the HTTPS URL that you copied from GitHub. Once you enter the command, you’ll see the percentage completion reported, and then git will let you know when it’s done. Now let’s navigate into the market directory that we have just cloned:

cd market 4.3 Install the dependencies

Installing the dependencies is a vital step for running the market. It’s a super simple process, thanks to npm (node package manager). Make sure you have node.js installed, otherwise it will fail. In Ocean Market, we use node.js version 16 and it’s highly recommended that you use the same.

Enter the following command to install the dependencies:

npm install

This command will take a few minutes to complete and you’ll see some warnings as it runs (no need to worry about the warnings).

4.4 Run your Market fork for the first time

At this point, you are ready to run your data marketplace for the first time. This is another straightforward step that requires just one command:

npm start

The above command will build the development bundle and run it locally. Once it’s complete you should see the following output:

Great news — your data marketplace has successfully been built and is now running locally. Let’s check it out! Open your browser and navigate to http://localhost:8000/. You’ll see that you have a full-on clone of Ocean Market running locally. Give it a go and test out publishing and consuming data assets — everything works!

That’s all that’s required to get a clone of Ocean market working. The whole process is made simple because your clone can happily use all the smart contracts and backend components that are maintained by Ocean Protocol Foundation.

So you’ve got a fully functioning data marketplace at this point, which is pretty cool. But it doesn’t really look like your data marketplace. Right now it’s still just a clone of Ocean Market — same branding, name, logo etc. The next few steps focus on personalising your data marketplace.

4.5 Change your Market Name

It’s now time to open up your favourite code editor and start getting stuck into the code. The first thing we will be doing is changing the name of your data marketplace. A decent code editor (such as VS Code) makes this incredibly simple by searching and replacing all the places that the name appears.

Let’s start by searching and replacing “Ocean Marketplace”. In VS Code there is a magnifying glass symbol in the left-hand panel (arrow 1 in the image below) that will open up the interface for searching and replacing text. Type “Ocean Marketplace” into the first textbox, and the name of your marketplace into the second textbox (arrow 2). To make things simple, there is a button to the right of the second textbox (arrow 3) that will replace all instances at once. You can take a moment to review all the text you’re changing if you wish and then click this button.

Next up, we need to repeat the process but this time we’ll be searching and replacing “Ocean Market”. As you can see in the screenshot below, we have called our fork “Formidable Data Market”.

Now let’s change the tagline of your site. Open up the folder called “content” and then open the file called “site.json”.

On line 4 in this file, you can enter the tagline that you want for your data marketplace.

4.6 Change the Logo

The next important step to personalising your data marketplace is setting your own logo, we highly recommend using your logo in SVG format for this. The site logo is stored in the following location:

src/images/logo.svg

Delete the “logo.svg” file from that folder and paste your own logo in the same folder. If you rename your logo “logo.svg” everything will work without any problems.

At this point, it’s a good idea to check how things are looking. First, check that you have saved all of your changes, then cancel the build that’s running in your terminal (Ctrl + C or Cmnd + C) and start it again `npm start`. Once the build has finished, navigate to http://localhost:8000/ and see how things look.

As you can see in the screenshot above, changing the site name and tagline has worked well. However, our logo doesn’t look quite right… To fix this, open up the Logo.module.css file from src/components/atoms/Logo.module.css. In this file, you can see the styling that is being applied to your logo.

Delete lines 7, 8 and 9 and save the file. Now open up http://localhost:8000/and refresh the page (no need to restart the build here).

Awesome! Now our logo is fixed and looking great.

4.7 Change the Styling

Hopefully, you like our pink and purple branding but we don’t expect you to keep it in your own data marketplace. This step focuses on applying your own brand colours and styles.

4.7.1 Background

Let’s start with the background. Open up the following CSS file:

src/components/App.module.css

You’ll notice in the screenshot above that we are setting our “wave” background on line 3. Here, you’ll want to use your own background colour or image. For this example, we’ll use an SVG background from svgbackgrounds.com. We save the new background image into the src/images/ folder (same folder as the logo), then we change the CSS to the file location of the new background (see line 3 in the image below).

If we save this file and view the site at this point, we get an ugly white section at the top (see image below).

To fix this we need to change the starting position of the background image. In this situation, we can just delete “13.rem” from line 3.

Now when we view our data marketplace we can see that the new background starts at the top and fills the whole page. Perfect!

4.7.2 Brand Colours

Next up, let’s change the background colours to match your individual style. Open up the following file: src/global/_variables.css. Here you’ll see the global style colours that are set. Now is the time to get creative, or consult your brand handbook (if you already have one).

You can change these colours as much as you wish until you’re happy with how everything looks. Each time you save your changes, the site will immediately update so you can see how things look. You can see the styles chosen for this example in the image below.

And the screenshot below shows how it’s currently looking.

4.7.3 Change Fonts

The final part of the styling that we’ll alter in this guide is the fonts. This is an important step because the font used in Ocean Market is one of the few elements of the market that are copyright protected, if you want to use the same font you’ll need to purchase a license. The other copyrighted elements are the logo and the name — which we have already changed.

If you don’t already have a brand font, head over to Google Fonts to pick some fonts that suit the brand you’re trying to create. Google makes it nice and easy to see how they’ll look and it’s simple to import them into your project.

The global fonts are set in the same file as the colours, scroll down and you’ll see them on lines 36 to 41.

If you are importing fonts, such as from Google fonts, you need to make sure that you include the import statement at the top of the _variables.css file (Google Fonts provides you with this import statement so you just need to copy and paste it).

As with the colour changes, it’s a good idea to save the file with each change and check if the site is looking the way that you expected it to. You can see our eclectic choices below.

4.8 Change the Fee Address

At this point, we have made a lot of changes and hopefully you’re happy with the way that your data marketplace is looking. Given that you now have your own awesome data marketplace, it’s about time we talked about monetizing it. Yeah, that’s right — you will earn a commission when people buy and sell data assets on your marketplace.

When someone sets the pricing for their data assets in your marketplace they are informed that a commission will be sent to the owner of the data marketplace. In order to receive this commission, you need to set the market to send the commission through to your Ethereum address.

This important step is the last thing that we will change in this guide. To change the address that the marketplace fees are sent to you need to set your address as an environmental variable. Create a new file called .env in the root of your repository. It is important that the file is saved in the right place, your file structure should look the same as below.

Now copy and paste the following into the file:

GATSBY_MARKET_FEE_ADDRESS=”0xxx”

You need to replace “0xxx” with your Ethereum address and save the file. And that’s it, you now have a fully functioning data marketplace that will earn you revenue every time someone uses it.

4.9 Build and host your Data Marketplace

All that’s left is for you to host your data marketplace and start sharing it with your future users. To host your data marketplace you need to run the build command:

npm run build

This takes a few minutes to run. While this is running you can get prepared to host your new data marketplace. You have many options for hosting your data marketplace (including AWS S3, Vercel, Netlify and many more). In this guide, we will demonstrate how to host it with surge, which is completely free and very easy to use.

Open up a new terminal window and run the following command to install surge:

npm install — global surge

When this is complete, navigate back to the terminal window that is building your finished data marketplace. Once, the build is completed enter the following commands to enter the public directory and host it:

cd public Surge

If this is your first time using surge, you will be prompted to enter an email address and password to create a free account. It will ask you to confirm the directory that it is about to publish, check that you are in the market/public/ directory and press enter to proceed. Now it gives you the option to choose the domain that you want your project to be available on. We have chosen formidable-data-market.surge.sh which is a free option. You can also set a CNAME value in your DNS to make use of your own custom domain.

After a few minutes, your upload will be complete and you’re ready to share your data marketplace. You can view the version we created in this guide here.

5. Conclusion

Well done for making it to the end of this guide. We’ve covered all of the most important steps for getting your own data marketplace setup:

Forking Ocean Market Changing the Logo Setting the name and tagline for your data marketplace. Customising the styling using your own fonts, colours and background. Setting your own Ethereum address as an environmental variable to receive the marketplace fees. Building and publishing your data marketplace.

Hopefully you’ve enjoyed this guide and found it productive. If you’ve found it valuable, give us a clap — if we get a good response we’ll write a follow-up guide on all of the feature toggles that you can use out of the box with your fork of Ocean Market.

If you have come across any problems or issues, reach out to us on Discord or open an issue on GitHub. We are always happy to help.

All of the code in this tutorial has been pushed to this branch on Github.

6. Further Reading Ocean Protocol documentation has tutorials and covers the core concepts and API references. Ocean Market readme covers the core topics to get you started with running your own fork. The official React documentation and Gatsby documentation will help you make changes to the front end. If you intend on making any changes to the search then the Elastic search documentation is incredibly useful.

Launch a blockchain-based data marketplace in under 1 hour was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Secure Key

Interac and SecureKey fuel transformative milestone in Canada’s digital ID ecosystem

The post Interac and SecureKey fuel transformative milestone in Canada’s digital ID ecosystem appeared first on SecureKey Technologies Inc..

Affinidi

A Detailed Guide on Selective Disclosure

Vince walks into a casino and the security at the entrance wants to know if Vince is over 21 years old, as that’s the rule to legally play in the casinos. Now, Vince is in a dilemma. He wants to prove that he is over 21 years old, but doesn’t want to disclose any other information to the security such as his name or address. How can he do that? For starters, it is not possible through

Vince walks into a casino and the security at the entrance wants to know if Vince is over 21 years old, as that’s the rule to legally play in the casinos.

Now, Vince is in a dilemma. He wants to prove that he is over 21 years old, but doesn’t want to disclose any other information to the security such as his name or address. How can he do that?

For starters, it is not possible through physical IDs such as your driver’s license or passport as they will have your name on it. So, how can Vince prove his age without disclosing any other personal information?

Think about it for a moment.

This is where selective disclosure comes into play. .

What is Selective Disclosure?

Selective disclosure is one of the pillars of Self-Sovereign Identity and it enables individuals to share just what is needed by the recipient to process the data and take actions based on it.

In the above example, selective disclosure allows Vince to share just his age with the security as that alone is enough to decide if he should be allowed to enter the casino or not.

Benefits of Selective Disclosure

The benefits of selective disclosure are:

Enhances the privacy of the user as minimum information is exchanged Empowers users to determine what data must be shared with whom. Reduces processing time for the recipients as only a small set of information has to be verified. No storing and the related security hassles for recipients

Now that you know what selective disclosure is and its associated benefits, let’s jump to its implementation.

How Does Selective Disclosure Work?

One of the best ways to implement selective disclosure is through Verifiable Credentials (VCs), a tamper-proof and cryptographically-verifiable way of sharing Personally Identifiable Information.

Here is what a VC looks like.

There are three entities to a VC and they are issuer, holder, and verifier, and together, they form what’s called a trust triangle. A VC will have the three key components, namely,

Metadata that identifies the issuer and the holder Claim, which is the data that a holder wants to share with the verifier Proof that includes digital signatures of both the issuer and the holder for authenticity.

Let’s look closely at the claim, as this is where selective disclosure is implemented.

In the above example, there is only a single claim, that the holder is an alumni of a specific institution.

Let’s look at another VC.

{ “@context”: [

“http://schema.org/",
“https://w3id.org/security/v2",
“https://w3id.org/security/bbs/v1" ], “@type”: “Person”, “firstName”: “Jane”, “lastName”: “Doe”, “jobTitle”: “Professor”, “telephone”: “(425) 123–4567”, “email”: “jane.doe@example.com”, “proof”: { “type”: “BbsBlsSignature2020”, “created”: “2020–04–25”, “verificationMethod”: “did:example:489398593#test”, “proofPurpose”: “assertionMethod”, “proofValue”: “F9uMuJzNBqj4j+HPTvWjUN/MNoe6KRH0818WkvDn2Sf7kg1P17YpNyzSB+CH57AWDFunU13tL8oTBDpBhODckelTxHIaEfG0rNmqmjK6DOs0/ObksTZh7W3OTbqfD2h4C/wqqMQHSWdXXnojwyFDEg==”, “requiredRevealStatements”: [4,5] } }

This contains a ton of information about the holder such as firstname, last name, job title, email, etc. Many times, the holder may not have to share all this information with potential verifiers. At the same time, an issuer may find it cost-effective and convenient to issue these credentials in one go.

So, how do you balance between what the issuer offers and what the holder wants?

You guessed it — selective disclosure!

When an issuer creates and sends a verifiable credential, the holder stores it in his or her digital wallet. When it is time to share it with the verifier, the holder compiles the required credentials together into a verifiable presentation and sends it to the verifier.

Every time, when a holder shares a verifiable presentation with the verifier, he or she digitally signs it using public-key cryptography. Essentially, the holder signs with his or her private key and the verifier decrypts this information with the associated public key.

Likewise, the verifiable presentation will also contain the issuer’s digital signature that is signed using its private key.

Now, you may wonder how the holder can store each credential separately and compile it separately to have the issuer’s signature on each of it.

To make the question clear, in the above example, let’s say the holder shares only the phone number with a verifier. This is selective disclosure. But how will the issuer’s signature that applies to all of the credentials can be used for just one? Isn’t it a single block of content to which the entire signature applies?

Well, that’s where BBS+ signatures come in.

Implementing Selective Disclosure using BBS+ signatures

BBS+ signatures are a good way to implement selective disclosure as it allows the holder to share a part of a verifiable credential with a verifier.

The above verifiable credential of Jane Doe is an example of selective disclosure as it uses BBS+ signatures. Here is the line that shows this for you.

“type”: “BbsBlsSignature2020”,

As a result, the holder can send just her phone number to a verifier, her email address to another entity, and so on, and all of it will have the same digital signature.

Affinidi’s Implementation of Selective Disclosure

At Affinidi, we have implemented the BBS group signature schema on our tech stack to give holders the flexibility to build a presentation with only the required fields from a credential. These fragments are cryptographically verifiable as they are signed by the BBS+ signature.

Specifically, we have created an API to create and share the URL for the VC fragment.

The createShareUrl is the API that enables selective disclosure in Affinidi’s tech stack. It takes credentialID as a required parameter and optional selective disclosure fields array to build fragment VCs. Of course, the chosen VC should support selective disclosure.

If this parameter is provided and the VC supports selective disclosure — the fragment from VC will be builded first and then shareUrl will be created for it.

To learn more about this API and how you can leverage, reach out to us on Discord or email us. Also, follow us on LinkedIn, Twitter, and Facebook.

Join our mailing list to stay abreast of all interesting developments in SSI.

The information materials contained in this article are for general information and educational purposes only. It is not intended to constitute legal or other professional advice.

A Detailed Guide on Selective Disclosure was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Where Do Custodians Stand in the Age of Digital Assets?

The post Where Do Custodians Stand in the Age of Digital Assets? appeared first on Tokeny Solutions.

The digital asset space continues to thrive and attract institutional investors, as according to PitchBook, $17 billion worth of institutional capital has poured into the sector this year alone. Moreover, 70% of institutional investors expect to invest in digital assets in the future. Since the funding in the space has exploded, traditional custodians are trying to figure out how they fit in as new players on this value chain. Custodian banks have been slow to adapt and are now struggling with managing digital asset custody. Even worse, they mix up crypto-currencies and tokenized financial securities.

Digital assets: new opportunities for custodians

It is natural that custodians need time to understand how blockchain technology works. Once that happens, the lucrative opportunities that await on the other side will become clear. Here, we listed a few benefits and opportunities:

New safekeeping service

As digital wallets are the access points for digital assets, custodians can provide custodial wallets for investors.

Reduced operational costs

When all investor positions are kept on the blockchain, it eliminates the need for reconciliation of depository in the process. This reduces the heavy reconciliation costs.

Immediate services

Custodians can provide immediate value-added services such as owning the recovery process for lost security tokens, reporting, tax certification, lending services and other collateral management services to investors.

We used the word ‘immediate’ because with the blockchain, processing these services will use a single truth of information. Issuers, agents and investors can access the real-time data at any time. Services can be provided promptly by avoiding the lengthy process of flowing information throughout the traditional chain of custody and the risk of human errors.

The main challenge is not regulation but the adoption of the new model 

People often think the main challenge in this space is regulation, but it is actually not true. Instead, the real struggle for financial institutions including custodians is adopting new processes and new tools. When it comes to compliance, it is mostly about verifying the eligibility of investors. This can’t be performed by identifying investor wallets, as they are browsers for the internet of value, not vaults.  That is where digital identity comes into play, but how does it work? Let’s take a look at the main characteristics in the blockchain-based model of custody to provide some clarity:

Smart contracts

The smart contracts represent the asset(s) on the blockchain infrastructure. They code compliance rules into the security tokens, and only eligible investors can hold these tokens. The serious protocols for tokenized securities usually include a token recovery function that can be triggered by authorized agents.

Tokens

The tokens that have eligibility rules embedded into themselves are permissioned tokens, and they can only be traded between eligible counterparts.

Digital identity

Investors have a digital identity that stores verified identity proofs onchain, and smart contracts can check these proofs to verify the eligibility of investors.

Wallets

Once the smart contract code confirms the eligibility of the investor, the permissioned tokens will be allocated to wallets associated with the investor’s digital identity. If the private key or access credential is lost, the permissioned tokens can be recoverable to another wallet. This can be processed after the verification of identity by token issuers or third-party agents.

 

Use cases: recovery of security tokens

It is therefore obvious that custodians need access to the token smart contracts, not to the wallets, for safekeeping investors’ assets. For example, issuers of tokenized securities using the T-REX Protocol, now recognized as ERC3643 by the Ethereum community, are able to appoint a custodian bank as the recovery agent and grant that bank access to the recovery function of the smart contracts. In practice, the security tokens can be recovered in just a few steps:

Declare the loss of the wallet: The investor creates a request to recover their tokens to the custodian bank. Verify the digital identity – ONCHAINID: The custodian bank verifies the token holder’s identity with a standardized KYC process to validate the provenance of the request. Recover the same tokens in the new wallet: If the verification is positive, the custodian bank triggers the recovery function and the lost tokens are transferred to the new investor wallet. This new wallet will also be added to the ONCHAINID of the investor. The custodian can perform this operation even if he doesn’t have access to the investor’s wallet.

Time to embrace and explore the new technology 

Traditional custodians have an important role to play in digital asset custody, but they need an updated approach. There are some quick actors who are already reacting to it and updating their model. Recently, US custodian bank State Street launched a digital finance division to expand into the space. Why? It was because of client demand. Certainly, the demand will continue to increase, and as we mentioned earlier, seven out of ten institutional investors plan to invest in digital assets.

The good thing is that the technology has matured after years of use and securing billions worth of digital assets. Custodians can implement the technology quickly and straightforwardly. It is time to move and keep up with technology in order to gain a competitive advantage, those who react slower risk falling behind the competition, or other new players enter into this space.

As Nadine Chakar, the lead of the new digital finance division at State Street, said, “This space is evolving at warp speed. I believe the winners will be those who embrace change versus those who stand back and watch – they will be disintermediated or disrupted. Business leaders, including myself, must set the tone that standing still is not an option.”

Are you ready for digital asset custody?

We provide you with an enterprise-grade technology shield to update your custody services quickly and straightforwardly.

Contact Us

The post Where Do Custodians Stand in the Age of Digital Assets? appeared first on Tokeny Solutions.


Dock

Proposal to Update Network Rewards Split

A new proposal has been submitted to adjust the staking rewards emissions on the Dock network to a 50/50 split with 50% of emission rewards going to validators and stakers, and 50% going to the Dock Treasury.

A new proposal has been submitted to adjust the staking rewards emissions on the Dock network to a 50/50 split with 50% of emission rewards going to validators and stakers, and 50% going to the Dock Treasury.

This would increase the amount of staking rewards to validators and stakers who participate in securing the Dock network, who are currently receiving 40% of rewards and reduce the Treasury’s allocation from 60%. The initial division of staking rewards was based on Polkadot’s reward split, but we feel that adjusting to a 50/50 split is a better fit to align incentives across the network, increasing the rewards to validators while adequately funding the Dock Treasury to develop and operate the network.

This proposal is live on the Dock Network and can be viewed in the Governance portal here: ​​https://fe.dock.io/#/democracy. All Dock token holders can vote to influence the outcome of this proposal and should participate in determining this important change to the network.

How to vote Go to the Governance portal where you will see the Referendum. Click Vote on the proposed Referendum.

2. Select which account you want to use to submit the vote and click Aye or Nay. Select a Conviction amount which lets you choose to lock-up tokens in order to put more weight behind your vote. You can also chose to not lock-up your tokens as part of the vote.

Once the voting period ends as shown in the Governance portal, if the referendum passes, then it will move to be executed by the Association Council. Otherwise, the rewards split will remain as it currently is with a 60/40 split.
For assistance with voting on the referendum, please refer to the documentation or contact our team at support@dock.io.


MyDEX

FLICKING THE SWITCH OF PERSONAL DATA

We believe individuals should be able to access and use their own data to be able to manage their lives better. Currently, this isn’t possible because every individual’s data is dispersed across dozens (probably over a hundred) different organisations that hold it and don’t make it available. This is absurd and unfair. Over the last 14 years we have built the infrastructure needed to make ci

We believe individuals should be able to access and use their own data to be able to manage their lives better. Currently, this isn’t possible because every individual’s data is dispersed across dozens (probably over a hundred) different organisations that hold it and don’t make it available. This is absurd and unfair.

Over the last 14 years we have built the infrastructure needed to make citizen data empowerment possible — infrastructure capable of providing every individual with their own personal data store, where they can safely and securely collect their own data, use it and share it under their own control. This infrastructure is now live and operational, officially recognised as a supplier to public services on procurement platforms in both England and Scotland and independently accredited for data management security under ISO 27001.

Unleashing the potential

But what we’ve also learned over these 14 years is that core infrastructure is not enough. A parallel: it is extraordinary and wonderful that we have water infrastructure that brings fresh, safe water to our homes and offices, and a national grid that does the same with electricity. But if we didn’t have taps and switches to turn the water and electricity on and off as and when we need them, they wouldn’t be half as valuable as they are.

So, over the past few years, we’ve also been building the taps and switches that are needed to make our citizen empowering personal data logistics infrastructure really useful. This blog outlines some of them.

Smart directories

One of the really big, time consuming, frustrating and expensive things every individual and every organisation has to contend with is what we call ‘matching and connecting’. Individuals want to find the services they need to help them with a particular task, but often they don’t know who they are or where to find them. They might not even know that such services exist. Likewise, organisations offering a particular service often struggle to find and reach the particular people who really want or need this service.

A huge amount of time, money and effort is wasted by both individuals and organisations trying to solve these puzzles, often in expensive and unsatisfactory ways.

With smart directories, individuals can allow selected organisations to see an anonymised profile of themselves (shorn of any data that would identify the particular individual), and the selected organisations can see whether individuals fit the criteria their service is designed for. If there is a fit, the organisation can use the platform to send a message to the individual. If the individual decides to accept the message and respond, a connection is established.

Smart Directories lie at the heart of the work we are currently doing with the Office of the Chief Designer in Scotland and Connecting Scotland to radically reduce the costs of finding and working with citizens to help co-design public services.

Automated form filling

There is little more dispiriting and irritating than having to fill in forms, especially when you have to do it time and time again, providing the same information to different people. Using our platform and its ability to enable the sharing of verified attributes in particular, if an individual says ‘Yes, I would like this service’ it is possible for the necessary information (and only the necessary information) to be automatically sucked out of their PDS and sent to the service provider so that service provider doesn’t have to waste time, money and effort checking to see if it is correct.

This eliminates friction, effort, risk and cost for both sides while radically speeding up the process. Using our infrastructure, what used to take weeks or months (the individual painstakingly and manually filling out a form; the organisation painstakingly checking every piece of information within the form — often having to pay external service providers for the privilege), can now take minutes.

This approach is central to the Scottish Government’s planned Scottish Attribute Provider Service, which could revolutionise both citizens’ experience of accessing and using public services in Scotland and the costs these services incur.

Circles

In many service situations, different groups of people need to talk to each other to make arrangements, coordinate diaries, deal with changes, share results of meetings, organise follow ups, and so on. Often this is done by phone or email where the information gets lost or becomes difficult to find and where separate processes are then needed to share associated data.

To address this need, we have created what we call ‘Circle’ capabilities by which an individual can create a specific circle of contacts for a particular need (a bit like a WhatsApp group) but where a) all data generated by the conversation is automatically recorded into the individual’s PDS and b) where if related information needs to be shared it can be, automatically, again via the individual’s PDS.

Individuals having to manage their cancer journeys provide a good example. Each cancer journey requires a lot of organisation and coordination, with different groups of people for different purposes. For example, the patient will need to arrange and attend appointments and share clinical data with medics; to coordinate arrangements with carers (both paid and unpaid); to manage related domestic arrangements with friends and family (can someone walk the dog while I’m recovering from chemo?); and to connect with specialist service providers.

In our work with Macmillan My Data Store, we have created specific Circles (e.g. for friends and family or small service providers) where all of these tasks can be undertaken safely and efficiently, thereby lifting an energy and emotional burden off cancer patients while helping service providers work more productively.

The same core technologies are now also being used in the Revolutionising Healthy Ageing project, where multiple different service providers need to come together to create an integrated, joined-up service for citizens that deals safely and respectfully with multiple aspects of their lives.

Web apps

For services like these to work, people and the services they engage with need an interface. Traditionally, this interface has been provided by a single, isolated service provider via their website or app. But this requires individuals to sign in to each particular service and doesn’t enable dots to be joined between different providers. And if it’s an app, it creates a new dependency on Silicon Valley monopolists like Google and Apple with their app stores, while requiring that the individual has a smart device.

What’s more, the interface acts as a data suction device — sucking the user’s data into the systems of the app provider. In other words, the current approach is not only not inclusive, inefficient and restrictive, it is also privacy invading. Our answer: our web apps.

Web apps can be used on any device connected to the Internet, independently of the Silicon Valley monopolists’ app stores. Our architecture for web apps goes further, placing data that is generated in the individuals’ own personal data store.

And there is something else. With our Web Apps front line service providers (who know each step of a process inside out) can map each of these steps out, identifying exactly what information needs to be shared when, to create a seamless journey. We have developed the technology by which the resulting interface — the App itself — is generated without the front line service providers having to know anything about software or code. This means the people who really know what each service needs can quickly and easily generate interfaces with service users, as and when they need them.

With each Web App directly linked to the individual’s PDS, any information that needs to be shared (for example, a form needing to be completed) can be automatically sucked out of the PDS. This makes access to service modules like Smart Directories and Automated Form Filling instant and easy.

Citizen Co-design

For any service to be really useful, the most important input is that of the user — because users are the only people who really know what it feels like to use a service.

Every service module we develop — including all of the above — has been developed working with citizens who actually use the service. We have developed skills and processes to facilitate this and we are continuing to research how to best engage with citizens to really make their contributions easy, fulfilling, informative and actionable. This is part of the work we are doing on co-design of public services discussed above.

Our Inclued platform

Many service providers don’t need just one of the above. They need them all — and more. They need channels to communicate and engage with citizens, to send and receive messages and to integrate that ability to act on these messages.

Working with Glasgow City Council we have built a platform, called Inclued, which does all these things. Using Inclued, the Council can use the above capabilities to present citizens with incentives and offers, create communities, and seek feedback or direct engagement to gain insights and improve the targeting of their services. Inclued also eradicates form filling through secure, informed data sharing. Inclued is now being used as part of our work with the Scottish Government and Connecting Scotland and our work with Blackwood Homes and Care as part of the three year programme with three local communities working on Healthy Ageing.

Consent Management Dashboards

For all of the above to work — for citizens to remain in control of their data and to have continued trust and confidence in what’s happening — they need quick, easy, simple ways to see which service provider has had access to what information, for what purposes. They need tools which enable them to make changes — such as withdrawing consent — when and if they want to; and to exercise their rights under data protection legislation if needed.

Currently this is practically impossible. Most of us have data sharing relationships with over 100 organisations (across health, financial services, public administration, education skills and employment, travel, retail, leisure, media and so on). Currently, to do anything relating to data with them, we have to jump through hoops, signing in to each different organisation’s website or app, navigating our way through to MyAccount, scrolling through Settings and so on.

The costs and hassle of doing this in a consistent way across 100 or more different relationships are so prohibitively high that hardly anyone bothers. What we have today is an entire data ecosystem and economy built on learned citizen helplessness — the realisation that exercising genuine control over your own data is such a vast, time consuming, hasslesome task that it’s not worth even trying.

We are changing that by building consent management dashboards that enable citizens to see and manage all their data relationships simply, quickly and easily from one place within their PDS. This includes the ability to create general permission and consent settings. What if, for example, instead of being confronted with a separate, different pop-up each time you visit a web site to set permissions for cookies, your PDS could instantly inform the web site that (for example) ‘I am happy with cookies for performance analytics but not for third party advertising).

Our consent management dashboard is still work in progress. There are still many bits of functionality that need to be built. But the core is there, working to support all of the above service modules.

Summary

Giving individuals the means to exercise control over their data — to be able to access and use their own data for their own purposes — is easy to say but hard to do.

Doing it in a way that also helps service providers benefit from citizen empowerment adds another level of challenge.

We’re not finished with this task by any means. But we have made a good start. Many core capabilities and functions are built and already adding value for both citizens and bona fide service providers. We’ve got many more in the pipeline.

The opportunity to unleash the full personal, social and economic potential of personal data is immense. And we are now on our way.

However, for all these services to be truly valuable they have to demonstrate built-in integrity. That’s the subject of our next blog.

FLICKING THE SWITCH OF PERSONAL DATA was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


Okta

An Overview of Best Practices for Security Headers

Many decisions go into the process of creating a secure website. One of these decisions is selecting which HTTP security headers to implement. Today, we’ll dive into the most important HTTP security headers and the best practices that will strengthen your website’s security. Table of Contents The Security Headers HTTP Strict Transport Security (HSTS) Content-Security-P

Many decisions go into the process of creating a secure website. One of these decisions is selecting which HTTP security headers to implement. Today, we’ll dive into the most important HTTP security headers and the best practices that will strengthen your website’s security.

Table of Contents

The Security Headers HTTP Strict Transport Security (HSTS) Content-Security-Policy (CSP) X-XSS-Protection X-Frame-Options Referrer-Policy X-Content-Type-Options Permissions-Policy Configuring a Security Header Nginx Apache IIS Firebase Learn More About Security Headers The Security Headers

HTTP security headers are HTTP response headers designed to enhance the security of a site. They instruct browsers on how to behave and prevent them from executing vulnerabilities that would endanger your users.

HTTP Strict Transport Security (HSTS)

First, the Strict-Transport-Security header forces the browser to communicate with HTTPS instead of HTTP. HTTPS is the encrypted version of the HTTP protocol. Strictly using HTTPS can prevent most man-in-the-middle and session hijacking attacks.

This header has two configuration options: max-age and includeSubDomains. max-age is the number of seconds the browser should remember this setting. And if includeSubDomains is selected, the settings will apply to any subdomains of the site as well.

Strict-Transport-Security: max-age=31536000 ; includeSubDomains

Ideally, this header should be set on all pages of the site to force browsers to use HTTPS.

Content-Security-Policy (CSP)

The Content-Security-Policy header controls which resource the browser is allowed to load for the page. For example, servers can restrict the scripts browsers use to a few trusted origins. This prevents some cross-site scripting attacks that load scripts from a malicious domain.

<script src="attacker.com/cookie_grabber.js"></script>

There are many different directives of the policy, but the most important one is script-src, which defines where scripts can be loaded from. Other directives include default-src, object-src, img-src, and more. You can define a policy using the” header syntax:

Content-Security-Policy: RESOURCE-TYPE ORIGIN ORIGIN ORIGIN ...

For example, this policy limits the source of scripts to the current domain, and “www.okta.com”. self represents the current domain.

Content-Security-Policy: script-src 'self' https://www.okta.com

The default-src directive defines the policy for any resource that does not already have a policy. For example, this policy tells the browsers that all scripts should come from subdomains of “okta.com”, and all other resources should only load from the current domain.

Content-Security-Policy: default-src 'self'; script-src https://*.okta.com X-XSS-Protection

This header controls the XSS auditor on the user’s browser. There are four options for this header.

X-XSS-Protection: 0 (Turns off XSS Auditor) X-XSS-Protection: 1 (Turns on XSS Auditor) X-XSS-Protection: 1; mode=block (Turns on XSS Auditor, prevents rendering the page when an attack is detected) X-XSS-Protection: 1; report=REPORT_URI (Sanitizes the page and sends a report to the report URL when an attack is detected)

XSS auditors are built-in XSS filters implemented by some browsers. However, they are not a reliable way to protect your site against XSS attacks. Many browsers have removed their built-in XSS auditor because they can help attackers bypass XSS controls implemented by websites.

The current best practice is turning the XSS auditor off and implementing comprehensive XSS protection on the server side.

X-XSS-Protection: 0 X-Frame-Options

The X-Frame-Options header prevents clickjacking attacks. Clickjacking is an attack in which attackers frame the victim site as a transparent layer on a malicious page to trick users into executing unwanted actions.

This header instructs the browser whether the page’s contents can be rendered in an iframe. There are three options: DENY, SAMEORIGIN, and ALLOW-FROM.

X-Frame-Options: DENY (Page cannot be framed) X-Frame-Options: SAMEORIGIN (Allow framing from pages of the same origin: same protocol, host, and port) X-Frame-Options: ALLOW-FROM https://google.com (Allow framing from specified domain)

One of these options should be set on all pages that contain state-changing actions.

Referrer-Policy

The Referrer-Policy header tells the browser when to send Referrer information. This can help prevent information leakages offsite via Referrer URLs. There are many options for this header, the most useful ones being no-referrer, origin, origin-when-cross-origin, and same-origin. Note that “referrer” is not misspelled in this header like it is in HTTP’s “Referer”!

Referrer-Policy: no-referrer (Do not send referer) Referrer-Policy: origin (Send the origin, no path or parameters) Referrer-Policy: origin-when-cross-origin (Send the origin when the destination is offsite, otherwise, send entire referer) Referrer-Policy: same-origin (Send referer when the destination is of the same origin, otherwise, send no referer)

You should consider using one of the above options as your Referrer-Policy header. They all protect against user info leaks in a referer path or parameter. In addition to setting the correct Referrer-Policy header, you should also avoid transporting sensitive information in URLs if possible.

X-Content-Type-Options

This header prevents MIME-sniffing. MIME-sniffing is when browsers try to determine the document’s file type by examining its content and disregarding the server’s instructions set in the Content-Type header.

MIME-sniffing is a useful feature but can lead to vulnerabilities. For example, an attacker can upload a JavaScript file with the extension of an image file. When others try to view the image, their browsers detect that the file is a JavaScript file and execute it instead of rendering it as an image. Setting this header to nosniff will prevent MIME-sniffing.

X-Content-Type-Options: nosniff

Ideally, this header should be set for all content so that your website can decide how the browser renders files by setting the Content-Type response header. You could also use a separate subdomain to host user-uploaded content to prevent potential XSS attacks on the main domain.

Permissions-Policy

The Permissions-Policy header lets you enable and disable browser features. For example, you can control whether the current page and any pages it embeds have access to the user’s camera, microphone, and speaker. This allows developers to build sites that protect users’ privacy and security. The Permissions-Policy header looks like this.

Permissions-Policy: FEATURE ORIGIN; FEATURE ORIGIN Permissions-Policy: microphone=(), camera=() There are three options for the allowed ORIGINs of each feature. Permissions-Policy: microphone=(*) (Microphone will be allowed in this page and all framed pages) Permissions-Policy: microphone=(self) (Microphone will be allowed in this page and all framed pages if same origin) Permissions-Policy: microphone=() (Microphone will be disallowed in this page and all framed pages)

You can also specify the specific domain where the feature is allowed:

Permissions-Policy: microphone=(self "https://example.com")

You can configure these directories according to your needs. It’s a good idea to place some control over the features that your iframes can access.

Configuring a Security Header

After you’ve determined which headers to use, you can configure your server to send them with HTTP responses.

Nginx

In Nginx, you can add a header by adding these lines to your site’s configuration.

add_header X-Frame-Options SAMEORIGIN always; add_header Content-Security-Policy "default-src 'self' https://*.okta.com"; add_header Permissions-Policy microphone=() Apache

In Apache, the syntax is similar.

Header always set X-Frame-Options "SAMEORIGIN" Header set Content-Security-Policy "default-src 'self' https://*.okta.com"; Header always set "microphone 'none'; camera 'none'"; IIS

Finally, you can configure headers in IIS by adding custom headers to your site’s configuration file.

<configuration> <system.webServer> <httpProtocol> <customHeaders> <add name="X-Frame-Options" value="SAMEORIGIN" /> <add name="X-XSS-Protection" value="0" /> </customHeaders> </httpProtocol> </system.webServer> </configuration> Firebase

Major cloud providers also give you options to customize the security headers you use. For instance, if you use Firebase, you can add security headers into the firebase.json file. Add a headers key to the JSON file with the security headers you want to add as its values:

"headers": [ { "key": "Permissions-Policy", "value": "microphone=(), camera=()" }, {"key": "X-Frame-Options", "value": "DENY" } ]

For more information about how to configure security headers on different cloud providers, such as Heroku, Netlify, and AWS, read Angular Deployment with a Side of Spring Boot.

Learn More About Security Headers

In this post, we looked at some of the most important HTTP security headers. By using these headers on your site, you’ll be able to prevent some basic attacks and improve your site’s security! securityheaders.com is a good resource to help you implement the correct security headers. It can scan your website and point out which security headers you have implemented and which are still missing. You can even try it on this site.

Got questions or feedback about HTTP security headers and how to improve the security score of your webpages? Drop a comment below, we’re happy to hear from you. Want to stay up to date on the latest articles, videos, and events from the Okta DevRel team? Follow our social channels: @oktadev on Twitter, Okta for Developers on LinkedIn, Twitch, and YouTube.


PingTalk

8 Benefits of Multi-Factor Authentication (MFA)

Multi-factor authentication (MFA) reduces the risk of security breaches from occurring and keeps data safe. In the past, requiring a static username and password to access an account seemed sufficient for security. However, weak or stolen passwords can be used to execute fraud attacks and data breaches when they are the only form of authentication required. Using MFA to bolster password security w

Multi-factor authentication (MFA) reduces the risk of security breaches from occurring and keeps data safe. In the past, requiring a static username and password to access an account seemed sufficient for security. However, weak or stolen passwords can be used to execute fraud attacks and data breaches when they are the only form of authentication required. Using MFA to bolster password security with another form of authentication is proven to keep hackers out of your systems. According to Microsoft, MFA can “prevent 99.9 percent of attacks on your accounts.”
 

Sunday, 17. October 2021

KuppingerCole

Analyst Chat #99: Protecting OT and ICS

John Tolbert sits down with Matthias and shares his insights into current approaches for protecting and defending essential enterprise systems beyond traditional, often office-focused cybersecurity. Safeguarding Operational Technology (OT), Industrial Control Systems (ICS), and the Industrial Internet of Things (IIoT) is getting increasingly important. John explains that modern approaches like Ne

John Tolbert sits down with Matthias and shares his insights into current approaches for protecting and defending essential enterprise systems beyond traditional, often office-focused cybersecurity. Safeguarding Operational Technology (OT), Industrial Control Systems (ICS), and the Industrial Internet of Things (IIoT) is getting increasingly important. John explains that modern approaches like Network Detection and Response (NDR) and especially Distributed Deception Platforms (DDP) can be valuable building blocks in an overall strategy for defending, for example, the factory floor or critical clinical systems.




Veridium

One Size Doesn’t Fit All – Authentication Journeys to Digital Transformation

The 2020 pandemic and resulting work-from-home experience yielded an important conclusion regarding cybersecurity: identity is the new perimeter. Sixty-three percent of all data breaches exploit weak credentials, and incidents like SolarWinds highlight the need to focus cybersecurity on identity and access management (IAM) systems. Many organizations used the pandemic to accelerate their digital t

The 2020 pandemic and resulting work-from-home experience yielded an important conclusion regarding cybersecurity: identity is the new perimeter. Sixty-three percent of all data breaches exploit weak credentials, and incidents like SolarWinds highlight the need to focus cybersecurity on identity and access management (IAM) systems. Many organizations used the pandemic to accelerate their digital transformation projects with a focus on identity, but met additional challenges such as:

Bring-Your-Own-Device (BYOD): employees (new and existing), contractors, and freelancers require flexibility regarding the types of devices used and their capabilities to ensure proper identity verification, authentication, and credential storage.

Passwordless authentication: some devices are capable of securely storing credentials that can be used for proof-of-possession by an individual. This can be a convenience and productivity boost and reduce cybersecurity risks.

Biometrics: with new use cases for remote identity verification, and to further ensure proper identity binding to device-based credentials, biometrics can be combined with proof-of-possession to enable passwordless authentication.

Software-defined perimeters: in addition to or instead of virtual private networks (VPNs), multiple authentication flows can be combined to secure authorized access to specific resources within a perimeter. This can prevent data breaches, even in cases of VPN compromise (as in the SolarWinds attack), while enabling fine-grain audit for data protection compliance.

These challenges can make digital transformation a daunting challenge for many CISO and IT managers already under pressure to maintain their status quo password-based IAM systems. Luckily, the new concept of “auth journeys” lets them move beyond the limitations and complexities of existing AD and LDAP-based approaches. “Auth journeys” combine both authentication (authn) and authorization (authz) into processes defined at the user experience (UX) level instead of low-level auth protocols like OAuth, OIDC, SAML, etc.

An authentication journey is a workflow comprised of multiple steps in authentication and authorization processes available within an enterprise. Such steps include multi-factor authentication (MFA), biometrics, geolocation, PIN, credential checks, and even traditional methods (e.g., username & password). Most IAM systems are never fully replaced but include old and new systems and are carefully migrated over rollout periods that last weeks or months. Journeys provide a mechanism to define rollouts, migrations, deprecated methods, upgrades, and credential recovery paths. Journeys are full lifecycle processes that include onboarding, offboarding and internal steps like Active Directory conditional access checks. IT departments can define minimal journeys for all users, specific roles, and then let users choose their own devices and options for MFA via self-service portals, which allows convenience for users. Some additional benefits of authentication journeys include:

Compliance & auditing: KYC/AML laws vary globally, so journeys can be customized as remote onboarding becomes more widespread. Onboarding processes can be assessed at various IAL levels and resulting credentials tied to auth processes to directly link the IAL and AAL levels. If logging is part of the journey (as an internal step), auditing can be directly associated with both onboarding and auth processes. Reduction in costs & complexity: A wealth of powerful and effective authentication and authorization technologies are available in the market, but most are complex to manage from a system-level perspective. The security context is lost when dealing with sessions, roles, tokens, and other protocol-specific mechanisms at a low-level of process programming. By managing journeys as high-level definitions of authn and authz processes, CISOs can elevate management of access risks of associated resources above the machine-level programming of OIDC and JWT tokens used to implement such processes. Better vulnerability management: explicit auditing, vulnerability analysis, threat modeling and accessibility enablement that actually improves security and reduces help desk password-reset costs. As in the cyber world, IAM vulnerabilities should be managed, shared and fixed at the journey level, not at the level of a specific protocol implementing a security policy incorrectly. Credential-based onboarding and authorization: new verifiable credentials permit on-demand provisioning for new employees, contract and temporary freelancers who can present such trusted credentials upon signup instead of requiring out-of-band, a prior enrollment in AD/LAP registries. Such credentials, which have expiration dates and explicit privileges, can also encapsulate capabilities to be used in authorization flows. FIDO: allows storage of encrypted credentials across many devices and modalities including security keys, biometrics, mobile phones, tablets, laptops and desktops with a password. FIDO allows developers to focus on high-level journeys in the IAM perimeter while providing device and security policy options. Privacy, Accessibility, Inclusion and Diversity: Ultimately, such flexibility regarding authentication journeys allows each user to choose their own devices, biometric modalities, and credentials for access to specific resources within their enterprises. Such flexibility enables new paths that help protect user privacy, enable accessibility, and promote inclusion and diversity.

Journeys let users define their own authentication methods that comply with GRC (Governance, Risk & Compliance) requirements while making choices explicit for IT managers. The IAM (Identity & Access Management) landscape may seem to be getting more complex, but only because we’re trying to fit a square peg in a round hole. Old methods tied strictly to AD and LDAP registries with groups and their associated roles are only a narrow keyhole from the past into which we can view a broader IAM future. These systems will continue to be used in many enterprises but form only part of the many journeys available to users within the new IAM landscape.

The post One Size Doesn’t Fit All – Authentication Journeys to Digital Transformation appeared first on Veridium.


Europechain

Top 5 Most Interesting NFT Use Cases (Part 1)

NFTs have a wide range of use cases, with many interesting use cases being explored right now. In this article we discuss some of the most interesting ones!

It wasn’t long ago that the word ‘fungibility’ was but an obscure term understood only by economists and asset and commodity traders. Fewer even know that this peculiar word derives from the Latin verb fungi, meaning ‘to perform.’ From these arcane origins, the word is everywhere today, thanks to the meteoric rise to popularity of non-fungible tokens (NFTs), those little nuggets that can turn...

Source

Friday, 15. October 2021

Holo

Q4 2021: Update on Milestones and Progress

Lead/Org Update You may have noticed an uptick in stories, blogs, apps, news and announcements coming from Holo recently. In large part this is simply because there is much more that can be built on Holochain than ever before. The proof of this reality is that the wider community of projects and developers are beginning to move with haste in their respective projects. Simultaneously the Holo deve
Lead/Org Update

You may have noticed an uptick in stories, blogs, apps, news and announcements coming from Holo recently. In large part this is simply because there is much more that can be built on Holochain than ever before. The proof of this reality is that the wider community of projects and developers are beginning to move with haste in their respective projects. Simultaneously the Holo development team is now unblocked. Certain feature development had been on pause until Holochain had finished updating to a necessary level of features and stability. This is huge.

In order to make the sequencing of our interdependent work more visible to our community, we refreshed our roadmap for Holo and included some key Holochain releases.

Milestones Clarity Tied to Weekly Updates

We’ve renamed a few milestones for clarity but mostly we’ve included additional milestones to be able to more effectively tie our regular development updates to the milestones so that everyone can better understand where we are.

You will notice that we now have 9 of the Phase 3 milestones completed. An additional 7 of the milestones are what we call our work in progress, each of which is focused on by a specific sub-team. Every week, across the board, we are delivering features into test networks, usable builds for our community, and public releases.

Sharding Ready for Developers

In the past month the Holochain team released what we call the “Sharding Ready” release. If you are looking at it on Github, it is the release that is numbered 0.0.107. They have since delivered two additional releases that solidify and make those fundamental changes usable by community developers and by the Holo team. These releases of Holochain are precisely what the participants of the Dev Camp are using now for building, testing and deploying their own Holochain applications.

New Tools Make Rapid App Development Possible

Some amazing tools have also been made available on the tails of the Holochain release. The Launcher, our scaffolding tool, and the Dev Hub all make it easier for developers to adopt Holochain by saving them work and shortening their learning curve.

The Holochain Launcher is a dashboard that allows end users to install and run Holochain applications locally. This means that Holochain apps are easy to demo, easy to configure for testing by QA teams and community members — and ultimately easy for non-technical users to simply install and begin running on their own computers.

The scaffolding tool is the first in a set of Rapid Application Development tools — what’s known in the software world as RAD tools. The scaffolding tool provides a kickstart to the process of building an application. It allows new developers to lean on patterns that other more established Holochain developers have designed. It will also let non-developers who are members of product teams use business terms and logic for setting up the basics of an application. For example, if you are designing a blog, the scaffolding tool is where you would define concepts like a post and a list of posts.

Dev Campers Show RAD Tools Working

Session 3 of Dev Camp is where they introduced the RAD scaffolding tool to the participants and then assigned homework to participants to scaffold their first Holochain application. After that they will package the application and then run it from inside the Launcher.

DevHub Makes hApps Testable, Available to Devs

The DevHub is another tool that is being tested by a few of our community developers now. It’s a developer console where hApps are uploaded and tested, and where they will be reviewed/rated by other ecosystem developers. This component is critical to Holo in the sense that our Publisher Portal will look to it as the source for all Holochain applications that can be published to the Holo network for hosting.

Work in Progress

This week our team just finished getting all the various applications and services related to the Holo Platform updated to the Sharding Ready release of Holochain. This means that now Holo hApps and Infrastructure are working with a version of Holochain that exceeds all of our prior baseline performance metrics as compared to the previous releases of Elemental Chat.

There is so much underway — so we’ve added a small overlay to the roadmap that shows the following parts we are already working on.

Next Up: Elemental Chat Public Release (still just a Prototype)

Yes, with the performance gains with Holochain, we’re now preparing to let people use Elemental Chat on Holo. We’re working through testing and release processes now so stay tuned for announcements.

Publisher Portal & HoloFuel Readying for Pre-Release Testing

The Holo team is steadily working on the first pre-release of the Publisher Portal as well as a release of the HoloFuel app with basic transactions of Test Fuel. Development on these two applications had paused previously because certain features required changes in Holochain that are now available with the recent Holochain update.

Holochain 0.1.x HDK Release

Holochain is preparing to do their first bump to a 0.1.x release. This indicates a greater level of usability and reliability. Our release numbering is based on the standard of Semantic Versioning, where the first number is a major release, the second is a minor release and the third is a patch. Each time a project moves over one decimal in a startup project, it is a milestone worth celebrating as it typically indicates increased stability and documentation for the developers who are dependent on the project.

HoloPort Registration Useability

Another effort currenlty underway in the Holo team is the new key generation process for HoloPorts. This work will revise the current HoloPort Registration process, adding in critical steps for security as well as making the process more extensible for both our operations team and for the hosts who are registering. This will specifically set us up to build other features required for beta like multi-port registration and identity verification.

Wrapping it up with Community & Customer Service

As we move forward today we are entering a period of increased complexity for our Community and Customer Support folks. For most of our Dev cycles up until now, we’ve had two sets of users — Hosts and Holochain developers. Others who follow the project have wanted to know what is possible with this new tech, what is available and how it works, and many others have wanted to better understand how it all relates to our token and eventually HoloFuel. This past month, however, saw us push forward to have multiple new tools and applications that are being used by many more developers and now also the end users of the applications those developers have built.

This means that non developers are asking more questions on our support channels and forums. This will only increase as we do pre-releases of HoloFuel and the Publisher Portal and as other projects make more and more Holochain apps available on the Launcher. We’re going to be working diligently with the community on ways to direct those questions to all the right places.

So thank you for all the interest and engagement — and a special thanks too for your patience as we stretch and evolve through these wonderful growing pains.

-Mary

Q4 2021: Update on Milestones and Progress was originally published in HOLO on Medium, where people are continuing the conversation by highlighting and responding to this story.


Civic

Civic Milestones and Updates: Q3 2021

During the third quarter, the entire crypto space was abuzz, and Civic was no exception. The quarter was marked by strong performances, especially from Bitcoin, which is the best performing asset this year, beating both stocks and commodities. NFT investment also reached new heights as a flood of sales reached $10.7B during the quarter. And, […] The post Civic Milestones and Updates: Q3 2021 app

During the third quarter, the entire crypto space was abuzz, and Civic was no exception. The quarter was marked by strong performances, especially from Bitcoin, which is the best performing asset this year, beating both stocks and commodities. NFT investment also reached new heights as a flood of sales reached $10.7B during the quarter. And, FATF, the regulatory entity that sets global standards for cryptocurrency, issued its 12-month review of revised standards on virtual assets, clearing the way for countries to codify and enforce new standards.

Key Milestones

Civic has always been ahead of the curve when it comes to decentralization and identity verification. As signals from regulators became stronger over the summer, we understood quickly from our clients and prospective clients that new permissioning tools for DeFi were needed to respond to these dynamic compliance requirements. We accommodated by building a DeFi product, called Civic Pass.

In September, we partnered with Solrise Finance to announce the first on-chain decentralized exchange on Solana, Solrise DEX Pro, with permissioned access based on digital identity. Civic provides a KYC solution, through Civic Pass, that a dApp provider can use as an input to their compliance program. Solrise DEX Pro uses Civic Pass to determine which participants meet their rigorous standards for verification prior to allowing them the ability to trade. From Solrise DEX Pro, users are guided to use Civic’s identity verification technology solution. Once the user has completed the Civic Pass screening process, Solrise DEX Pro uses the results of the screening to allow trades on the DEX. The underlying technology for this product comes from Identity.com.

During the third quarter, we also partnered with Black Fire Innovation, the hospitality technology hub created by Caesars Entertainment, Inc., and the University of Nevada, Las Vegas (UNLV), to launch a vending machine demo using Age Verification by Civic. The technology is based on Identity.com’s open source identity verification ecosystem and is strategically located where advanced gaming and hospitality technology are developed.

The summer heralded new hires and promotions as the company repositioned itself for growth. Most importantly, our co-founder Vinny Lingham moved into a new role as Chairman of the Board at Civic, and promoted Civic’s new CEO, Chris Hart, from his position as COO, a role he held for three years. We also gained a new board member, Maja Vujinovic, who previously led blockchain as a CIO at GE, and has already made a tremendous strategic impact on our company’s direction. We also added a director of risk and compliance to our team, Chis Harding, who has already penned an op-ed in Cointelegraph.

Civic in the News

Civic gained its share of media attention over the summer with a total online readership of 332M. Our new CEO, Chris Hart, shared thoughts on the identity verification industry at the ReImagine Conference, on the Crypto Coin Show, and on Securities.io, while our previous CEO, Vinny Lingham, shared insight with The Information. Our new head of risk and compliance, Chris Harding also introduced himself to Authority Magazine.

Our partnership with Black Fire Innovation garnered unique features in Vending Times, The Spoon and Biometric Update, and our press release reached 83M+ viewers. Additionally, our partnership with Solrise Finance received 14 pieces of unique coverage, including a piece in CoinDesk, CoinMarketCap, U.Today, Invezz and other publications. The press release reached a total potential audience of 77M.

Coming Soon

With the strong performance of NFTs this summer and into the fall, we’ve expanded our Civic Pass offering to NFTs, so you can expect to hear more announcements on that front. We’re also working hard to make Civic Pass available across chains and will be sharing more news about partnerships on that front. Finally, we’ll be paying close attention to upcoming regulations that affect our clients and responding to their needs to make our products better.

The post Civic Milestones and Updates: Q3 2021 appeared first on Civic Technologies, Inc..


Coinfirm

Panel Discussion – Crypto Custody, Trading and Governance with CMO Sachin Dutta

Listen to Sachin Dutta, Chief Marketing Officer of Coinfirm, at the Digital Assets Realised panel on Investing in Crypto Assets – Crypto Custody, Trading & Governance on his views of the crypto regulatory landscape, methods and unique cases on how to prevent crypto crime. Who is on the panel? – Sachin Dutta Chief Marketing Officer, Coinfirm...
Listen to Sachin Dutta, Chief Marketing Officer of Coinfirm, at the Digital Assets Realised panel on Investing in Crypto Assets – Crypto Custody, Trading & Governance on his views of the crypto regulatory landscape, methods and unique cases on how to prevent crypto crime. Who is on the panel? – Sachin Dutta Chief Marketing Officer, Coinfirm...

Coinfirm at Money 20/20 USA

We are sponsoring and attending Money20/20 in Las Vegas. Join Our CEO at the Coinfirm Kiosk K1646.
We are sponsoring and attending Money20/20 in Las Vegas. Join Our CEO at the Coinfirm Kiosk K1646.

auth0

Exploring the Auth0 ASP.NET Core Authentication SDK

The new Auth0 ASP.NET Core Authentication SDK makes adding authentication and authorization to your web applications a breeze. Learn how.
The new Auth0 ASP.NET Core Authentication SDK makes adding authentication and authorization to your web applications a breeze. Learn how.

Ocean Protocol

Building towards a decentralized European Data Economy: A Minimal Viable Gaia-X (MVG) powered by…

Building towards a decentralized European Data Economy: A Minimal Viable Gaia-X (MVG) powered by Ocean Protocol Representatives from business, science, and politics on a European level are creating a proposal of a European data infrastructure for the next generation: a secure, federated system that meets the highest standards of digital sovereignty and promotes innovation. Gaia-X is the crad
Building towards a decentralized European Data Economy: A Minimal Viable Gaia-X (MVG) powered by Ocean Protocol

Representatives from business, science, and politics on a European level are creating a proposal of a European data infrastructure for the next generation: a secure, federated system that meets the highest standards of digital sovereignty and promotes innovation. Gaia-X is the cradle of an open, transparent, decentralized digital ecosystem, where data and services can be made available, collated, shared, and monetized in an environment of trust. More than 300 organizations and over 2500 contributors are already supporting Gaia-X.

Private data can help with the research and development of life-altering innovations in both science and technology. The more data available — from a multitude of sources, the more accurate the predictions of Artificial Intelligence (AI) models become.

However, companies still hesitate to share and sell their private data, as it has traditionally had unpredictable risks, such as data breaches and related liability. This is one reason why data often remains locked, because of concerns over security, privacy, and trust. Thus, even today, data silos are exploited by only a few companies, which results in an inaccessible and intransparent data market.

Gaia-X tries to solve these problems and shows that data sharing and monetization are possible without sacrificing data sovereignty, data protection, privacy, and European values. Ocean Protocol offers a technical solution to these problems in that it enables creating open, transparent, and decentralized data markets.

As data has the potential to become the single most valuable asset of our time, Gaia-X will significantly contribute to the economic welfare of all Europeans in the coming decades.

“​​Ocean Protocol is working with Gaia-X to help deliver goals for the development of an efficient, competitive, secure and trustworthy federation of data infrastructure and service providers for Europe.” — Trent McConaghy, Ocean Protocol founder.

“Together with industry-leading collaborators, we continue jointly innovating and engineering a European cloud infrastructure.” — Frederic Schwill, Tech Lead at deltaDAO AG.

Ocean Protocol provides core functionalities for the Gaia-X federated services leading to a Minimal Viable Gaia-X (MVG) available today. It enables services to interact with each other and be used like lego bricks to build new and more complex products.

Using Open Source Distributed Ledger Technology to build a European Data Economy

The very first Gaia-X Hackathon took place in August 2021. More than 250 participants from over 25 countries worked on different components of a Minimal Viable Gaia-X, sponsored by Ocean Protocol, deltaDAO, Datarella, Google Cloud, and more.

deltaDAO AG co-organized the Hackathon, led the “Compute-to-Data” stream, and presented a Minimal Viable Gaia-X Demonstrator based on Ocean Protocol.

The Minimal Viable Gaia-X Demonstrator built by deltaDAO on top of Ocean/using Ocean tech stack is a Web3 interpretation of the Gaia-X Federation Services and Registry. deltaDAO demonstrated that Ocean Protocol provides most Gaia-X core functionality today, enabling a decentralized data economy in line with the Gaia-X vision.

Gaia-X Test Network

Ocean Protocol is blockchain agnostic. Hence, it works with any EVM-compatible Distributed Ledger. deltaDAO deployed a few additional components on the Gaia-X Test Network in preparation for the Hackathon, enabling Ocean Protocol to be seamlessly used within the Gaia-X ecosystem. If you want to try it, get your OCEAN test tokens here.

Gaia-X Portal powered by Ocean Protocol: A platform to find, publish and consume Data Services in the Gaia-X Test Network.

https://portal.minimal-gaia-x.eu/ |

GitHub: https://github.com/deltaDAO/Ocean-Market

The Gaia-X Portal, developed by deltaDAO, is built on Ocean Protocol, as a trustless, decentralized, and privacy-preserving data and AI-sharing and monetization protocol. It connects data publishers and consumers, listing data services in the Gaia-X network. The Portal is fully open-source and you can try it here.

Any company or institution can adopt data portals. While fully customizable, all portals connect to the Gaia-X Test Network, enabling interoperability and cross-listing of Data Services by design.

Onboarding videos and tutorials

As most Hackathon participants were new to Web3, deltaDAO assembled an interactive onboarding tutorial. Try it yourself here.

Gaia-X Catalogue: A tool to find Data Service Self-Descriptions in the Gaia-X Test Network.

deltaDAO’s Web3 interpretation of a Gaia-X Catalogue demonstrates a “Catalogue of Catalogues” based on Distributed Ledger Technology (DLT). It lists all data service self-descriptions in the Gaia-X Test network, offering an immutable ground truth to all network participants.

Self-Description Lifecycle Management

The Gaia-X Architecture document defines a lifecycle for all data self-descriptions. deltaDAO built a basic lifecycle management tool based on Ocean Protocol during the Hackathon.

Ocean’s Compute-to-Data

The Gaia-X Hackathon had a full stream dedicated to Compute-to-Data. A privacy-preserving data sharing and monetization approach to bring algorithms to the data instead of moving data around.

“Ocean Compute-to-Data is the response to solving the current tradeoff between the benefits of using private data and the risks of exposing it. Compute-to-Data lets data stay on-premise while allowing 3rd parties to run specific compute jobs on it, like building AI models. There are multitudes of applications in science, technology, and business contexts because the compute is sufficiently ‘aggregating’ or ‘anonymizing’ that the privacy risk is minimized,” explained Trent McConaghy, Ocean Protocol Founder.

“Ocean Compute-to-Data helps to make private data available for AI and business intelligence applications, while avoiding ‘data escapes’ and without the last-mile user seeing the private data directly,” remarks McConaghy. “It enables data marketplaces to buy and sell private data in a privacy-preserving fashion, for applications in healthcare, mobility, and more. We see Compute-to-Data as a new opportunity for companies to monetize and unlock their data assets.”

Compute-to-Data was the foundation for many third-party portals and industry use cases:

Third-party portals and industry use cases

deltaDAO created a satellite image service portal, enabling data monetization with Compute-to-Data. It allows running remote machine learning algorithms, generating insights and value on large Copernicus image datasets. The showcase uses open-source satellite images from Sentinel-1 SLC IW and Sentinel 2 L2A missions over the metropolitan area around Marseille, France. It provides insights into how resource-intensive machine learning algorithms can be applied to open-source data. Datasets and algorithms are available here.

Not moving the data around enables new business models for the data owner, saving immense amounts of storage and bandwidth.

In addition, another demonstrator portal was created to showcase how machine learning can be used to predict heat demand in data centers depending on the weather. For training the ML model, historical heat demand data from data centers as well as the historical weather data from the location were used. This resulted in weather forecast data insights that could predict the future heat demand of the given data center location. The operator of the data center could then, for example, use this to optimize their infrastructure accordingly.

This demonstrator highlighted how different data sources can be combined to generate insights that can be used e.g. to save energy and run more efficient cloud systems.

Ocean Protocol tech stack & Gaia-X

Ocean Protocol’s mission is to kickstart a Web3 Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Ocean Protocol already supports a wide range of functionalities that the Gaia-X federated services (GXFS) aim to provide. It also works well with Self-Sovereign Identities (SSI) and Verifiable Credentials (VC). Ocean Protocol might allow the Gaia-X community and emerging European data economy to save years of development and integration time. The created Minimal Gaia-X (MVG) was one of the first steps to prove this claim.

The Ocean Protocol tech stack provides Gaia-X with the tools to build a decentralized network, completely open, transparent, interoperable, with no lock-in, and conformity and trust checked automatically.

1. Identity and Trust. Ocean Protocol relies on Self-Sovereign Identities (SSI) and works well with verifiable credentials (VC). It allows users to retain control of their identity and adds a new level of trust and security.

2. Federated Catalogue. Ocean Protocol metadata smart contracts provide a decentralized database and “ground truth” for all data asset self-descriptions. This includes a fully operational federated catalogue (FC) and inter-cataloguesynchronization on the application layer.

3. Sovereign Data Exchange. Ocean Protocol enables Data Contract Services (DCS) based on Smart Contracts. It enables providers and consumers to offer, negotiate and stipulate data contracts and execute data access rights transparently and securely. Compute-to-Data (CtD), as part of Ocean Protocol’s core features, allows data service providers to monetize their data while keeping ownership and control.

4. Data Exchange Logging. Using Blockchain, all transactions are verified, stored, and audited in an ordered, transparent, and trustless way, thus enabling the Data Exchange Logging Service (DELS).

5. Portal and Integration. A user-friendly tool to discover and interact with data assets allows for widespread adoption and acceptance of Gaia-X services. Ocean Protocol delivers an open-source, customizable web frontend and APIs to interact with it. Users can start publishing and consuming data services with not much more than a standard browser at their disposal.

6. Security and Privacy by Design. Ocean Protocol features security and privacy by design based on a trustless environment and SSI as a Blockchain application and protocol. It does not collect any more information about the participants than is needed to facilitate its services.

Towards a New Data Economy

Simplifying privacy compliance for organizations. Data ownership remains in the hands of the owners while enabling value creation. It makes any process involving private data compliant with the most restrictive data protection frameworks.

Creating new revenue streams. No matter how much and what kind, anyone who owns data can earn revenue by granting access to those willing to pay. This applies to individuals and organizations.

Unlocking data taps. The ability to monetize data introduces an incentive to share data. This increases the amount of data breaking free from silos, ready to be blended and transformed to create value.

Making data available. With more data available for Data and AI Scientists, data models will get more accurate, increasing the market value of data products. The ability to easily search and find data from a global corpus removes barriers and friction to match data supply and demand.

Liberalizing data workflows. With data and data services organized in markets, data science workflows can be redesigned from scratch. Market actors will be able to compete and collaborate on each activity in the value chain.

Seeding the New Data Economy. Data markets will be 100% liquified. While it is difficult to describe how the New Data Economy will behave, moving from an archaic ownership model to a completely liberalized economy is expected to drive disruption. The wave of change coming to data science and AI with the adoption of distributed ledger technology is likely to be as big as the one that came with the adoption of the internet.

Conclusion: Keep Building

“Having joined the Gaia-X AISBL recently, we at deltaDAO are excited to continue working towards a decentralized, self-sovereign, and privacy-preserving European Data Economy together.“, Kai Meinke, Business Lead at deltaDAO AG.

By embracing a collaborative, open, and transparent structure, Gaia-X actively asks for more contributors to build a European data infrastructure. Gaia-X is delivering — save the date for the second Gaia-X Summit in November!

If you want to integrate your use-case or application into the upcoming Gaia-X Hackathon in early December 2021, please contact our partners at deltaDAO.

Building towards a decentralized European Data Economy: A Minimal Viable Gaia-X (MVG) powered by… was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Dec 07, 2021: Lessons From a Journey Into a Real-World Ransomware Attack

Ransomware Attacks have become the biggest single cyber risk for enterprises of any size and industry. Research indicates a steep rise not only in the number of attacks, but as well in the average damage per incident. It is therefore essential that organizations are prepared for these attacks.
Ransomware Attacks have become the biggest single cyber risk for enterprises of any size and industry. Research indicates a steep rise not only in the number of attacks, but as well in the average damage per incident. It is therefore essential that organizations are prepared for these attacks.

Aergo

Announcement of the 1st DApp Contest Winners

Dear AERGO Community, This summer we hosted the first DApp contest as a means of expanding AERGO ecosystem. We are blown away by the participation of developers from all over the world. Through this opportunity, we can visualize a bright future for AERGO ahead of us. We once again immensely thank all those who participated in the first ever DApp contest. After reviewing the mind blowing ent

Dear AERGO Community,

This summer we hosted the first DApp contest as a means of expanding AERGO ecosystem. We are blown away by the participation of developers from all over the world. Through this opportunity, we can visualize a bright future for AERGO ahead of us. We once again immensely thank all those who participated in the first ever DApp contest.

After reviewing the mind blowing entries, we chose the winners based on the following criteria:

Concept and originality of DApp Creativity Innovation Ability to attract users

Prizes will be awarded as follows:

AERGO Billboards Siren Praise SNSChain Real-time Voting Blockchain based security system

500,000 AERGO token purse

250,000 First place 100,000 Second place 50,000 Third place 20,000 Fourth place 10,000 Fifth place

Here are the winning projects.

5th Place: Hodl

HODL is a DApp that helps you deposit Ethereum and ERC 20 tokens for a long time. If you withdraw before the promised time, there is a -10% penalty and the fine will be delivered to the depositors as a reward. Get rewarded while protecting your Ethereum and ERC 20 tokens.

web site : https://github.com/strongcontract/eth_hodl/ app store: https://raw.githubusercontent.com/strongcontract/eth_hodl/main/files/app/eth-hodl-release.apk 4th Place: Loany

Loany is a service that helps you diagnose loans and improve them to better conditions. Users who encounter Loany for the first time will diagnose the current loan, and if the loan conditions need to be improved, the user will be notified in an easy-to-understand manner.

web site : https://www.loany.co.kr/ app store : https://loany.page.link/app 3rd Place: PlusFi

User-friendly DeFi built on AERGO. Offers features such as Bridge, Staking, Swap, Lending, IDO.

web site : unreleased app store : unreleased 2nd Place: BananaClips

Banana Clips is an online marketplace where you can sell or buy clips using web and mobile applications. Anyone with a smartphone can easily upload a video, and anyone can easily purchase it.

web site : https://bananaclips.net/ app store : unreleased 1st Place: Pikkle

AERGO blockchain-based electronic voting service. It has been released in the global market and its stability has been proven through the Korea Broadcasting Awards and Seoul Drama Awards.

web site : https://www.pikkle.me app store : https://www.pikkle.me/#download

Future Plans

Once the DApps are completely ready and tested, we will submit to Samsung Blockchain DApp Marketplace for listing.

This concludes the first successful cycle in our DApp incubation program by announcing the winners of our DApp contest. Our DApp incubation program will evolve into supporting DeFi and eSignature applications on the AERGO ecosystem. We are looking into applications such as Automated Market Maker (AMM), Decentralized Exchange(DEX), Electronic Document, and a Lending Platform.

We will be announcing an updated roadmap for supporting DeFi applications on AERGO ecosystem with a formal funding program for DeFi incubation (AERGO DeFi Incubation Funding Program) and a DeFi hub for AERGO (aka as AERGO Swap) that will serve as a primary interface for DeFi applications and a launchpad for future incubation programs. We are planning a $10,000,000 USD incubation fund, pending AGORA approval.

We, AERGO, will do our best to support the AERGO community and ecosystem become more active and achieve further developments. Any questions or ideas regarding the contest, please feel free to contact dapp@aergo.io

Thank you.

AERGO Team

Announcement of the 1st DApp Contest Winners was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 14. October 2021

KuppingerCole

Managing Risk in Ever-Changing As-a-Service Environments

In the infrastructure and platform-as-a-service worlds, application developers are the new infrastructure superstars. With concepts ranging from containers to infrastructure-as-code, we are experiencing a paradigm shift in how tightly coupled application code and the related infrastructure are. Often security is under-represented in this formula, and for good reason.

In the infrastructure and platform-as-a-service worlds, application developers are the new infrastructure superstars. With concepts ranging from containers to infrastructure-as-code, we are experiencing a paradigm shift in how tightly coupled application code and the related infrastructure are. Often security is under-represented in this formula, and for good reason.




Finicity

Steve Smith, CEO of Finicity, talks open banking with the Fintech One-on-One Podcast

Finicity CEO Steve Smith stopped by Lendit Fintech’s One-on-One with Peter Renton. Steve talks open banking’s everyday applications, the state of open finance around the globe and what we’re doing to ensure consumer-permissioned financial data is used responsibly.  Listen to the podcast episode here. The post Steve Smith, CEO of Finicity, talks open banking with the Fintech One-on-One Podca

Finicity CEO Steve Smith stopped by Lendit Fintech’s One-on-One with Peter Renton. Steve talks open banking’s everyday applications, the state of open finance around the globe and what we’re doing to ensure consumer-permissioned financial data is used responsibly. 

Listen to the podcast episode here.

The post Steve Smith, CEO of Finicity, talks open banking with the Fintech One-on-One Podcast appeared first on Finicity.


Jack Henry Partners with Open Banking Providers to Enhance Digital Platform

Jack Henry & Associates, Inc.® is a leading provider of technology solutions and payment processing services primarily for the financial services industry. Jack Henry announced today that Finicity, Akoya, and Plaid are among the first open banking pioneers to integrate to the Banno Digital Platform using the Banno Digital Toolkit℠. This provides better security, privacy […] The post Jack Hen

Jack Henry & Associates, Inc.® is a leading provider of technology solutions and payment processing services primarily for the financial services industry. Jack Henry announced today that Finicity, Akoya, and Plaid are among the first open banking pioneers to integrate to the Banno Digital Platform using the Banno Digital Toolkit℠. This provides better security, privacy and transparency for the nearly 6 million consumers banking with the Banno Digital Platform by removing screen scraping. 

Learn more about Finicity’s partnership with Jack Henry here

The post Jack Henry Partners with Open Banking Providers to Enhance Digital Platform appeared first on Finicity.


Civic

Introducing: Civic Pass Integration Guide

Since our earliest days, we’ve been working on identity for credential verification, based on the open-source platform, Identity.com. The identity solutions that we’ve built since are diverse – everything from establishing your age, so that you can get a beer out of a vending machine, to facilitating KYC for DeFi applications. Most recently, we collaborated […] The post Introducing: Civic Pass I

Since our earliest days, we’ve been working on identity for credential verification, based on the open-source platform, Identity.com. The identity solutions that we’ve built since are diverse – everything from establishing your age, so that you can get a beer out of a vending machine, to facilitating KYC for DeFi applications. Most recently, we collaborated with Solrise Finance on the launch of Solrise Dex Pro, the first permissioned DEX built on Solana, using a new product from Civic, called Civic Pass.

Civic Developer Resources

Now, we’re making Civic Pass available to developers everywhere. The idea is that developers can plug Civic Pass into their platform and create an identity layer that allows for a permissioned dApp platform, be it a DEX, an NFT marketplace or mint, a metaverse, or who knows what else you’ll come up with. 

Done thoughtfully, identity verification will drive a more trusted web3 ecosystem. For example, the way DEXs are currently designed and used locks out institutional players. These larger entities, which have the ability to bring a lot of liquidity to the market, cannot use DEXs due to counterparty risk. They need to meet regulatory requirements in order to keep their licenses. 

Specifically, institutional players must not interact with individuals or organizations that are sanctioned from using regulated financial institutions or platforms. Failure to comply with Anti-Money Laundering, Counter Terrorism Financing regulations and other sanctions could result in prison time for individuals and colossal fines for participating firms. 

We believe that this permissioning can and should be done on-chain and that it should be done in a privacy-first and privacy-focused way. Ultimately, it should be in the power of the individual user to share their identity or not, and they should do so only if they want to access tools and financial services that are really interesting to them. Our point of view is that composable identity on-chain is the next evolution of what is required to enable a web3 structure, whether it’s a DEX, AMM, NFT marketplace or lending protocol.

Civic Pass is our answer to this problem. With Civic Pass, many different identity elements may be permissioned. So, if for example, a lending protocol has different needs, or if a dApp in another country has a lower threshold for KYC requirements, these components may be configured as needed. Regulations are inevitable, and we need to ensure that the right systems are in place so that these nascent companies can quickly and painlessly adapt as regulations change.

Getting Started is Easy

Visit our Developer Resources and reach out to us once you are ready to start integrating. Our tools aim to make a permissioned identity layer more accessible to dApps. The guide outlines integration with projects built on Solana, but support for other chains is coming soon. 

We’re Here to Support You 

While we’re a small but mighty team, we’re committed to helping you integrate with Civic Pass. You can find us in the #dev channel on our Discord server, and we’d love to answer your questions.

We look forward to seeing what you build and hope this guide helps to better understand the potential implementations and possible innovations that Civic Pass can offer to your dApps.

The post Introducing: Civic Pass Integration Guide appeared first on Civic Technologies, Inc..


auth0

The Future of Healthcare Relies on Adaptation

Interoperability, liberating patient data, and the identity lynchpin
Interoperability, liberating patient data, and the identity lynchpin

IDnow

IDnow AutoIdent will soon be usable according to German TKG

Federal Network Agency confirms IDnow AutoIdent as official identification method Munich, October 14th 2021, the AutoIdent solution from IDnow, a leading platform-as-a-service specialist for identity verification, is listed with the German Federal Network Agency as an officially confirmed identification method. Following VideoIdent, the AI-supported, automated method for identity verification, has

Federal Network Agency confirms IDnow AutoIdent as official identification method

Munich, October 14th 2021, the AutoIdent solution from IDnow, a leading platform-as-a-service specialist for identity verification, is listed with the German Federal Network Agency as an officially confirmed identification method. Following VideoIdent, the AI-supported, automated method for identity verification, has thus now also received official proof that it meets the applicable requirements of eIDAS, the VGV and the VDV.

Following the upcoming amendments to the German Telecommunications Act (TKG), this means that IDnow can then also provide telecommunications service providers with an automated identification process.

The official confirmation means that AutoIdent (as defined in Article 24 (1) litera d) of the eIDAS) brings with it the determination of equivalent security, in terms of reliability compared to personal presence. Thus, the IDnow Group is the only German provider with a total of six identification procedures confirmed by the Federal Network Agency.

“The confirmation by the Federal Network Agency that our AI-based solution, AutoIdent, complies with the applicable regulations can be a breakthrough for many companies, especially for the future of digitalization in Germany,” says Armin Bauer, Managing Director Technology and Co-Founder at IDnow. “Automated identification procedures ensure seamless processes without media disruption and increase cost efficiency. By modernizing laws that allow these procedures, many cases can be simplified and modernized in the future.”

This is an important step especially in view of the amendment to the Telecommunications Act (TKG) in December. The modernization of the TKG to implement the European Electronic Communications Code (EU Directive 2018/1972) implements adjustments in the areas of digital infrastructure expansion, consumer protection and public safety.

Furthermore, there is an innovation for identity verification. Until now, only methods involving human interaction, such as postal or video ID procedures, were permitted. The new TKG amendment, §171 (2), now also includes procedures that have been accredited by the Federal Network Agency through a conformity assessment body. A draft of a new order of the Federal Network Agency on identification procedures pursuant to §172 (2) sentence 3 TKG published on October 13, 2021, supplements this innovation by clearly including automatic video identification as a possible identification procedure within the scope of the requirements regulated in the order. For telecommunications providers, this brings great advantages, as they can also use confirmed AI-based solutions. These enable a seamless on-boarding process, which can be done without disruption. This not only leads to a significantly higher conversion rate but also reduces costs through the automated process.


Bloom

Bloom OnRamp Has Arrived

We’re thrilled to launch Bloom OnRamp, a new enterprise product that unlocks the full potential of DeFi, establishing a key piece of the infrastructure that will help the DeFi sector achieve compliance and, in turn, attain truly global and mainstream status. Defi has experienced massive growth over the


We’re thrilled to launch Bloom OnRamp, a new enterprise product that unlocks the full potential of DeFi, establishing a key piece of the infrastructure that will help the DeFi sector achieve compliance and, in turn, attain truly global and mainstream status.

Defi has experienced massive growth over the past year, topping both 1 million users and $100 billion in value for the first time. But this growth brings with it unforeseen challenges and limitations. Over the past month, regulators have warned that a new wave of regulations was imminent as few DeFi companies have achieved compliance with KYC/AML and other applicable rules.

With the threat of regulations on the horizon, there is an opportunity in the DeFi space for projects to address compliance, risk assessment, and privacy in a new and more secure way

With OnRamp, enterprises can securely access reusable, verifiable credentials (VCs), all without requiring users to expose sensitive data.

For enterprises seeking KYC & AML compliance, OnRamp offers ID verification, sanction screening and PEP Screening. Other identity credentials include phone number, email as well as social accounts like Facebook, Google, Linkedin and Twitter.

For enterprises seeking alternative data and better risk assessment, OnRamp offers secure access to bank account activity, balances and other financial signals. Future plans include the integration of traditional credit scores, utility bill payment history, and other alternative signals that could be helpful in determining creditworthiness.


Beyond OnRamp’s direct data integrations, the platform also supports the ability for third party credentials to enter the OnRamp platform via the WACI specification. This feature can enable unique localized use cases where financial infrastructure isn’t supporting local populations.

“We are excited to give enterprises the ability to leverage verifiable credentials for identity verification and risk assessment, all while respecting user privacy and mitigating the risk of data leaks” said Jace Hensley, Head of Platform at Bloom. “But what we are even more excited about is what our WACI integration enables. Now third party data sources around the world can enter the platform, giving the opportunity for people outside of the standard financial system to prove reliability and creditworthiness. This is a major step for DeFi to go global and truly expand financial inclusivity.”

Bloom’s OnRamp will be a key piece of infrastructure for DeFi companies, whether they are in launch or growth mode. The developer-friendly application includes no integration fees, no monthly minimums, no long term contracts, and flexible pricing. OnRamp reduces the barrier to entry for new builders, allows growing companies to expand, and most importantly, opens the door a bit wider for those who deserve access to the decentralized finance markets.

Learn how Bloom is helping DeFi with compliance and growing beyond collateralized lending, all while prioritizing user privacy.


OWI - State of Identity

Gateway to the World

COVID fundamentally evolved the way we transact digitally — from healthcare to banking to online betting through crypto trading account opening. How will banks evolve remote account opening? How will organizations ensure people are responsibly gambling online? In this week's episode of State of Identity, host, Cameron D'Ambrosi is joined by Shaun Moore, the Chief Executive Of
COVID fundamentally evolved the way we transact digitally — from healthcare to banking to online betting through crypto trading account opening. How will banks evolve remote account opening? How will organizations ensure people are responsibly gambling online? In this week's episode of State of Identity, host, Cameron D'Ambrosi is joined by Shaun Moore, the Chief Executive Officer of Trueface, a Pangiam company, and Patrick S. Flanagan, the Co-Founder and COO of Pangiam. They tackle these questions head-on, the broader context of digital identity and proofing solutions, Pangiam's recent acquisition of Trueface, and how software has been the critical piece delivering hardware-agnostic solutions.  

Ontology

What Are Ethereum Virtual Machines and Why Should You Care?

A primer on Virtual Machines and Ontology’s Ethereum Virtual Machine. As we move towards Web3, the next iteration of the internet, there has been a lot of talk about the benefits of interoperability and the need to connect different ecosystems and blockchains that currently operate in silos. Virtual Machines (VMs) are playing a big role in this. Haven’t heard of them? Never fear! We recently
A primer on Virtual Machines and Ontology’s Ethereum Virtual Machine.

As we move towards Web3, the next iteration of the internet, there has been a lot of talk about the benefits of interoperability and the need to connect different ecosystems and blockchains that currently operate in silos. Virtual Machines (VMs) are playing a big role in this. Haven’t heard of them? Never fear! We recently did a tweet series explaining their benefits, specifically related to Ethereum Virtual Machines. Check out the post below for a full recap!

A VM is a computer that runs on the blockchain and allows smart contracts from multiple sources to interact with one another. A smart contract is simply an agreement, expressed as code, executed on the blockchain, and the language it uses is blockchain platform-specific. To run smart contracts on different blockchain platforms, we need VMs to allow smart contracts written in one language to run another blockchain which saves time and costs compared to rewriting and testing new smart contracts or even requiring developers to learn a new programming language.

Imagine you’re a decentralized application (dApp) developer who has just launched an app on a leading public blockchain platform after months of research and coding. The number of users has skyrocketed since the dApp’s launch. Instead of becoming complacent, you begin to think about what’s next. You might consider moving the dApp to another blockchain platform to explore new opportunities and grow your user base. That’s where VMs come in. They help developers to instantly integrate their dApps into different platforms with a programming language they are already familiar with. By giving developers access to networks, VMs facilitate decentralized ecosystems and give them more computing power. Developers can then leverage the networks they are connected to to create smart contracts and dApps.

Ontology already has its own VM, and also supports Wasm and NeoVM. This summer, we announced that we had completed the development of our Ethereum Virtual Machine (EVM). The EVM is the most popular and widely used VM, supporting a huge variety of dApps.

Ontology’s EVM will bring many benefits to users by:

Increasing interoperability between Ontology and Ethereum; developers will be able to seamlessly migrate assets between both blockchains. Allowing users to access EVM-based projects deployed on Ontology, whilst enjoying lower gas fees and faster block production. Enabling Ethereum developers to build using Ontology’s decentralized identity and data protocols. Increase Ontology’s functionality by adding an Ethereum account system, Ethereum transaction types, and Web3 APIs.

Our EVM recently launched on TestNet and opened its EVM-compatible public beta for developers. To help refine our TestNet, we set up a bug bounty with Slow Mist, which allows people to earn rewards for identifying vulnerabilities, with a top prize of $12,000 of $ONG.

In addition to Ontology’s EVM TestNet, the Ontology Blockchain Explorer, Developer Documentation Center, and Ontology EVM-supported Web3 API are also being upgraded, which will incentivize Ethereum developers to deploy dApps on the TestNet. Developers can also use the Ontology Bridge to convert Ontology’s native OEP4 tokens to ORC20 tokens and add them to their MetaMask wallets and then deploy dApps.

We hope you’re excited for the full launch of our EVM and we look forward to sharing more updates. For now, if you’re interested in learning more about our EVM development, you can check out our Medium series: “Everything You Need To Know About Ontology EVM Contract Development”.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

What Are Ethereum Virtual Machines and Why Should You Care? was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

CSLS Speaker Spotlight: Robindro Ullah on the Impact of a Global Pandemic on People's Job Mindsets

by Fabian Süß Robindro Ullah, CEO at Trendence Institut will discuss the The Impact of a Global Pandemic on Peoples Job Mindsets on Wednesday, November 10 from 17:00 pm to 17:20 pm at Cybersecurity Leadership Summit 2021. To give you a sneak preview of what to expect, we asked Robindro some questions about his presentation. Why has recruitment today never been so easy and hard at the same

by Fabian Süß

Robindro Ullah, CEO at Trendence Institut will discuss the The Impact of a Global Pandemic on Peoples Job Mindsets on Wednesday, November 10 from 17:00 pm to 17:20 pm at Cybersecurity Leadership Summit 2021.

To give you a sneak preview of what to expect, we asked Robindro some questions about his presentation.

Why has recruitment today never been so easy and hard at the same time?

This is a very good question, because on the one hand, we clearly see in our data, for example, that direct search or active sourcing was less used by companies, which is obviously because they have less job openings to fill. And, for example, also everything around the recruitment process was also easier to access, like video calls or virtual onboarding, and all that stuff. It didn't take so much time to do things like that, where in the past you had to travel to an employer to do an interview or a case study or something. Everything went online. So, talent was more accessible than before. On the other hand, the requirements of talent increased, like more people were looking for secure jobs or something we called job security. And another thing, for example, which was very prominent in Germany, was that more and more talents were looking for essential jobs, or at least they wanted to work in an essential environment. Like for example, we didn't ask the question before the pandemic, but during the past year, it was like every second person said that if they change or if they switch jobs, they will definitely look especially for essential jobs or jobs in an essential field.

How has the pandemic changed people's mindset about jobs in the last 1.5 years?

This is a difficult question. With our surveys and the data we are gathering we tried to look into the mindsets of people to get a clear picture of what they are thinking now. Obviously, as I said before, we have a shift in the direction of essential jobs or essential fields. So, till now, we are not 100 percent clear whether this is a shift which is based on more job security, because people think that an essential job is more secure, or is it because they thought about their life and the things they are doing and whether it is really something which matters, or is it essential for the society? And so, this could also be a reason why there's a shift in the direction of essential. In Germany especially, we also had, for example, a shift in the direction of government jobs or jobs in the public sector. This is also something which could be based on more job security, as well as on a shift in the mindset. In general, we can see that work/life balance or as we call it sometimes also work/life blending got more and more important during the pandemic and the focus on the family itself.

What distinguishes IT expert recruitment from the search of other professional fields?

This is a nice question, and a lot of our customers ask us how to approach IT experts, and is this completely different to all other searches they do or all other jobs that they have to fill? If you look at it from a more modern way of recruitment, you have to admit that every target group you are trying to access, you have to access individually. So, from this point of view of modern recruitment, IT experts, of course, they have to be approached individually, but it's not different from other target groups, because nowadays you have to approach every target group in a very individual way. What especially in the field of IT is harder or more difficult is the fact that they are approached a lot, like they get offers all the time, every day. So, you have to be more creative in comparison to other target groups, and you have to find other ways to approach them.

The highways to these talents, they are frequent. A lot of companies use those ways like LinkedIn or in in the German-speaking area, Xing. So, you have to be more creative and think about other ways to approach this talent. And of course, you have to consider the requirements they have. For example, in our data, we can clearly see that remote work has a different status or it has a different impact on this target group than on others. They have higher requirements on the topic of remote work. For example, companies who have rules like you can work two days a week from home, also after our crisis. So, in future, this is something which would not be enough for IT experts. Because IT experts, in general say at least it should be 3.5 days a week I can use remote work and not just two days.

Could you give us a sneak peek into your Keynote: “The impact of a global pandemic on people’s job mindsets”?

In my keynote, I want to present to you some of the data we collected during the last one and half years, what we did in the very beginning of this pandemic was that we started a monthly survey in Germany. And from this data, we got a lot of insights I want to present to you. Especially those insights which give a little bit of insight into the mindsets of the people. And for example, one insight I was very - not shocked, but surprised about, was the fact that, for example, the social media usage in terms of career and job search changed a little bit.

Before we had the big players who got a lot of traffic. And now more and more small players get more traffic, like very small networks in terms of career and job topics like TikTok, for example, came up in the professional area, which I've never seen before. TikTok is on the market from I think from 2015 on, or 2017 in the Western world too. So a lot changed due to the pandemic. And we have good reason to believe that. for example, one of the reasons is that people have more time. They were able to spend more time thinking about which social networks fit their needs best.


Okta

Spreading Some Okta Love to the DevOps World

Hello Oktaverse! So finally, I have landed at Okta on my second attempt 😉. I’m so excited about this new chapter in my career journey and can’t wait to see what’s in store. But first, introductions. Who am I I’m from the south of India, a village in Kerala to be specific, but I grew up in Chennai since my parents moved there looking for work when I was 12. I like to call myself a soft

Hello Oktaverse!

So finally, I have landed at Okta on my second attempt 😉. I’m so excited about this new chapter in my career journey and can’t wait to see what’s in store. But first, introductions.

Who am I

I’m from the south of India, a village in Kerala to be specific, but I grew up in Chennai since my parents moved there looking for work when I was 12. I like to call myself a software engineer with a soldering iron. That’s because I never intended to be in the IT industry. I wanted to be an astronomer, but that didn’t work out, so I wanted to get into robotics. I got myself an engineering degree in electronics and electrical engineering to pursue that, but apparently, the universe had different plans.

The recession of 2008 hit, and robotics companies in India weren’t hiring much. To make ends meet and repay my student loan, I applied to one of the biggest software companies in India with little to no hope. The only software experience I’d had at that point was the data-structures class (which I flunked once and had to retake) and a little bit of dabbling with JS and PHP when I ran a personal blog about modifying motorbikes. Surprisingly, I made the cut. Hence, I began my journey into software engineering with a boot camp provided by Tata Consultancy Services (TCS). That is when I realized I was pretty good at programming and I passed out of the boot camp as the top performer! I still love learning about astronomy and astrophysics, among other science subjects, and still ride a motorcycle 😄. I also love Linux and you can read my take on the state of Linux if you are interested.

What I’ve done so far

Fast forward to 2021: I have more than a decade of experience wearing all sorts of hats in IT, working with a dozen languages and all sorts of tech. I’m a jack of all trades and master of some. At TCS, I spend 2011-2016 in Singapore working for its airline customers. I started out building Java web apps for enterprises, did a lot of JavaScript and UX engineering, built mobile apps, led some teams, managed projects, and finally was part of a pre-sales team proposing architectures, building prototypes, and so on. I was also the go-to JS person there. This is when I started getting into open source, discovered JHipster, and fell in love with it.

I now co-lead JHipster along with Julien Dubois and Pascal Grimaud. While I don’t actively code that much for JHipster these days, somehow, I still hold the number two position in terms of contributions 😉. Fun fact: Okta was an early supporter of JHipster, and thanks to Matt Raible, Okta is now the platinum sponsor for JHipster.

In 2016, I moved to the Netherlands to join a DevOps startup and helped build an analytics product, developer tooling, and cloud and container integrations. I worked mainly with Java, TypeScript, and Golang. This is where my fascination for cloud and container tech began. I had been doing some advocacy all along for JHipster and DevOps by writing content, giving conference talks, meetups, and so on. In 2020, I officially became a paid Developer Advocate for Adyen, a unicorn in the fintech space.

As part of my open-source software endeavors, I continue contributing to Java and JS communities. Recently I got fascinated with Rust and built a Kubernetes dashboard in Rust, called KDash. I’m hoping to get more involved in the Rust community as well.

What I’ll be doing at Okta

As someone who has been following Okta closely for many years, it’s an honor to be here finally. Though I have dabbled in many things and was fortunate to work with a lot of different technologies, my expertise is still around Java, JS, Golang, Rust, and DevOps. I hope to use my skills to contribute to those communities as much as possible while making Okta a well-known name in the DevOps space. Stay tuned for more content, talks, and OSS contributions in those areas. While here, I also hope to improve the developer experience of using Okta products, especially in the DevOps space.

How to get in touch

You can find me on Twitter, LinkedIn or GitHub. You can also write to deepu.sasidharan@okta.com. I have an active tech blog at deepu.tech, where you can find me writing about my love for Linux, why I like Rust, or my opinion on tech in general, among other content. You will also find me writing technical pieces on this blog very soon.

Please do connect with me in your channel of choice and shoot me a message if you have any questions or if you just wanna chat.

Wednesday, 13. October 2021

KuppingerCole

Georg Jürgens David Mason: US Pharmaceutical Supply Chain adopts Decentralized Identity and Credentials




Joni Brennan: Verifying Assurance for Blockchain Enterprise Scenarios with a Pan-Canadian Trust Framework




Monica Singer: The Art of the Possible with Blockchain Technology

The explanation will enable CEOs and CFOs of any business to understand how this technology will impact their business.

The explanation will enable CEOs and CFOs of any business to understand how this technology will impact their business.




Timo Hotti: Automated Trust and Findy




Anne Bailey: Where Blockchain is Thriving

Blockchain is a revolutionary technology, but before it redefines the organization as we know it today, it will revolutionize the way we transact in small ways first. This presentation will discuss the use cases that where blockchain is moving beyond PoC to enterprise implementation.

Blockchain is a revolutionary technology, but before it redefines the organization as we know it today, it will revolutionize the way we transact in small ways first. This presentation will discuss the use cases that where blockchain is moving beyond PoC to enterprise implementation.




Sebastian Manhart: Decentralized Identity and Governments - Can It Work?

Covid19 has laid bare just how far behind most governments are in digitising public administration and services for citizens. As countries around the world scramble to catch up, digital identity has emerged as one of the key building blocks. The approach, however, varies greatly. Most governments fall back on a default pattern characterised by centralised approaches and a narrow focus on the publi

Covid19 has laid bare just how far behind most governments are in digitising public administration and services for citizens. As countries around the world scramble to catch up, digital identity has emerged as one of the key building blocks. The approach, however, varies greatly. Most governments fall back on a default pattern characterised by centralised approaches and a narrow focus on the public sector only. Other countries especially in Europe and most notably Germany,  are taking a different strategy and implementing user-centric, decentralised, self-sovereign digital identity with the goal of providing a holistic identity solution citizens can use everywhere and across borders. Join this talk with Sebastian Manhart, Advisor on Digital Identity to the German Chancellery (Angela Merkel´s Office), who will share what is happening in Germany and Europe, and why this could set the stage for digital identity globally.




Peter Busch: Opportunities and Challenges for Decentralized Identity




Dr. Michele Nati: Blockchain and Telco: Opportunities, Challenges and Use Cases




European Hosted Hybrid IaaS Cloud

by Mike Small The KuppingerCole Market Compass reports provide an overview of vendors and their product or service offerings in a certain market segment. This Market Compass report covers hosted hybrid cloud services provided from within Europe. It provides an assessment of the capabilities provided by these services including those for the tenant to ensure their secure and compliant use of the se

by Mike Small

The KuppingerCole Market Compass reports provide an overview of vendors and their product or service offerings in a certain market segment. This Market Compass report covers hosted hybrid cloud services provided from within Europe. It provides an assessment of the capabilities provided by these services including those for the tenant to ensure their secure and compliant use of the services.

Christoph Burger: The Age of Smart Contract - Illusion or Reality?

• A short status review of Blockchain: The development so far and operational challenges               • The race of application solutions                            ○ Blockchain without tokenization

• A short status review of Blockchain: The development so far and operational challenges

              • The race of application solutions

                           ○ Blockchain without tokenization

                           ○ Blockchain with tokenization

              • Hurdles of the future




Shyft Network

Shyft Network- Solving The Internet’s Identity Problem

From early ID management systems to the birth of Blockchain, learn how Shyft Network is building the most secure future for trust and identity. Identity has been a contentious point for humanity ever since we evolved societies. How are we sure that someone is who they say they are? Before the modern age, there were several ways ancient civilizations did identity verification. Unfortunately, the s
From early ID management systems to the birth of Blockchain, learn how Shyft Network is building the most secure future for trust and identity.

Identity has been a contentious point for humanity ever since we evolved societies. How are we sure that someone is who they say they are? Before the modern age, there were several ways ancient civilizations did identity verification. Unfortunately, the scope and viability of those methods went out the window during the information age — and even more so during the dawn of the internet, as human interaction started to become replaced by digital connection. The internet’s identity and trust problems weren’t much of an issue when it first saw use. Most people would use phone lines to dial into bulletin board systems. Phone lines were linked to addresses, which were tied to people living in the house. If someone had enough patience, they could locate who was dialing into their BBS. However, the internet has evolved since then, to put it lightly, and its identity problems have kept pace with it.

The Story of Trust Online

As more casual users started to take advantage of the internet with Web1.0, individual data would have to be batched out. As the rise of personal experience websites came onto the scene, the back end needed to know which user data it should show to the person sitting at the screen. Developers solved this by constructing credential systems. Usernames and passwords would be stored on databases (sometimes encrypted) and were handy enough to be carried across ecosystems. Unfortunately, when you have a centralized database of usernames and passwords, the attack vectors for obtaining access to them are trivial. In the early days of login authentication, many young users tested their skills attempting to break into these Web1.0 databases, and many succeeded.

These early, clunky user-password authentication efforts evolved into ID management systems run by large corporations with the rise of Web2.0. Now, users can log in with a single click, using the API to share their login verification with whatever website requests it. This approach made it easier for developers to onboard users since it had much less friction than forcing them to sign up for a new account. Users today have several dozen accounts already on sites they might only use once. The obvious drawback of this system is that you can be banned or restricted if you go against the data holders. As an added bonus, these companies routinely sell the data they collect from users to third parties for all sorts of purposes, usually without the user being aware of it. While we’d like to think these massive corporations are secure, the sheer amount of data leaks coming from businesses like Facebook shows that this is a vain hope.

The Evolution of Blockchain Trust

In 2008, Satoshi Nakamoto created Bitcoin intending to develop a trustless ecosystem for financial transactions. In essence, the Bitcoin blockchain is a conversation between two parties. The recipient asks, “Do you have the BTC I requested?” and the sender replies with a yes or a no, backed up by the blockchain’s record of their currently accessible funds. Bitcoin has its own problems, but the principle of a trustless network is sound, as has been evidenced by the widespread adoption of the world’s first cryptocurrency. Shyft Network seeks to take what Bitcoin developed and evolve it to the next level. But how do we intend to do that?

Shyft Into High Gear

A blockchain can bring about tangible change in the world. Shyft Network is doing so by providing users and third parties with a single decentralized method of verifying credentials. The Shyft ecosystem involves several stakeholders, each with their part to play.

Ø The Data Owners and Data Holders: Data owners are individuals or organizations that credentials refer to. They may or may not be trusted entities, but they can interact with Trust Anchors (Data Holders) to verify their identity or the validity of the data. In fact, any data can be contextualized via attestations and TAs.

Ø Trust Anchors (Attestors): These are trusted entities that collect data from issuers and holders. They verify the identities of these individuals or organizations and hold the data off-chain. If the data owner would like to pass that data to a third party, they wouldn’t need to pay a fee for transferring that validated data. However, third-parties, such as aggregators could potentially pay for that data before the data-owner’s consent.

Ø Data Consumers: These entities provide pre-approved app services which utilize trusted data. They’re responsible for requesting data from holders, reviewing attestations, and determining whether that data is usable.

Ø Validators (Nodes): Nodes perform much of the same work as on other blockchains — they validate the transactions and add them to the chain.

Ø Consent Framework: Today’s users have become used to companies hiding privacy data inside their setting’s menus. Shyft Network operates on a strict opt-in system that allows users to control who gets their data.

Ø Proof of Individual: How do you know that the sender probably has what they’re offering? With the Shyft Network, it’s a simple matter of verifying the blockchain address. With all data pertaining to an individual linked to this address, it’s impossible to spoof identities.

Reliability and Dependability

The Shyft Network aims to provide a method for users to share data with the companies they see fit. As an opt-in system, the data holder can deal with each request for their data individually. However, the Shyft Network offers more than just a method of verifying users. Some of the most current use cases have demonstrated the flexibility of the platform.

Ø Financial Action Task Force (FATF) Travel Rule Compliance: The FATF recently issued crypto-specific guidance recommending jurisdictions to require Virtual Asset Service Providers (or VASPs), (such as cryptocurrency exchanges) to provide sender and recipient data on request. Yet, at current, there exists no way to comply with the FATF guidance. By partnering with some of the world’s most trusted cryptocurrency exchanges, like Binance and Tether, the Shyft Network has helped design and implement a solution to this problem.

Ø Compliant Decentralized Finance: De-Fi has had regulators up in arms, trying to lock down much of the innovation within the space. Large liquidity providers, for example, cannot interact with the De-Fi space because of a lack of clarity from regulators’ perspectives. Shyft Network is a decentralized solution that considers these outdated or obsolete perspectives that regulators may have and combines it with a way for liquidity providers to verify themselves in a way that complies with the regulations.

Ø Government Digital Identity: Identity fraud is a real danger, but with a trustless verification system like Shyft, a seamless, single verification is all that any citizen or organization within a country needs. Through Shyft, governments have a platform to create Know-Your-Customer (KYC) systems that are seamless to interact with and provide easy verification for all users. With this seamless verification comes security, interoperability, and transparency. Data collected would be readily available to regulators and auditors within the confines of privacy rules.

All of these use cases only scratch the surface of what Shyft is capable of, especially as we roll out open-sourcing of Veriscope, our latest and great DeFi compliance framework and smart-contract platform for VASPs to enable travel rule compliance. While we already have an idea of what can be done with our growing infrastructure, we’re excited to see how the evolution of the blockchain will bring about new and innovative approaches to personal verification within cyberspace. Stay tuned.

Shyft Network aggregates trust and contextualizes data to build an authentic reputation, identity, and creditability framework for individuals and enterprises.

Join our Newsletter
Telegram (https://t.me/shyftnetwork)
Follow us on Twitter (https://twitter.com/shyftnetwork)
Check out our GitHub (https://github.com/ShyftNetwork)
Check out our website (https://www.shyft.network)
Check out our Discord (https://discord.gg/ZcBNW37t)

Indicio

Node Operator Spotlight: Anonyome

The post Node Operator Spotlight: Anonyome appeared first on Indicio Tech.

A distributed ledger is a database that has copies distributed across a network of servers (nodes), all of which are updated simultaneously. A network like this is the foundation of decentralized identity, a way of generating robust trust and collaboration free of the security risks of centralized databases. We call the companies and organizations that support an Indicio Network node on a server that is under their control “Node Operators.” 

 

Recently we caught up with Paul Ashley, CTO and Co-CEO of Anonyome Labs, a current Node Operator of Indicio, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading.

Tell us about Anonyome: how did it start, where did it start, and who makes up your team?

The goal of Anonyome Labs is to shift the control of personal information back to normal users. 

Everything we do is recorded, collected, mined, profiled, stored, targeted and sold. The balance of power has shifted to the cabal of tech giants and data miners who overtly or covertly monitor and control what is seen, clicked, and cared about. At Anonyome Labs we build the tools that shift control of personal and private information from the big data miners back to the user. 

Anonyome Labs was founded in 2014 and is headquartered in Woodside California, with teams in Salt Lake City, Utah and Gold Coast, Australia. Anonyome Labs has about 70 employees –  the teams have deep enterprise and consumer expertise across identity, cyber security, authentication, authorization, privacy and cryptography – with hundreds of granted patents.

What are some of the products/services (Self Sovereign Identity or not) that you currently offer? Who are your target customers? What sets you apart from the competition?

Anonyome Labs created the Sudo Platform to provide enterprise software developers with capabilities to add persona (Sudo) based identity, privacy and cyber safety features to their applications. The Sudo Platform provides to these enterprise software developers mobile and web SDKs, sample apps, documentation and UI Kits to accelerate their application development. 

Each of the capabilities of the Sudo Platform is attached to a persona. This includes masked email and masked credit cards, private telephony, private and compartmentalized browsing (with ad/tracker blocker and site reputation), VPN, password management, decentralized identity and more.

In addition, Anonyome Labs created the MySudo mobile application to put the same identity, privacy, and cyber security capabilities into the hands of normal users for their interactions with the online and offline world.  Each user is able to create a number of personas (Sudos) and with each of them have access to various Sudo Platform capabilities.

What Self Sovereign Identity /Decentralized Identity products/services are on your roadmap?

A key offering of the Sudo Platform is Decentralized Identity based services.  This includes both client (Edge Agent) and server (Cloud Agent) offerings.  This allows the enterprise to become a Decentralized Identity Verifiable Credential Issuer and/or Validator. And it allows the enterprise’s users to take part in a decentralized identity ecosystem – by giving them a mobile wallet/agent to manage decentralized identities, connections and verifiable credentials.

What motivated your work in Decentralized Identity? Why did you become a node operator? What appeals to you in this field?

We believe that Decentralized Identity is the most important innovation in identity to help normal users have control over their personal information as they interact with the online world.  Given Anonyome’s focus on privacy and cyber safety,  it was a natural extension to our Sudo Platform to add Decentralized Identity services. Anonyome Labs became a founding steward of the Indicio decentralized identity network in anticipation of using that network for our customer’s enterprise applications.

Where do you see the future of Self Sovereign Identity/Decentralized Identity?

It is our belief that decentralized identity will become the core foundational technology of future privacy and cyber safety capabilities.  Over time we will transition from the current privacy invasive technologies, to new systems founded on decentralized identity.

 

For more information about the Sudo platform or any of their other products, go to Anonyome.com

The post Node Operator Spotlight: Anonyome appeared first on Indicio Tech.


auth0

Okta + Auth0 Showcase 2021: Identity for All

CEOs Todd McKinnon and Eugenio Pace highlight the exciting opportunities ahead for the combined company
CEOs Todd McKinnon and Eugenio Pace highlight the exciting opportunities ahead for the combined company

Ocean Protocol

OceanDAO Round 10 Results

9 projects funded; 100mm+ votes submitted OceanDAO Grants Hello, Ocean Community! The OceanDAO is pleased to share the results of the 10th round of our community grants initiative: A total of 530,000 USD was available Conversion rate of 0.75 OCEAN/USD 702,963 $OCEAN tokens were available All funds that were not granted will be burned. $OCEAN burned is a signal of the extra room available

9 projects funded; 100mm+ votes submitted

OceanDAO Grants

Hello, Ocean Community!

The OceanDAO is pleased to share the results of the 10th round of our community grants initiative:

A total of 530,000 USD was available Conversion rate of 0.75 OCEAN/USD 702,963 $OCEAN tokens were available

All funds that were not granted will be burned. $OCEAN burned is a signal of the extra room available for new grants.

The results are in:

Round 10 included 4 first-time projects and 5 returning projects requesting follow-up funding.

158,547 $OCEAN has been granted 544,416 $OCEAN will be burned

Burned $OCEAN will be sent to address 0x000000000000000000000000000000000000dEaD, forever decreasing total $OCEAN in circulation.

OceanDAO Round 11 and announcements will be live shortly. Ocean Protocol is dedicated to ever-growing resources for continued growth, transparency, and decentralization. Keep an eye out on Twitter @oceanprotocol and our blog for the full announcement and new highlights.

For up-to-date information on getting started with OceanDAO, we invite you to get involved and learn more about Ocean’s community-curated funding on the OceanDAO website.

We encourage the Ocean ecosystem to apply or re-apply AND to vote!

Thank you to all of the participants, voters, and proposers.

OceanDAO Round 10 Results

You can find the full overview on our Round 10 — Votes page.

Round 10 Rules

Proposals with 50% or more “Yes” Votes received a grant, until the “Total Round Funding Available” is depleted in descending number of votes received order.

19% of “Total Round Funding Available” was earmarked for New Projects. Earmarked proposals were eligible for entire “Total Round Funding Available”; returning (general) grants were eligible for 81%.

The grant proposals from the snapshot ballot that met these criteria were selected to receive their $OCEAN Amount Requested to foster positive value creation for the overall Ocean ecosystem.

Voting opened on October 7th at 23:59 GMT Voting closed on October 11th at 23:59 GMT

Proposal Vote Results:

31 proposals submitted 9 funded or partially funded 83 Unique Wallets Voted 1002 voters across all proposals (same wallet can vote on multiple proposals) 682 total Yes votes 320 total No Votes 73,486,412.21 $OCEAN voted Yes on proposals 75,728,935.83 $OCEAN voted No on proposals 149,215,348 $OCEAN Tokens voted across all proposals 158,547 $OCEAN has been granted 544,416 $OCEAN will be burned Recipients

Congratulations to the grant recipients! These projects have received an OceanDAO grant in the form of $OCEAN tokens.

See all the expanded proposal details on the Round 10 Ocean Port Forum!

If your Proposal was voted to receive a grant, if you haven’t already, please submit a Request Invoice to the Ocean Protocol Foundation (OPF) for the Ocean Granted amount listed on the Round 10 — Votes page.

Proposal Details

Grants

Clean Docs: As Ocean core components that power the ocean economy are constantly evolving, I propose to maintain the documentation for a period of 3 months. Regularly updated docs and improvements in https://docs.oceanprotocol.com/.

Metaverse Game Hub:Offering datasets on the Ocean Marketplace revolving around data from metaverse based assets such as LANDS that are generated by analytic algorithms powered by AI models which allow a fair and transparent price evaluation of the NFT market.

Coral Market:Grant funds are requested to support the development of an open-source application for GDPR-compliant self-sovereign scientific data management and peer-to-peer sharing.

Walt.id: Proof of Concept: Implementing Europe’s new open-source digital identity ecosystem based on EBSI and ESSIF in Ocean Protocol.

Rugpullindex.com: Our current mission is to improve the overall safety of all Ocean data token users such that anyone can trade data without running the risk of getting the rug pulled.

DataX: Vision is a one-stop shop for all Data DeFi needs. We are helping to make data a liquid asset.

Governauts Rewards Systems Research Initiative: We develop and verify DAO improvements for Ocean DAO, as part of a community research initiative led by Token Engineering Academy.

Ocean Protocol Japan: Operation and management of local communities in Japan and promotion of adoption of OceanProtocol by Japanese individuals and organizations.

Indian Ocean program: The extended outreach program initiates to set up an Indian outreach ‘port’ amidst the lush Web2 developer community inviting new inn-mates and young outreach ambassadors to on-board the Data Ocean.

OceanDAO Ecosystem

Continue to support and track progress on all of the Grant Recipients here!

Much more to come — join our Town Halls to stay up to date and see you in Round 11. Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 10 Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Northern Block

Battle of the Trust Frameworks with Tim Bouma & Darrell O’Donnell

 Listen to this Episode On Spotify Listen to this Episode On Apple Podcasts Introduction Every week there seems to be mention of a new Trust Framework. People are now trying to monetize them. But to understand trust frameworks, let’s first understand what constitutes trust. We must define concepts such as levels of assurances and […] The post <strong>Battle of the Trust Frameworks</st


Listen to this Episode On Spotify
Listen to this Episode On Apple Podcasts

Introduction

Every week there seems to be mention of a new Trust Framework. People are now trying to monetize them.

But to understand trust frameworks, let’s first understand what constitutes trust. We must define concepts such as levels of assurances and what defines technical vs human trust?

Once this is better understood, the conversation about where a particular organization fits into a digital identity ecosystem is simpler. And note, in user-centric digital identity (or SSI), organizations no longer need to be at the centre of the universe.

About Episode
During this conversation, we discuss:

Levels of Assurance (LOA): an introduction to LOAs as they relate to Digital Identity and why they’re an important part of the recipe in achieving digital trust. Tim and Darrell give us some practical examples of LOAs. The Concept of Trust: how do we define trust at a high-level and how do we differentiate between technical and human trust? How can we build trust with credential issuers but also with credential holders? The World of Trust Frameworks: what are trust frameworks and what are different types of frameworks being deployed in both the public and private sectors? How are organizations trying to monetize trust frameworks? What’s going right, and what’s going wrong with the way trust frameworks are being implemented? The Importance of Open Source for Trust Creation: why is open source important for achieving digital sovereignty? Is open source the only way to improve transparency, flexibility and accountability?

Mentions during episode:

Dee Hock: his book & his Twitter account Link to episode with John Ainsworth, where we talk about Dee and payment processors such as Visa UK’s guidance on open source


About Guests

Tim Bouma is  Senior Policy Analyst for Identity Management at Treasury Board Secretariat of the Government of Canada. My mandate is to develop a government-wide identity management strategy that spans across the service delivery and security communities.


You can find Tim here on Twitter here: https://twitter.com/trbouma; and on LinkedIn here: https://www.linkedin.com/in/trbouma/

 

Darrell O’Donnell is a technology company founder, executive, investor, and advisor. He’s on a mission to help organizations build and deploy real-world decentralized (#SSI) solutions. He advises numerous startups, senior government leaders, and investors.


You can find Darrell here on Twitter here: https://twitter.com/darrello; and on LinkedIn here: https://www.linkedin.com/in/darrellodonnell/

The post <strong>Battle of the Trust Frameworks</strong> with Tim Bouma & Darrell O’Donnell appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Anonym

The US Data Privacy Law “Floor”: What Deserves Basic Protections?

The New York Times recently did a deep dive into the United States’ lack of a national data privacy law and why it matters, and we want to drill down on some key points here. The article describes the four basic areas of users’ data privacy that should be protected as a minimum, and calls these areas the “floor” of any future national privacy […] The

The New York Times recently did a deep dive into the United States’ lack of a national data privacy law and why it matters, and we want to drill down on some key points here.

The article describes the four basic areas of users’ data privacy that should be protected as a minimum, and calls these areas the “floor” of any future national privacy law onto which other protections could be built over time. It shows you the direction in which privacy advocates are heading.

Four basic areas make up the privacy law “floor”:

Data collection and sharing rights – This is the user’s fundamental right to clearly see what personal data companies collect, share and sell about them; their right to ask a company to delete any personal data they don’t want them to have; and their right to demand a company stop sharing their personal data.  Opt-in consent – This is where a company, not the user, does the heavy lifting on privacy, by asking the user whether they may collect, share or sell their data to third parties. Opting out takes the user a lot of time and effort; opting in puts that effort on the company. But opt-in isn’t easy to implement, which is why global opt-out functions like the Global Privacy Control are a popular stopgap, even seeming acceptable under the California Consumer Privacy Act. The GPC and other tools like it allow opt out at the browser or device level, not the site level.  Data minimization – This protection would pare back the data a company can collect to only the basics required for them to deliver their product or service.  Non-discrimination and no data-use discrimination – The final plank in the basic data privacy “floor” would protect users from being discriminated against for exercising their right to privacy. This means users couldn’t be charged more for opting out (or not opting in) and couldn’t be offered incentives such as discounts and coupons for opting in, for example. This requirement would also prevent companies from discriminating against users based on personal characteristics, such as religion, race or gender.  In addition to this four-plank “floor”, the privacy experts interviewed for the NYT article would like to see: 

a more comprehensive data breach notification law, to standardize who gets notified and some common standards for doing so a private right of action or the right of a person to sue a company that violates their privacy strong, well-funded enforcement agencies and resources  privacy by default so apps come with the strictest built-in privacy without the user having to do anything unless they want to opt-in to certain settings.

Anonyome Labs agrees with the four-plank data privacy law “floor” and the regulatory extras proposed by the privacy community in this article. We also recognize that a national privacy law and a uniform approach to these requirements may still be a long way off, which is why we created our consumer privacy app MySudo and the privacy and cybersecurity capabilities in Sudo Platform that help enterprises rapidly develop and deploy branded privacy and cybersecurity solutions.

Photo By cgstock

The post The US Data Privacy Law “Floor”: What Deserves Basic Protections? appeared first on Anonyome Labs.


auth0

Auth0 Identity Platform Now Available on Microsoft Azure

New private cloud deployment option offers customers greater choice, scalability, and reliability
New private cloud deployment option offers customers greater choice, scalability, and reliability

Ontology

Ontology Weekly Report (October 1–11, 2021)

Highlights Ontology, in collaboration with AP. LLC, a Japanese consulting firm with strong ties to domestic research institutes and researchers, announced that our jointly developed blockchain-based anti-falsification program has been linked to the inventory management system of ZAICO, a cloud inventory management software company. With a suite of business-ready applications and a series of busin
Highlights

Ontology, in collaboration with AP. LLC, a Japanese consulting firm with strong ties to domestic research institutes and researchers, announced that our jointly developed blockchain-based anti-falsification program has been linked to the inventory management system of ZAICO, a cloud inventory management software company. With a suite of business-ready applications and a series of business cooperation, Ontology’s presence in Japan has grown substantially.

Latest Developments Development Progress We have completed the launch of Ontology’s EVM TestNet and are 70% done with testing. Ontology’s Security Vulnerabilities and Threat Intelligence Bounty Program, has attracted a lot of developers’ attention, with a top prize of $12,000 in ONG rewards. We have completed Ethereum RPC support and are 100% done with internal testing. The TestNet has been synchronized online; we are 71% done with testing. We have completed 100% of Ontology’s Ethereum account system development and the TestNet has been synchronized online; we are 72% done with testing. The EVM/OEP-4 asset seamless transfer technical solution, which facilitates the efficient conversion between OEP-4 assets and EVM assets, is complete and the TestNet has been synchronized online; we are 69% done with testing. Product Development ONTO hosted a limited edition NFT event with Babylons. Participants used ONTO to take part in a chance to win rewards. About 300 people participated in the event and all NFTs sold out in 10 minutes. ONTO hosted AMAs with MOBOX, SIL.Finance and Pinecone. The activities have all been successfully completed, with a total of about 3,000 participants. ONTO hosted a limited edition NFT event with SOTA Finance; the number of participants exceeded 5,000, and all NFTs were sold out within 30 minutes. On-Chain Activity 121 total dApps on MainNet as of October 11, 2021. 6,784,648 total dApp-related transactions on MainNet, an increase of 46,268 from last week. 16,536,539 total transactions on MainNet, an increase of 119,670 from last week. Community Growth Ontology published the next couple of interviews in its Harbinger Series. SoloTürk from our Turkish Community described the Ontology community with the key words “Active, Quality, & Technology” and he hopes to make his own contribution to the development of the shared global community. We held our weekly Discord Community Call, led by Humpty Calderon, our Head of Community. He introduced the importance of Ontology’s EVM with high compatibility and Ontology’s extensive support for developers. We held our new Telegram weekly Community Call, led by Astro, an Ontology Harbinger from our Asian community. He introduced the latest deployment progress of Ontology EVM. We held our weekly DeID Summit, in which we spoke about NFTs. More than 50 participants discussed the different types of NFTs and their composability with DeFi. As always, we’re active on Twitter and Telegram where you can keep up with our latest developments and community updates. Global News Ontology, in collaboration with AP. LLC, announced that our jointly developed blockchain-based anti-falsification program has been linked to the inventory management system of ZAICO. Clients will be able to use blockchain-based, tamper-evident, accurate and traceable inventory management with unprocessed inventory data output from the ZAICO system at a significantly lower cost. Li Jun, Founder of Ontology, was invited to participate in the exclusive interview Focus held by Cointelegraph China. He spoke about his history with crypto, why he founded Ontology, and his thoughts on the future of blockchain and Web3. Ontology published #EVM 101 on Twitter, introducing some basic information about the Ontology EVM and its benefits. Ontology has achieved full compatibility with the Ethereum ecosystem. Developers can directly use EVM development tools on the Ontology TestNet for dApp deployment. Ontology in the Media

BeInCrypto — Third Time’s A Charm: Establishing Secure Data and Identities In Web3

The internet is currently in a liminal phase. It is sitting on the precipice of changes that will shake up human life in ways that we could have only imagined ten years ago. Web3 is the internet’s third iteration. It is seeing the digital sphere become more open-source.

As a highly interoperable chain network and the multi-virtual machine solution, Ontology is equipped for the coming transition to Web3. Whether you are a senior blockchain developer or a traditional web developer, you can quickly become familiar with the Ontology development environment, easily deploy blockchain applications, and enjoy fast block confirmation speeds and low deployment costs.

The Defiant — How Decentralized Identity Can Take Crypto to the Next Level

Over the past year, there has been a notable increase of awareness and interest towards decentralized identity solutions. Decentralized identity solutions help users, amongst other things, to control their digital identity without the input of intermediaries. As well as the individual user benefits, decentralized identity solutions have the potential to create seamless, accessible, and verifiable ecosystems. Decentralized identity holds the potential to solve many issues across the DeFi sphere and more.

Ontology brings trust, privacy, and security to Web3 through decentralized identity and data solutions. They’ve developed a comprehensive identity infrastructure including an identity wallet that allows you to easily create digital identities, a composable, cross-chain reputation protocol, and a DeFi protocol that leverages your decentralized identity and self-sovereign reputation to enable undercollateralized loans.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Other Resources

LinkedIn / Medium / Facebook / Reddit / DiscordYouTube

Ontology Weekly Report (October 1–11, 2021) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Auth0 Identity Platform Is Now Available on Microsoft Azure

Take advantage of cloud provider flexibility, data residency options, and a streamlined experience for your identity needs with a private cloud deployment
Take advantage of cloud provider flexibility, data residency options, and a streamlined experience for your identity needs with a private cloud deployment

Coinfirm

Powering the mass adoption of blockchain with Coinfirm’s leading AML Risk Management Platform

Zug, October 2021. BV Ventures announces a follow-on investment in its portfolio company Coinfirm, the leader in RegTech for digital currencies and the blockchain-based financial ecosystem. BVV participated in the second tranche of the current USD8M Series A fundraise, co-led by SIX Fintech Ventures and FiveT Fintech, with MiddleGame Ventures, Mission Gate and CoinShares participating...
Zug, October 2021. BV Ventures announces a follow-on investment in its portfolio company Coinfirm, the leader in RegTech for digital currencies and the blockchain-based financial ecosystem. BVV participated in the second tranche of the current USD8M Series A fundraise, co-led by SIX Fintech Ventures and FiveT Fintech, with MiddleGame Ventures, Mission Gate and CoinShares participating...

Okta

From Ops To Advocacy

Hi, I’m edunham, and I’m an infra person. Our buzzwords include DevOps, Cloud Operations, and Site Reliability Engineering, though if you ask any two of us about the difference between those terms, you’ll get at least three answers. Whatever name it’s wearing, doing ops well requires not only the “how” of changing systems, but also the “why” of making the right changes. Although I’ve always loved

Hi, I’m edunham, and I’m an infra person. Our buzzwords include DevOps, Cloud Operations, and Site Reliability Engineering, though if you ask any two of us about the difference between those terms, you’ll get at least three answers. Whatever name it’s wearing, doing ops well requires not only the “how” of changing systems, but also the “why” of making the right changes. Although I’ve always loved helping spread the information that my colleagues need in order to build the systems that work best for them, I’ve chickened out of taking the leap into pure advocacy many times, before finally diving into it here at Okta!

I got my start in open source and the tech industry as a student software developer and sysadmin at the OSU Open Source Lab. The OSL gives students hands-on expertise with skills essential to those careers, and there weren’t enough of us to satisfy all the companies who wanted to hire us as interns! To help address this problem, I created the first iteration of our DevOps Bootcamp training program. I was amazed by how good it felt to take information I was lucky enough to have special access to and share it with others who found it useful. When looking at where to steer my career after university, I considered building on my growing resume of talks to pursue evangelism roles, but it wasn’t time yet. I wanted to go out and apply my skills at scale, rather than just talking about them.

As ops for the Rust and Servo teams at Mozilla, I was again able to play the positive-sum game of sharing the most interesting and useful insights I learned from my unique access to fantastic engineers. But as the sysadmin and speaker roads diverged in that forest of mercurial trees, I stuck with the ops route.

Missing the conference circuit as growing responsibilities meant less time for talks, I started working on a transition toward advocacy in early 2020. Faced with a choice between making the switch then or building on my ops skills for a while longer, the engineer in me won out. Switching fields, into a field that was frantically reinventing itself at the time, was just too many simultaneous changes to roll out to prod without any staging environment or CI available.

Okta’s developer advocacy team is in the right place at the right time for me. I get to use my background to help useful information flow out of the company to our community, as well as bringing the community’s observations and insights back to contribute to internal decision-making. Our product has made my life better as an engineer whenever I’ve worked with it, increasing my knowledge about how to do security well.

I’ve been working remotely since 2018, and I put the time that I would otherwise spend commuting into gardening and an assortment of maker projects, which I sometimes mention on my blog. I’m a volunteer firefighter and emergency medical responder with my local fire department, and I play tuba and sousaphone poorly but enthusiastically on the now-rare occasions when we can get enough people together to make some music.

You can find me on GitHub and some IRC networks as edunham, and on Twitter and LinkedIn as qedunham, or email me, e.dunham@okta.com!


Credify

GenZ và Millennials – Thế hệ định hình Fintech Việt Nam

Millennials và Gen Z đã luôn là đối tượng khách hàng trung tâm mà... The post GenZ và Millennials – Thế hệ định hình Fintech Việt Nam appeared first on Credify.

Millennials và Gen Z đã luôn là đối tượng khách hàng trung tâm mà Fintech xoay quanh để phát triển. Song, đặt tại bối cảnh của Việt Nam, hai thế hệ này mang vai trò mạnh mẽ hơn cả, đòi hỏi các Fintech startup và cả các doanh nghiệp tài chính truyền thống liên tục thích nghi để đáp ứng các nhu cầu của họ. Bài viết này để cập đến tầm quan trọng của gen Z & Millennials đối với Fintech tại Việt Nam, lối suy nghĩ về tài chính của họ, cũng như đưa ra các chiến lược cho doanh nghiệp xây dựng lòng trung thành từ hai nhóm khách hàng này.

GenZ và Millennials trong bối cảnh Việt Nam

Gen Z và Millennials lần lượt là các thế hệ sinh trong khoảng năm 1997 đến 2012 (gen Z) và 1981 đến 1996 (Millennials)[1]. Với độ tuổi từ 9-40 (năm 2021), họ sẽ sớm chiếm phần lớn lực lượng lao động, mang trong mình sự khác biệt đột phá nhờ sinh ra và lớn lên trong thời đại số hóa. Nhờ chất lượng và năng suất lao động vượt trội, cũng như điều kiện kinh tế thuận lợi, gen Z và Millennials cũng được dự đoán sẽ có mức thu nhập cao hơn và trở thành những người tiêu dùng cực kỳ tiềm năng.

Việt Nam, với nhiều tiềm năng về thị trường và nguồn lực, đang trong đà phát triển Fintech mạnh mẽ, thu hút 36% vốn đầu tư của khu vực Đông Nam Á năm 2019, chỉ sau Singapore, theo báo cáo của PWC[2]. Song, Việt Nam vẫn là một đất nước phụ thuộc nhiều vào tiền mặt và có nhiều người chưa có tài khoản ngân hàng (69%, theo Merchant Machine UK, 2021[3]), nên vẫn còn nhiều thị trường cho Fintech khai phá. Với bối cảnh thị trường đó, giới trẻ tại Việt Nam mang vai trò định hình mạnh mẽ hơn cả cho thị trường dịch vụ tài chính, so với các thị trường fintech đang dần bão hòa tại các nước đang phát triển.

Những thế hệ có lối đi riêng trong hành trình tìm kiếm tự do tài chính

Dù có nhiều sự khác biệt nhất định giữa gen Z và Millennials, họ vẫn có nhiều điểm tương đồng nhất định mà doanh nghiệp có Gen Z và Millennials tại Việt Nam là một thế hệ đặc biệt. Họ chứng kiến một Việt Nam “chuyển mình” nhanh chóng, từ thời kỳ bao cấp đến công cuộc Đổi Mới và nền kinh tế năng động, với tăng trưởng GDP hằng năm lên đến 7%. Và ở hiện tại, họ đang chứng kiến những sự biến động của Việt Nam trước đại dịch. Những chuyển biến đó ít nhiều ảnh hưởng đến tâm lý của Gen Z và Millennials.

Lớn lên trong một Việt Nam đầy biến động, Gen Z và Millennials có động lực tìm cho mình sự an toàn về mặt tài chính. Nhờ có mạng internet, họ được tiếp cận nguồn thông tin phong phú, và mở ra đa dạng hơn những cách kiếm thêm thu nhập bên cạnh những công việc “9 to 5” – những công việc truyền thống 9 giờ một ngày vốn quen thuộc với các thế hệ trước. Ngoài ra, đây cũng là thế hệ quan tâm nhiều hơn về đầu tư như cổ phiếu, bất động sản, tiền ảo (cryptocurrency),… tạo cơ hội cho Fintech bùng nổ.

Sự thay đổi quá nhanh chóng trong bối cảnh kinh tế xã hội cũng góp phần làm tăng lên khoảng cách thế hệ giữa Gen Z, Millennials và các thế hệ trước. Các phương thức kiếm thu nhập mới của họ thường liên quan đến không gian internet và rất khó để ghi lại theo các phương thức truyền thống. Đây cũng là một rào cản lớn khi hai thế hệ này muốn tham gia các dịch vụ tài chính truyền thống để đầu tư cho những nhu cầu của mình.

Chân dung khách hàng của các doanh nghiệp đã trở nên rất khác. Tư duy truyền thống đã không còn áp dụng được cho Gen Z và Millennials.

Cách nào để Fintech xây dựng niềm tin và lòng trung thành trong một thế giới đầy biến động

Cũng vì là thế hệ sống trong thời đại của thông tin mà gen Z và Millennials cũng là các khách hàng khó giữ chân. Theo Kasasa, 83% Millennials thừa nhận họ sẵn sàng đổi dịch vụ ngân hàng hiện tại để đổi lại các lợi ích cung cấp bởi các ngân hàng khác[4]. Là thế hệ sinh ra trong thời đại công nghệ, với khoảng thời gian tập trung ngày càng ngắn, gen Z và Millennials ngày càng trông đợi sự tiện lợi và dễ dàng trong các hoạt động hằng ngày, bao gồm cả hoạt động tài chính, vốn vẫn có nhiều thủ tục nhiêu khê.

Từ gamification (game hóa) đến liên tục tạo nhiều chương trình ưu đãi hấp dẫn, mỗi doanh nghiệp có một chiến lược riêng để giữ chân khách hàng gen Z & Millennials. Những xu hướng nhìn chung có thể thấy như sau:

Trải nghiệm liền mạch, dễ dàng

Liên tục cải tiến UI/UX là một điều kiện cần của dịch vụ tài chính thời đại số. Khách hàng Gen Z và Millennials cần những dịch vụ tài chính an toàn, tiện lợi để phục vụ cho nhu cầu của họ. serviceX giúp kết nối các sản phẩm tài chính đến khách hàng khi họ thanh toán cho các dịch vụ đời sống, với thao tác đăng nhập dễ dàng thông qua Open ID Connect.

Tạo giá trị bền vững

Ngoài những mối quan tâm về các “hot trend” (xu hướng), gen Z và Millennials còn quan tâm sâu sắc đến việc tạo giá trị bền vững cho cộng đồng. Vì thế, để tạo sự gắn kết với khách hàng trẻ, các doanh nghiệp dịch vụ tài chính cần quan tâm đến những vấn đề như: tài chính toàn diện, giảm nghèo đói và bất bình đẳng, công bằng xã hội,…

Tận dụng dữ liệu lớn (big data)

Khi các hoạt động hằng ngày của gen Z và Millennials diễn ra nhiều hơn trên không gian mạng, doanh nghiệp cần có những cách tiếp cận mới để hiểu khách hàng của mình. Thành công trong việc hiểu được hồ sơ kỹ thuật số (digital identity) và tận dụng những insight từ dữ liệu lớn sẽ giúp doanh nghiệp có cái nhìn sâu sắc hơn về khách hàng của mình, từ đó dễ dàng có chiến lược chinh phục khách hàng hiệu quả.

Minh bạch trong quản lý

Theo khảo sát của Salesforce, chỉ 55% Millennials và 44% Gen Z cảm thấy thoải mái với cách các công ty đang sử dụng thông tin của họ[5], phần nào phản ánh sự thiếu tin tưởng của họ với các công ty, đặc biệt là các công ty Fintech với nhiều dữ liệu nhạy cảm như số tài khoản ngân hàng, thông tin giao dịch,… Thực trạng này đòi hỏi các công ty Fintech cần minh bạch hơn trong việc lưu trữ và quản lý thông tin của người dùng, cũng như tăng cường các biện pháp bảo mật thông tin. 

Đa dạng hóa các sản phẩm dịch vụ

Các startup Fintech nổi bật với nhiều giải pháp đột phá, nhưng đồng thời bị giới hạn trong số lượng sản phẩm có thể cung cấp cho người dùng. Ngược lại, các tổ chức tài chính truyền thống có đa dạng sản phẩm nhưng lại thiếu tính đột phá để thu hút các khách hàng gen Z và Millennials. serviceX có thể giúp cả hai nhóm doanh nghiệp này kết nối với hệ sinh thái đối tác của Credify, giúp họ tiết kiệm nguồn lực và tập trung vào việc phát triển sản phẩm, đồng thời có thể sáng tạo các giải pháp mới dựa trên hồ sơ của khách hàng trên các nền tảng khác nhau.

Nguồn tham khảo

[1]: Insider Intelligence, Generation Z facts https://www.insiderintelligence.com/insights/generation-z-facts/ 

[2]: UOB group, From Startups to Scale-up Report, 2019. https://www.uobgroup.com/?redirectTrk=/techecosystem/pdf/UOB-From-Start-up-to-Scale-up.pdf 

[3]: Merchant Machine UK, The Countries Most Reliant on Cash, 2021. https://merchantmachine.co.uk/most-reliant-on-cash/ 

[4]: Kasasa, What motivates Millennials to switch financial institutions? 2019. https://www.kasasa.com/hubfs/K.com/Redirect%20Docs/documents/executive-summary/Nielsen_Executive_Summary_2016.pdf [

5]: Salesforce, How Millennials and Gen Z are different. https://www.salesforce.com/blog/how-millennials-and-gen-z-are-different/

The post GenZ và Millennials – Thế hệ định hình Fintech Việt Nam appeared first on Credify.

Tuesday, 12. October 2021

KuppingerCole

Verifiable Credentials: A Fresh Approach to Identity in the Digital Era

Establishing a verified digital identity is crucial to successful business collaboration and customer engagement in the digital economy. Verifiable Credentials provide a highly secure way of establishing digital identity. However, knowing exactly how to begin using this approach can be challenging.

Establishing a verified digital identity is crucial to successful business collaboration and customer engagement in the digital economy. Verifiable Credentials provide a highly secure way of establishing digital identity. However, knowing exactly how to begin using this approach can be challenging.




Evernym

Does the W3C Still Believe in Tim Berners-Lee’s Vision of Decentralization?

The World Wide Web Consortium (W3C) is facing a critical decision on decentralized identifiers (DIDs) that may come down to a democratic vote on how much decentralization really matters. We've analyzed what it means and why it matters. The post Does the W3C Still Believe in Tim Berners-Lee’s Vision of Decentralization? appeared first on Evernym.

The World Wide Web Consortium (W3C) is facing a critical decision on decentralized identifiers (DIDs) that may come down to a democratic vote on how much decentralization really matters.

We've analyzed what it means and why it matters.

The post Does the W3C Still Believe in Tim Berners-Lee’s Vision of Decentralization? appeared first on Evernym.


Global ID

GiD Report#181 — The future will be self sovereign

GiD Report#181 — The future will be self sovereign Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. This week: Understanding the Sovereignty Stack The problem with Tether Chart of the week Stuff happens 1. Understanding the Sovereignty Stack The
GiD Report#181 — The future will be self sovereign

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

This week:

Understanding the Sovereignty Stack The problem with Tether Chart of the week Stuff happens 1. Understanding the Sovereignty Stack

The elegance of Bitcoin is that it was the right idea at the right time. A confluence of pre-existing technologies, the concept of a decentralized ledger pushed all the right buttons in the wake of the 2008 financial crisis. From big bank bailouts to quantitative easing, here was a bottom up solution that reframed how we thought about value — value that was innately digital in the spirit of the internet.

Just as the World Wide Web empowered people to connect and share knowledge and information, the rise of Bitcoin taught us that we could have direct ownership over our valuable assets and payments — no middle man or central operator necessary.

Twelve years after Satoshi’s white paper, Bitcoin and crypto at large are at a crossroads. With a market value north of $2 trillion, crypto is impossible to ignore — with much of the participation coming from grassroots retail investors. That’s the thing about crypto. Everyone gets a seat at the table. You don’t need to be an accredited investor. You don’t need to wait for an IPO — after all the insiders already benefited from the initial upside. All you need is a non-custodial wallet that you control. (And, of course, with opportunity and access comes responsibility and risk.)

But with maturation comes a new set of expectations. Regulators and the powers that be are paying attention.

It would be easy to argue that much of Coinbase’s success in the American market is due to the fact that they’ve always taken compliance seriously — stuff like proper KYC and AML protocols (Know Your Customer and Anti-Money Laundering).

Just as the WWW made IP addresses humanistic with domain names, our blockchain public/private keys will ultimately be tied to an identity — our names. For the burgeoning DeFi space, that will be a likely next step. For crypto to grow up, we need to establish identity and trust.

But that’s also where the elegance of Bitcoin, the internet, and the decentralized movement comes back into play.

Just as you own and control the BTC in your noncustodial wallet, so should you control your identity and the data behind it.

We should have ownership over identity, our communications, and our money. That’s the Sovereignty Stack.

To learn more about the self-sovereign future, check out this recent METACO interview with GlobaliD co-founder and CEO Greg Kidd.

Here’s /gregkidd:

There’s an expression in philosophy called a bundle of sticks. And our identity is really that — that bundle of sticks. It can even include things like: I own a bicycle that I really identify with. And so it’s all those attributes that persist over time. If you can’t prove them, and if you can’t control them, then you’re really not in control of your identity. If you have to ask someone else for permission to prove or establish your own identity, once again, you’re not really in control of your identity.
So when we talk about digital identity and self sovereignty, we’re talking about a form of identity that you can control. That also should be a form that’s interoperable. It shouldn’t live within one app. You want to be able to take those credentials to any other platform. We’re not anti-company or anti-government, but any company or government that’s willing to recognize those credentials should be able to identify you, authenticate you, and then authorize you to do the things you want to do.

The full interview:

The Sovereignty Stack: Re-thinking Digital Identity for Web3.0 w/ Greg KIDD [METACO TALKS #23] Relevant: What you need to know from the Facebook whistleblower hearing Via /j — Bitcoin Will Become Currency in Brazil Soon, According to Federal Deputy Aureo Ribeiro — Bitcoin News Via /j — Meet The World’s Richest 29-Year-Old: How Sam Bankman-Fried Made A Record Fortune In The Crypto Frenzy Via /antoine — City police investigate privacy breaches involving PORTpass vaccination proof app Via /antoine — Roblox to add opt-in age verification for players and developers — TechCrunch Web 3 Is Where the Young People Are 2. The problem with Tether

If we ever needed an example why regulations might be a good idea, we wouldn’t have to look further than Tether.

From Bloomberg’s Zeke Faux’s investigative report:

After I returned to the U.S., I obtained a document showing a detailed account of Tether Holdings’ reserves. It said they include billions of dollars of short-term loans to large Chinese companies — something money-market funds avoid. And that was before one of the country’s largest property developers, China Evergrande Group, started to collapse. I also learned that Tether had lent billions of dollars more to other crypto companies, with Bitcoin as collateral. One of them is Celsius Network Ltd., a giant quasi-bank for cryptocurrency investors, its founder Alex Mashinsky told me. He said he pays an interest rate of 5% to 6% on $1 billion in loans from Tether. Tether has denied holding any Evergrande debt, but Hoegner, Tether’s lawyer, declined to say whether Tether had other Chinese commercial paper. He said the vast majority of its commercial paper has high grades from credit ratings firms, and that its secured loans are low-risk, because borrowers have to put up Bitcoin that’s worth more than what they borrow. “All Tether tokens are fully backed,” he said.

It’s also worth checking out Matt Levine’s take.

Relevant: Looking for Tether’s Money Stablecoins to face same safeguards as traditional payments 3. Chart of the week Relevant: Telegram says it added 70M users during day of Facebook and WhatsApp outage — TechCrunch Facebook Whistleblower Frances Haugen testifies before Senate Commerce Committee Facebook stock tanks as troubles mount Whistleblower testimony, global outage brew up political storm for Facebook The long list of Facebook’s insiders-turned-critics 4. Stuff happens Colombian Fintech Movii Raises $15M in Series B Round ‘Fantasy equity’ NFT game wants you to spend real money buying fake shares of real startups — TechCrunch Anyone Seen Tether’s Billions? Tech feels labor market crunch EXCLUSIVE Apple to face EU antitrust charge over NFC chip — sources Manifold Announces Verified NFT Studio, Funding from a16z and Initialized — Decrypt Via /jvs — Amazon Prime members can now send gifts with just a phone number or email address MoneyGram Partners With Stellar and USDC for Blockchain-Based Payments WisdomTree CEO: Blockchain Could Disrupt Financial Industry Like ETFs Did — Blockworks Andreessen Horowitz’s Haun: Competition for Crypto Equity Deals ‘Has Intensified’ Andreessen Horowitz Values Developer of NFT Game Axie Infinity at $3 Billion

GiD Report#181 — The future will be self sovereign was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Configuring PostgreSQL as Auth0 Custom Database

Connect Auth0 to PostgreSQL to create and maintain your own user store.
Connect Auth0 to PostgreSQL to create and maintain your own user store.

Coinfirm

SHIB, GRT and 4,000+ Tokens Added to Coinfirm’s AML Platform

4059 new tokens have been added to Coinfirm’s AML Platform, bringing the overall total supported to ~5500. Most of the tokens added are Ethereum based (ERC-20 standard) as well as several dozen from the Tron chain (TRC-20). Some of the most widely traded tokens that have been added include; Shiba Inu (SHIB), The Graph (GRT), Liquid...
4059 new tokens have been added to Coinfirm’s AML Platform, bringing the overall total supported to ~5500. Most of the tokens added are Ethereum based (ERC-20 standard) as well as several dozen from the Tron chain (TRC-20). Some of the most widely traded tokens that have been added include; Shiba Inu (SHIB), The Graph (GRT), Liquid...

KuppingerCole

Nov 23, 2021: Engineering Successful IAM Projects to Support Digital Business

In the digital era, traditional approaches to Identity and Access Management (IAM) are ineffective. IAM is no longer just about managing employee identities, but about managing all kinds of identities across a wide variety of use cases. This requires a comprehensive and agile approach.
In the digital era, traditional approaches to Identity and Access Management (IAM) are ineffective. IAM is no longer just about managing employee identities, but about managing all kinds of identities across a wide variety of use cases. This requires a comprehensive and agile approach.

CSLS Speaker Spotlight: Oliver Carr on Maximizing the Value of Security

by Fabian Süß Oliver Carr, cybersecurity evangelist and strategist will discuss the Maximizing the Value of Security on Wednesday, November 10 from 12:00 pm to 12:20 pm at Cybersecurity Leadership Summit 2021. To give you a sneak preview of what to expect, we asked Oliver some questions about his presentation. Could you give us a sneak peek into your keynote: “From Burden to Benefit - How

by Fabian Süß

Oliver Carr, cybersecurity evangelist and strategist will discuss the Maximizing the Value of Security on Wednesday, November 10 from 12:00 pm to 12:20 pm at Cybersecurity Leadership Summit 2021.

To give you a sneak preview of what to expect, we asked Oliver some questions about his presentation.

Could you give us a sneak peek into your keynote: “From Burden to Benefit - How Aligning on Business Purpose and Objectives Is Critical to Maximize the Value of Security”?

I've spent the last years working very closely in the digital transformation of many large organizations, and a topic that keeps on coming up is that the security function is seen as a necessary evil at best, and a total hindrance at worst. The question is, what does security actually provide to the business? And does the business understand the value that security tools, security processes and security methodologies can bring? What we will look at in this session is how to align what security can bring to the table, but also to understand what the understanding of the other players in the game is, so that the security as a function and the security as a unit within the organizations can move forward and show how they can provide a benefit to the business and therefore move from just being a necessary cost factor to actually providing business benefit.  

How can companies manage the competing demands of minimizing the risk of security breaches and maximizing the opportunities of agile, digital transformations?

Both security and digital transformation want the same thing at the end of the day. They want to make sure that the business is successful and that it can develop better than the competition. The ways of doing this are different, though, and especially the approach towards risk is different between security, who tend to try to minimize risk and digital transformation, who willingly take risks in order to move ahead of the competition. Bringing those two views together and understanding the necessity of being risk aware. And to decide in a specific case whether it is better to be on the side of caution or whether it is the time to be bold, to move forward is something that only works if there is an open dialog of understanding between those functions. And that is what companies need to achieve going forward if they want to be successful.

How can aligning on business purposes and objectives help to maximize the value of security?

Security is very often seen as that cost factor, as that necessary evil. If we want to show the benefit that security can bring, it needs to be brought in terms of what the company actually produces, what the organization produces in the way of products or services. Only by aligning that, can the amount of effort and the amount of money that we need to put into security to achieve that risk appetite that the company is willing to take, really acceptable, not just to the security organization, but to the company overall, and thereby accepting security as part of the solution and not as part of the problem.  

What do you see as the biggest challenge for Security to becoming recognized as a value-adding function?

Security, in many cases, has been seen as this absolute - what one of my colleagues calls the “Ministry of No”. You're either secure or you are insecure. It is always this question of you're either with us or you are against us. The way we need to move forward, is to accept that between those two extremes, there is a middle ground. It's not an either - or. It's actually an also - and. And if we can move in that area together with the people who within our organizations who are driving that digital transformation, who are driving product innovation and identifying where that common ground is, and to stride down that road together, then we can be successful. And that, to me, is the biggest challenge that we are facing today towards making security a business benefit rather than just a business burden.


IDnow

At IDnow, the working future is human

At IDnow our employees are family, and to us, this means ensuring they should feel safe, productive, inspired, and energised in their work life. This of course, has become a lot more important during the global corona pandemic since processes became digitalised and engaging in the digital world has become the new normal. At IDnow, […]

At IDnow our employees are family, and to us, this means ensuring they should feel safe, productive, inspired, and energised in their work life. This of course, has become a lot more important during the global corona pandemic since processes became digitalised and engaging in the digital world has become the new normal. At IDnow, we understand the need for a seamless online experience, and so it has become our number one priority to create the best working models for all employees.

Since IDnow introduced the initial remote working policy in early 2020, the benefits have by far outweighed any criticisms: “Some of the IDnow employees were sceptic in the beginning”, told Jacqueline Arlt, Recruitment Business Partner at IDnow. “But after a few months the first colleagues started asking if they can keep working remotely, even after the pandemic. They told us about the benefits they experience and the great work-life balance.” she adds. As working from home became the new normality and employees began seeing the advantages, IDnow introduced a long-term remote policy and offered employees in administrative roles the possibility to choose remote or partially remote working models.

Employees can now decide how and where they are working from, as we believe effective work is not based on the hours they spend at a desk. “We strongly believe that the future of working will be more and more diverse and international and as a global and digital company, offering a flexible way of working without a location-boundary will be the future” says Jacqueline Arlt.

IDnow offers three ways of working for their employees, and everyone who’s role allows it can choose which one suits them best. Employees can decide if and how often they come to the office. Besides from 100% office based, fully and partially home office possibilities, IDnow also offers the possibility to go on a “workcation”. Imagine you travel to see your family in a different country and would like to stay a little longer than your vacation days allow. This is one of the many possibilities with a “workcation”. This way, IDnow offers our employees the best solution to make them feel comfortable, be creative and motivated and have the best possible work-life integration.

You may ask, why is this so important to IDnow? Because we believe that the future is human, and that even the most digital product and company is based on the motivation and happiness of each employee. People are happier and deliver better results in a comfortable and familiar environment. “It’s one of the most important things for us, to make sure every employee at IDnow has the opportunity to maintain a happy and healthy lifestyle, a good work-life-balance and the opportunity to find inspiration in the things they love.” says Jacqueline Arlt.

Hear what some of our employees have to say about their workcation experience:

“With the workcation option, I was able to spend 2 months with my family in Italy after the Covid-restrictions got eased.” says Francesca Palmeri, Head of Controlling at IDnow.

“I live in Estonia with my family and work fully remote at IDnow – that offers a great work life balance for me – especially since I’m a dad for a couple of months now. I have a great connection with my team, and we work together very productively.” Pedro Gallardo, Director Product Management.

Monday, 11. October 2021

Identosphere Identity Highlights

Identosphere #53 • IIW33 Tomorrow • SSI Ecosystem of Ecosystems • Self Sovereign Objects

Just over a year since we began this weekly review. Incredible! Thanks for sticking with us!
This weekly newsletter is possible thanks to Patrons, like yourself.

Just over a year since we began this weekly review. Incredible! Thanks for joining and sticking with us!

Consider paying us a small amount each month via Patreaon

Support our work on Patreon — Get Exclusive Content!!

Read previous issues and Subscribe : newsletter.identosphere.net

Contact \ Content Submissions: newsletter [at] identosphere [dot] net

Upcoming Events Internet Identity Workshop 33 • Starts Tomorrow! - Tues-Thurs

Ceramic's Sovereign Data Hackathon • 10/7-21
Ceramic is live on mainnet 🚀👩🏽‍🚀🍾🥂 [...] a chance at over $10,000 in bounties. 

‘Identity Matters’: Meeco at IDNext ’21 • 10/11-12 • Katryna Dow will speak on day two, giving a session on the challenges ahead in the identity space.

Self-Sovereign-Identity & eIDAS a contradiction? Challenges and chances of eIDAS 2.0 University of Murcia/Alastria 10/19

The Business Models Made Possible By Economic Incentives 10/19

Authenticate Virtual Summit Recap and looking forward Authenticate 2021 • 10/18-20 • Fido Alliance

High Level New: Microsoft's 5 guiding principles for decentralized identities Business What Your Customers Really Want From Your Login Box Auth0

convenience and control: they want to choose which authentication method to use – whether it’s MFA or SSO or biometrics. They want a brand experience that resembles a concierge desk: a 24/7 service where no demand is too big. To top it off, they don’t want to see any technical glitches

Where the Intention Economy Beats the Attention Economy

There’s an economic theory here: Free customers are more valuable than captive ones—to themselves, to the companies they deal with, and to the marketplace. If that’s true, the intention economy will prove it.

The Future Now Problem Continuum Loop

When the idea of what is available right NOW is blurred among the ideas of the FUTURE, you can’t differentiate between what is feasible TODAY from what is not ready for prime time.

Development Dangling Domain From SDK Installed in 150+ Apple Apps Putting Kids, Families and Crypto Traders at Risk

TLDR: The Me2B Alliance believes apps including the AskingPoint SDK should be safe from malicious redirects or other exploits.

The next architecture for building Web3 data apps Ceramic

We're replacing the popular IDX runtime with a more powerful set of tools for building applications on Ceramic including DID DataStore, DataModels, and Self.ID.

Organization News A YEAR IN REVIEW: NEW BEGINNINGS AND SUCCESSES TOIP

The TSWG provides guidance and specifications that support the ToIP 4-layer model from a technical standpoint.

Launching the Global Assured Identity Network (GAIN) with Elizabeth Garber UbiSecure

fills us in on what the GAIN project is, explaining how it’s different from other trust networks and why GAIN is good for financial institutions. She also discusses the role of the Global Legal Entity Identifier Foundation (GLEIF) in the project, and what’s next for GAIN.

Kay Chopard Talks Digital Security, Diversity, and Business Advice Kantara - DigitalID NZ

Chopard’s vision for Kantara and the digital security world, her role in supporting diversity and inclusion, and other topics.

eSSIF-Lab ecosystem: the 2nd Business-oriented Programme participants

Blockchain Certified Data • Academic VCs

Upstream Dream • The LHS project

Mopso • Amlet

Credenco • Digital Certificate of Good Conduct

Stichting Cherrytwist • Decentralized Open Innovation Platform

Truu Ltd • Healthcare Professionals Digital Staff Passport

Fair BnB Network • Società Cooperativa Stay Fair, Play Fair – a co-operative habitat for music

ZENLIFE SARL-S • Zenlife eConsent

LearningProof • HonorBox-SSI

WorkPi • Work Performance Intelligence

yes.com • European Bank Identity Credentials

MYDATA TAIWAN – FROM THEORY TO PRACTICE WITH AWARD-WINNING PERSONAL HEALTH APP

MyLog/LogBoard pulls together health information on temperature, sleep, heart rate and more into a single place that can be shared with doctors and medical staff. […] Data is held on mobile devices and not in the cloud and can be shared with a one-off URL that wipes all data after 72 hours.

What bridging the $81bn trade finance gap could mean for Africa with Barry Cooper from Cenfri GLEIF

we’re catching up with our key partners to hear their thoughts on how the project will bring about greater financial inclusion for SMEs on the continent and beyond.

Company Updates Node Operator Spotlight: IdRamp Indicio

Recently we caught up with […] IdRamp, one of the first companies to become an Indicio Node Operator, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading.

Evernym: September 2021 Release Notes

A fee will be charged on a regular basis to remain an endorser on either MainNet or StagingNet.

In addition to the endorser fee, write fees will now be charged for transactions on Sovrin StagingNet in a similar fashion to the existing fees on Sovrin MainNet.

The current Sovrin Self Serve website will stop being used to become an endorser on StagingNet, and instead endorsers will be charged a fee after registering.

Talking tech and discussing data on the ‘Tech-Entrepreneur-on-a-Mission’ Podcast Digi.Me

Julian describes digi.me’s mission of empowering people with their personal data, as individuals know where all their data is, while they also have “a right for that data”. By having that knowledge and ownership, only individuals have “unlimited usage rights” to unlock the potential data has to be a force for good. 

Corporations, Capital Markets, & the Common Good — How We’re Working to Reorient the Rules and Rebalance Power in Our Economy Omidiyar

as part of our commitment to Reimagining Capitalism, Omidyar Network is committing $10 million to a new focus area: Corporations, Capital Markets, and the Common Good. The vision for this work is to reshape the rules that govern markets to incentivize corporations and their investors to contribute to the common good 

Imageware to add Biometrics to Blockchain Powered Self Sovereign Identity (SSI)

By joining the Decentralized Identity Foundation and Trust Over IP groups, we’ll be able to leverage their network and resources in our efforts to further develop a portfolio of SSI integrated biometric solutions.

DM Note #6 — Building the Spatial Justice Mission DarkMatter Labs

Self-sovereign objects are are self-executing and self-owning; capable of determining their own lifecycle to maximise material utility and performance whilst minimizing negative environmental impacts.

Proving our Point Facebook & Instagram outages expose the pain points of Centralized identity systems Covid \ Healthpass Ugh! There's an App for That! <-Phil Windley on Vaccine certificates. 

Interoperability is a fundamental property of tech systems that are generative and respect individual privacy and autonomy. And, as a bonus, it makes people's live easier!

Building an SSI Ecosystem: Health Passes and the Design of an Ecosystem of Ecosystems Windley 

This post explores the ecosystem of ecosystems that is emerging as hundreds of organizations around the world rise to the challenge of implementing a globally interoperable system that also respects individual choice and privacy.

Wallets SSI Wallet LIst Gimly

Trinsic EsatusLissi IDJolocomSelfKeyConnect.MeiGrantGataca IdentityTalaoAceIDMattrDataKeeperMS AuthenticatorBloomDID Wallet

Winners Circle NGIatlantic.eu third Open Call: applications and winning proposals! Digi.me is a Health Tech Challengers finalist!

Digi.me has been specifically designed to solve the current complexities and challenges around data mobility, which include difficulty of sourcing, variable quality, multiple incompatible formats and the need to apply complex and extensive data analytics to gain insights.

Indicio-SITA Pilot Named 2021 Enterprise Blockchain Award Finalist

Indicio […] and SITA, the leading provider of IT to the air transport industry, today announced they were finalists in the Blockchain Services Award: Tools & Middleware category in the Blockchain Research Institute’s Enterprise Blockchain Awards (EBAs). The partnership was recognized for their work on the Aruba Secure Health Card

Indicio CTO Named 2021 Enterprise Blockchain Award Finalist

Ken Ebert nominated in Blockchain Leadership Award category for vision and leadership in developing interoperable blockchain-based Trusted Data Ecosystems

The Blockchain Leadership awards honor people who have shown exceptional leadership in a blockchain collaboration or implementation within an enterprise, an industry, a government, or a multi stakeholder organization.

Public Sector Early Adopters Programme | Imagining what EBSI can do for European citizens

Each project's private and public sector partners was given early access to the pre-production environment of EBSI, and was invited to develop their own pilot project to address a specific business or government use case involving the exchange of verifiable credentials.

Three Governments enabling digital identity interoperability  Heather Vescent

On September 15, 2021, I moderated a panel with representatives from the United States Government, the Canadian Government, and the European Commission. Below is an edited excerpt from the panel

IDunion: Germany’s Bold SSI Strategy with Hakan Yildiz

What use cases should a National Digital Identity program prioritize in collaboration with the private sector? As use cases become verticals of their own, what are then some of the horizontal considerations that need to be applied to enable all of the use cases to function within their relative ecosystems?

A key place for Identity in the Digital Strategy for Aotearoa

Colin Wallis will now head Digital Identity.nz

Our government is embarking on a journey to create A Digital Strategy for Aotearoa that seeks to respond to the social, economic, education and cultural opportunities from digital technology, along with the risks that these technologies can bring.

Research Online livelihoods and young women’s economic empowerment in Nigeria

1) In what ways might platform work empower women?

2) How can we make platforms work better for women?

Literature  Digital Identities and Verifiable Credentials (PDF)

we discuss the challenges of today’s centralized identity management and investigate current developments regarding verifiable credentials and digital wallets. Finally, we offer suggestions about promising areas of research into decentralized digital identities.

Thanks for Reading!

Read more \ Subscribe: newsletter.identosphere.net

Support this publication: patreon.com/identosphere

Contact \ Submission: newsletter [at] identosphere [dot] net


Tokeny Solutions

T-REX Protocol Recognized as ERC3643

The post T-REX Protocol Recognized as ERC3643 appeared first on Tokeny Solutions.

Product Focus

T-REX Protocol Recognized as ERC3643

Last month, the Ethereum community officially recognized the T-REX suite of smart contracts as ERC3643 – the new official standard for permissioned tokens.

This content is taken from the monthly Product Focus newsletter in October 2021.

Last month, the Ethereum community officially recognised the T-REX suite of smart contracts as ERC3643 – the new official standard for permissioned tokens. Here’s a quick rundown of its main functions.

ERC3643 functionality:

Controllable – as the assets are guaranteed by digital identity, the issuer can recover tokens as long as the investor can prove who they say they are.

Versatile – ERC3643 tokens can represent a multitude of assets including, but not limited to, securities, cryptocurrencies, stablecoins, fiat currencies, commodities and NFTs.

Interoperable – the standard is also compatible with the well known ERC20 standard and is compatible with any Ethereum Virtual Machine (EVM) compatible blockchain, sidechain or Layer 2 solution, such as Polygon.

Secure – the standard has also been audited and confirmed as secure by cybersecurity firm Kaspersky.

 

In short, the ERC3643 provides actors with everything they need to issue assets compliantly utilising decentralized technologies. It enables this through the use of permissioned tokens and digital identities, meaning tokens can only be sent or received based on the verified credentials of investors.

Tokenize securities with us

Our T-REX Platform is built on the top of ERC3643, providing you with an institution-grade and ready-to-use software to seamlessly digitize assets on the decentralized infrastructure in a matter of weeks. 

Contact us Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs T-REX Protocol Recognized as ERC3643 11 October 2021 Gas Tank on Polygon 10 September 2021 DvD Transfers 29 June 2021 Blockchain Layer Enhancement 8 June 2021 T-REX Factory Enhancements 29 March 2021 Security Tokens Conditional Transfer 1 March 2021 Messaging 25 January 2021 ONCHAINID Notifications 30 November 2020 Tokens Recovery 2 November 2020

The post T-REX Protocol Recognized as ERC3643 appeared first on Tokeny Solutions.


MyDEX

The Right Sort of Competition

Last week we completed our submission to the UK Government’s consultation on its proposed new Digital Markets Unit. It was an interesting exercise. What is competition policy for? Competition policy is the area where the belief system called free market economics gets its purest expression. Under this belief system, there is an invisible, mystical, magical entity called ‘the invisible hand’

Last week we completed our submission to the UK Government’s consultation on its proposed new Digital Markets Unit. It was an interesting exercise.

What is competition policy for?

Competition policy is the area where the belief system called free market economics gets its purest expression. Under this belief system, there is an invisible, mystical, magical entity called ‘the invisible hand’ of market forces (e.g. innovation and competition) whereby, if left perfectly free from any ‘interference’ from ordinary mortals (such as policy makers and regulators), always, without fail, automatically produces the best of all possible worlds.

In such a world, the main job of a competition regulator is to intervene if and when there is a ‘market failure’: where for one reason or another, the invisible hand is being stopped from working its magic wonders. Because markets are seen as places where money is exchanged for goods and services, the main test for market failure is the existence of monopolies where, by restricting innovation and competition, monopolists get away with charging excessive prices.

But because companies like Facebook and Google offer their services for ‘free’ — making their money via the different currency of personal data they haven’t been classed as monopolies. So for decades, they have been left free to do pretty much whatever they liked.

Wrestling with reality

To its credit, a few years back, the Competition and Markets Authority looked at reality, rather than operating on the assumption that fairy stories are facts. Its final report, published in early 2021, opened the door to a new approach to regulating ‘digital markets’.

The report recognised that rather than relying solely on ‘ex post’ interventions (e.g. after a market failure has been proved to exist) regulators should be able to intervene ‘ex ante’ (e.g. before problems occur, with policy goals in mind); and that rather than tackling each isolated abuse separately, one by one, they should be able to “address a wide range of concerns holistically”.

Specifically, they identified five key issues which need to be addressed:

Network effects and economies of scale — where a platform’s power increases with the number of people using the platform, making it effectively invincible. Unequal access to user data — “digital platforms collect vast quantities of unique user data, which gives them a significant competitive advantage when providing data-driven services”. Consumer decision making and the power of defaults — where platforms’ control over default settings effectively gives them control over what choices consumers are presented with; which may be used to “influence the platform’s ability to collect users’ data”. Lack of transparency around complex decision-making algorithms. The importance of ecosystems — where large collections of integrated, complementary products and services are designed in a way that favours the firm’s own services. What we said

The Government’s consultation was on the scope and powers of the resulting new ‘Digital Markets Unit’. Most of our response was devoted to questioning the Government’s apparent assumption that by definition, all ‘innovation’ and all ‘competition’ is a good thing. It’s just not so. For example, fraudsters constantly innovate in very clever ways to con people out of their money more effectively. Improved competition in the market for the modern slave trade is a bad thing, not a good thing.

Our particular focus was innovation and competition around the issue of ‘unequal access to user data’. Here, we pointed out, there are two possible types of innovation and competition. The first is innovation and competition around who is most efficient and effective at hoovering up individuals’ personal data and monetising it to line the pockets of their shareholders. The second is innovation and competition around who is most efficient and effective in enabling individuals to access and use their own data for their purposes.

In other words, the issue is not ‘innovation’ and ‘competition’ in the abstract, but what sort of innovation and competition the Digital Markets Unit chooses to ‘holistically’ address. One of the core suggestions we made therefore is that the DMU should be required to develop a set of criteria — open to public debate and scrutiny — that defines whether any particular innovation is desirable and therefore to be encouraged, or not.

Unfortunately, the way the UK Government’s questions were framed in this consultation give the impression that the only market it really has in mind is that of more efficient, effective data monetisation and exploitation. Operating on this assumption would take us back to where we were before the CMA wrote its report.

We did our best to explain why this would be a terrible idea; why empowering individuals with their own data is the only sensible, fair and economically efficient way forward. This issue lies at the heart of everything the Government does in this space, including the shape and content of its new National Data Strategy. It has yet to be resolved. And it’s not going to go away.

Joining the dots

This is all the more important given the proliferation of other consultations and initiatives currently under way, all of which raise the same core question. These include:

Data: A New Deal consultation to reform data protection regulations National Data Strategy Single Login for Government National Identity and Attributes Trust Framework Open Finance Pensions Dashboard programme Modernisation of Lasting Power of Attorney NHS consultation on data sharing Scottish Attribute Provider Service

Is there anyone in Government joining the dots between these different initiatives? If not, is there any chance of the UK ever developing a coherent, positive strategy for personal data?

The Right Sort of Competition was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto Regulatory Affairs: The US Justice Department Sets up Inter-Agency National Enforcement Unit

🇺🇸  Meet the Crypto-Enforcers: The US Justice Department Sets up Inter-Agency National Enforcement Unit Washington DC has been seized by a crypto-flurry of regulatory and law enforcement responses to illicit and criminal activities, and the US Justice Department (DoJ) is now growing significant teeth to effectively enforce the US regulatory and compliance framework. &n
🇺🇸  Meet the Crypto-Enforcers: The US Justice Department Sets up Inter-Agency National Enforcement Unit

Washington DC has been seized by a crypto-flurry of regulatory and law enforcement responses to illicit and criminal activities, and the US Justice Department (DoJ) is now growing significant teeth to effectively enforce the US regulatory and compliance framework.  


Elliptic Raises $60m Series C Funding Round

Making Crypto Safer For Financial Institutions As announced in the Wall Street Journal today, we have closed a $60m Series C funding round. The round is a milestone and a recognition of our critical role in the crypto ecosystem.
Making Crypto Safer For Financial Institutions

As announced in the Wall Street Journal today, we have closed a $60m Series C funding round. The round is a milestone and a recognition of our critical role in the crypto ecosystem.


Affinidi

What Links Identity and VCs Together Across Applications?

Protecting user data from identity fraud has become challenging for businesses in this dynamic environment shaped by privacy and security concerns. While centralized and federated identity management systems may seem like a good idea in terms of user access and convenience, it is also fraught with issues such as security threats, siloed data, poor user control, and more. A possible solution

Protecting user data from identity fraud has become challenging for businesses in this dynamic environment shaped by privacy and security concerns.

While centralized and federated identity management systems may seem like a good idea in terms of user access and convenience, it is also fraught with issues such as security threats, siloed data, poor user control, and more.

A possible solution to these problems is Self-Sovereign Identity (SSI), a revolutionary identity framework that empowers users to determine where their data is stored, how it is used, and with whom it is shared. A popular way to implement SSI is through Verifiable Credentials (VCs) and Decentralized Identifiers (DIDs), two W3C standards that create a common layer for user authentication and verification.

A DID is a unique identifier built on the Distributed Ledger Technology (DLT) and it is linked to a DID document, that in turn, contains information about an entity. A VC, on the other hand, is a cryptographically verifiable and machine-readable credential of an entity such as the date of birth, government ID, vaccination records, employment details, etc.

Now comes the big question — how are VCs tied to an entity’s identity in any application?

VC Issuance and Verification

To understand the relationship between VCs and DIDs, let’s talk briefly about the issuance and verification process.

There are three parties involved in VC issuance and they are:

Issuer — the entity issuing the VC. For example, the Motor Vehicle department issuing a driver license Holder — the owner of the VC. For example, the individual who owns the license Verifier — the entity verifying the authenticity of the VC. For example, a police authority who wants to issue a speeding ticket.

Together, these three entities form what’s called a trust triangle, and this is a key requirement for implementing SSI. The trust aspect among these entities is handled by public-key cryptography, a pair of private and public keys that always work together.

The process begins when an issuer creates a VC about the holder and digitally signs it with its private key. The holder, on receiving this VC, uses the issuer’s public key to decrypt the information, and store it in his or her digital wallet.

When the holder wants to share this VC with a verifier, he or she creates a verifiable presentation that contains one or more VCs such as date of birth, driver’s license, address, etc, and signs it with his or her private key.

The verifier decrypts the VC with the holder’s public key to access the information and authenticates that it came from the right source by using the issuer’s public key.

Role of Decentralized Identifiers

A DID is a unique identifier that is linked to a DID document through a DID method.

Essentially, the DID method resolves the DID and helps to locate the associated DID document. You can imagine it to be something similar to what a URL would do to locate the web server containing information.

This DID method is implementation-specific and varies according to the network or distributed ledger where it is implemented. The DID document, on the other hand, contains information about the entities involved in JSON format.

Link Between VCs and DIDs

To answer the question of how VCs are linked to identity across applications, we must understand how VCs are linked to DIDs.

Earlier, we said that the DID document contains information about the entities, and this includes information about their public keys and the cryptographic algorithm used. This information is then used for verifying the identity of the entities.

Here’s a sample DID document.

{ “@context”: [ “https://www.w3.org/ns/did/v1", “https://w3id.org/security/suites/ed25519-2020/v1" ] “id”: “did:example:123456789abcdefghi”, “authentication”: [{ // used to authenticate as did:…fghi “id”: “did:example:123456789abcdefghi#keys-1”, “type”: “Ed25519VerificationKey2020”, “controller”: “did:example:123456789abcdefghi”, “publicKeyMultibase”: “zH3C2AVvLMv6gmMNam3uVAjZpfkcJCwDwnZn6z3wXmqPV” }] }

In other words, DIDs contain information that can verify the authenticity of a VC.

Workflow of DIDs and VCs

To further elaborate, let’s look at how VCs and DIDs come together from an implementation standpoint.

Step 1 — Creating a DID

The first step is to create a DID using a specific DID method. The issuer can decide to create a new DID for each VC or use the same DID for all VCs issued to a holder.

Step 2 — Adding Information to the DID Document

The cryptographic keys of the DID are updated in the DID document.

Next, the issuer has to collect and/or verify the identity of the holder as required. The exact process of this step would depend on the type of credential. For example, VCs for proving that the holder is more than 21 years requires less stringent verification than issuing a driver’s license.

The issuer takes this responsibility of verification and accordingly issues a VC to the holder.

Step 3 — Signing the VC

The VC issued by the issuer contains the DID of both the issuer and the holder and it is digitally signed with the issuer’s DID. After the signing process, the issuer sends the VC to the holder

Step 4 — Revoking a VC

At a later point, the issuer can choose to revoke the VCs issued, especially if they are no longer related to the holder. While many implementations don’t support VC Revocation, Affinidi’s tech stack has simple methods to revoke VCs.

Step 5 — Verifying the VC

The holder creates a verifiable presentation consisting of one or more VCs and shares it with a verifier.

The verifier has to resolve the DID documents from the DID of the issuer and the holder present in the VC. Using the DID method, the verifier can check if the VCs sent by the holder are authentic.

The verifier also uses the DID methods to check for revocation of the VCs.

If found authentic, the verifier can offer its services to the holder.

From the above workflow, it’s clear that VCs are bound with DID, which in turn, reveal the identity of an entity. Since these are interoperable and are not tied to a centralized authority, VCs have many use-cases across different spheres.

Thus, this is how VCs and identity are linked together across applications.

For more such content, read through our blog posts, join our mailing list, and follow us on LinkedIn, Facebook, and Twitter.

The information materials contained in this article are for general information and educational purposes only. It is not intended to constitute legal or other professional advice.

What Links Identity and VCs Together Across Applications? was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

On Voting In OceanDAO

Towards compound growth in Ocean ecosystem Punchline OceanDAO is a grants DAO where anyone can propose a project, and each OCEAN holder can vote “yes” or “no” for a project. It’s great for the Ocean ecosystem if OCEAN holders thoughtfully vote “yes” and “no” on OceanDAO proposals, to curate towards projects with the best chance of growing the Ocean ecosystem (“thoughtful” voting). Voting
Towards compound growth in Ocean ecosystem Punchline

OceanDAO is a grants DAO where anyone can propose a project, and each OCEAN holder can vote “yes” or “no” for a project.

It’s great for the Ocean ecosystem if OCEAN holders thoughtfully vote “yes” and “no” on OceanDAO proposals, to curate towards projects with the best chance of growing the Ocean ecosystem (“thoughtful” voting).

Voting “no” across all projects with the hope that the burn will drive OCEAN price is simply irrational. It’s too small of a burn to make a dent for the near term, and it hurts the ability to grow value in the longer term.

If you care about the value of your OCEAN, then please vote, thoughtfully.

1. Introduction

This article provides background to better understand the “punchline” presented above. Ocean is designed to mirror Amazon’s successful dynamics. The keys are decentralized funding, curation, and a focus on compound growth (vs dividends).

The rest of this article is organized as follows.

Sections 2–4 describe Amazon. Sections 2 & 3 describes the Amazon flywheel. Section 4 describes how Amazon focuses revenue on growth rather than shareholder dividends. From this, it’s been able to grow like crazy and create longstanding shareholder value. Sections 5–8 draw a parallel to Ocean. Sections 5 & 6 describes how Ocean’s token design is modeled after the Amazon flywheel. Section 7 describes how Ocean design focuses on growth to create longstanding value. Section 8 describes how OceanDAO voting plays a big role. 2. The Amazon Flywheel

Over 25 years, Amazon has grown from a struggling online bookstore to a globally pervasive company with a trillion dollar market cap.

Its flywheel was key to success: if Amazon selection grows, then customer experience can’t help but improve; then sales can’t help but grow; then the number of sellers can’t help but grow; then selection can’t help but grow. And the loop continues.

The Amazon loop. Source: Amazon.com 3. Amazon Flywheel Components

Two key components in this flywheel are (1) its decentralized workers and (2) its resource allocation algorithm.

Amazon has (1) a sea of two-pizza teams, each running fairly independently, each with its own profit & loss. It’s the “workers” block in the image below. Each team is (2) funded by Amazon HQ according to a resource-allocation algorithm — the “curation” block in the image below. The more ROI a team delivers, the more funding they get. When getting going, they get a bit of time to start — typically months — to start delivering ROI. If they can’t deliver, the project gets shut down and team disbanded. Amazon’s flywheel, including curation of $ (top left), role of workers (middle), and role of dividends (bottom right) 4. Amazon Growth vs Dividends

The revenue for these teams comes as a cut of sales.

Amazon takes a cut from the sales and aggressively plows it into growth. (“curate” box in the image). It explicitly does not put much into shareholder dividends (“lower-bound-shareholder rewards” in image). This is because money into growth gives benefits that compound over time, whereas dividends are a one-time payoff.

For example, if Amazon spends R&D money to reduce its logistics costs by 2%, then it can keep reaping those savings for years to come. In contrast, Amazon did a dividend or a stock buyback, it would be a one-time benefit to shareholders and then the money is gone.

The best way to grow shareholder value is to grow revenue. It would only be rational for Amazon to give dividends if it couldn’t efficiently allocate its capital to growth (ROI > 1). When you see Amazon start to dole out lots of dividends, it means growth has slowed.

Interestingly, many companies religiously aim to give dividends every year. IBM is famous for it. Unsurprisingly, their growth is tepid, and has been for decades.

5. The Ocean Sustainability Loop

Amazon shows how well flywheels can work, including with decentralized workers and an algorithm to allocate resources. This gives Amazon an interesting path to decentralization. More usefully to for us: we can use this flywheel as a model for decentralized business models. We call it the Web3 Sustainability Loop. We use it as the beating heart of Ocean’s token design.

If Ocean data selection grows, then data buyer experience can’t help but improve; then data sales can’t help but grow; then the number of data sellers can’t help but grow; then data selection can’t help but grow. And the loop continues.

6. Ocean Sustainability Loop Components

Two key components in Ocean’s loop are (1) its decentralized workers and (2) its resource allocation algorithm.

Ocean has (1) a sea of OceanDAO-funded teams, each running fairly independently, each with its own profit & loss. It’s the “workers” block in the image below. Each team is (2) funded by OceanDAO according to its resource-allocation algorithm: voting. This is the “curation” block in the image below. The more ROI a team delivers, the more funding they can expect over time. When getting going, they get a bit of time to start — to start delivering ROI. If they can’t deliver, the team should not expect to receive further grants. Ocean Sustainability Loop is an instance of Web3 Sustainability Loop pattern

Ocean is an ecosystem. There are way more teams than just OPF core team working to build and grow the Ocean ecosystem. Here are examples of work funded by OceanDAO.

Ocean documentation. These have matured greatly in the last half year, thanks to Akshay. Work includes: search, ocean.py API, Provider API, Aquarius API, and (currently) documentation for V4. Deep Gaia-X engagement. OPF core team has been engaged with Gaia-X a long time. However this engagement has become broader and stronger thanks to work by grant recipients (deltaDAO). Vote with datatoken pools. This was accomplished by a grant (Tim Daubenshuetz). Moderation of TG, Discord, and other channels. This is done by the Ocean Ambassadors grants recipients. Learn about Ocean. The Ocean Academy team created tons of awesome material to onboard people about Ocean. It’s become so good, that doing Ocean Academy is a pre-requisite for people to become Ocean Ambassadors. Learn about OceanDAO grants. The Ocean Pearl team has created a gorgeous frontend (oceanpearl.io) to browse, search and filter OceanDAO grants recipients. The scope is increasing to support other OceanDAO flows such as voting. 7. Ocean Growth vs Burning

The revenue for OceanDAO teams comes from OPF treasury, 51% rewards, and (with traction) Network Revenue.

Ocean is designed to aggressively plows Network Revenue into growth. (“curate” box in the image). It explicitly puts just 5% — a low amount — into burning (bottom left image). This is because money into growth gives benefits that compound over time, whereas burning is a one-time payoff.

For example, if OceanDAO grants R&D money to a new app that adds 100,000 participants to Ocean ecosystem, those people will stick around for years to come, assuming they’re getting value from the app. Network Revenue would flow to the Ocean community accordingly.

In contrast, in a burn situation, it a one-time benefit to OCEAN holders and then the money is gone.

Burning actually happens in two places: (a) 5% of Network Rewards, as described, and (b) any OceanDAO funds not allocated in a given funding cycle, i.e. not enough projects with a sufficient “yes” vote to use up the funds available.

The best way to grow OCEAN value is to grow Network Revenue. It would only be rational for OceanDAO to burn its funds it couldn’t efficiently allocate its capital to growth (ROI > 1).

We’ve actually built TokenSPICE simulations that validate this (at least with the model assumptions). In scenario A, we ran at 95% of Network Revenue to OceanDAO, and 5% to burning. In scenario B, the opposite. The results: in 10 years, predicted market cap was about 10x higher for scenario A. Focus on growth creates the most value for OCEAN holders, because it compounds. Burning is a one-time payoff.

8. OceanDAO Voting & Ocean Growth

Let’s drill in on voting in OceanDAO.

What’s rational: voting “yes” for projects that you think have a good chance to help grow Ocean (expect ROI > 1 over time).

What’s rational: voting “no” for projects that you think don’t have a good chance (ROI <1 and will never get better)

What’s less rational: voting “no” on all projects because you think burning will help drive the price of Ocean. This less rational because the amount burned is so low that it won’t dent the price of OCEAN. Even more importantly, it means the ecosystem isn’t getting funded — precisely where the growth needs to be!

Put another way, ask yourself: can I expect more returns to OCEAN from this project in the next 6–18 months, versus burning? If yes, then vote yes. If no, then no. Burning is the lower-bound measuring stick for assessing value add.

9. Conclusion

This article reviewed the Amazon flywheel and how Ocean’s token dynamics mirror them. It highlighted that value creation comes from funding for growth, which gives compound rewards over time. Contrast this to burning which is only a one-time payoff (and is money taken from growth).

On Voting In OceanDAO was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ontology

Third Time’s a Charm: Establishing Secure Data and Identities in Web3

By Gloria Wu , Chief of Ecosystem Partnerships at Ontology The internet is currently in a liminal phase. It is sitting on the precipice of changes that will shake up human life in ways that we could have only imagined ten years ago. Web3 is the internet’s third iteration. It is seeing the digital sphere become more open-source. A transparent community of developers is building this new
By Gloria Wu , Chief of Ecosystem Partnerships at Ontology

The internet is currently in a liminal phase. It is sitting on the precipice of changes that will shake up human life in ways that we could have only imagined ten years ago. Web3 is the internet’s third iteration. It is seeing the digital sphere become more open-source.

A transparent community of developers is building this new web. It means it is trustless — participants can interact without intermediaries harvesting their data. Finally, it means it is permissionless — that anyone can participate without the need for a governing body.

Under the correct guiding principles, Web 3.0 can offer an internet that is more decentralized, open, and secure. To make this a reality, individuals as opposed to corporations and privacy as opposed to profits must have sovereignty in Web 3.0.

Looking back to move forward

Web 2.0 was defined by innovations such as smartphone technology, social media, and cloud computing. In tandem with these developments, data processing and collection quickly grew legs. It became a source for capital gains and marketing success with little regulation in place on data harvesting and its uses.

Policymakers have begun to police this through laws such as GDPR and MiFID ii in Europe. In addition to updated antitrust laws aimed at increasing consumer protection in the United States.

Ultimately, however, the underlying problem is that data is a valuable commodity. However, data protection is not something that chimes with the interests of big corporations.

The privacy failure

As well as facilitating an internet economy that thrives through data harvesting, Web 2.0 has failed to protect consumers’ privacy. This is because most data is stored on centralized platforms such as computer files and databases.

This makes data extremely vulnerable to losses, alterations, and hacks. Last year saw more than 37 billion records exposed due to just under 4000 data breaches in the U.S.

It’s not just Big Tech giants, like Watsapp and Amazon, that are in hot water when it comes to security. National health services such as the Health Service Executive (HSE) in Ireland, as well as schools and restaurants around the world, have all been implicated in significant data breaches this year. Therefore, showing how centralized systems can no longer provide users with the data protection they need.

Centralized bodies are often required to exchange data with different companies to function. For instance, when a bank must seek information from a loan provider on a customer’s repayment history.

Moving this data creates a new copy. With that copy, an opportunity arises for loss or incorrect data to be replicated or altered.

Web3 is perfectly placed to remedy the existing failures of these centralized systems. In addition, it can place privacy at the forefront of all technological developments.

Using a decentralized data storage system can ensure that each entity involved in data exchange has access to a real-time, shared view of the data when permission is granted by the person in control. Therefore, reducing the risk of malpractice.

Web3 and decentralized identities

Another essential part of the decentralization of Web3 will be the creation of secure digital identities.

The current model of online identity verification on the web is old and fraught with vulnerabilities. To do almost anything interactive, we have to supply valuable personal information. This comes with little oversight into how it is shared and protected.

Decentralized identity solutions already exist in the form of applications built on blockchain. These allow users to maintain autonomous control over their private data. They can then decide if and when to share it by granting access through private keys.

In the next phase of internet development and communication, decentralized metaverses, spaces where virtual worlds (such as games and forums) and the real world collide, are also becoming a reality. Different games will be linked up to each other, as well as to other virtual arenas like online shops or virtual events platforms.

To ensure the seamless integration of these worlds, users have to be able to move between different parts of the metaverse with a consistent identity. All while ensuring their data and privacy is respected.

Decentralized identity solutions can allow users to create one decentralized username and password. This then allows them to access multiple different platforms securely. Therefore, making them a perfect solution in an integrated decentralized metaverse world.

Priorities in Web3

Every blockchain or layer two network must wrestle with tradeoffs such as speed or block size. A given developer’s goals influences these decisions.

As such, priorities for the ecosystem will have to be set when developing a blockchain for Web3. Open, trustless, decentralized networks that value privacy and security should be prioritized. The principle of interoperability is part of this too. Creating one blockchain to facilitate all users’ needs goes against the decentralized principles that underlie blockchain technology.

There will be many blockchains and layer two protocols coexisting and providing different services. The task is to ensure that they can communicate seamlessly with one another. Thus, creating a safe and secure Web3 that places users’ needs ahead of those of corporations.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Third Time’s a Charm: Establishing Secure Data and Identities in Web3 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 10. October 2021

KuppingerCole

Analyst Chat #98: GAIN and Reusable Identities

Annie Bailey and Matthias take a deeper look at the emerging concept of the Global Assured Identities Network (GAIN) and also seek a broader perspective on the benefits and challenges of reusable identities in general.

Annie Bailey and Matthias take a deeper look at the emerging concept of the Global Assured Identities Network (GAIN) and also seek a broader perspective on the benefits and challenges of reusable identities in general.



Friday, 08. October 2021

Indicio

Indicio Expands Decentralized Identity Possibilities with Transaction Endorser Wizard

The post Indicio Expands Decentralized Identity Possibilities with Transaction Endorser Wizard appeared first on Indicio Tech.
Build your business and customer base fast by simplifying ledger writes with Indicio’s easy-to-use toolkit. Scale products and services in minutes

SEATTLE — October 08, 2021 — Indicio, the world’s leading provider of decentralized identity development and hosting solutions, today announced the public availability of the Indicio Transaction Endorser Wizard, the newest offering  in the company’s  range of products for building enterprise-grade Trusted Data Ecosystems. The Indicio Transaction Endorser Wizard takes the guesswork out of putting together the right components for a ledger write which means you can get your customers to go live with their products and services  in minutes and not hours or  days.

Writing to any Hyperledger Indy-based network can be time consuming and complicated.  As more companies turn to trusted data ecosystems and verifiable credential technologies to share information in a secure, privacy-preserving way, IT organizations have to take on the challenge of writing to the ledger for their own products and services—as well as endorsing writes for their customers.

These processes, which involve creating decentralized identifiers (DIDs) and schema combination for Hyperledger Indy networks require a high level of technical skill and coordination. The Indicio Transaction Endorser Wizard solves this problem by making writing to the ledger simpler, quicker, and easier to scale. This helps you get to market quicker, expand your market, and grow your business as an endorser for your customers.

“Writing transactions and orchestrating multiple writes from multiple parties is a new core business activity,” said Heather Dahl, CEO, Indicio. “For decentralized identity to scale, for companies to get to market, we need to make these  processes as frictionless as possible. For products and services involving multiple customers that all want to issue verifiable credentials this can mean hundreds of ledger writes to a network. The Transaction Endorser Wizard makes this possible  in minutes instead of  hours for even the most knowledgeable teams. This is an essential tool for scaling decentralized identity solutions for everyone.”

Indicio Transaction Endorser Wizard subscribers get:

Onboarding and training: Onboard your team with the dedicated guidance of an Indicio network architect. Keep your team on top of technical developments with the latest training and workshops available for the Transaction Endorser Wizard. Timely, dependable updates: Indicio is constantly working to make decentralized identity technology simpler and quicker to use. Receive the latest updates and ongoing engineering assistance, directly from our team of experts. Minimize downtimes, increase dependability, and keep up with the latest developments. Indicio doesn’t just back our products, we have your back. Continued support: As a Transaction Endorser Wizard subscriber, you’ll have access to a monitored help line, with a trained Indicio engineer available to answer any questions that you may have. Have constant contact with the Indicio staff directly for continued support. Discounted Customization: Request feature enhancements as part of your Transaction Wizard subscription, such as integrating the Wizard into your client portal. Currently available to use in a simple docker configuration, the script can be worked into a branded

The Wizard is available on a subscription basis and can be used by any Transaction Endorser on any Hyperledger Indy-based network. Companies using it to endorse transactions to the Indicio Network may also qualify for subscription discounts.

“As the TE Wizard’s first subscriber, Liquid Avatar is using it to make ledger writing a scalable and user-friendly process,” said RJ Reiser, Chief Business Development Officer of Liquid Avatar Technologies, a global blockchain, digital identity and fintech solutions company and a member of the  Indicio Network Consortium.“It will  help us deploy the next wave of decentralized identity applications and by making it  accessible through a simple Docker Hub, we can extend our reach to meet an even broader set of developers and application leaders.” 

###

The post Indicio Expands Decentralized Identity Possibilities with Transaction Endorser Wizard appeared first on Indicio Tech.


KuppingerCole

Complex Modern Business Needs Trusted IT Partners to Be Secure

by Paul Fisher In today’s business environment, companies have three major challenges – making a profit, finding great people, and staying ahead of the competition. That’s quite enough, but they also have major operational challenges with IT, cyber security, and compliance. For example, IBM Security Services reports that it’s not unusual for clients to have more than 15 different cloud providers

by Paul Fisher

In today’s business environment, companies have three major challenges – making a profit, finding great people, and staying ahead of the competition. That’s quite enough, but they also have major operational challenges with IT, cyber security, and compliance. For example, IBM Security Services reports that it’s not unusual for clients to have more than 15 different cloud providers – and probably more they don’t know about. How do business and IT leaders deal with that when they don’t even know how many clouds there are, or what data resides on those multiple clouds? They can appoint new people into new privacy and security positions to cope but is that enough? Given the cyber threats the business now faces, it most likely is not.

Hail to the chief security officer

Most recently, the number of companies falling victim to ransomware attacks has seen a steep increase. The fear that ransomware and other forms of cyber-attack will have a serious impact on the US economy led to President Biden’s Executive Order on radically improving cyber security within Federal Agencies. This seriously upped the game for those vendors that wish to supply into the Federal Sector, and the rigorous standards they will now have to meet.

The Executive Order is far reaching and will have an impact on the private sector where the cyber threat is no less acute. The order states: “The trust we place in our digital infrastructure should be proportional to how trustworthy and transparent that infrastructure is, and to the consequences we will incur if that trust is misplaced.” This almost perfectly describes the situation that many private companies now find themselves in when looking for partners to improve their own security posture against the worsening threat landscape – trust is all important as is the ability to provide modern scalable security platforms that are compatible with legacy platforms.

What IT buyers are looking for now, and why

Buyers are looking for an integrated approach to data security, privacy and governance – and looking for trusted partners to provide such solutions. The major challenge is that IT infrastructures for many organizations are of necessity more complex as they seek to make progress on digital transformation with clouds, infrastructure as code, third party and customer identity management etc. – and this has made a consistent approach to security and governance harder. Too often, incompatible legacy solutions are preventing a fully joined up holistic approach to security and governance and creating pain points across the enterprise. With modality now the order of the day for larger enterprises clients want to see vendors and service providers who understand the customer’s pain points and how they are trying to address it, reaching further into relationships that scale and mature.

Finding the data and then protecting it is not easy

Understanding data is central to innovation, efficiency and productivity and therefore its protection is paramount. To make use of data, teams must know where it is. Data on customer behaviour is hugely valuable to defining new services and products but it’s also sensitive and needs protecting. Thales, a global digital identity and security company, advises that encryption is a good solution, but it too must be managed, be best in breed so that it 100% protects data but does not stand in the way of access to those that need it. Encrypted data is useless to hackers, but it is also no good to users if they cannot decrypt what they need with zero latency.

There is of course no silver bullet when it comes to protecting data and applications but for any security infrastructure to succeed following an internationally recognised framework such as NIST is advisable. It helps with purchase decisions, deployment and operation to follow accepted and recognised patterns and will also help with meeting compliance demands. The best IT services providers and integrators will be able to ensure that any security project or data protection platform follows recognised and suitable frameworks that also fit the culture, operations and market obligations of the client.

Clients do not want to rip out legacy technologies to conform to an individual platform – so solutions must have as many integrations as possible – a “perfectly baked cake” that covers the full spectrum of integrations that plays nicely with other applications in the stack. Of course, writing this is simple, doing it is much harder and few organizations these days can achieve security solutions suitably hardened and stressed tested against compliance standards and known vulnerabilities without expert help that understands business.

Poor security costs more than dollars

The cost of inadequate or badly configured IT security is not just measured in lost production or through ransomware payments but also in loss of reputation and brand damage after a breach. And, since the emergence of consumer privacy regulations such as the EU GDPR, organizations have had to ensure that they protect PII or be subject to fines. This comes at the same time as enterprises are introducing more consumer focused digital services putting further emphasis on protecting customer data. If that is not secured any new digital relationship with customers (CIAM) will not work – the trust will not be there. In those less regulated markets where Privacy Regulation are less severe this may matter less in dollar terms but a reputation for good security and looking after customer data will do brands no harm at all in any market.

Increasingly businesses should see spend on cybersecurity not as cost but as an investment – in the company, its people and its future. Further, in the joined up interconnected supply chains we are accustomed to – companies increasingly wish to do business with other companies that proactively think about security – and can be seen to tangible taking steps to be as secure as possible.

It is about reputation with customers and partners in supply chain. It is about competitive positioning. For example, legacy financial institutions need to compete with more agile cloud native FinTechs who may have a more baked in security layer. To compete they will demand a security and privacy fabric entwined with their clouds and, importantly, legacy infrastructures – and they will look to trusted IT partners to provide this.


Coinfirm

Jobs: Customer Engagement Manager

Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...
Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...

Infocert (IT)

Sportello Online di AdER: con InfoCert la PA viaggia verso il Futuro

Con lo sportello online di Agenzia delle Entrate-Riscossione, InfoCert si conferma al centro del processo di digitalizzazione della Pubblica Amministrazione. Lo Sportello virtuale a cui si può fare accesso con SPID, è disponibile da pochi giorni e consente ai cittadini di dialogare con Agenzia delle Entrate-Riscossione senza doversi recare fisicamente presso gli sportelli sul territorio […] The

Con lo sportello online di Agenzia delle Entrate-Riscossione, InfoCert si conferma al centro del processo di digitalizzazione della Pubblica Amministrazione.

Lo Sportello virtuale a cui si può fare accesso con SPID, è disponibile da pochi giorni e consente ai cittadini di dialogare con Agenzia delle Entrate-Riscossione senza doversi recare fisicamente presso gli sportelli sul territorio ma attraverso una sessione in videochiamata.

Ciò, oltre a rendere più smart e flessibile il rapporto con la PA, consentirà enormi risparmi in termini di tempo, file, inquinamento e stress sia per i cittadini che per gli operatori dell’Ente.

AGENZIA ENTRATE-RISCOSSIONE SPORTELLO ONLINE – TG1 del 7.10.2021

InfoCert ha supportato questo progetto con la sua piattaforma TOP, la Trusted Onboarding Platform già usata da tantissime banche e operatori del settore finanziario di tutta Europa per l’onboarding online dei propri clienti.

Grazie a questi nuovi servizi, l’Italia avrà strumenti innovativi ad oggi disponibili a livello Europeo solo in Scandinavia. Vuoi saperne di più? Contattaci

The post Sportello Online di AdER: con InfoCert la PA viaggia verso il Futuro appeared first on InfoCert.


Coinfirm

Jobs: Crypto AML Expert and Training Lead

Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...
Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...

Ocean Protocol

Ocean’s on Energy Web Chain

Ocean Protocol smart contracts and Ocean Market are now running on Energy Web Foundation’s dedicated network As part of Ocean’s multi-chain strategy, we have deployed Ocean’s technology to Energy Web Chain (EWC), the dedicated network for Energy Web (EW). This collaboration marks a major milestone in the energy vertical of the Ocean data ecosystem. Energy Web is the leading decentralized energy

Ocean Protocol smart contracts and Ocean Market are now running on Energy Web Foundation’s dedicated network

As part of Ocean’s multi-chain strategy, we have deployed Ocean’s technology to Energy Web Chain (EWC), the dedicated network for Energy Web (EW). This collaboration marks a major milestone in the energy vertical of the Ocean data ecosystem.

Energy Web is the leading decentralized energy ecosystem, trusted by influential energy companies including GE, Tepco and national operators the world over. The Energy Web consortium has more than 100 organizations. EW technology enables any energy asset owned by any customer to participate in any energy market.

The Energy Web Chain (EWC) “is a public, enterprise-grade blockchain platform designed for the energy sector’s regulatory, operational, and market needs. Launched in mid-2019, it has become the industry’s leading choice as the foundational digital infrastructure on which to build and run blockchain-based decentralized applications (dApps)” [link].

The teams of Ocean and Energy Web have been quietly collaborating since 2017, when both projects were at inception stages. Ocean has contributed to EW’s annual Event Horizon conference, for example with this 2019 talk “Energy Data and Access Management” [slides][video]. Moreover, both Energy Web and Ocean have long-standing collaborations in the Kusama / Polkadot ecosystems. Ocean and Energy Web have grown up together.

EWC is EVM-compatible, which allows it to run Ethereum smart contracts and use Ethereum-style ERC20 tokens. It’s currently a Proof-of-Authority chain with a modern consensus algorithm built on Parity Substrate technology. This allows it to have low gas fees and low latency compared to Ethereum mainnet.

The first dataset published in EWT.

With Ocean deployed to EWC, this means:

Users of Ocean Market can publish, swap, stake, and consume data assets in EWC. Data marketplace builders and other dApp developers can use Ocean libraries (ocean.js, ocean.py) and frontend components (market) with EWC. Token holders can move their OCEAN on Ethereum mainnet to OCEAN on EWC, and back using the Energy Web Token Bridge or the CarbonSwap OmniBridge.

Ocean components supported in EWC include: Ocean smart contracts, Ocean Market, Provider (for data services), Aquarius (metadata cache), and Ocean Subgraph (The Graph queries). For more details, visit our Supported Networks page.

Cross-chain Mental Model

Ocean is now deployed to five production networks: Ethereum mainnet, Polygon mainnet (Matic), Binance Smart Chain (BSC), Moonriver, and (new) Energy Web Chain.

An Ocean Market user can click on the network preference in the top right, and toggle which chains they would like to see the assets for. EWC is now in this list and selected by default for all new users.

Default selected networks now include Energy Web Chain.

In each of these networks, tokens on the network have real value associated, so there need to be token bridges across networks. For EWC this is:

OCEAN on Ethereum mainnet gets bridged across as OCEAN on EWC. ETH for gas fees on Ethereum mainnet corresponds to EWT for gas fees on EWC. EWT is the native token on Moonriver used for gas fees.

For dApp developers, as with other networks, in ocean.js the Energy Web Chain chainId is passed to the ConfigHelper to get all required endpoints for initializing Ocean, while in ocean.py the respective RPC URL needs to be set as the network config parameter.

Resources on Ocean * EWC Docs: Ocean-related tokens and services in EWC OCEAN contract in EWC blockchain explorer Energy Web Token Bridge CarbonSwap OmniBridge Announcement Statements
With the decentralization of energy assets, energy market participants need your data to operate the markets. With Ocean marketplace, you can monetize your data whilst not giving up ownership. Welcome, Ocean, to Energy Web!

–Walter Kok, Managing Director of The Lab (EW’s innovation hub), former CEO of Energy Web

I’m very happy to see this integration of Ocean and Energy Web Foundation. It’’s something that both Ocean and EW been dreaming of for years. The world of energy has tons of data; let’s unlock that data for the benefit of all.

–Trent McConaghy, Founder at Ocean Protocol

About Ocean Protocol

Ocean Protocol’s mission is to kickstart a Web3 Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Visit oceanprotocol.com to find out more.

Twitter | LinkedIn | Blockfolio | Blog | YouTube | Reddit | Telegram | Discord

Ocean’s on Energy Web Chain was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Infocert (IT)

InfoCert nominata tra le aziende Leader nell’IDC MarketScape: Worldwide eSignature Software 2021

Settembre 2021: InfoCert riconosciuta come Leader a livello mondiale nel mercato dell’eSignature nell’IDC MarketScape: Worldwide eSignature Software 2021 Vendor Assessment (doc #US46742320, September 2021). Nello studio, che ha preso in considerazione 15 tra i più importanti fornitori di soluzioni di firma elettronica a livello mondiale, in virtù delle proprie capacità e strategie connesse alla pr

Settembre 2021: InfoCert riconosciuta come Leader a livello mondiale nel mercato dell’eSignature nell’IDC MarketScape: Worldwide eSignature Software 2021 Vendor Assessment (doc #US46742320, September 2021).

Nello studio, che ha preso in considerazione 15 tra i più importanti fornitori di soluzioni di firma elettronica a livello mondiale, in virtù delle proprie capacità e strategie connesse alla propria offerta, InfoCert si posiziona nella “Leaders category”.

Lo studio di IDC esamina vari fornitori di software di firma elettronica in tutto il mondo. Le soluzioni di firma elettronica sono un elemento importante per migliorare sia l’efficienza che l’esperienza dei flussi di lavoro incentrati sui contenuti tra imprese e tra imprese e consumatori.

In un momento storico in cui un numero crescente di documenti “nasce digitale” e le Organizzazioni continuano a digitalizzare i flussi di lavoro incentrati sui contenuti, la firma elettronica ha permesso a questi flussi di lavoro di rimanere digitali e apportare benefici, tra cui la riduzione dei tempi e dei costi delle transazioni, una maggiore sicurezza e una migliore user experience per dipendenti, fornitori, partner e clienti.

Il contributo della tecnologia della firma elettronica alla continuità aziendale è diventato lampante durante la pandemia COVID-19. Le carenze sono risultate evidenti nelle Organizzazioni prive di processi aziendali completamente digitalizzati e i lavoratori in remoto hanno dovuto cercare metodi alternativi per portare avanti il lavoro.

Holly Muscolino – Research vice president, Content Strategies and the Future of Work
InfoCert e la sua soluzione di eSignature: GoSign

GoSign, la soluzione di firma di InfoCert, è ormai un punto di riferimento sul mercato grazie alla sua versatilità. La possibilità di firmare su qualsiasi dispositivo, utilizzare diverse tipologie di firma e personalizzare la soluzione con processi verticali, consente alle aziende di velocizzare i processi approvativi e quelli di firma verso clienti e fornitori.

Essere posizionati da IDC tra le aziende Leader nell’IDC MarketScape per il mercato delle soluzioni di firma digitale conferma il lavoro straordinario che InfoCert ha fatto in questi anni nel campo del Digital Trust.

La nostra missione è supportare i nostri clienti con soluzioni end-to-end che li abilitino ad una trasformazione digitale sicura e con un livello di Trust e di Governance scalabile, secondo le esigenze del business. Grazie anche al recente ingresso nel gruppo Tinexta di tre nuove aziende focalizzate nella CyberSecurity, amplieremo ulteriormente il ventaglio della nostra offerta.

Il nostro obiettivo è quello di essere sempre al passo con le evoluzioni tecnologiche e di scenario normativo che continuamente cambiano il nostro mercato. L’attitudine al miglioramento è sicuramente un tratto distintivo che ci ha permesso di affermarci come riferimento per numerose aziende e professionisti che da anni si affidano ai nostri servizi.

Danilo Cattaneo – InfoCert CEO

Sono ben 650 milioni le firme apposte nel 2020 attraverso GoSign, che conta 3,8 milioni di utilizzatori nel mondo, dei quali più di 1 milione e mezzo ogni giorno utilizza la piattaforma per firmare e gestire iter approvativi in digitale.

GoSign è già stata adottata con successo da importanti aziende in vari settori di mercato – tra cui Automotive, Utilities, Healtcare, Insurance e Servizi finanziari – aiutando le Organizzazioni a digitalizzare i servizi di vendita e noleggio, la firma e gestione di approvazioni interne o contratti con clienti e fornitori, riducendo il tempo medio di ogni firma e approvazione​ in modo consistente.

IL TOTAL ECONOMIC IMPACT di INFOCERT GOSIGN

Uno studio indipendente, condotto da Forrester Research, analizza tutti i vantaggi quantitativi e qualitativi che un importante cliente di InfoCert nel settore energetico è riuscito a ottenere adottando GoSign. Tra questi:

la crescita del ROI (Return On Investment) del 177%​; la riduzione del tempo ​di gestione dei documenti​ dell’85%; l’aumento dell’efficienza del 55%. Scopri di più e scarica gratuitamente lo studio Forrester

Informazioni su IDC MarketScape:

il modello di valutazione dei fornitori IDC MarketScape è progettato per fornire una panoramica della competitività dei fornitori ICT (information and communications technology) in un determinato mercato. La metodologia di ricerca utilizza una rigorosa metodologia di punteggio basata su criteri qualitativi e quantitativi che si traduce in un’unica illustrazione grafica della posizione di ciascun fornitore all’interno di un determinato mercato. IDC MarketScape fornisce un quadro chiaro in cui le offerte di prodotti e servizi, le capacità e le strategie, e i fattori di successo attuali e futuri dei fornitori di IT e telecomunicazioni possono essere comparati in modo significativo. Il quadro fornisce inoltre agli acquirenti di tecnologia una valutazione a 360 gradi dei punti di forza e di debolezza dei fornitori attuali e futuri.

***

Anticipa i competitor: esplora le ultime innovazioni in vari ambiti e settori di mercato, scopro le prossime tendenze tecnologie e le soluzioni che stanno cambiando l’approccio al business.

Anticipa il cambiamento con i nostri Analyst Reports

The post InfoCert nominata tra le aziende Leader nell’IDC MarketScape: Worldwide eSignature Software 2021 appeared first on InfoCert.


Infocert

InfoCert named as a Leader in IDC MarketScape: Worldwide eSignature Software 2021

September 2021: InfoCert has been recognized as a Leader in the eSignature market in the IDC MarketScape: Worldwide eSignature Software 2021 Vendor Assessment (doc #US46742320, September 2021). The report, evaluated 15 e-signature solution providers on the basis of their capabilities and strategies related to their offerings, InfoCert was placed in the “Leaders Category“. The IDC […] The post In

September 2021: InfoCert has been recognized as a Leader in the eSignature market in the IDC MarketScape: Worldwide eSignature Software 2021 Vendor Assessment (doc #US46742320, September 2021).

The report, evaluated 15 e-signature solution providers on the basis of their capabilities and strategies related to their offerings, InfoCert was placed in the “Leaders Category“.

The IDC MarketScape study examines various esignature software vendors worldwide. eSignature solutions are an important element in improving both the efficiency and the experience surrounding business-to-business and business-to-consumer content-centric workflows.

According to the IDC website, as an increasing number of documents are “born digital” and organizations continue to digitally transform content-centric workflows, esignature technology has enabled those workflows to remain digital and drive benefits including reduction in transaction time and cost, increased security, and improved employee, supplier, partner, and customer experiences.

The contribution of esignature technology to business continuity became glaringly evident during the COVID-19 pandemic. Gaps were exposed in organizations without end-to-end digitally transformed business processes, and remote workers had to seek alternative methods to move the process forward.

Holly Muscolino – Research vice president, Content Strategies and the Future of Work
InfoCert and its eSignature Solution: GoSign

GoSign, InfoCert’s eSignature solution, is now a benchmark in the market thanks to its versatility. The ability to sign on any device, use different types of signatures and customize the solution with vertical processes allows companies to speed up approval processes and signature processes with customers and suppliers.

Being positioned as a Leader by the IDC MarketScape in the market for digital signature solutions confirms the extraordinary work InfoCert has done in recent years in the field of Digital Trust.

Our mission is to support our customers with end-to-end solutions that enable them to have a secure digital transformation with a scalable level of Trust and Governance, according to the needs of their business. Thanks also to the recent entry into the Tinexta group of three new companies focused on CyberSecurity, we will further expand the range of our offerings.

We aim to be always in step with the technological and regulatory scenario evolutions that continuously change our market. The attitude to improvement is certainly a distinctive feature that has allowed us to establish ourselves as a reference for many companies and professionals who for years have relied on our services

Danilo Cattaneo – InfoCert CEO

In 2020, 650 million signatures were performed through GoSign, which has 3.8 million users worldwide, of which more than 1.5 million use the platform every day to digitally sign and manage approval processes.

GoSign has already been successfully adopted by leading companies in various market sectors, including Automotive, Utilities, Healtcare, Insurance and Financial Services. GoSign helps these organizations to digitize sales and rental services, sign and manage internal approvals or contracts with customers and suppliers every day, reducing the average time of each signature and approval consistently.

THE TOTAL ECONOMIC IMPACT of INFOCERT GOSIGN

An independent study, conducted by Forrester Research, analyzes all of the quantitative and qualitative benefits that a major InfoCert customer in the energy industry was able to achieve by adopting GoSign. These benefits include:

177% increase in ROI (Return On Investment); 85% reduction in document management time; 55% increase in efficiency. Learn more and download the Forrester study for free

About IDC MarketScape:

IDC MarketScape vendor assessment model is designed to provide an overview of the competitive fitness of ICT (information and communications technology) suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor’s position within a given market. IDC MarketScape provides a clear framework in which the product and service offerings, capabilities and strategies, and current and future market success factors of IT and telecommunications vendors can be meaningfully compared. The framework also provides technology buyers with a 360-degree assessment of the strengths and weaknesses of current and prospective vendors.

***

Get ahead of the competition:

explore the latest innovations in various areas and market sectors, discover the next technology trends and solutions that are changing the approach to business.

Anticipate change with our Analyst Reports

The post InfoCert named as a Leader in IDC MarketScape: Worldwide eSignature Software 2021 appeared first on InfoCert.


IDnow

How Identity Verification can solve esports’ cheating problem

Cheating in esports is a pressing issue as it poses a threat to all involved stakeholders. From players to developers, tournament hosts, and sponsors, if a tournament is damaged by the presence of cheaters, the fallout is never good.  Especially with esports being such a fast paced and modern environment, negative news spreads like wildfire […]

Cheating in esports is a pressing issue as it poses a threat to all involved stakeholders. From players to developers, tournament hosts, and sponsors, if a tournament is damaged by the presence of cheaters, the fallout is never good. 

Especially with esports being such a fast paced and modern environment, negative news spreads like wildfire and just one incident could have lasting repercussions, sometimes even to a point of no return. 

This is why it’s vital to detect and prevent online gaming fraud and one of the best ways to do this is through the use of AI-powered identity verification systems. But what exactly does this mean and how does it all work? Let’s break it all down step by step. 

What is AI-Powered Identity Verification?

AI-powered Identity verification is a system that combines the best of human instinct and technology with modern machine learning technology to verify the identity of users. It allows for a streamlined user experience and, with tools like IDnow’s Autoident, it can be as simple as a user uploading images of their ID documents and following on screen prompts. 

The biggest advantages of using an AI-powered system are the speed and accuracy of results. It both eliminates time consuming manual inputs, as well as the risk of human error. Overall, it is a both quick and secure method to verify users identification and can be used as both a cheater prevention and age verification tool. 

How could AI-Powered Identity Verification be used for esports?

There are many ways that an online identification system could be used in esports. It could come as a solution to a key problem of ‘ghost accounts’ – these are new accounts created and used by cheaters when they have been caught and banned on other accounts, allowing them to cheat in a game all over again. This is one of the biggest issues, as there’s currently no real way to stop users from making new accounts and it’s especially problematic in free-to-play games, where there’s no cost to making a new account.

By using an AI-powered identity verification tool, accounts could only be created if they are tied to unique users, which would require official documents and proof of identity, making it much more difficult to make and obtain ghost accounts. 

This would also come as a deterrent to cheating and other illicit activities in the first place, as players would be held accountable for their accounts and wouldn’t be able to play the game again if they were banned. As the tools require ID verification, it would also aid in preventing fraud and under-age gaming. 

At the same time, it also provides an extra layer of security, as these verification tools can also be used as a multi-factor authentication system. 

The best news is that these AI-powered identity verification systems already exist and successfully function in other markets. Let’s take a look at one example, IDnow. 

How IDnow’s Autoident can solve esports’ cheating problem

IDnow’s Autoident can be the perfect solution to the outlined problems. It’s an AI-powered identity verification tool that makes it quick and painless for users and is thus a perfect fit for the fast paced esports world. 

Verification can be as simple as downloading an ID verification app, then scanning an ID document and taking a selfie to confirm identity. This is then automatically verified through AI and users can expect results within minutes, all while following simple on screen instructions. Even if there is an issue, IDnow specialists are always available and still able to deliver speedy results. 

The platform also boasts great scalability and potential for global deployment as it’s able to identify people anywhere, anytime. In fact, right now there is potential to cover 7 Billion customers from 195 different countries and considering how much of a global phenomenon gaming and esports is, this will be vital for large scale, triple A titles. 

Best of all, IDnow boasts impressive proof of work, having already employed products with over 250 customers, including leading international companies like Lottoland, Bank of Scotland, and Western Union during their seven years in business. 

All this means that IDnow’s Autoident not only fills the gaps in esports current anti-cheater systems, but also has a reliable and experienced team that can provably deliver an exceptional product and service.

Interested in learning how to keep your esports gaming experience secure and fun? Check our guide How to prevent cheaters with AI-powered identity verification.

By

Jonathan Bluemel
Senior Content & SEO Manager at IDnow
Connect with Jonathan on LinkedIn


Okta

Secure Access to AWS EKS Clusters for Admins

In this tutorial, we will leverage OpenID Connect (OIDC) to allow our DevOps team to securely access their EKS clusters on AWS. We use Role Based Access Control (RBAC)] to enforce the least privilege required without the need to configure AWS IAM roles. 😎 We’ll highlight the steps to manually enable an OIDC provider on your EKS clusters. At the end of this tutorial, we’ll point to resource

In this tutorial, we will leverage OpenID Connect (OIDC) to allow our DevOps team to securely access their EKS clusters on AWS. We use Role Based Access Control (RBAC)] to enforce the least privilege required without the need to configure AWS IAM roles. 😎

We’ll highlight the steps to manually enable an OIDC provider on your EKS clusters. At the end of this tutorial, we’ll point to resources you can leverage to automate all those steps.

Below is the target architecture you’ll be deploying:

Table of Contents

Who This Quick-Start Guide Is For What You’ll Need to Get Started What is Okta? What is Kubernetes? What is AWS EKS? Okta + EKS: How Do They Work Together? Configuration Create a New Cluster Service Role Create a New EKS Cluster Configure Your Okta Org Add Okta as an OIDC Provider on Your EKS Cluster Some Extra Checks Automation for Your AWS EKS Workflow Learn More About Identity Security Who This Quick-Start Guide Is For

This tutorial is intended to show AWS DevOps and Identity Security administrator teams how to securely access Amazon Elastic Kubernetes Service (EKS) clusters. Anyone with an interest in Identity Security best practices can learn from this guide, but it assumes at least some knowledge of:

Kubernetes (k8s), k8s API Server, k8s RBAC Authorization, and k8s role binding AWS Console, EKS, AWS CloudShell Terminal on a end-user workstation (e.g. macOS, Windows, Linux) What You’ll Need to Get Started

The prerequisites to complete this tutorial are:

The tutorial assumes that you’re already using Okta as your identity and authorization solution. However, if you don’t have an existing Okta tenant, you can create a new one here and follow along. One or more Okta administrative user(s) One or more Okta test user(s) Okta administrator rights Workstation(s) running a supported version of macOS, Windows, or Linux Installation permissions SSH terminal application HTTPS web browser (recommended) What is Okta?

Okta, Inc. is an identity and access management company, providing cloud software that helps companies manage and secure user authentication into applications, and for developers to build identity controls into applications, websites, web services, and devices. You get scalable authentication built right into your application without the development overhead, security risks, and maintenance that come from building it yourself.

What is Kubernetes?

Kubernetes, also known by the abbreviation k8s, is an open-source container orchestration platform for automating deployment, scaling, and management of containerized applications. See https://kubernetes.io/.

What is AWS EKS?

Amazon Elastic Kubernetes Service (Amazon EKS) is a managed container service to run and scale Kubernetes applications in the cloud or on-premises. See https://aws.amazon.com/eks/.

To deploy k8s clusters on your own infrastructure, you can use EKS Anywhere. See https://aws.amazon.com/eks/eks-anywhere/

Okta + EKS: How Do They Work Together?

Let’s take an EKS cluster deployed in AWS. We’ll perform the following steps:

add Okta as an OIDC provider to the EKS cluster configure the k8s API server so it prompts the user for Authentication (AuthN) configure RBAC Authorization (AuthZ), mapping Okta groups with given k8s roles leverage an OIDC plugin that 1) prompts the user for AuthN in the web browser and 2) retrieves the JSON Web Token (JWT) id_token from Okta and passes it to our kubectl (Kubernetes command-line tool) commands

Ready? Let’s get started!

Configuration

Let’s first deploy a brand new EKS cluster. We’ll do it manually from the AWS Console.

Note: We recommend configuring access to the AWS Console using Okta SSO+MFA.

Create a New Cluster Service Role Go to https://console.aws.amazon.com/iamv2/home?#/roles > Create role

Select trusted entity = AWS service, and click on the EKS service

Click on use case = EKS - Cluster, then Next: Permission.

Verify that AmazonEKSClusterPolicy is included in the attached permissions policies. Click on Next: Tags.

Click on Next: Review.

Enter EKSCluster as Role name.

Once your role is created, go back to the list of roles and open EKSCluster to double-check it’s properly configured:

Create a New EKS Cluster

Let’s create a brand new EKS cluster.

Go to EKS

Click on Clusters > Add cluster > Create.

Enter a name: eks-cluster. Select the Cluster Service Role created in the previous section EKSCluster. Then click on Next.

On the next Networking screen, keep the default options and click on Next:

On the Logging screen, keep the default options and click on Next:

On the Review and create screen, click on Create:

Your EKS cluster will take a couple of minutes to start. In the meantime, let’s do the configuration on the Okta side. Then we’ll come back to the AWS Console to configure Okta as the OIDC provider for the EKS cluster.

Configure Your Okta Org

In the Okta admin console, we’ll create a group of users that we’ll assign to a OIDC client, and we’ll configure the AuthZ Server to inject the list of groups into the id_token.

Go to your Okta admin console Let’s create a group. Go to the sidebar menu and select Directory > Groups > Add Group.

Then enter eks-admins in the Name field, and in the Description field enter Admins who can administer the EKS cluster:

Click Save. Then assign yourself to this group. From the Group screen, go to the People tab and click on Assign people:

Search for your user and click on the +:

You should see that your user is now assigned to the eks-admins group:

Now we’ll create a new OIDC client. We’ll leverage the AuthCode + PKCE grant type since the terminal to access EKS clusters will be running on the laptops of DevOps team members, and like any native app, it can’t host any secrets.

While still in your Okta admin console, go to the sidebar menu and select Applications > Applications. On the Applications screen select Create App Integration:

Select Sign-in method OIDC - OpenID Connect, Application type Native Application, then click Next:

Enter the following settings:

App integration name: EKS Grant type: Authorization Code only Set the Sign-in redirect URIs to http://localhost:8000 (Eventually, we’ll run the kubectl commands from our laptop.) For Controlled access select Allow everyone in your organization to access

Then Save. In the General tab, be sure to select Use PKCE. Then copy the Client ID, we’ll need it later:

Now let’s create an Authorization Server. Go to the sidebar menu, select Security > API. Then go to Authorization Servers tab and select Add Authorization Server.

Enter the following settings:

Name: EKS Audience: http://localhost:8000

Click on Save.

On the next screen, copy the Issuer URL from the Settings tab. We’ll need it later:

Now let’s add a custom claim “groups” in the id_token that Okta will generate, to list the groups of the connected user.

Go to the Claims tab and select Add Claim.

Use the following settings to add the groups claim in the id_token:

Name: groups Include in token type: ID Token - Always Value type: Groups. Filter: Starts with eks- (This means we’ll only list the connected user’s groups whose names start with “eks-“) Include in: Any scope

Now let’s create an access policy on this AuthZ Server to drive when the AuthZ Server should mint the id_token.

Go to the Access Policies tab and select Add Policy

Enter the following Policy settings:

Policy name: EKS Description: EKS Assign to: The following clients Clients: EKS (Look for the OIDC client you created earlier.)

Once you Create Policy, add a rule. Click on Add Rule:

Enter the following Rule settings:

Rule Name: AuthCode + PKCE Grant type is: Authorization Code User is: Any user assigned the app Scopes requested: Any scopes

Then click on Create Rule.

You should see this view:

Now let’s run a test to see what our id_token will look like when the Okta AuthZ Server mints it.

Go to the Token Preview tab and enter the following Request Properties: OAuth/OIDC client: EKS Grant type: Authorization Code User: your user Under Scopes, enter openid, email, profile, offline_access

Then click on Token Preview. On the right side of the screen, you’ll see a preview of your id_token. So far it has all the claims we’re looking for, including:

“email”: typically matches your Okta username “groups”: contains an array of groups the user is a member of, including “eks-admins”

The configuration on the Okta side is complete.

Add Okta as an OIDC Provider on Your EKS Cluster

Now let’s get back to the AWS Console:

Open the eks-cluster view. Go to the Configuration tab, then select Authentication, and click on Associate Identity Provider.

Enter the following parameters:

Name: Okta Issuer URL: This is the URL you copied earlier from your Okta AuthZ Server. Client ID: This is the value you copied earlier from your Okta OIDC client. Username claim: email Groups claim: groups

Then Save.

Note: Your EKS cluster configuration may take 5-10 minutes to update after you add the OIDC provider.

Now let’s update the kubeconfig of the EKS cluster so the API server prompts for authentication whenever there’s an inbound kubectl request. We’ll also add an RBAC control stating that a user part of the eks-admins Okta group will have the k8s ClusterRole cluster-admin.

To update the EKS kubeconfig we’ll use AWS CloudShell. It’s particularly convenient to make quick updates if you access the AWS Console with Okta SSO and assume a given role, as we did earlier in this tutorial:

Go to AWS CloudShell using the search field in the AWS Console.

You should land on this view:

Let’s install kubectl (source):

Download the latest release: curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl" Download the kubectl checksum file curl -LO "https://dl.k8s.io/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl.sha256" Validate the kubectl binary against the checksum file echo "$(<kubectl.sha256) kubectl" | sha256sum --check

If valid, the output should be:

Install kubectl sudo install -o root -g root -m 0755 kubectl /usr/local/bin/kubectl Test to ensure that we installed an up-to-date version. kubectl version --client

Now, let’s retrieve the list of EKS clusters in the specified region (us-west-1): aws eks --region us-west-1 list-clusters

Add a new context for the eks-cluster in the kubeconfig file: aws eks --region us-west-1 update-kubeconfig --name eks-cluster

Let’s see what our kubeconfig file looks like. kubectl config view

Let’s double-check the current context: kubectl config current-context

Install nano (text editor) sudo yum install -y nano

Create a cluster role binding:

Create a new yaml file nano oidc-cluster-admin-by-group.yaml Paste the following content: kind: ClusterRoleBinding apiVersion: rbac.authorization.k8s.io/v1 metadata: name: oidc-cluster-admin roleRef: apiGroup: rbac.authorization.k8s.io kind: ClusterRole name: cluster-admin subjects: - kind: Group name: eks-admins enter CTRL-O to save the file then CTRL-X to close the file Create the cluster role binding from the yaml file kubectl create -f oidc-cluster-admin-by-group.yaml Apply the cluster role binding: kubectl apply -f oidc-cluster-admin-by-group.yaml Edit the local kubeconfig file and add the OIDC config nano $HOME/.kube/config Insert the part in red below:

Below is the text to include:

- name: oidc user: exec: apiVersion: client.authentication.k8s.io/v1beta1 args: - oidc-login - get-token - --oidc-issuer-url=https://nico.okta.com/oauth2/auscierlvzfBoWkKC2p7 - --oidc-client-id=0oacieu408ExEjXwu2p7 - --oidc-extra-scope=email - --oidc-extra-scope=offline_access - --oidc-extra-scope=profile - --oidc-extra-scope=openid - --oidc-extra-scope=groups command: kubectl

This basically specifies the config of the OIDC provider. Note: Replace the oidc-issuer-url and oidc-client-id with Issuer URL and Client ID we copied earlier.

Once you’re done editing the file:

enter CTRL-O to save the file. then enter CTRL-X to close the file.

At this point the EKS cluster is properly configured to use Okta as an OIDC provider.

From CloudShell, we can retrieve the list of pods in our cluster with our current assumed role.

kubectl get pods --all-namespaces

Let’s now look into how we can run a similar command from the terminal on our local machine.

There are two things we need to configure:

Export the kubeconfig file and import it to your laptop.

Configure a kubectl OIDC plugin to prompt the user for AuthN and request an id_token. We’ll use kubelogin.

From your CloudShell, enter the command:

echo $HOME

The path to our kubeconfig file is /home/cloudshell-user/.kube/config, as shown above.

Let’s download that file. Click on Actions at the top right of CloudShell, then click on Download file.

Paste the path to your kubeconfig file and click on Download.

You should now have a copy of the config file in your Downloads folder.

Install kubectl on your local machine. (See the instructions to install kubectl on macOS).

Double-check that kubectl is properly installed.

kubectl version --client

On your Mac, replace the existing kubeconfig file with the one you downloaded from CloudShell:

Open your Finder on your Mac, then CMD-SHIFT-G. Enter ~/.kube/config then click Go.

Rename any existing config files as a backup and add the one from CloudShell:

Open a terminal on your laptop and run: kubectl config get-contexts

If you don’t see a “*” in front of the desired context, run: kubectl config use-context arn:aws:eks:us-west-1:013353681016:cluster/eks-cluster

You may want to adjust the context name in the above command based on the context name in your own config file.

You may want to adjust the context name in the above command based on the context name in your own config file.

You may want to adjust the context name in the above command based on the context name in your own config file.

Double-check that the current context is properly set. kubectl config get-contexts

Install kubelogin (the OIDC helper for kubectl). Run this for mac/Linux: brew install int128/kubelogin/kubelogin Now let’s try to test the first part of the AuthN flow. Run the following command. (Be sure to replace the oidc-issuer-url and oidc-client-id with your own values.): kubectl oidc-login setup --oidc-issuer-url=https://nico.okta.com/oauth2/auscierlvzfBoWkKC2p7 --oidc-client-id=0oacieu408ExEjXwu2p7

You should be prompted to authenticate in your web browser against your Okta org.

After authenticating, you should be redirected to localhost:8000 in your web browser, with an OK response.

You can close this tab.

Check your terminal. You should see a confirmation that you’ve received an id_token from Okta:

We’re ready to make a final test. Run: kubectl --user=oidc get pods --all-namespaces

If you’re not already Okta-authenticated you’ll be prompted for AuthN. You should be able to see your list of pods:

Congratulations! You’ve successfully configured Okta as an OIDC provider to access your EKS cluster! 🎉

Some Extra Checks

Let’s double-check that our RBAC controls are working as expected. Currently we’re a member of the eks-admins Okta group in the Universal Directory.

Let’s remove ourselves from the eks-admins group in the Okta admin console.

Then, let’s delete the cached id_token on our laptop.

In your terminal, run: cd ~/.kube/cache/oidc-login List files in your cache folder. ls

That first file contains the id_token Okta minted. Let’s delete it. rm 8ead66f63afa81d7300257989c391d035f386b80758a2847c99d37ecdd5610e0 Double-check that your cache folder is empty. ls

Ok, now let’s try again to retrieve the list of pods. kubectl --user=oidc get pods --all-namespaces

As expected, we’re not authorized. Since we’re no longer a member of the eks-admins Okta group, the group is no longer injected in the id_token, and the Kubernetes API Server no longer applies the cluster-admin role.

Once again, let’s delete the cached id_token in the ~/.kube/cache/oidc-login folder. Let’s add ourselves again to the eks-admins Okta group.

We should now be able to access the list of pods as before:

Pretty cool right? 😎

Automation for Your AWS EKS Workflow

All the manual steps in this tutorial can be automated:

AWS exposes REST APIs and a Terraform Provider. Okta exposes REST APIs and a Terraform Provider Learn More About Identity Security

You successfully configured Okta as a third party OIDC provider on your EKS cluster, and applied RBAC to enforce least privilege without the need to configure AWS IAM roles. This allows you to have a very generic AuthN/AuthZ framework, for all your Kubernetes (k8s) clusters, regardless of where they run (public cloud, private cloud, or on-prem).

To learn more about OAuth 2.0 and OIDC, check out these blog posts

Easy Single Sign-On with Spring Boot and OAuth 2.0 Add Social Login to Your Spring Boot 2.0 App Build a CRUD App with Vue.js, Spring Boot, and Kotlin Use PKCE with OAuth 2.0 and Spring Boot for Better Security Migrate Your Spring Boot App to the Latest and Greatest Spring Security and OAuth 2.0

Follow Okta Developers for more great content and updates from the team! You can find us on Twitter, Facebook, subscribe to our YouTube Channel, or start the conversation below.

Thursday, 07. October 2021

KuppingerCole

Uncovering the Truth About SAP Identity & Access Management

Ensuring everyone has access to the right systems and data is critical for security and compliance, but often the management of identity and access in SAP is siloed. A survey by SailPoint Technologies and Turnkey Consulting uncovers the extent to which this is true and points to potential solutions.

Ensuring everyone has access to the right systems and data is critical for security and compliance, but often the management of identity and access in SAP is siloed. A survey by SailPoint Technologies and Turnkey Consulting uncovers the extent to which this is true and points to potential solutions.




Dark Matter Labs

DM Note #6 — Building the Spatial Justice Mission

DM Note #6 — Building the Spatial Justice Mission This is the sixth in a series of DM notes, in which we write about insights from our work on the ground, following internal learning sessions called DM Downloads that are organized every two weeks or so. The aim is to make our practice more legible, for us as well as for you. As some of you might know, Dark Matter Labs emerged from the
DM Note #6 — Building the Spatial Justice Mission

This is the sixth in a series of DM notes, in which we write about insights from our work on the ground, following internal learning sessions called DM Downloads that are organized every two weeks or so. The aim is to make our practice more legible, for us as well as for you.

As some of you might know, Dark Matter Labs emerged from the interest of revealing and transforming the forces dictating our built environment. The Spatial Justice mission formalises this intention that still drives us: understanding institutional innovation, systemic challenges, and social justice through built form and lived spaces. In partnership with communities and governments around the world, we aim at developing the open systems for tech enabled, democratic and sustainable urban futures.

Why Spatial Justice?

Spatial justice is not a new term. It references an established body of academic work and activism that frames social justice not just as concerned with distribution of wealth and power amongst individuals or groups, but one that shaped and driven by the geography and place in which it takes place. How we organise and experience space can lock in injustice and ecological degradation, or can accelerate equity and agency, a good economic recovery and support climate transition.

One of the first steps in understanding this area of work is to delineate a set series of scales of spatial justice: from the discreet materials that make up our buildings, through to scales that are in themselves complex systems, like cities or bio-regions.

Over the past few years and across many initiatives and geographies, we have been working on building a mission around spatial justice that recognizes that the intimate spaces of our homes and the community spaces of our neighbourhoods have become the minimum viable units for systemic change.

The second step is to focus heavily on the invisible layers of design that shape those physical scales; centring heavily on contemporary bureaucracy, financial incentives, and governance structures (eg. planning policy, urban use licenses, property laws, etc.) — and investigating these as active sites for creating spatial justice.

Focus areas:

Out of this framing we have landed on four focus areas of work, and seek projects, initiatives, and willing partners for advancing this:

Socializing the value of land

Equity in access and affordability Civic ownership & governance Linking land value to civic and social infrastructure Linking land value to civic outcomes (eg. air quality)

Locally-led Just Climate Transition

Building community capacity, collective ownership and resilience Developing models that support additional long-term investment Changing the dynamics of finance and local places

Material circulatory and self sovereignty

Beyond human asset ownership and extraction Indigenous law and legal sovereignty of materials National material passports and performance registry

(performance based) Contract innovation

Focus on performance rather than objects Live energy use intensity contracts Projects

Supporting greater equality and transitioning our urban economies beyond carbon will require unprecedented levels of intervention and investment, and in turn, a new model of democratic and collective ownership and management. The following projects, being developed across North America and Europe each explore various possibilities on how to rethink our relationship to the land and how we use it.

Smart Covenant / Smart Commons Real estate value impact of New York’s Highline

Public investment in shared infrastructure results in disproportionate private profit through land value uplift. This problem which underpins our current model of urban development, fuelling land speculation and creating unequal and unsustainable neighbourhoods. We need to find ways to redesign our land economy.

In collaboration with EIT Climate-KIC and Centre for Spatial Technologies, Dark Matter Labs are developing a platform that brings together smart contracts, property value data and distributed ledgers to more fairly distribute this public value, by creating a mechanism to invest in building more sustainable communities.

Since the publication of Smart Commons on Medium, we built on the feedback we received to better understand the technical and cultural barriers in the realization of that vision. We then further adapted these proposals to other areas we are working on to further develop them, such as micro-contracting for the retrofit of buildings.

Partners: EIT Climate-KIC and Centre for Spatial Technologies

Open Whole Community Retrofit Ecosystem

To meet our current climate targets almost every home and neighbourhood in the UK will need to be retrofitted to reduce energy and resource demand, reduce flood risk, urban heat island effects and manage and reduce associated liabilities. This is also a massive opportunity to create better and healthier homes and neighbourhoods as well as a whole new sector of the economy.

Previous approaches to retrofit have failed dismally. This is often because government has assumed that the best way forward is to create subsidies for individual measures for individual households.

Furthermore the UK remains a deeply unequal and highly centralized society and current proposals around ‘levelling up’ remain poorly articulated, governed and resourced. More than ten years of austerity have significantly damaged local capacity and resilience.

This project seeks to use new investment logic (transformation capital), to create positive change and support the tangible and intangible assets in our communities (community capital) by using new forms of contracting and better understanding of spillover value flows (civic value) and changing the dynamics between financial institutions and local places. Rather than an atomized approach, we are exploring how locally accountable climate transition organisations can support collective change, and how this might be replicated.

Transformation Capital is the deployment of capital to fundamentally change dynamics of the (real-economy) system, setting it on an environmentally and socially sustainable footing, as well enabling the continued multiplication of capital in the long run. This will go way beyond small shifts being currently assumed through ESG investment paradigm.

When this is aligned with places and communities in a way that, rather extracting their value, generates compound, multiplying value (for both private and public actors) and invests in the networks of trust, deliberation and agency at local level alongside climate transition projects themselves, this marrying of transformation capital and community capital can establish a new model for systemic, non-extractive investment logic.

Partners: Working with Thirty Percy, Lankelly Chase, EIT-Climate KIC, TransCap Initiative, and Civic Square, the project is currently engaging residents on a street in Birmingham, to help the residents envision potential street-scale propositions, such as street retrofit, and rewiring and integrating their value flows.

Healthy, Clean Cities EU CINCO

Healthy, Clean Cities EU CINCO is a two-year initiative, supported by Laudes Foundation, which focusses on accelerating efforts to minimise embodied carbon in new buildings, supporting circular practices and increasing uptake of bio-based materials. Acting in the cities of Madrid and Milan as the two initial testbeds, Dark Matter Labs is working with a consortium of European design partners, coordinated by EIT Climate-KIC, alongside implementation partners who include: City of Madrid, Commune di Milano, AMAT, Redo, Distrito Castellana Norte, Arup, Politecnico di Milano and itdUPM. The testbeds are two of Europe’s largest urban regeneration projects: Madrid Nuevo Norte and Milan’s L’Innesto.

The consortium is working closely with local partners to identify and develop innovations which could support greenhouse gas neutrality of a development over its full lifecycle, considering opportunities throughout the value chain, including: policy and regulation; financing; citizen engagement; digital tools and data; and materials and processes. Dark Matter Labs is specifically focussing on the areas of policy and regulation, contractual innovation and digital and data infrastructure.

Partners: Design partners — EIT Climate-KIC, Bankers Without Boundaries, Democratic Society and Material Economics + Implementation partners — City of Madrid, Commune di Milano, AMAT, Redo, Distrito Castellana Norte, Arup, Politecnico di Milano and itdUPM

Material Registry / Performance-Based Contracting Mapping the life cycle impacts of an electric hire bicycle

Material Registry is a concept for an open registry of embodied impact data and in-use performance data for buildings and their constituent components, that could enable performance-based contracting and regulation in the built environment. If implemented it would enable better decisions on building retrofit, material circularity and minimizing environmental impacts.

The registry would provide open access to standardized and reliable data on embodied carbon, materials, processes and labour; component condition, location and performance such as air quality, temperature and humidity.

In collaboration with the Centre for Spatial Technologies, Dark Matter Labs has been developing an animation to illustrate the lifecycle of a self-sovereign window that describes the matter, processes and the embodied impacts it contains as well as the possible futures it could become part of.

Self-sovereign objects are are self-executing and self-owning; capable of determining their own lifecycle to maximise material utility and performance whilst minimizing negative environmental impacts. Augmented by sensing capabilities and the ability to autonomously execute smart contracts, material objects could continuously monitor their own performance and contract for their maintenance, repair or recycling.

Partners: Centre for Spatial Technologies, EIT Climate-KIC

Community Wealth / Perpetual Affordability

In Canada close to 79% of wealth is invested in real estate both directly through home ownership but also through our investments, pension plans and education funds for our children. There is a similar situation in many other countries.

This leads to our economy, our money and political systems being locked into unsustainable policies and paradigms of ever increasing land values (and the attendant housing affordability crises) and new construction (and the challenges around material use set out above as well as possibly social costs).

Community Wealth focuses on systematic change in the ways we view wealth and its purpose by transitioning away from a dichotomy of private and public property ownership towards portfolios of community assets, jointly owned and governed. We believe this would better allow us to address the affordability gap that is breaking apart our social fabric, as well as create solutions for climate crisis.

One of the pathways towards this long-term horizon goal is through creation of a parallel housing market that would focus on the ‘missing middle’ of the Canadian residential real estate market. By aggregating capital at a national level through a fund and deploying it locally to participatory districts, first-time home buyers as well as existing home owners would be able to transition towards a new model of “perpetual stewardship”.

We are focusing on specific components of the long-term horizon and prototyping the following connected proposals:

“Free House” where smart perpetual bond meets self-sovereign house meets stewardship contract; “Community wealth portfolio” which consists of: affordable homes (via Free House and other models); Renewable energy; Shared Transportation; Local small businesses (via retail real estate); Natural assets; and Co-sharing working space (via office real estate); “Smart Perpetual Bond” — long-term instrument for capitalizing the community wealth portfolio; “Participatory District” — a new type of intermediary that holds all above-land assets; jointly owned and governed by: residents, community foundations, community groups (social enterprises, not for profits working to meet community needs) and government; “Smart Community Trust” — asset holder of all land in perpetuity.

Partners: United Properties Resource Corporation, CMHC, Watershed Partners, Third Space Planning and Generation Squeeze

Key Questions, Lessons and Challenges Moving Forward Getting to the right mission framing: Spatial Justice encompasses a very broad range of topics. What are we missing or overstepping the line with the framing to better organise internally around the key platform components that we want to build for the just climate transition of the built environment? How do our projects interact? Understanding real estate as a system: How do we understand ‘real estate’ as a system for spatial justice? How can we address materials (build quality, lifespan, carbon, maintenance), supply chain (labour rights, skill level, regenerative), ownership (power, legal setup, governance), finance (liquidity, value) and data (behaviour, technology, openness) in that system? Lack of quality information on materials: Currently, we lack of good quality information on buildings. We lack open, verified data on: Supply chain (Materials; Processes; Carbon; Labour practices..), Performance of components (Air tightness; Thermal insulation; Condition; Reuse potential…) and Performance of spaces (Indoor air quality; Temperature; Humidity; Light…). What are the best ways to change that? Various horizons are important: At least 3 horizons are key to the mission’ advancement; Horizon 1 teaches the sequencing in the pathway, Horizon 2 constructs a macro solution within existing system, and Horizon 3 illustrates the possibility of the mission and creates an alternative system. We needed to advance on all three horizons simultaneously in order to move towards a possibility of the solution. Moving towards longer term mission-wide funding: Carefully crafting partnerships and prototypes takes time and resources; we needed to strike a balance between allocating our capacity and ensuring that we are eventually getting paid for the work we are developing, or better developing mission-wide funding support. Building financial models can be fun and visionary: Building a financial model is extremely important and fun; it forces us to look carefully at numbers, decide what connections and correlations made sense, and establish where are the gaps. Once that work is completed, we could replicate the model to other prototypes in other initiatives across our portfolio. Recognizing the tensions behind the numbers: Much of this work hypothesises that monetizing intangible co-benefits — such as the value of heritage, wellbeing, access to nature, and so on — will drive different investment flows and result in better outcomes. Most current investment decisions are justified through quantitative analysis and therefore developing better systems for this is of significant benefit. But there are dangers in this approach: Even an evidence-based process of quantification is compromised with assumptions and can be corrupted and opaque. The numbers should support open, transparent and democratic conversations about values and trade offs — for example, do we want to give more weight to heritage or nature? There are some values which are beyond quantification e.g. The spiritual value of a tree. Furthermore financialization should be complemented by Effective and genuine participation, better regulation and reformed political systems for example. Addressing uneven power dynamics between capital and community: One common thread was about here seems to be unequal power dynamics between finance and the places where some of the projects operate, and how local actors with what feels like enormous scale systems. Existing systems create certain rules and paradigms that have to shift to make it more equitable. How can we ensure that the terms of engagement be fairer (security, returns, scale, etc.). Imagining land trusts beyond discounted capital: Land trusts are a powerful tool that have shown immense potential, however they generally work on the basis that somebody has to write off some or all of the price of the land at the outset. In other words, it requires discounted capital to operationalize itself, which can limit in part the scaling of the model. What we are trying to achieve in some of the emergent models we are working on is to trying to acknowledge the current value, and then create a deviation from it, in a way that can work as a more universal mechanism. How can we build that not in opposition but with or in support of the existing land trust movement around the world? Get in touch

If you enjoyed this 6th DM Note, you can read all of our previous notes here, follow us on Medium for more to come and clap the article to show appreciation. And please feel free to reach out and share your thoughts on this as we continue to grow a community of interest/ practice/ impact around the world.

Jack Minchella — Spatial Justice Mission+ Smart Covenant
jack@darkmatterlabs.org
Dark Matter Laboratories UK

Calvin Po — TransCap/CommCap
calvin@darkmatterlabs.org
Dark Matter Laboratories UK

Oliver Burgess — Material Registry+ Healthy, Clean Cities EU CINCO
oliver@darkmatterlabs.org
Dark Matter Laboratories UK

Anastasia Mourogova — Community Wealth
Anastasia@darkmatterlabs.org
Dark Matter Laboratories / Laboratoires de Matière sombre — Canada

DM Note #6 — Building the Spatial Justice Mission was originally published in Dark Matter Laboratories on Medium, where people are continuing the conversation by highlighting and responding to this story.


OWI - State of Identity

Deep Tech Revolution

Will technology help save humanity or accelerate our demise? Listen in as host, Cameron D'Ambrosi, discusses with Guy Perelmuter, the Author of Present Future: Business, Science, and the Deep Tech Revolution and Founder of GRIDS Capital, to unwrap the evolution of cryptocurrencies and what that means for the future of money, how global supply chains change with AI, 3D printing, robotics, and block

Will technology help save humanity or accelerate our demise? Listen in as host, Cameron D'Ambrosi, discusses with Guy Perelmuter, the Author of Present Future: Business, Science, and the Deep Tech Revolution and Founder of GRIDS Capital, to unwrap the evolution of cryptocurrencies and what that means for the future of money, how global supply chains change with AI, 3D printing, robotics, and blockchain technologies, and how a new “social score” may come into existence.


Coinfirm

Evolved Apes: Coinfirm Demonstrates NFT Compliance with Scam Case Study

‘Evolved Apes’ was a much-hyped new PFP (profile picture) NFT project, with the 10,000 unique tokens selling out in 10 minutes. The minting of the tokens and some secondary sales generated millions of USD. But on September 24, 2021 the main developer of the project, ‘Evil Ape’, and the project’s creators then rug pulled the...
‘Evolved Apes’ was a much-hyped new PFP (profile picture) NFT project, with the 10,000 unique tokens selling out in 10 minutes. The minting of the tokens and some secondary sales generated millions of USD. But on September 24, 2021 the main developer of the project, ‘Evil Ape’, and the project’s creators then rug pulled the...

Elliptic

Elliptic Supports Polkadot and Algorand

We are delighted to announce that Elliptic now supports the native cryptoassets of the Polkadot (DOT) and Algorand (ALGO) blockchains. This development makes us the first blockchain analytics provider to offer comprehensive coverage of the Polkadot blockchain globally.

We are delighted to announce that Elliptic now supports the native cryptoassets of the Polkadot (DOT) and Algorand (ALGO) blockchains. This development makes us the first blockchain analytics provider to offer comprehensive coverage of the Polkadot blockchain globally.


KuppingerCole

CSLS Speaker Spotlight: KC Analyst Alexei Balaganski on the Human Factor in Cybersecurity

by Fabian Süß Alexei Balaganski, Lead Analyst and Chief Technology Officer at KuppingerCole will discuss the Human Factor in Cybersecurity on Wednesday, November 10 from 11:00 am to 13:00 pm in the first track at Cybersecurity Leadership Summit 2021. To give you a sneak preview of what to expect, we asked Alexei some questions about the track. What exactly is the human factor in cybersecu

by Fabian Süß

Alexei Balaganski, Lead Analyst and Chief Technology Officer at KuppingerCole will discuss the Human Factor in Cybersecurity on Wednesday, November 10 from 11:00 am to 13:00 pm in the first track at Cybersecurity Leadership Summit 2021.

To give you a sneak preview of what to expect, we asked Alexei some questions about the track.

What exactly is the human factor in cybersecurity after all?

Well, here we are starting with the right questions. Yeah, cybersecurity, in general, is a field where everything depends on the right definition because more often than not, we hear people talking about things which they call a certain title or name or definition or buzzword. But they talk about totally different things, right? Whether it's cloud or Zero Trust or even blockchain. It may mean different things. And the same applies to the "human factor". I would say, first of all, it's more than one thing with regard to cybersecurity. Perhaps I would try to identify at least three.

The most obvious one is, to err is human, everyone makes mistakes, and especially it applies to users of information technology. So it's the normal business people, not the experts. They are doing their job, they're trying to do it as securely as possible, of course, but they just have no idea how to do it properly, how to deal with all the technology. So they make mistakes. And sometimes this can lead to a massive data breach or a compliance violation. Not very often, but it can. Of course, there is a lot of bad actors out there who try to exploit this. We have a multitude of different attack vectors. The most obvious ones are social engineering, where you just get an email, which looks normal, it might even come from your CEO and you click a link and you get a malware download and you have a ransomware attack. Or it might lead to a discussion with a person that pretends to be CEO and you end up wiring all your corporate money to a hacker's account. There is a lot of possibilities to exploit. And everything depends on this human factor.

However, this is not the only thing we have to worry about. One other major problem is the so-called skills gap in cybersecurity. Here we are talking about experts, actual security analysts, technicians, or people who are supposed to protect those, quote-unquote, innocent businesspeople. But you just don't have enough of them. And even those who are employed, are always overworked. They have too many things to deal with, but too little time. So, yes, that's skills gap. It's been, it's existed for a decade at least, and it's only growing. And there is no simple solution on how to address it.

And finally, last but not least, there are people in the company who make executive decisions. These are usually not the technical people. But again, might be your CEO or some other person who decides, OK, we should spend some money and deploy a new security solution. Or perhaps we shouldn't. Or we want to switch to a totally different business platform. Maybe it's Salesforce, maybe something else. But every new business platform comes with a set of new security risks. And if those executive people make those decisions without consulting with proper security specialists like us, for example, at KuppingerCole, this might lead to massive security and compliance problems afterward as well.

So I would say these are three major human factor problems we have to deal with.

In thinking about the human factor in cybersecurity as you described it right now, we also have to talk about the insider threat. What strategies and probably technologies can help organizations reducing the risk from such threats?

Again, when we talk about the insider threat, we have to think about multiple different kinds of it. The most obvious one is yes, the most common one is just negligence. People make mistakes. They click the wrong button, they delete a database. Or they click a wrong link and they end up firing off a ransomware attack. Theoretically, any employee can do that. They don't have to be an expert. They don't have to have special access rights, anything can happen. How do you deal with those? Well, it's everything you do, everything you plan and deploy and operate with regards to cybersecurity that has to somehow address all those challenges.

The other kind of insider threat is those disgruntled and privileged employees. Maybe they just want more money. Maybe they want to work for a different company and got to steal your trade secrets. Maybe they are colluding with a hacker and they might be even being blackmailed and forced to deploy malware just to let those hackers start their attack on your company. Again, this is very difficult to deal with because more often than not, you just cannot identify what is the difference between, quote-unquote, normal people acting normally. And again, those, the same people acting maliciously. Perhaps one of the most common ways to at least detect those attacks would be something behavior-based. But you have to observe how your workers act normally, how they work, how they do their daily jobs, and then identify when something uncommon happens, an anomaly. Unfortunately, most of the tools you can get for this purpose are pretty basic in terms of yes, they can identify that something unnatural is happening, but they will struggle to explain why exactly it is unnatural. What exactly is this anomaly about and why you should kind of put all other things aside and investigate it. Solutions that can explain it, I would say they belong to the quote-unquote, next-generation security analytic solutions.

If I understand you correctly, nowadays, when we no longer recognize the classical divide between inside and outside, we should not distinguish so much the threats from inside from those from outside, is that right?

Absolutely, not just should not, you cannot anymore, because well, first of all, for many companies, there is no inside anymore. And we are now still working from home for such a long time and will probably continue doing so. So if I am talking to you now from my own home, am I inside the company or not? And of course, a firewall would never detect me doing something malicious because there is no firewall at my home. And even if there was one, it's not connected to our corporate security. So absolutely, there is no more any difference between inside and outside. You have to treat every action, every activity, every potential threat, regardless of where it's coming from.

Is this where Zero Trust as a concept or strategy comes into play?

Exactly, Zero Trust, I would say, is a theoretical concept which attempts to address all of those challenges at once. Basically, Zero Trust is the philosophy, it's the idea of how you should completely redesign your corporate network and your corporate security architecture to make sure that you can deal with any of those, quote-unquote, insider threats, whether they happen inside the company or somewhere outside. For example, at home.

So yes, Zero Trust is exactly the solution, but unfortunately, Zero Trust is not a magic pill, not a panacea. It's not something that you can just buy, deploy and forget about. Zero Trust is a new way to operate your business if you will. And the tools are secondary.

Looking into the future: How do you think the threats will develop in the next three to five years? Will it become more serious or will the "good side" be able to develop faster than the "bad side"?

Well, I mean, here, in the field of security, we are always on the wrong side, if you will. We will never prevail because a hacker only needs one vulnerability to break your entire system, and you have to deal with all of them at the same time. If you only missed that one, oh, you lost already. So, unfortunately, I would not be optimistic. Cyber attacks will definitely grow in scope and number and power if you will. It's becoming even cheaper than ever now to stage a cyberattack. You can even get one as a service. You only need your credit card and maybe your blockchain or Bitcoin wallet to start with one. So in that regard, no, there is no light at the end of the tunnel, if you will. But there is some hope in the new technologies and in the new approaches to design in your corporate networks.

You've already mentioned Zero Trust. If there is a company that starts with Zero Trust, it doesn't have any legacy network at all. I would say it will be automatically protected from perhaps 95 percent of all the common cyberattacks. So it will never be completely secure, but it will make the job of their IT teams and security teams much, much easier. If you add strong authentication, for example, into the mix, just that alone is a major booster of cybersecurity. And of course, things like security automation. So if you have this combination of modern innovative security technologies, you will be much safer than before. You will not be safe, but at least you will be protected from a lot of drive-by attacks if you will, which are not specifically targeted at you but you will never be the collateral damage in such an attack.

Do I understand you correctly? Do you think that something like Autonomous Security, which makes security smarter, for example through machine learning, will help companies rule out the human factor in some future time?

Well, it's a difficult question because, again, we are talking about a thing which many people tend to misunderstand completely. Autonomous security does not equal automated security. Autonomous does not necessarily mean that it's operated by a robot or artificial intelligence. Yes, we know that AI is a huge buzzword nowadays. And we've observed a lot of interesting developments. But we know that even now in cybersecurity, nobody trusts AI to make security decisions. Simply because there is a lot at stake and a wrong decision can ruin your business completely, especially if it's something like manufacturing or high-speed financial trading. So people tend to mistrust AI.

And when we're talking about properly autonomous security, it's probably still managed by a team of security experts. But the team doesn't have to work for you. It might be just the managed service, a multi-tenant managed service even, so someone who's operating your security for you as a service from a remote location. But they are still kind of much closer to your infrastructure and your pain points and bottlenecks to quickly identify any attack. But again, there will still be a human, in the end, making a decision. But your goal is to make sure that that human is the best one, the smartest one, and the one supported by the best tools. So a robot will never replace a human in that regard. But they will help.

Do you think "never" is a word to be used in that context?

Well, I dread the times where robots will make all the decisions for us. I think they will be more like Skynet than benevolent...


nu_i.d.

My NuID: It Started With Authentication

NuID's new COO reflects on his journey discovering the company and diving into the deep end of sovereign digital identity

Last week I wrapped up my first quarter here at NuID, which is also my first 90 days as COO of the company since I started on July 1st.

Prior to July 1st it had been 20 years since the last time I formally “resigned” from one company and “joined” a completely different one.  I have certainly started many new jobs, projects, roles, and relationships along the way, but this one has been a bit different.  Reflecting on it now, the difference is that in the SAP/Ariba/Procurement technology world, my reputation—my identity—was always known even though the responsibility and team may change.

1998 was my first year as a professional consultant, and during an early performance feedback session my manager said to me something along the lines of, “you can only see into the future twice as far as you’ve come...  you’ve only been on the job for 6 months; what do you want to be doing in a year?.”  It stuck with me as a truth, I believe it was David Barron that told me that.

Sometime between 2013 and 2016 I had come far enough in my career and had enough Gladwell-esque sampling under my belt that certain other truths about my values had become quite clear to me. I realized that the teams I had been building, and that had built me, held these common truths as well.

Things like trust and loyalty are important to me, and I say that because I feel physically uncomfortable if lost, not because they are abstract concepts I’ve merely decided to abide by; they are things that I just am, just feel.  Things also like accountability, follow-through, challenging the status-quo.

Things like learning to not mistake kindness for weakness and developing the experience to teach it as a lesson if I’m mistaken for weak.  Things like at the end of the day, I can only work with people that I genuinely like and enjoy, because I can’t be successful if I’m not having a lot of fun while doing it.

I’ve also shared many times in coaching sessions, and have illustrated in my career that “the best way to get your next job is to be great at the job you have,” not to always be looking for your next move.  The process of how I got to the decision to leave SAP-Ariba—and my family of 2,000 of the most amazing people to be found in eProcurement—is one that I’ve been very thoughtful about.

Starting sometime around the holidays of last year, my mind emotionally started to wander about “what’s, really next?”.  I was having instincts I trust begin telling me to think about the next 20 years of my career.

As I pondered that question over several months, a few things I was able to determine were that:

I felt the end of the first half of my career was coming

I was not going to leave SAP to do SAP, or Procurement software elsewhere

Application verticals are not where I want to be

Platforms, hyperscalers, low-code/no-code, P2P, and distributed Web3 technologies is where the tip of the spear is

Identity management is a big internet problem, it’s a separate and common problem across all of these enablers for the Web of the future. And THAT, I’m interested in most indeed!

I was feeling a draw towards what I was calling Identity Management at the time and began looking into Enterprise Identity SaaS providers

Side note – it’s probably not coincidence that Jason Wolf resigned less than a month before I did to become Ping Identities Chief Revenue Officer. Jason and I “grew up” together at Ariba and SAP after both graduating from Texas A&M. We spoke often, including about careers, but neither of us told the other where we were going, or asked. I learned from Jason’s LinkedIn announcement, chuckled, and was not surprised.

In December and January, I had paid $1,700 to have a professional writer begin compiling some form of a resume for me.  I truly did not have one for over 15 years; I needed only to rely on my identity and my reputation.  Since I sensed a different kind of change was coming for me, I felt I needed serious resume help, and yet that resume was never used for the same reason I hadn't needed one in the 15 years prior.

In late March I was walking back from my mailbox while simultaneously texting with my aunt, Sarah.  Sarah randomly told me of a company that she had invested in that I should look into called NuID (“(New Id)” she included, so I didn’t think it was pronounced “Noooooid”).

It made me stop walking, stand still, and google NuID.  A few minutes later, still standing there with mail tucked under my arm, I replied “Well Sarah, personally I am looking to invest in and join a group in identity management specifically, or platforms in general.  I like what NuID is doing, that is the type of group I want to join and lead to grow, good stuff right there!”

Within 48 hours I had spoken to my cousin and arranged for him to introduce me to Locke.  My entire family made a weekend trip to Birmingham to visit family, and to meet with Locke for 6 hours straight on Saturday and again at brunch on Sunday.  For the next two months, I, and NuID, did as much due-diligence as we could, and then I joined the team.

That we know, but I’ve been thinking about it in terms of Relationships, Identity, Credentials, Verification, and Reputation – a big Web2.0 problem, without a real solution prior to NuID for a while now so I want to play back that very natural, organic vetting and transacting that occurred between NuID and I with a bit more simplicity, and specificity, to ultimately help illuminate the mechanics of identity as they play out in the real world.

SAP ID #835427

I was an SAP employee from 2012 to 2021 and as such, I possessed a Verified Credential with SAP ID # 835427.

Over those years I developed a strong Reputation within the Ariba and SAP community and progressed through my career, which is easily verified on LinkedIn, or by talking to my team.

In late March I was walking back from the mailbox when I got a random text from my aunt Sarah that “she had a thought”. Sarah had invested in a company called NuID that she thought I would be very interested in.

Sarah learned of NuID because she saw Locke Brown grow up, since Locke and her son went to junior-high and high school together she personally knew Locke.

Sarah isn’t a technologist, far from it actually, but she is a successful doctor and businesswoman. So, it wasn’t a keen eye toward SSID, it was her son’s attestation of NuID’s Credentials, and her personal experience with Locke’s reputation that gave her confidence to invest.

Similarly, Sarah verifying Locke’s Credentials with me, gave me the attestation I needed begin digging into NuID, intensely.

Locke, as human nature is, essentially got attestations from my cousin and aunt that I am indeed, Jason Jablecki and that I have the Credentials necessary. So we met via a personal email introduction from my cousin.

As June 30th turned to July 1st, I turned in my SAP employee credential and was granted my NuID employee credential

You see… Identity is the fact of being who or what a person or thing is, and Authentication is how we know that the Identity is indeed, The Fact.  Once we have authenticated the factual-identity of a person, place, thing, device, anything, and we trust its verified credentials, it can exchange digital assets or participate trustlessly in any peer-to-peer digital, and digitally enabled physical interaction.

After Authentication, it is really the composite of an identity’s qualification, achievement, personal quality, or aspect of a person's background that make up how we tend to think of identity, or reputation of a person day to day.

Credentials are elemental to Identity, and we know this instinctively as humans.

Go shopping and present debit card as a credential that authorizes financial settlement for the exchange of assets.

Travel using CLEAR and your biometric credential authorizes an agent to escort you more quickly through TSA.

Our verifiable identities are elemental to our interactions, our transactions, our possession, therefore, authentication is elemental for identities.

So once again, let’s play my story out, this time through the self-sovereign identity layer of Web3.0, by using the backwards compatible Web2.0-bridging NuID Authentication Protocol.

Safe harbor: this is an imaginary, futuristic scenario that did not take place, and even though all of it is possible with the NuID elements available today, none of the parties are real

I was a NuSAP employee through June of this year. NuSAP had implemented the NuID Authentication Protocol element in Q1 and deleted all employee passwords, so my verified NuID was created at that time and SAP attested my Employment Credential to it.

During my time with NuSAP I developed a strong reputation, held many different positions, got certifications, and titles, all of which was attested as credentials to my Identity by NuSAP and other third parties.

In late March of this year I was walking back from the mailbox, simultaneously texting with my aunt, who told me of NuID, and that NuID is looking for a COO.

My aunt connected me to Locke Brown, with whom I shared relevant aspects of my Nu Identity, which now has quite a reputation given all of the “credentials” I’ve acquired.

NuSAP’s automated offboarding process ran on June 30th, removing my employee credential from my NuID, and NuID Inc. attested my Nu Employee Credential.

Thinking through identity process and applications of the NuID protocol is something I’ve been doing a lot of lately while we’re not working on the Kii launch, our Q2 consumer product release, and talking to enterprise and partner prospects.  I’ll close this, my first NuID blog, by sharing a whiteboard that I did last week with the team, which is referred to as, the comic book:

Look out for more blogs, comics, and content from the NuID team and for exciting news to come as we execute on our roadmap toward a sovereign identity ecosystem.

—Jason


Ontology

Ontology Harbinger Interview Series: SoloTürk

This is an ongoing series featuring Ontology’s Harbingers. Harbingers are exceptional leaders that help Ontology grow its community and ecosystem, making it an ever-stronger blockchain, and decentralized identity and data leader. In the ninth interview in this series, Ontology speaks with SoloTürk, administrator of our Turkish Community. 1. How did you hear about Ontology? What drew your at
This is an ongoing series featuring Ontology’s Harbingers. Harbingers are exceptional leaders that help Ontology grow its community and ecosystem, making it an ever-stronger blockchain, and decentralized identity and data leader.

In the ninth interview in this series, Ontology speaks with SoloTürk, administrator of our Turkish Community.

1. How did you hear about Ontology? What drew your attention to Ontology?

I discovered Ontology for the first time on CoinMarketCap in January, 2018. I liked the name Ontology, so I did some research.

2. Why did you become a supporter of Ontology and a champion of its brand?

I became a supporter as I saw very little activity in the Ontology Turkish group. So I asked if I could be a part-time supporter and help strengthen the group, and now I am a big supporter. My group is much more active, and we also have a Free Technical Analysis Signals group.

3. What’s the key role of a Harbinger? What’s your favorite thing about being a Harbinger?

I want to strengthen the team and grow the community.

4. Why is it important to you to be a part of the Ontology community?

I want to follow up on the latest news from Ontology and share the news and ideas within our Ontology community.

5. How is the Ontology community different from other blockchain communities, is there anything that makes it stand out?

The Harbinger program is not available in other blockchains or other communities. I have not seen it or heard of anyone else doing a similar program.

6. What do you use as your main channels for interacting with the Ontology community and why? Do you want to see more?

Twitter and Telegram for now, but maybe Instagram in the future.

7. Can you share an unforgettable experience or something you learned from being a Harbinger? What advice would you give to someone who wants to be a journalist?

My advice would be to stay active in the project. Supporting the project as much as you can, helping them to build the future, and sharing your knowledge with others.

8. How would you describe the Ontology community in three words?

Active, Quality, Technology.

9. How do you think Ontology can expand its community going forward? What would you like to see more/less? What kinds of things do you see community members doing that you think have helped our community grow?

Hosting AMAs each week in other Telegram groups with other public chains, or by inviting them and mingling with them and organizing activities together.

10. What do you see as key milestones for Ontology and how can the community help achieve them?

Now Ontology is 3 years old and I hope to see more collaborations with other companies or public chains, to let the world know how wonderful Ontology is.

To learn more about Ontology’s Harbinger Program and how you can get involved, check out our updated GUIDE.

Want more Ontology?

Learn more about our decentralized data and identity solutions on our website and official Twitter account. You can also chat with us on Telegram and keep up-to-date with the latest news via our Telegram Announcement account.

Ontology Harbinger Interview Series: SoloTürk was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 06. October 2021

KuppingerCole

Securing your IaaS Cloud

by Mike Small While the major CSPs (Cloud Service Providers) go to great lengths to secure the services that they provide it is up to the client of the Cloud service provider to secure their use of these services. The responsibility for security and compliance is shared. This report describes the approach that clients (or in other words tenants) of Cloud infrastructure need to take to ensure that

by Mike Small

While the major CSPs (Cloud Service Providers) go to great lengths to secure the services that they provide it is up to the client of the Cloud service provider to secure their use of these services. The responsibility for security and compliance is shared. This report describes the approach that clients (or in other words tenants) of Cloud infrastructure need to take to ensure that they use IaaS services in a way that is secure and compliant, including examples of how to realize this with Amazon Web Services. In our recent independent 2021 "Kuppinger Cole Market Compass report on Global IaaS Providers Tenant Security Controls", AWS was evaluated as an outstanding provider. This report also contains an extract of our evaluation of AWS from the Market Compass report.

Indicio

Indicio-SITA Pilot Named 2021 Enterprise Blockchain Award Finalist

The post Indicio-SITA Pilot Named 2021 Enterprise Blockchain Award Finalist appeared first on Indicio Tech.
Decentralized identity solution developer and air transport technology provider nominated for Blockchain Services Award in the Tools & Middleware category

SEATTLE — October 06, 2021 — Indicio, the world’s leading provider of decentralized identity development and hosting solutions and SITA, the leading provider of IT to the air transport industry, today announced they were finalists in the Blockchain Services Award: Tools & Middleware category in the Enterprise Blockchain Awards (EBAs) presented by the Blockchain Research Institute. The partnership was recognized for their work on the Aruba Secure Health Card, a successful pilot  for sharing health information in a privacy-preserving way using distributed ledger technology that debuted earlier in the year with the assistance of the Aruba Health Department.

“Indicio is proud to be a finalist for this award, which recognizes the power of blockchain-based technology to create a world in which the ability to trust data is possible — thank you BRI and EBA,” said Heather Dahl, CEO of Indicio. “I also want to also thank our partner in this project,  SITA, for its vision and leadership in the air travel industry, and for seeing how this technology can change the way we travel and deliver so many benefits to passengers in the future.”

A distributed blockchain-based ledger is at the foundation of Indicio and SITA’s project. It serves as an anchor for verifiable digital credentials that enable people to hold, share and prove important information in a way that preserves privacy and improves security.

“For the Aruba Secure Health Card, we wanted travelers to Aruba to share proof that they had tested negative for COVID as they visited hotels and restaurants on the island,” said Dahl. “Our goal was to have travelers share their status in a way that didn’t require sharing any personal identifying information with the businesses they visited—or require those businesses to record any personal information. We call this a Trusted Data Ecosystem. It’s where information can be shared and verified in a secure, privacy-preserving way and without the need for third parties to be involved in the flow, management, or storage of that information. Removing third parties is  vital to both data privacy compliance and to implementing a Zero-Trust approach to security. Trusted Data Ecosystems are our solution to the problem of trust online.

New this year, the Blockchain Service Award category celebrates providers of blockchain technologies and services. It recognizes companies, government agencies, or nongovernmental entities delivering blockchain solutions to transform their clients’ operations, supply chains, or ecosystems. This category offers an opportunity to showcase the tangible and long-term change that blockchain service providers are working to support.

###

The post Indicio-SITA Pilot Named 2021 Enterprise Blockchain Award Finalist appeared first on Indicio Tech.


Anonym

Consumers Could Gain from FTC’s Stronger Focus on Data Protection

The Federal Trade Commission could go harder on consumer privacy protection and cybersecurity with President Biden’s recent nomination of digital “privacy hawk” and law professor Alvaro Bedoya and House Democrats’ proposal to allocate $1 billion for a new privacy and data security bureau. If confirmed, Bedoya will join Biden’s other pick, antitrust expert and now FTC chair,&nbs

The Federal Trade Commission could go harder on consumer privacy protection and cybersecurity with President Biden’s recent nomination of digital “privacy hawk” and law professor Alvaro Bedoya and House Democrats’ proposal to allocate $1 billion for a new privacy and data security bureau.

If confirmed, Bedoya will join Biden’s other pick, antitrust expert and now FTC chair, Lina Khan. Both are prominent critics of Big Tech data collection and surveillance and seen as a potentially “formidable regulatory presence” in reining in the tech industry. 

As founding director of the Center on Privacy & Technology think tank at Georgetown University and former chief counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law, Bedoya’s work is in major privacy tech issues including mobile location data, stalkerware and facial recognition.

Bedoya is known for his bipartisan approach to privacy as a human right. We look forward to seeing what comes next for data protection.

Photo By Mark Van Scyoc

The post Consumers Could Gain from FTC’s Stronger Focus on Data Protection appeared first on Anonyome Labs.


digi.me

Talking tech and discussing data on the ‘Tech-Entrepreneur-on-a-Mission’ Podcast

Digi.me Executive Chairman and Founder Julian Ranger was the most recent guest to feature on Ton Dobbe’s ‘Tech-Entrepreneur-on-a-Mission’ Podcast. Ton is the Chief Inspiration Officer and Founder of growth consultancy firm Value Inspiration and hosts a different entrepreneur each week to chat about their innovative ideas. Recently, he explored the work digi.me is doing to put people back in co

Digi.me Executive Chairman and Founder Julian Ranger was the most recent guest to feature on Ton Dobbe’s ‘Tech-Entrepreneur-on-a-Mission’ Podcast.

Ton is the Chief Inspiration Officer and Founder of growth consultancy firm Value Inspiration and hosts a different entrepreneur each week to chat about their innovative ideas.

Recently, he explored the work digi.me is doing to put people back in control over their personal data, while respecting individual privacy. By enabling the ‘Internet of Me’, which puts individuals at the heart of their digital life, brand new opportunities are created to bring about the innovation and breakthroughs needed to change society for the better.

Continue reading Talking tech and discussing data on the ‘Tech-Entrepreneur-on-a-Mission’ Podcast at Digi.me.


Holochain

RedGrid: Carbon Neutral By 2030

Tackling the toughest climate challenge — coordinating our efforts

Forty-nine degrees Celsius outside. Suddenly, the climate crisis felt pretty real.

As I started writing this article back in June, sweltering in my home office, western North America was in the grip of a bizarre and unusual heat wave. One town a few hours west of me broke the all-time Canadian record at 49.6°C (121°F), and was incinerated in a wildfire later that week.

The last decade has seen more than its share of wildfires up and down the west coast; some years I’ve forgotten what it was like to breathe clear air and see the stars. Meanwhile, Germany, Belgium, and the southwest US were drenched in catastrophic floodwaters.

This isn’t normal.

I usually have to experience something myself in order to really appreciate it. It takes me from “of course I care, in an abstract way” to “holy cow, we’ve got to do something about this.” I expect a lot of us work that way.

So what can we do?

I’m not sure. But one thing I do know is that we have enough science to diagnose the problem and solve it. On the front end, we need to stop releasing greenhouse gases into the atmosphere; on the back end, we need to stop damaging our planet’s lungs. We can rapidly repair and even enhance our planet’s capacity to support life with the tools we already have available. And if we do it right, we could even rebalance inequity and improve the quality of our food in the process.

What we don’t know is how to coordinate our efforts — all 7.9 billion of us, and counting — to reach these goals. In a recent interview, thinker and futurist Daniel Schmachtenberger argues that if we can solve this problem, all the other problems will get solved automatically.

Energy, one piece of the pie

All the brightest and darkest things in our modern civilisation come from our capacity to harness the energies of the universe. And right now most of those energies come as fossil fuels — coal, petroleum, and natural gas. Almost three quarters of the greenhouse gases we emit come from energy production. That’s because 82% of that energy comes from fossil fuels. And we’re using more and more energy each year.

Of humanity's greenhouse gas emissions, 73% comes from energy consumption – electricity, heating, transportation, and industry. The energy we consume comes primarily from fossil fuels, while only 18% comes from renewables and nuclear power. (Note: because of rounding errors, the percentages in this graphic don't quite add up to 100%.)

This tells me two things: we need to learn how to use less energy, and the energy we do use must come from low-carbon sources. As far as we know, that means we need to electrify everything.

But low-carbon electricity carries a new set of coordination problems.

First, most renewables are not well-matched with demand. Let’s take a look at solar electricity. When we start brewing our morning tea, the sun is still rising. Then there’s a sort of peak production — too much power in the grid — in the middle of the day when we’re off at work. In the evening, just when we’re cooking supper, the sun is already going to bed. It won’t do us any good to switch to renewable electricity if it’s not actually there when we need it.

A graph taken from a typical October day in a California power utility shows a steep rise in solar electricity generation from 08:00, which begins to fall at 15:00 and is virtually non-existent at 18:00. Combined with high demand in the morning and evening, this creates a 'duck curve'.

Second, power producers and distributors build their infrastructure to meet peak demand, not average demand. If we electrify everything, that would mean more drastic peaks, which would mean more power plants and fatter wires. And every tonne of concrete and steel, every kilometre of copper wire, every wind turbine, every hectare of solar panels, is the product of carbon-intensive manufacturing processes.

Third, consumers are also becoming producers, and it’s happening quickly. Over a fifth of Australian homes have rooftop solar panels, and minigrids and community solar gardens are on the rise. Distributed energy production is already here. That’s wonderful, but the grid is designed for one-way delivery of electricity generated by large plants whose output is easy to predict.

Our infrastructure isn’t prepared to handle these challenges. It’s big and unwieldy. Teaching it to respond to these new demands would be like teaching a team of oxen to fly like a flock of starlings.

But starlings are exactly what we need. Many small energy producers and consumers, responding nimbly to each other, moment by moment. Analysts tell us that the most reasonable way to do this is to decentralise and digitalise the grid as we decarbonise.

Coordination is necessary, but it appears exceedingly difficult at the same time.

Given the risks we face, this is sobering.

Fortunately, there are people working on these challenges already. I’d like to introduce you to one such group of visionaries.

The Internet of Energy Network: teaching toasters to fly

In 2018, Dr Adam Bumpus, Alex Evans, and Sim Wilson came together with a desire to tackle the challenges of energy coordination. If they could figure this out, they knew they could help the world move toward a brighter future.

One of their first questions was, “what if we could build coordination capacities into the things that already exist?” I appreciate their choice to ground high ideals in reality. Instead of pushing the grid to change, they started at the bottom with individual devices in homes and businesses.

The trio called themselves RedGrid, assembling a team to help them imagine ways to get all these devices talking to each other. They asked what would happen if they were to connect together into ‘virtual microgrids’ and eventually into a global ‘software-defined grid’ giving intelligence to the existing physical grid. Production and storage devices, side by side with our air conditioners and fridges and ovens, making decisions together in response to changes in supply and demand. Just like a flock of starlings.

Mike Gamble, the company’s lead architect, describes this approach as “grassroots, bottom-up, organic.” Anyone who wants to join the flock is welcome.

As they built out this vision, they called it the Internet of Energy Network (I’ll call it IOEN, pronounced “ion”, for short). Given what I learned about the inflexibility of the current energy grid, this approach makes a lot of sense. It won’t require everything to change at once, which means that it has a chance of actually working. Regardless of how many people and organisations end up getting involved, something meaningful can happen at every scale. As more layers are added, the benefits start compounding.

And it happens to mesh nicely with the greenest, least complicated carbon reduction technique: just use less energy.

Even without taking advantage of the IOEN, RedGrid has an app that talks to smart appliances (and not-so-smart appliances using smart plugs). At its most basic, it helps people monitor their energy consumption; this alone has been shown to reduce people’s consumption patterns.

It also allows people to schedule energy-hungry appliances, like their dishwasher, in the middle of the day when green energy is plentiful. This doesn’t reduce consumption by itself, but it can help them green the energy they do consume, or get better rates and even rebates by taking advantage of their energy company’s demand response incentives. This doesn’t just save people money and make them feel good; it also prevents more infrastructure from being built. Remember that power companies build to meet peak demand — if we can flatten the peaks, the grid can be leaner.

Their software can also use machine learning to help long-running machines, such as air conditioners and pool pumps, use less power without sacrificing comfort.

If everyone added this sort of smarts to their individual energy use, we’d likely see both consumption and peak demand start to drop across the world. It’d help — to an extent. But the RedGrid team thinks that the real reductions will happen when these devices can break through the four walls around them and start talking to their neighbours.

Devices in the IOEN will be able to emit signals for other devices to respond to. At noon the community solar farm could say “we’re flooded with energy right now; can anyone use it?” Hundreds of dishwashers and car chargers, waiting for just such a signal, would turn on in response. Or in the evening the utility could say “we’re getting stressed; could everyone go easy for a while?” and air conditioners, pool pumps, and water heaters would scale back.

It gets more interesting when devices, and the people who own them, can talk directly to each other. Rather than overwhelming the community solar farm all at once, all those dishwashers could arrange to take turns. Or, if someone knows they’re going to be using a lot more electricity than usual — maybe they’re cooking for a big birthday party — they could let their neighbours know. People like being neighbourly, so they’re likely to make accommodations for that. Mike suggested that it could even spur a bit of friendly competition among virtual microgrids, an energy-saving drive: “You may have a group that is your sports team, and you are aiming to do better energy-wise against another team.”

And with the increase in local generation and storage — rooftop solar and electric car batteries — energy consumers could also be providers, selling their surplus to their neighbours. Local generation reduces the burden on infrastructure even further and makes everyone more resilient to power outages.

The IOEN is simply the communication layer that makes all this possible; the real magic happens in the applications built on top of it. RedGrid sees more opportunities on the horizon.

Virtual microgrids can aggressively optimise their energy use, offer their combined optimisations as a package to energy companies, and distribute the rebates to members. Physical microgrids such as Monash University’s Net Zero project (partnering with RedGrid) can build smaller systems if they can rely on the IOEN to help smooth out demand peaks. If grid operators and power producers choose to use aggregated, anonymised data from individuals and virtual microgrids, they can do an even better job of planning, building out, and maintaining their infrastructure. As the network grows, third-party service providers can pop up, offering intelligence and insights to help everyone reduce their energy use even further. Equipped with IOEN-based management software, new microgrid projects could receive investor money in exchange for a stake in revenues, accelerating the adoption of renewables.

The Rocky Mountain Institute calls this demand flexibility, predicting that it can reduce energy-related emissions by up to 40% as energy electrifies and decarbonises. But they see electricity pricing as the main coordination tool. RedGrid’s vision for the IOEN embraces pricing but goes beyond it, giving us a way to coordinate directly and explicitly with each other.

Why decentralisation?

Good question. Decentralised coordination is messy and awkward. If you’ve ever tried to organise a multi-family camping trip, you know what I’m talking about.

First of all, when we’re dependent on faraway, centralised infrastructure and all the stuff that connects us to it, we’re in a vulnerable position. This goes for both electricity generation and the digital tech needed to make it ‘smart’. What if the power plant, the machines in the data centre, or any of the miles of wiring between them and our town, go offline (or even burn down)? Suddenly we’re stranded. With rooftop and community solar, home batteries, and edge computing, we have the means to stay online, right in our home town.

Second, the further away the power source is from your plug, the more energy is lost. It’s not a huge deal, but local generation does come out a little bit ahead.

Lastly, the rising complexity of the modern grid, along with the decisions we need to make, will virtually demand it. Zillions of new internet-connected plugs, toasters, solar panels, and car chargers will be coming online, producing lots of data and expecting to be able to make split-second decisions based on current local conditions. This is a logistical challenge for cloud computing, which (despite its fluffy name) happens in faraway data centres. It’s going to be hard for the cloud to keep up. (And I ought to point out that it has its own ecological problems too).

Where Holochain comes in

This is the Holochain Blog, after all, so I ought to mention it. Why did RedGrid build the IOEN on Holochain? The way Mike explained to me, it’s because Holochain is built for decentralisation.

And I’m not talking about the sort of decentralisation you see in blockchain. It’s a marvellous feat of engineering with some compelling use cases, but it’s still logically centralised, according to Ethereum’s creator Vitalik Buterin.

In practice, that means that your smart devices need to be connected to the global blockchain network all the time. This demands an always-on internet connection with zero downtime — which is fine if that’s what you have, but it’s problematic if you’re part of the world’s majority who live with patchy power and internet.

First-generation blockchains are also unfit for tiny smart devices, and they consume a massive amount of electricity for very little work (although newer DLTs are addressing those problems). The solution shouldn’t contribute to the problem. Mike tells me that “we turned our backs on Ethereum years ago”.

Holochain was designed to tolerate messy, unreliable connectivity. Far from being an anomaly, this may become increasingly common. During our local wildfire season, one fire nearly knocked out power and internet for 60,000 people. This would not have been a problem for an IOEN-powered local grid.

Holochain’s creators built it for edge computing in the extreme. Every device carries its own weight, according to its ability. It stores its own data, makes its own decisions, connects directly to neighbouring devices, and helps support the health of the local network. This lets the computing capacity scale linearly with the number of participants. And less powerful devices will soon be able to offload their work to a decentralised cloud if needed.

Mike and his colleagues also appreciated that Holochain already had a small but strong developer community. In preparation for an IOEN demo for their partner Monash University, Holochain developers from around the world rallied around RedGrid to make it happen. After the demo, Mike said:

“I should point out that what attracted me to Holochain was the support, not so much the tech. And it’s just in the last month that this has proven out, with Guillem and Connor [and Alfredo, Jesús, and Eduardo] helping us… You can see in GitHub and Discord their discussions with David Meister and Tom Gowan [Holochain core devs], to the point where a community dev actually found things and updated the core. It’s really pretty impressive. And Monash recognised that.”

Adam agreed, saying,

“And that was really cool, because you’re building together. For us as a company, we’ve had lots of good support from Art [Brock] and Eric Harris-Braun [Holochain co-founders] from the beginning, and obviously other people Mike mentioned... It makes us feel good—about the community, about the whole mission, and also there are other people out there who are not building on Holochain but believe in what we’re doing, the Holochain/RedGrid combination.”
Where you come in

When we’re faced with the stark challenge of climate change, and the apparent inaction of the people who make decisions, it’s easy to fall into discouragement. Many of us are desperate to see change, but feel powerless to actually make the change happen. We would seize any opportunity to get our agency back, to help guide our world towards a happy future.

And we technologists have often found ourselves part of the problem rather than the solution. But projects like this remind me that there is a place for appropriate applications of technology, there is a role for people who can design, program, build, tinker, and tweak. We can “create options for policy-makers”, in the words of Saul Griffith. And when they ignore those options, we can step around them and build tools for everyone — tools for coordination, tools to make the invisible visible. Tools to help us make sense of our world and take action together.


Coinfirm

Jobs: Technical Writer

Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...
Coinfirm is a global leader in AML & RegTech for blockchain & cryptocurrencies. Coinfirm is full of professionals with experience in litigation, finance and IT powering the mass adoption of blockchain. Offering the industry’s largest blockchain coverage – over 1,500 cryptocurrencies and protocols supported – Coinfirm’s solutions are used by market leaders including industry heavyweights...

Ocean Protocol

Ocean’s on Moonriver

Ocean Protocol smart contracts and Ocean Market are on the leading EVM Kusama Parachain, to unlock the Web3 Data Economy for the Polkadot ecosystem As part of Ocean’s multi-chain strategy, we have deployed Ocean’s technology to Moonriver. Moonriver is the leading EVM-compatible chain in the Kusama / Polkadot ecosystem. This gives it excellent security and cross-chain interoperability that will c

Ocean Protocol smart contracts and Ocean Market are on the leading EVM Kusama Parachain, to unlock the Web3 Data Economy for the Polkadot ecosystem

As part of Ocean’s multi-chain strategy, we have deployed Ocean’s technology to Moonriver.

Moonriver is the leading EVM-compatible chain in the Kusama / Polkadot ecosystem. This gives it excellent security and cross-chain interoperability that will continue to improve.

Moonriver is a parachain (sub-chain) that leverages Kusama relay chain (parent chain) security. It’s a sister to the upcoming Moonbeam chain, a parachain to Polkadot relay chain. Kusama targets rapid testing and long-tail applications, whereas Polkadot targets slower large-enterprise applications.

Moonriver went live on Aug 26, 2021. Its traction is already impressive: within three weeks, it hit 1 million transactions and 100K wallet addresses. A week and a half later, the transactions had doubled to 2 million. It has more than $200M staked in DeFi applications and another $270M bonded in collator staking, excellent for any network, let alone such a young one.

The Ocean core team and Moonriver / Moonbeam core team (PureStake) have been collaborating since January 2021. In April, the Ocean core team deployed Ocean backend components and Ocean Market frontend to Moonbase Alpha testnet, then shared that work in the Moonbeam #Illuminate21 community event. The teams worked closely to iron out small kinks. The Ocean deployment to Moonriver is the first major technology milestone in this collaboration, and as part of the broader Ocean — Polkadot ecosystem.

Ocean and Polkadot have grown up together. The creators of Ocean (and before that BigchainDB and ascribe) and creators of Kusama / Polkadot have been collaborating since 2014-era Ethereum days through the founding of Parity (née Ethcore) and Web3 Foundation, from security audits to go-to-market strategy, and always with an eye towards technology integration.

In 2017, Ocean founder Trent McConaghy presented “The Web3 Data Economy” vision at the inaugural Web3 Summit conference, hosted by Web3 Foundation [slides][video][blog post]. The Ocean Moonriver deployment makes this vision real in Web3 Foundation ecosystem.

Moonriver is fully Ethereum-compatible, which allows it to run Ethereum smart contracts and use Ethereum-style ERC20 tokens. It’s a Proof-of-Stake chain with a modern consensus algorithm built on Parity Substrate technology. This allows it to have low gas fees and low latency compared to Ethereum mainnet.

The first dataset published in Moonriver.

With Ocean deployed to Moonriver, this means:

Users of Ocean Market can publish, swap, stake, and consume data assets in Moonriver. Data marketplace builders and other dApp developers can use Ocean libraries (ocean.js, ocean.py) and frontend components (market) with Moonriver. Token holders can move their OCEAN on Ethereum mainnet to OCEAN on Moonriver, and back using the AnySwap bridge.

Ocean components supported in Moonriver include: Ocean smart contracts, Ocean Market, Provider (for data services), Aquarius (metadata cache), and Ocean Subgraph (The Graph queries). For more details, visit our Supported Networks page.

Cross-chain Mental Model

Ocean is now deployed to four production networks: Ethereum mainnet, Polygon (Matic), Binance Smart Chain (BSC), and (new) Moonriver.

An Ocean Market user can click on the network preference in the top right, and toggle which chains they would like to see the assets for. Moonriver is now in this list and selected by default for all new users.

Default selected networks now include Moonriver.

In each of these networks, tokens on the network have real value associated, so there need to be token bridges across networks. For Moonriver this is:

OCEAN on Ethereum mainnet gets bridged across as OCEAN on Moonriver. ETH for gas fees on Ethereum mainnet corresponds to MOVR for gas fees on Moonriver. MOVR is the native token on Moonriver used for gas fees.

For dApp developers, as with other networks, in ocean.js the Moonriver chainId is passed to the ConfigHelper to get all required endpoints for initializing Ocean, while in ocean.py the respective RPC URL needs to be set as the network config parameter.

Resources on Ocean * Moonriver Docs: Ocean-related tokens and services in Moonriver OCEAN contract in Moonriver blockchain explorer AnySwap bridge
“There is an increasing recognition that the web2 model is broken, with huge concentrations of data and power in the hands of a very small number of centralized entities. Ocean’s unique application of web3 technology to create open and decentralized data markets is leading the path forward on how to avoid these abuses. We are very happy to partner with Ocean to democratize access to data and help create a fairer and less manipulative online environment for everyone.”

–Derek Yoo, Moonbeam Founder.

“It’s a wonderful feeling to see something manifest that has been years in the making. This is one of those days. The PureStake team has been excellent collaborators to bring Ocean to Moonriver and the broader Kusama / Polkadot ecosystem.”

–Trent McConaghy, Founder at Ocean Protocol

About Ocean Protocol

Ocean Protocol’s mission is to kickstart a Web3 Data Economy that reaches the world, giving power back to data owners and enabling people to capture value from data to better our world.

Visit oceanprotocol.com to find out more.

Twitter | LinkedIn | Blockfolio | Blog | YouTube | Reddit | Telegram | Discord

Ocean’s on Moonriver was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Let's Talk about Digital Identity

Launching the Global Assured Identity Network (GAIN) with Elizabeth Garber – Podcast Episode 52

Let's talk about digital identity with Elizabeth Garber, Editor of GAIN. In episode 52, Elizabeth explores the recently announced Global Assured Identity Network (GAIN) initiative. She fills us in on what the GAIN project is, explaining how it's different from other trust networks and why GAIN is good for financial institutions. She also discusses the role of the Global Legal Entity Identifier Foun
Let’s talk about digital identity with Elizabeth Garber, Editor of GAIN.

In episode 52, Elizabeth explores the recently announced Global Assured Identity Network (GAIN) initiative. She fills us in on what the GAIN project is, explaining how it’s different from other trust networks and why GAIN is good for financial institutions. She also discusses the role of the Global Legal Entity Identifier Foundation (GLEIF) in the project, and what’s next for GAIN.

“This is really going to unleash creativity and expand access to individuals and communities and sellers all around the world.”

Elizabeth Garber is a customer and product strategist who started her career in telecommunications and honed her craft in six different industries before joining one of the world’s largest retail banks. She is an expert in designing experiences and delivering transformational change based on a deep understanding of people. This interest has underpinned her graduate studies of the psychology of cross functional teams as well as how customers define value in relation to services they use.

In 2015, she was named one of the top 3 marketers under 30 by the UK Marketing Society and was recognised by Energy UK and EY for her work building Trust across the UK energy industry. In 2017 she won the Financial Times/30% club ‘Women in Leadership’ award.

Find Elizabeth on LinkedIn.

Elizabeth recently played a leading role editing the paper published by more than 150 Identity experts – GAIN: How Financial Institutions are taking a leadership role in the Digital Economy by establishing a Global Assured Identity Network. It was announced at the European Identity and Cloud Conference on 13 September by Nat Sakimura, chairman of the OpenID Foundation, and Gottfried Leibbrandt, former CEO of Swift, and then published by, among others, the Institute of International Finance.

To get involved, email digitaltrust@iif.com or join the LinkedIn group.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

 

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello, and thanks for joining. Our guest today played a leading role editing a paper published by more than 150 identity experts. The paper is called GAIN: How Financial Institutions are taking a leadership role in the Digital Economy by establishing a Global Assured Identity Network. It was announced at the European Identity and Cloud Conference last 13th of September by Nat Sakimura, who is the Chairman of the OpenID Foundation, and Gottfried Leibbrandt, former CEO of Swift, and then was published by, among others, the Institute of International Finance.

Our guest today is Elizabeth Garber. She is a customer and product strategist who started her career in telecommunications and honed her craft in six different industries before joining one of the world’s largest retail banks. She is an expert in designing experiences and delivering transformational change based on a deep understanding of people. This interest has underpinned her graduate studies of the psychology of cross functional teams, as well as how customers define value in relation to the services they use.

In 2015, she was named one of the top three marketers under 30 by the UK Marketing Society, and was recognised by Energy UK and EY for her work building trust across the UK energy industry. In 2017, she won the Financial Times 30% club Women in Leadership Award.

Hello, Elizabeth.

Elizabeth Garber: Hello, thanks for having me.

Oscar: It’s a pleasure. Welcome to our show. And let’s talk about digital identity. And certainly, we always like to start hearing a little bit more about our guest, especially how was your journey into this world of digital identity. Please tell us a bit about yourself.

Elizabeth: Sure. So my name is Elizabeth Garber. As you said, I’m a customer strategist, a product owner and service innovator. Now, I’m also a digital identity evangelist, I suppose. My passion is for understanding people, what drives their behaviour, how does any kind of service really add value in their lives. Then I help organisations to build services and communicate those benefits to real people. So these skills, I think, are critical in the digital identity space, because the solutions that we design really need to reflect the people who use them, how they move through their worlds, globally.

And not only that, but really great customer experience, designers, product owners, they’ll get to know what’s really on people’s mind that might prevent adoption of any one solution. And how different solutions kind of destroy value maybe in sometimes insidious ways, like a really convenient identity solution could come with some real trade-offs that are invisible to users at first. Maybe their data isn’t secure, maybe it’s sold off, maybe they’re giving it willingly, but they’re blind to some of the applications that will follow.

As someone who hasn’t been involved in identity for all that long, I can really relate to the users who don’t understand the difference between the offers that are currently on the market, and what might be there in future. So I’m here to get underneath how our identity solutions are going to create and potentially undermine benefits for real people. And then I want to promote the adoption of good ethical solutions.

Oscar: Excellent. And what was your exact role in this project in this paper, GAIN?

Elizabeth: In the paper?

Oscar: Mm-hmm.

Elizabeth: Yeah, as it turned out, a few other people thought that I could be useful. So one of the early co-authors Rod Boothby, who chairs the Open Digital Trust Initiative at the Institute of International Finance, reached out to me about his work. I was really floored when he took me through it and I knew that I wanted to get involved. It was really exciting. So, after I read through some of the documentation, I sent him some thoughts about the value to users, to banks, to relying parties.

And then within a few days, I was facilitating a virtual whiteboard session with a lot of the other co-authors all over the world, trying to help them kind of coalesce around who the paper was for, who is the audience, what are the messages that are going to resonate with that audience and then the structure of the paper. At that point, I wondered if someone might realise that I didn’t have a huge background in identity, maybe kick me out, but they are an inclusive group of people, and they really valued the approach. So ultimately, I played a pretty big role in pulling the paper that we published together.

I should say who else was involved. It was really an experiment in radical democracy as Don Thibeau of the OpenID Foundation likes to say. We brought more than 150 people together, experts in the space. I might mention some of their names in this interview, but it was a pro bono, no logos collaboration. But we did have the support of some major organisations in the space. The ones that participated and are now publishing and promoting the paper include the OpenID Foundation, the Institute of International Finance, the Open Identity Exchange, the Global Legal Entity Identifier Foundation, and the Cloud Signature Consortium.

Oscar: Excellent. Tell us a bit more about the…

Elizabeth: The paper itself?

Oscar: Yes, please.

Elizabeth: OK. Yeah, sure. So the paper invites financial institutions. It’s a call to action for financial institutions to join us in solving the biggest problem on the internet – a lack of trust, specifically a lack of trust due to the absence of verified identities. How do you know the person you’re dealing with is real? How do you know that they are who they claim to be? How do you know that your money will end up in the right hands? Fraud and cybercrime increases every year. Some estimates say it’s 5% of GDP, trillions of dollars. Criminals are thriving in anonymous digital spaces.

And at the same time, you and I and our friends are pervasively tracked. We enter our details into countless sites, service providers follow us around, they trade our information. Even our biometrics are seeping around into more and more places, our faces, our fingertips, more and more people or parties have access to really private, really personal information that could be used in any number of ways – steal our identities, for example. And that needs to stop.

The paper argues for the internet’s missing trust layer. That’s a phrase coined by one of the co-authors, Kim Cameron, who led identity at Microsoft for many years, and he wrote the Laws of Identity. Then this concept was really beautifully explained by Nat Sakimura, Chairman of the OpenID Foundation at the European Identity and Cloud Conference in September. In this new paradigm, highly trustworthy identity information is passed from a regulated or otherwise highly trusted institution to an organisation that needs it, with end user knowledge and consent every time.

So, if I’m buying another bottle of Malbec, I shouldn’t have to prove my age by uploading a driver’s license to another site or sharing a photo of my face. You don’t need my name, my face, my address. No one else needs a copy of my credentials or biometric information. You just need to know that I am the person I say I am and that I’m old enough. So my online wine retailer can send a message to my trusted identity information provider. I would choose my bank and they will use their app to authenticate me and say, “Hey, the International House of Malbec wants to know if you’re over 21, should I tell them?” And of course, it’s a yes, because I need to celebrate recording this podcast.

Oscar: Tell us a bit the solution itself because there are some solutions, which one way or another address the problem you described?

Elizabeth: Yeah, so none of this is really new. People have called for this for years. One of my co-authors actually from INNOPAY, Douwe Lycklama, shared a YouTube video arguing for the same thing, and it was dated in 2007. Of course, it does exist in some places already, Norway, Sweden, Belgium, Canada, Finland, in one form or another lots of jurisdictions around the world have a solution like this. And as another GAIN co-author Dave Birch, who also wrote Identity is the New Money, pointed out in Forbes last week – We’re starting to see evidence that these networks really facilitated the rollout of aid during the pandemic, and mitigated the risk of fraud.

In particular, another co-author from Vipps in Norway helped us to compare some early information coming out against the US, UK data. So no, none of its new but we do argue in the paper that now is the time to think about global interoperability of these networks.

Oscar: Yeah, and I think as the model of doing the solving these problems goes towards how the countries you have mentioned, most of them have created a system based on banks, others combined with all of the mobile operators, right, but those are like the main types. And the states, of course, the states, some of the states has also provided that. So these are the three types of identity provider, the one who provision these systems, this identification service, but obviously, it’s not – it’s not commercial big tech, no, that’s out of these group of entities who provide these systems.

And of course, you mentioned different countries. We talk about few countries that have sufficient solutions. But I think the next question would come, how to make it available for the rest of the world, how we ensure some type of global interoperability?

Elizabeth: Absolutely, absolutely. So we think global interoperability is really important for a lot of different reasons. Three in particular spring, to my mind. The number one, end users want to live globally, maybe more importantly, most online companies are global or need to be global. They have suppliers and customers across borders. So the benefits increase dramatically when a trust network is global, rather than local – fewer contracts, fewer integrations, and the benefits extend throughout their supply chains all the way to the end users who benefit from simpler, more convenient services all across the internet.

Because my second point is, it’s an extension of that first one that those benefits actually extend to global society. A global network allows a specialist artisan in one part of the world to reach a global audience without relying on an intermediary. So Oscar, let’s say you have something to celebrate, and you’ve decided to buy yourself a handmade quilt from India, a Kantha. You will be able to find sellers that you can trust and they’ll be able to sell to you, charge you the price that someone in Finland expects to pay, and they will keep a greater percentage of it for themselves, potentially. So this is really going to unleash creativity and expand access to individuals and communities and sellers all around the world.

Finally, there’s a practical benefit from promoting this vision of interoperability. And I believe it’s the key factor that will move the needle towards actually setting up this trust layer in places where it does not exist today. Major relying parties, and I’m talking now about big companies that operate worldwide with the heft to influence a global movement, they buy into this vision only when it has a global reach. They don’t want to integrate with a different provider in every jurisdiction on the planet. So with that in mind, financial service institutions around the world are far more likely to collaborate and catalyse this movement, if they see that the vision is expansive enough to meet the demand for global reach.

Oscar: Okay, so definitely interoperability is something that we aim – not only the ones who are like me, I’m in Finland would like to have the same for the rest, but the ones who don’t have a sufficient identification system like those. But then something caught my attention is in the white paper. It says GAIN Digital Trust: How financial institutions are taking a leadership, et cetera, et cetera. So that means that the 100 persons who have been involved in this are targeting financial institutions. So why is targeting specifically financial institutions?

Elizabeth: Great question. So the main body of the paper is targeting, it’s directed at the world’s large financial institutions. It’s a call to action for them to catalyse the creation of a globally interoperable network. To be really, really clear with everybody though, we don’t think the paper or the global assured identity network is only for banks or financial institutions. The network itself must be inclusive, and in some parts of the world energy companies, telecommunications providers etc., they may be better placed than banks to be trusted identity information providers.

However, we argue that financial institutions are really well positioned to spark this change, to bring it to life, and to benefit from it. And so that’s what the paper really gets into. It runs through the reasons why financial institutions are positioned to do it – a point we repeat a few times – because we’ve seen them do it before. They built the rails for global payments, for cards, for securities, etc. They built Swift, Mastercard, Visa.

There are three main strengths that they have that makes them the perfect catalyst. Number one is trust. It’s a bank’s core offer. I know some people might laugh at that, because you don’t always trust a bank to give you the best rates, or brilliant customer service every time, but you do trust them. You trust them to keep your money and your data safe. That’s why they exist. Some companies monetise your data. Banks, financial institutions are in the business of monetising your security and privacy. And they have been since the very first bank – the Medici bank in 1397. Banks are trusted.

Number two, the second point, is a build off the first. They are regulated. A lot of that trust is underpinned by regulation. They need to be worthy of our trust and meet certain standards so that other businesses do not – and there are governance frameworks to build upon as these new services are created.

The third point that makes them a great catalyst, the third strength, is how well they know their customers. Because of those first two things, banks invest significant amounts of money in making sure that they know their customers. You might hear me say Know Your Customer, or KYC, is the process – and it’s regulated – that they go through to validate your identity when you open an account. And they also build the technical infrastructure to know that it’s you each and every time someone tries to get into your account. They have built best-in-class tools to identify, authenticate you, while keep your account secure and your information private. And all that’s why banks are best placed to vouch for your identity and to use their authentication methods to confirm on behalf of others that it’s really you.

Oscar: What would make possible that there is such interoperability in more practical, more technical terms?

Elizabeth: Good question. So we’re going to try to apply interoperability standards that many different types of systems can plug into. There will be direct participants, banks or other identity information providers who use the network to verify information for the companies who need to consume it. And, of course, there will need to be intermediaries and translation layers to ensure that a BankID Sweden customer can verify their identity with a seller in Mexico, for example. That BankID customer will continue to verify using BankID. But we will have created a way for BankID to communicate to relying parties all over the world, including Mexico.

We can also create servers that tap into self-sovereign identity networks. It’s important to know that we’re designing this system so that identity information passes from the provider, the identity information provider, directly to the relying party without passing through any centralised GAIN entity. That’s really important, at least as we understand it. That’s really important when we start to talk about interoperability from a legal standpoint.

Technically speaking, though, these direct connections are enabled by common API specifications. They’re based on OpenID standards, such as FAPI. With all that said, this field is evolving and technical proof-of-concepts are underway. So I definitely need to reserve the right to build upon my answer later on.

Oscar: Yeah, definitely. It’s very convincing your point, your three points. And I can feel that, absolutely. And the banks have already proved that in some of the countries that we have mentioned that they are leaders in this identification.

Elizabeth: Absolutely. Yep.

Oscar: And now my guess that among the 150 authors, there are some people who are directly involved in banks.

Elizabeth: Mm-hmm.

Oscar: OK, excellent.

Elizabeth: Yes. Like I said, it was a no logo collaboration so I won’t be dropping the names of any specific names.

Oscar: Yes.

Elizabeth: But we did have the partnership of the Institute of International Finance, who is pulling together a proof of concept with the Open Digital Trust Initiative. And we do have banks involved in that.

Oscar: So why it is really good for banks from the bank’s point of view, why it’s good to join GAIN?

Elizabeth: Yeah, so importantly, it’s not just that they can do this, it’s also that they will benefit from doing it. The first benefit that’s easiest to explain is simple, it’s revenue. Identity verification services will have as they do today, a small price attached. Transactions will result in a small amount of money flowing into the ecosystem. And a percentage will go to the identity information provider, in this case, the bank. That turns the assets that I was just talking about – what they’ve spent to build up KYC, authentication, security, all those infrastructures – that turns them from a major cost centre, into a profit centre.

Second, there will be massive efficiencies for banks, password resets, document signing, mortgage processing, etc. There will be more efficiency inside a bank. They will also see less fraud. In Norway, BankID saw fraud reduced from something like 1% to 0.00042% of transactions.

But the biggest benefits are strategic. New competitors are coming in between banks and their customers, are diminishing the role that banks play in providing access to capital markets. Providing identity information services to their customers, under their brand name, cements a really critical role for banks. They will provide their customers with access to the digital economy. And they will keep their customers safer, more secure in their privacy than they are today. And that’s a real value add for their customers. So strategically, banks really must do this, or they risk getting cut out of a lot of transactions.

Oscar: Indeed, banks have to hear it. When I started reading more about this paper, and you have mentioned at the beginning also, that the Global LEI Foundation is also involved, it’s one of the main supporters of this initiative. So tell me how that relates to that. They are not the banks. Of course, they have some business with the banks. But please tell us what is the connection?

Elizabeth: Yeah, I listened to your podcast that you did with GLEIF, Global Legal Entity Identifier Foundation, while I was preparing to talk to you. And yeah, I think the connections are really strong. So they were a critical partner in pulling the paper together. They’re doing incredibly important work, and will absolutely be part of the next steps as we figure out how to realise a global assured identity network.

There are three types of identity questions that we all really have as we transact online. Who is this person? Can I trust them? Who is this company? Are they trustworthy? And then connecting the two. Is this person related to a company? What’s their role? Are they entitled to be doing this thing, signing this document logging into my account on the company’s behalf? If we start getting key information about the individual, that’s great, that adds a ton of value. GLEIF is answering that second question, who is the company? And once we can get to the third point of connecting the dots, that’s going to be really powerful. So yeah, GLEIF is a critical partner and will remain so throughout the rest of the journey.

Oscar: Yeah, yeah, now the way you have explained also is pretty good. And we say that verify identity of a person, of a company or organisation and how to link these two in this. And that’s pretty critical. That’s something that has not been explored enough, I would say. So that’s, I’m really intrigued to hear how GAIN is addressing doubts about this because it’s essential these days.

So, Elizabeth, I’d like to hear more what comes next. So just a couple of weeks ago was the launch of the paper and I also saw some interest on the media. That’s excellent to hear. But what come next now for GAIN?

Elizabeth: Well, we already have planning underway for GAIN technical proof of concept. And we’re looking for more people to get involved, more companies to participate. So we have big companies who need to de-duplicate their customers, companies that need to collect verified signatures, the number of use cases is seemingly endless. We’re also looking for partners, aggregators, or service providers who can help us – help bring these relying parties on board or technical service providers who can help us to envision and build services on top.

There are so many ways to be involved. So if you’re thinking about getting involved, it might mean that you’re interested in identity. So you’re listening to this podcast after all, that probably means that you’re only one or two steps away from one of the co-authors, so it’d be really easy for you to reach out to them. You can also reach out to me or the Global Assured Identity Network LinkedIn group that we have. And if you’re really interested in the POC, we’ve got an email address for you, it would be digitaltrust@IIF.com. So even if none of that is true for you, there’s something you can do. You could call up your bank and tell them to offer this service

Oscar: Exactly. Even as a user, you can ask your customer for the bank, you can ask the bank, have you heard of GAIN? Yeah, absolutely. And I guess – assume that you already have a big list of the most potential use cases right that some of these customers, companies, service provider that you are now inviting. They could fit into this use cases. Excellent. Elizabeth, finally, tell us – for all business leaders that are listening to us in this interview. So what would you say is the one actionable idea that they should write on their agendas today?

Elizabeth: So these ideas that we’ve been talking about are going to mark a step change in the digital economy. A third wave in identity as my colleague Rod Boothby pointed out on LinkedIn last week. First, it was all about companies providing IDs and passwords to employees to access work systems, and businesses gave customers IDs and passwords to access services on the internet. This third wave is all about us bringing our own trusted digital identity wherever we go. With a really inclusive approach and active global collaboration, this will open up our digital economy. It’s going to expand access, and make life so much simpler and safer online.

So for those who are listening, I would urge business leaders to figure out what role your business can and will play in a globally interoperable assured identity network. Are you an identity information provider? Can you be? Will you be a relying party who consumes these services so that you can verify that the signature on a contract is valid? Can you help us onboard relying parties or integrate these services?

On your agenda, I’d say write down, figure it out, figure out how you’re going to get involved and then get in touch. Again, join our Global Assured Identity Network on LinkedIn, or email digitaltrust@IIF.com. If you know that you want to join the POC.

Oscar: Yeah, indeed. This might still be relatively new concept for companies who are not so exposed to this type of identity services. So yeah, it’s a good idea, as you said, to decide, think about what is going to be that new role of your company from the many choices you have mentioned.

Thanks a lot, Elizabeth. It was super interesting to hear about this extraordinary effort that had been done by GAIN project, very recently launched. And as you said, now, the Proof-of-Concept are starting to keep going. Please let us know how people could get in touch with you if they like to follow this conversation with you.

Elizabeth: Yeah, so you can find me on LinkedIn really easily. My name is Elizabeth Garber. Yeah, I think that’s the easiest way.

Oscar: OK. LinkedIn is the easiest way. Thanks a lot Elizabeth. It was a pleasure talking with you and all the best.

Elizabeth: Thank you. Thanks for having me.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up to date with episodes at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

[End of transcript]


PingTalk

These Trailblazers are Pushing the Identity Security Industry Forward

The Identity Excellence Awards are always one of the highlights of my year. Reviewing all the nominations and seeing all the ways our customers continue to innovate in the field is both exciting and inspiring.   But when it comes to actually choosing a winner? That’s not so easy. Each and every nomination reflects truly impactful work. That’s why it takes hours of review and deliberation

The Identity Excellence Awards are always one of the highlights of my year. Reviewing all the nominations and seeing all the ways our customers continue to innovate in the field is both exciting and inspiring.

 

But when it comes to actually choosing a winner? That’s not so easy. Each and every nomination reflects truly impactful work. That’s why it takes hours of review and deliberation amongst our judging committee to whittle down the field into the eight winners we decided to award this year. 

 

This year we celebrated the award recipients at our annual user conference, PingYOUniverse. There, I was happy to again roll out the digital red carpet for all these deserving companies alongside our very own Chief Identity Champion. And now, it’s my privilege to announce the 2021 Identity Excellence Awards winners as follows.

Tuesday, 05. October 2021

Indicio

Indicio CTO Named 2021 Enterprise Blockchain Award Finalist

The post Indicio CTO Named 2021 Enterprise Blockchain Award Finalist appeared first on Indicio Tech.
Ken Ebert nominated in Blockchain Leadership Award category for vision and leadership in developing interoperable blockchain-based Trusted Data Ecosystems

SEATTLE — October 05, 2021 — Indicio, the world’s leading provider of decentralized identity and hosting solutions, today announced that company Chief Technology Officer Ken Ebert has been named as a finalist in the Blockchain Leadership Award category in the Enterprise Blockchain Awards (EBAs) presented by the Blockchain Research Institute. Ebert was recognized for his long-lasting vision for open-source identity technology anchored on a blockchain network to solve some of our most pressing online problems—verification, privacy, security, and user experience.

Ebert was one of the co-founders of Indicio.tech, and he has been pivotal in nurturing it from its launch at the start of the COVID pandemic to its position today as the global leader in developing decentralized identity infrastructure and software for building Trusted Data Ecosystems. He built a virtual team of highly-skilled developers that is diverse in gender and nationality, and he developed an intern and mentoring program to create the needed skills in new recruits as the company took on more and more customers’ projects.

With Ebert’s team-focused leadership, Indicio was able to successfully create and deploy a complete decentralized identity based data ecosystem, weaving together health agencies, global technology suppliers in the travel industry, and national governments. The company’s continuing success reflects on Ebert’s  extraordinary ability to bring parties together and drive ideas and projects forward into workable technology that solves real-world problems. And, as a passionate advocate for interoperability in the open-source identity communities, Ebert isn’t just driving Indicio’s success, he’s helping to drive the adoption and evolution of blockchain technologies that promise transformative change and opportunity for everyone.

“Indicio’s success and its position as a global leader in the development of Trusted Data Ecosystems is a testimony to Ken’s seemingly boundless skill and wisdom. It’s not just his ability to analyze and clarify problems and guide our own team, it’s his tireless capacity to do so for all the companies who support the Indicio Network as well as our customers,” said Heather Dahl, CEO, Indicio. “That’s what makes him an outstanding leader. Part of that comes from character; part from experience; but a part of it is also a philosophy of ‘build now, build more, build better.’  When you think about the problems we’re trying  to solve, they’re huge. Ken is always the one to remind us—and our collaborators and customers—that we build solutions step by step. Let’s solve this bit first. And when you do this, and do it relentlessly, you’ll feel like you’re running and you can achieve anything.”

“I want to thank BRI for its dedication to helping enterprise leaders understand and implement blockchain within their industries. This is a technology that has  enormous potential to rewrite the way we do so many things in business and society,” said Ebert. “It’s exciting to be the CTO of a company that is driving that transformation through the Indicio Network—a distributed blockchain for identity—and through designing solutions that give people the ability to prove identity and share data in a way that preserves their privacy, in  a way that improves their security, and, above all, in a way that can be trusted. All this would be impossible without blockchain. I’m honored to be a finalist for this award;  but I’m more excited by what we—Indicio, our clients, the identity community—can do for the world with this technology. That’s the excitement that greets me every day—and I want to share it with everyone.”

The Blockchain Leadership awards honor people who have shown exceptional leadership in a blockchain collaboration or implementation within an enterprise, an industry, a government, or a multi stakeholder organization. The nominees are those that have gone above and beyond the job description to spearhead blockchain initiatives within an enterprise or ecosystem.

###

The post Indicio CTO Named 2021 Enterprise Blockchain Award Finalist appeared first on Indicio Tech.


Node Operator Spotlight: IdRamp

The post Node Operator Spotlight: IdRamp appeared first on Indicio Tech.

A distributed ledger is a database that has copies distributed across a network of servers (nodes), all of which are updated simultaneously. A network like this is the foundation of decentralized identity, a way of generating robust trust and collaboration free of the security risks of centralized databases. We call the companies and organizations that support an Indicio Network node on a server that is under their control “Node Operators.” 

Recently we caught up with Karl Kneis, COO of IdRamp, and Eric Vinton, Chief Business Officer of IdRamp, one of the first companies to become an Indicio Node Operator, to discuss their current projects, some goals for the future, and where they think decentralized identity is heading.

Tell us about IdRamp: how did it start, where did it start, and who makes up your team?

IdRamp was born from years of frontline experience in enterprise identity management and service delivery. With IdRamp we wanted to reduce the pain and vulnerabilities that surround digital identity passwords, platform migration, operation, and service delivery.

The cost and resource requirements of managing and replacing identity platforms can be astronomical. Operation requires special skills and complex customization. Migrations can take years to complete and often fail. Service delivery can be slow and require premium resources. — Our experience found that adapting decentralized, Zero-Trust identity principles will reduce cost while increasing security and accelerating the speed of service delivery.

We founded IdRamp to help remove passwords, automate expensive tasks, reduce the need for advanced skills, and simplify the adoption of new solutions, all while improving overall security through decentralized Zero Trust. Instead of reinventing identity management platforms every few years with mammoth projects, organizations can use Idramp to enjoy continuous adoption of new services and solutions at the speed of business. 

Decentralized verifiable credentials can easily be adapted to any service or system for advanced Zero-Trust protection and password elimination. No coding or long term platform projects are required. People appreciate the improved privacy and simplified experience of passwordless ecosystems. Security authorities appreciate the reduced data liability and the stronger protection of Zero Trust credentials.

Our team’s deep experience working through generations of multinational digital identity projects gives IdRamp a unique perspective. We excel at solving complex problems with simple effective solutions that improve the bottom line.

What are some of the products/services (Self Sovereign Identity or not) that you currently offer? Who are your target customers? What sets you apart from the competition?

Our premier product is the IdRamp platform. It caters to the public sector, enterprise and SMB customers across all industries. It provides service orchestration with zero-trust decentralized identity, and password elimination.

While IdRamp is a zero-code solution we also provide robust APIs that can be used to extend 

capabilities into any custom application or ecosystem experience. The APIs offer a limitless palette of design opportunities for application development.

We also provide a free digital identity wallet to securely share personal information, such as education certifications, health data, or employment credentials. The wallet provides multi-wallet stewardship capabilities that allow people to manage credentials for other people or things.This feature can be used to manage family credentials, or eldercare use cases, for example.

IdRamp is built on open standards for interoperability. It operates automatically across any standards-based digital identity network. While the IdRamp wallet offers robust capabilities, any standards based identity wallet can be used with the IdRamp suite of tools.

Recently, we co-developed a series of groundbreaking IdRamp-based apps with security software provider Bak2.life. These apps include:

Bouncer Zoom Event Attendee Security — extends Zoom meeting security with email 2FA or verifiable credentials for all participants. Return to Life — provides a simple way for organizations to offer safe access to events and facilities based on verifiable health credentials, digital ticketing or custom credentials tailored to business needs. Webcast Security Portal — provides end-to-end protection and access control for multiple webcast providers, including Zero-Trust, passwordless verifiable credentials.

What motivated your work in decentralized identity? Why did you become a node operator? 

Decentralized identity reduces data liability, increases privacy, improves security, and human experience. It is a natural compliment to our suite of Zero-Trust passwordless solutions. Decentralized design has always been core to the Idramp strategy. Adapting new standards in decentralized identity helps our customers achieve the best possible protection across their ecosystems.

The problems and challenges of enterprise security have been getting worse and worse over the past decade—Zero Trust identity provides much needed relief. However, the next iteration of Zero Trust will require a decentralized network to remove the need for centralized databases that carry inherent risks and increased costs. Being a Node Operator helps IdRamp provide a more comprehensive Zero Trust service to our customers.

Where do you see the future of Self Sovereign Identity/Decentralized Identity?

The need for secure identity is a high priority because the cost of a mistake with personal data can be very expensive. Terms like “SSI” and “decentralized” will eventually fade into globally accepted standard terms for digital identity. As decentralized identity becomes the preferred security standard, new threats and attacks will be developed and new Zero Trust-solutions will be required. With Idramp, organizations can stay ahead of the rapidly changing digital identity security landscape and avoid expensive technical detours that slow business and leak revenue.

 

For more information about the IdRamp platform or any of their other products, go to IdRamp.com.

The post Node Operator Spotlight: IdRamp appeared first on Indicio Tech.


Tokeny Solutions

Enegra’s tokenized equity improves the liquidity of its $28Bn balance sheet via T-REX Platform

The post Enegra’s tokenized equity improves the liquidity of its $28Bn balance sheet via T-REX Platform appeared first on Tokeny Solutions.
Tokenized Equity Enegra’s tokenized equity improves the liquidity of its $28Bn balance sheet via T-REX Platform

Enegra sought to provide liquidity and great UX for its investors while remaining compliant on the public blockchain. Enegra turned to Tokeny’s innovative digital asset compliance infrastructure to issue, manage and transfer its equity on the blockchain. Tokeny’s solution allows Enegra to achieve a seamless outcome without the worry of technical barriers. The highlighted results are:

Improved liquidity

Global and automated compliance

Zero transaction fees

We tokenized our equity to improve liquidity. And, now that the technology is available for faster, cheaper, and compliant transactions on the blockchain, we wanted our investors to take advantage of it. Polygon and Tokeny provided the complete infrastructure we needed to do so, and we are extremely pleased with the results. Matthew AverayManaging Director & CEO at Enegra Introduction

Enegra was established in Malaysia in 2011, with a mission to enable mid-tier commodity miners in emerging markets to compete globally, Enegra achieves this through its world-class trading expertise, risk management, logistics, and governance – thereby delivering economic security among mid-tier mining communities whose lack of trading expertise often leaves them exploited on price.

Why Enegra tokenized its equity

Earlier 2019, Enegra’s management team explored asset tokenization, and team was fascinated by the concept and the advantages it brings to the company as an issuer, along the liquidity benefits to its investors. The following benefits meet exactly what Enegra sought to achieve for its new equity offering:

Improved liquidity Automated compliance Real-time cap table Low administration and asset transfer costs Control securities Directly connect with investors Compliantly tokenizing assets since 2019

Later in 2019, Enegra embarked on a search for a suitable tokenization platform to issue and manage its equity-backed security tokens on a public blockchain in a compliant manner, along with the necessary controls in place. Tokeny’s T-REX Platform caught Enegra’s attention for two main reasons:

First: it leverages the public blockchain while bringing compliance and control. It achieves this through a unique onchain compliance approach by whitelisting of digital identity not the wallet for maximum security, e.g: when investors lose their wallets, issuers can recover the tokens for them. Second: it provides the most advanced platform (120+ functions) with the best user experience in the market.

In September 2019, Enegra issued its equity-backed EGX security tokens via Tokeny’s T-REX Platform on Ethereum. From then on, Enegra has been managing its security tokens and investors through the same platform.

Improving the UX further: say goodbye to gas fees

Earlier this year, Tokeny enhanced its blockchain layer to start supporting Polygon, a scalable layer 2 solution, to provide a faster, cheaper and compliant infrastructure for its customers. In addition, the Gas Tank solution was introduced to make the transaction fees free for platform users, eliminating infrastructure costs and troublesome processes.

To enhance the user experience of its security token holders, Enegra upgraded its services with Tokeny last month:

Migrated security tokens to Polygon  Implemented the Gas Tank solution

Enegra and its security token holders can now take advantage of fast, free and compliant transactions which ultimately improves liquidity and accessibility of the assets. 

Ready to tokenize your assets?

Digitally issue, manage and transfer security tokens compliantly in a few weeks.

Contact Us

The post Enegra’s tokenized equity improves the liquidity of its $28Bn balance sheet via T-REX Platform appeared first on Tokeny Solutions.


Global ID

GiD Report#180 — The Facebook problem

GiD Report#180 — The Facebook problem Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here. ICYMI: How GlobaliD enhances trust for XRPL Labs’ XUMM Wallet EPISODE 12 — The future of GlobaliD Groups This week: The Facebook problem Facebook she
GiD Report#180 — The Facebook problem

Welcome to The GiD Report, a weekly newsletter that covers GlobaliD team and partner news, market perspectives, and industry analysis. You can check out last week’s report here.

ICYMI: How GlobaliD enhances trust for XRPL Labs’ XUMM Wallet EPISODE 12 — The future of GlobaliD Groups This week: The Facebook problem Facebook shelves kids’ initiative Ripple gets serious about NFTs Stuff happens 1. The Facebook problem More like Facebook First, Photo: Maurizio Pesce

It hasn’t been the greatest month for Facebook. The WSJ published a series of scathing exposees highlighting the kinds of choices the company’s leadership regularly makes when faced with the option to expand profits versus “do the right thing.” (Which then doesn’t match their public proclamations of “doing better.”)

That in and of itself isn’t going to be much of a surprise to most people. The Facebook brand doesn’t have a ton of credibility these days. But seeing the nitty gritty details is still jarring — especially when we’re talking about children’s mental health.

We now know who we owe these nitty gritty details, too — a company whistleblower known as “Sean.” Here actual name is Frances Haugen, according to a New York Times report.

In any case, Matt Levine has a straightforward breakdown of the situation:

Okay fine. Facebook is a for-profit company with obligations to its shareholders, and at various points it has a choice between doing pro-social things that are expensive or doing anti-social things that maximize its profits. Sometimes (she asserts, and I don’t particularly disagree) it chooses profit over public good. This is not a huge surprise, incentives being what they are, but it is good for policymakers and the public to know about the particular cases. It is good because, if we know about these choices, we can make informed decisions about regulation and enforcement. If Facebook sometimes chose to break the law to maximize profit, then regulators should punish it. If everything that Facebook does to maximize profits is legal, but some of it is horrible, then we should write new laws to make the horrible stuff illegal. If Facebook does things that are bad for the world then policymakers should try to make it stop. Basic stuff.

That’s the inherent issue with a top-down platform that’s incentivized to maximize engagement and sharing in order to leverage its users’ data by serving them ads. Sometimes you need regulators to come in to whip them into shape.

Here comes the shameless plug. With Facebook, Messenger, WhatsApp, and Instagram all down today, why not try a bottom-up messenger that preserves your digital privacy and that is end-to-end encrypted by default? (Shameless plug idea courtesy of GlobaliD Developer Support Engineer Paul Keen.)

2. Facebook shelves Instagram Kids initiative

But sometimes, all you need is a bright enough light. After the WSJ published Facebook’s internal documents about Instagram and teens, the company has dutifully shelved the project.

Here’s Axios:

Instagram announced Monday that it is pausing its plans to develop a version of its platform for children under 13.
Why it matters: Facebook has received backlash since the Wall Street Journal published a report that showed the company kne