Last Update 1:01 PM July 26, 2024 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Friday, 26. July 2024

Elliptic

Navigating the APAC stablecoin regulatory landscape with Ecosystem Monitoring

Over the past year, the Asia-Pacific region has seen important developments taking place around the regulation of stablecoins. From Singapore to Hong Kong to the Philippines and beyond, regulators have been crafting rules that will require stablecoin issuers to meet extensive and rigorous standards designed to ensure financial stability, protect consumers, and mitigate risks related to

Over the past year, the Asia-Pacific region has seen important developments taking place around the regulation of stablecoins. From Singapore to Hong Kong to the Philippines and beyond, regulators have been crafting rules that will require stablecoin issuers to meet extensive and rigorous standards designed to ensure financial stability, protect consumers, and mitigate risks related to financial crime. 


uquodo

Web SDK 3.5.0 Updates

The post Web SDK 3.5.0 Updates appeared first on uqudo.

The post Web SDK 3.5.0 Updates appeared first on uqudo.


Verida

True Web3 Ownership Starts With Verifiable Credentials

Verida’s credentials library helps developers future proof their applications and master digital identity True Web3 Ownership Starts With Verifiable Credentials The world is moving online at a rapid pace, and with this transition comes the need for new processes and standards to identify ourselves. Verida harnesses Web3 infrastructure to empower people with verifiable, self-sovereign identiti
Verida’s credentials library helps developers future proof their applications and master digital identity True Web3 Ownership Starts With Verifiable Credentials

The world is moving online at a rapid pace, and with this transition comes the need for new processes and standards to identify ourselves. Verida harnesses Web3 infrastructure to empower people with verifiable, self-sovereign identities, removing the reliance on centralized organizations.

To understand how Verida achieves this, we first need to know what credentials are and how they’re managed.

Society and credentials

From ancient civilizations, to the digital worlds of Web2 and Web3, it’s impossible to navigate the world without trustworthy credentials. Typically speaking, you need physical documents such as diplomas, certificates, and identity cards to access products and services in modern society. Venture back a little further and people used stone tablets, wax seals, and papyrus scrolls to prove identities and convey authority.

In order for a reliable credential system to work, there are multiple elements that need to be in place.

Issuers are the recognised authorities that can provide such documents, be it a government authority in the case of an ID card, or an educational institution for degrees and diplomas. Not only are these recognised by law, their concepts are well-known and understood in all societies around the world.

Holders are the people, businesses, organizations, and institutions, who receive and use credentials to access products and services or to prove specific facts and achievements. Holders typically have little say over their choice of credential method and how the underlying system works.

Verification occurs when a credential is controlled. For most of humanity, this has been a manual process, one that’s only as reliable as the person or system that’s doing the controlling.

Photo by Liam Truong on Unsplash

As computers came on the scene, and the internet gained adoption, a digital transformation began to occur. Trust began to be built in the online world. Issuers moved online, bringing a new world of digital certificates, signatures, and identification.

From the stone tablets of ancient civilization to single sign-ons with Google, a number of factors have remained fairly constant. Striving for a faster and easier way to issue and verify credentials brings with it an increasing number of tradeoffs in security and privacy. Falsification has gotten easier, while centralized platforms continue to become targets for corruption, data breaches and internal misuse.

Enter Web3 and verifiable credentials

As the internet moves from Web2 to Web3, so too does it enter the “Own” era of “Read-Write-Own”. In order to allow the Web3 ecosystem to truly flourish, self-sovereign ownership of money and digital assets must be accompanied by self-sovereign ownership of personal credentials.

To tie down decentralized blockchains to physical documents would be counterproductive to their progress, while continuing to rely exclusively upon centralized systems defeats the purpose of self-sovereignty.

When implemented correctly, Web3 can be complemented by verifiable credentials: A suite of decentralized solutions to help us navigate the increasingly blurred lines of the physical and the virtual worlds, in a safe and secure manner.

Key use cases for Web3 verifiable credentials

Verifiable credentials offer the possibility to generate an undeniable proof that you, the person signing a document, requesting a service, or purchasing a product, have at one point in time proven your identity, or ownership of a particular asset, and that this document is still valid.

In practice, the act of providing this proof without granting access to sensitive information should be more than enough to allow for secure digital interactions to take place.

VC triangle of Trust by Daniel H Hardman, licensed under CC BY-SA 4.0

There are various use cases for verifiable credentials; the following are complementary to an empowered self-sovereign Web3 user experience.

Reusable KYC and KYB for crypto projects

How many companies have you entrusted with your sensitive personal data? Doesn’t it leave a weird feeling in your stomach once you start counting them all up?

Most of these companies don’t need to know your date of birth, your personal ID number, or your home address. But they are required to follow protocol and ensure they’re not dealing with people linked to illicit activity. There’s a lot that can be improved here simply by abstracting the sensitive data away from the fact that someone’s credentials are legitimate.

As regulations weigh down on the crypto industry, a single, reusable zero-knowledge Know Your Customer (KYC) or Know Your Business (KYB) proof would enable traders to hop between exchanges at ease, streamlining their user experiences and downscaling the risk of sensitive data being mishandled.

Proof of real world assets

How many people have unlawfully lost their properties and possessions during times of war? And what assurances did they have of regaining their land, or the value of their assets, once new borders were drawn up. Under a centralized structure, the new power in control can easily neglect or deny the contents of previous property records.

An open, decentralized blockchain-based ledger could fix this as it would provide an undeniable, timestamped proof of ownership. It’s the perfect platform to store and access proofs of verifiable credentials.

The use case of proving ownership of real world assets is of course not limited to times of war. Many institutions are beginning to navigate the world of asset tokenization, and verifiable credentials provide the ideal digital structure to safeguard their integrity.

Proof of personal data and reputation

Misinformation and impersonation are age-old human problems. More recently, these issues have been inflated due to the rise in popularity of social media platforms, with far reaching impact on political scenes around the world. Add to this the growing threat of AI-generated deep fake content and the line between what’s real and what’s fake becomes impossible to discern.

Human beings linking their accounts to verifiable credentials is potentially the only way to move forward in a digital world increasingly shared with bots, scammers and manipulative organizations.

How Verida empowers builders with verifiable credentials

Think of Verida as a base layer upon which a variety of other networks, blockchains and applications can tap into for solutions linked to identity and verification.

The advantage of this modular flexibility is that users can navigate both Web2 and Web3 applications and services while receiving, storing and even sharing credentials all in one place: their Verida Wallet. Just as the leather wallet in your pocket might hold your cards, tickets, photos, and receipts, as well as your cash, the Verida wallet aims to bring far more personal utility to your digital experiences than what a typical crypto wallet might offer.

The ability to access multiple identity networks and even bridge credentials between applications and blockchains is made possible thanks to Verida’s Verifiable Credentials Developer SDK. Verida’s open platform is purpose-built to help developers choose the right credential standard to suit their applications. Depending on the standard and underlying technology, credentials can be verified both off-chain or on-chain.

Verida enables the issuance of verifiable credentials directly on the Verida network, while developers can choose from multiple credential libraries to meet the requirements of their applications.

Verida supports multiple integrations but in theory every issuer or credential provider can work with Verida. The following are some of the key credential standards supported by Verida’s growing ecosystem.

W3C Standard DID-JWT-VC

Verida’s credentials are built to meet the specifications of W3C’s DID-JWT-VC compliance standard. Verida provides an SDK to create and issue these types of credentials and supports the storage and sharing of these credentials with decentralized applications (dApps).

The W3C’s DID standard is an extremely important protocol in the identity developer space and is likely to form the basis for verification in regions like the European Union.

Privado ID (formerly Polygon ID)

The Verida Wallet enables users to interact with Privado ID’s impressive zero-knowledge tech stack. Privado ID has grown a large ecosystem of applications focused on improving user experience and security in the world of decentralized identity.

Source: Verida on Medium

Through its modularity and flexible platform, Verida sees itself not as a competitor, but as a partner and valuable addition to the Privado ID ecosystem. This demo video shows the seamless user experience when connecting a Verida Wallet with Privado ID to issue, store and verify zero-knowledge credentials.

zkPass

The zkPass protocol enables the conversion of private data from standard https websites into zero-knowledge proofs, without the need for additional API integrations. This enables users to create a proof of any content located on a secure website.

Examples include logging into a bank and proving you have over $10k in your account, without disclosing any other financial information. Similarly, you can login to a crypto exchange and prove you have completed a KYC check, without disclosing any personal information.

Source: Verida on Medium

Verida recently announced a Verida Missions partnership with zkPass. Such a partnership demonstrates the importance of interoperability and modularity as it bridges the gap between Web2 and Web3 user data in a privacy preserving manner.

To learn more about this integration, visit the zkPass section of Verida’s credentials documents.

Reclaim Protocol

Reclaim Protocol’s verifiable credentials can be received and stored on the Verida network. Thanks to the Verida proof connector and Reclaim’s zero-knowledge technology, users can reply to a verifier’s proof request in a privacy preserving manner.

Reclaim Protocol has a large library of JSON credential schemas that meet the identity requirements of various services within Web2 and Web3.

Source: Verida Developer Docs

Alt text: Screenshots of various applications which can be accessed through Reclaim Protocol’s verifiable credentials service

To learn more about this integration, visit the Reclaim Protocol section of Verida’s credentials documents.

cheqd

Verida Wallet users can experience full support for verifiable credentials issued through cheqd’s enterprise-grade Credentials as a Service product. The combination of Verida’s user-friendly wallet, and decentralized storage and backup solution, together with cheqd’s infrastructure for DIDs and DID-linked resources makes for a robust tech stack in the decentralized identity space.

Source: Verida Developer Docs

The partnership with cheqd opens the door to additional collaborations such as FinClusive’s reusable KYC/KYB credential solution. FinClusive’s end-to-end integration with cheqd and Verida not only provides users with resubility, it also ensures client privacy, portability, and embedded compliance controls.

Source: cheqd.io blog

The FinClusive use case is but one example of a streamlined off-the-shelf identity solution that developers can expect to access once they explore the Verida network.

An open building site for dApp developers

Having Verida as the base layer for your application can future proof your implementation. Not only does it grant instant access to the Verida SDK integration, it also provides developers with the flexibility to mix and match multiple credential standards for different use cases.

Verida is on the forefront of decentralized digital identity, offering an open and collaborative ecosystem, not a siloed platform. Numerous plug-and-play KYC and KYB solutions are readily available to dApp builders, which can expedite the process from proof of concept to legitimate product or service.

To learn more about Verida’s framework for verifiable credentials, head over to the Credentials page on our Developer Docs.

And if you’re a developer looking to future proof your application with a seamless identity solution, get in touch with our experts in our Discord server or register your project for the Verida Ecosystem here.

We’re excited to see what you’re building!

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for various industries. Verida’s ecosystem of KYC partners and technologies are ideally suited to help Kima expand into new markets, streamlining processes and efficiency for compliant transactions. For more information, visit Verida.

Verida Missions | X/Twitter | Discord | Telegram | LinkedInLinkTree

True Web3 Ownership Starts With Verifiable Credentials was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.


Metadium

Metadium Explorer Update

Dear Community, We would like to announce updates to Metadium Explorer. This update was made to reduce site load and display data more efficiently. Details are as follows: Internal Transaction — Detail Mode applied Update details Modified to display only cases where Create and Value exist in the Internal Transactions list. You can check all types of information by clicking the Deta

Dear Community,

We would like to announce updates to Metadium Explorer. This update was made to reduce site load and display data more efficiently. Details are as follows:

Internal Transaction — Detail Mode applied

Update details

Modified to display only cases where Create and Value exist in the Internal Transactions list. You can check all types of information by clicking the Detail Mode button located at the top right of the list.

Reason for update

To resolve the load on retrieving all types of information due to the increase in Metadium’s internal transaction data.

메타디움 익스플로러의 업데이트 소식을 전해드립니다. 이번 업데이트는 사이트 부하를 줄이고 더욱 효율적으로 데이터를 표시할 수 있도록 진행되었습니다. 세부사항은 아래와 같습니다.

요약

Internal Transaction — Detail Mode 적용

업데이트 상세

Internal Transactions 리스트에 Create와 Value가 존재하는 경우만 표시되도록 수정되었습니다. 리스트의 우측 상단에 위치한 Detail Mode 버튼을 클릭하면 모든 타입의 정보를 확인할 수 있습니다.

업데이트 사유

메타디움의 Internal Transaction 데이터 증가로 인해 모든 타입의 정보를 불러오는 데 부하가 발생하여 이를 해소하기 위함입니다.

-메타디움 팀

Website | https://metadium.com

Discord | https://discord.gg/ZnaCfYbXw2

Telegram(EN) | http://t.me/metadiumofficial

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Metadium Explorer Update was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 26. July 2024

SC Media - Identity and Access

How companies can secure their data as the Summer Olympics begin -- and the threat landscape amps up

Authorities expect even more API attacks on businesses working the Summer Olympics in Paris this year – here’s how to mitigate the impact.

Authorities expect even more API attacks on businesses working the Summer Olympics in Paris this year – here’s how to mitigate the impact.


Indicio

Deepfake phishing: How to use verifiable credentials to defend against generative AI attack

The post Deepfake phishing: How to use verifiable credentials to defend against generative AI attack appeared first on Indicio.
Bank and financial institution call centers are on the frontline in technology’s latest battle over identity fraud. Decentralized identity provides a low cost, low friction, high benefit solution.

By Trevor Butterworth

Biometrics — using facial, voice, and other physiological characteristics to authenticate a person — was supposed to save customers from the risk of their passwords being stolen. Our biometric characteristics provide a much tougher security profile to crack and can be used to manage mobile and online banking services, payments, ATMs, KYC and AML requirements, and even to remotely onboard customers.

Now, the promise of seamless and secure authentication has been hit by a shapeshifting wrecking ball: deepfake phishing.

In January 2024, an employee of a multinational company, Arup, was duped into sending $25 million to fraudsters after she participated in a video conference call with what she thought was a real, senior executive. The executive who told her to make the payment turned out to be, an AI-generated “deepfake.” 

Deloitte’s Center for Financial Services estimates that AI-driven fraud could cost the United States alone between $20 and $40 billion dollars by 2027.

While Arup grabbed global headlines with the scale of the loss, the reality is that attacks using AI are more often focused on defrauding customers with bank and financial call centers on the frontline.

Using AI to mimic the voice of a person, a hacker can generate hundreds of robo calls to different bank call centers, each attempting to reset the passwords of different accounts so as to gain access. 

The burden on call center staff and interactive voice response (IVR) authentication systems to identify a virtual fake from a real customer is enormous. But customers are also at risk of being phished by a deepfaking their financial institution’s IVR system.

Proposed solutions such as abandoning voice recognition or using more sophisticated AI to detect deepfakes are either extreme or likely to trigger an endless AI arms race. 

Decentralized identity offers a much simpler approach to mitigating the problem. We have already shown how a verifiable account credential radically simplifies authenticating a customer (and a customer authenticating their bank) before access is given. And it provides passwordless login to deter regular phishing. 

But a bank can also use the communication features of a verifiable credential to rapidly double check identity. This is because a communication channel with the customer can be created during credential issuance. 

This trusted channel can’t be accessed by another party or faked. And it allows the bank call center to automatically verify a caller by sending a message over the channel asking the customer to “please confirm we are on a call right now.” 

The advantage over multifactor authentication is that using a verifiable credential is faster, more secure, and it works in both directions. It’s not just a simple way for banks to authenticate customers, it’s a simple way for customers to authenticate their banks.

Curious as to how you could protect your call center with verifiable credentials? 

We provide a free, no-obligation workshop where we evaluate your use case in light of this powerful, emerging technology. Contact us now!

####

Sign up to our newsletter to stay up to date with the latest from Indicio and the decentralized identity community

The post Deepfake phishing: How to use verifiable credentials to defend against generative AI attack appeared first on Indicio.


FindBiometrics

OIX Urges UK to Let Private Sector Handle Its Own Digital IDs

The Open Identity Exchange (OIX) has strongly advised the UK Government against extending the One Login service to the private sector for digital identity purposes. In a letter to Peter […]
The Open Identity Exchange (OIX) has strongly advised the UK Government against extending the One Login service to the private sector for digital identity purposes. In a letter to Peter […]

auth0

Secure Node.js Applications from Supply Chain Attacks

Guidelines and security best practices to protect from third-party threats
Guidelines and security best practices to protect from third-party threats

FindBiometrics

Identity News Digest – July 25 2024

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: NIST Updates PIV Standard, Seeking FIPS 201 […]
Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: NIST Updates PIV Standard, Seeking FIPS 201 […]

Judge Halts Kenya’s Digital ID Effort, Citing Constitutional Concerns

The rollout of Kenya’s new digital identity system, Maisha Namba, has been temporarily halted by the country’s High Court following a legal challenge. Justice Lawrence Mugambi issued the suspension order, […]
The rollout of Kenya’s new digital identity system, Maisha Namba, has been temporarily halted by the country’s High Court following a legal challenge. Justice Lawrence Mugambi issued the suspension order, […]

Ocean Protocol

Predictoor Benchmarking: Linear SVM Classifier with Calibration

Comparing Linear SVM vs Lasso (L1) vs Ridge Regression (L2) vs ElasticNet (L1-L2), and Calibration of Classifier Probabilities Summary This post describes benchmarks on Ocean Predictoor simulations across various approaches: Linear SVM (ClassifLinearSVM) and Linear Logistic Regression models (ClassifLinearLasso, ClassifLinearRidge, ClassifLinearElasticNet) to determine the effects of (a) linear
Comparing Linear SVM vs Lasso (L1) vs Ridge Regression (L2) vs ElasticNet (L1-L2), and Calibration of Classifier Probabilities Summary

This post describes benchmarks on Ocean Predictoor simulations across various approaches: Linear SVM (ClassifLinearSVM) and Linear Logistic Regression models (ClassifLinearLasso, ClassifLinearRidge, ClassifLinearElasticNet) to determine the effects of (a) linear regularization, and to (b) calibrating the models’ output probabilities.

It then proceeds to do a walk-through of each of the benchmarks for predictoor/trader profit, and comparison plots for the models & their calibrations.

1. Introduction 1.1 What is Classification?

Classification models (“classifiers”) predict the category or class label of an input. For example, in email spam detection, a classifier might predict whether an email is Spam or Not Spam. Since it has just two possible categories, this makes it a binary-valued classification model (True or False).

To compare, regression models (“regressors”) predict a continuous-valued output. In an email spam application, a regressor might predict the number of spam messages per day.

1.2 What is Regularization?

“Regularization” is a technique in fitting models that aims to reduce the uncertainty of future predictions. (“Minimize volume of future predictions confidence ellipsoid.) There are many approaches to regularization. When learning linear models, they occupy a spectrum from:

L1 / Lasso: Try to get sparse models with few parameters. L2 / Ridge Regression: Allow each variable to have a weight, however small. ElasticNet: Shades of gray between L1 & L2. 1.3 What’s Calibration of Classifier Probabilities?

Beyond just predicting True or False, classification models can also output their estimated probability of True or False (or more general category).

Probabilities fall out naturally with some classifier formulations, such as logistic regression. However, SVMs, including linear SVMs, do not produce probability estimates. Instead, they produce a decision boundary and classify data points based on which side of the boundary they fall on. Calibration of a linear SVM’s classification involves converting the raw decision scores (which indicate the distance from the decision boundary) into calibrated probabilities that reflect the true likelihood of the classification.

We can “calibrate” output probabilities by warping the model’s estimated probability to a training set. This means its probability estimates can be more trustworthy. Some techniques include: None, Isotonic, and Sigmoid.

None calibration: Uses the model’s raw, uncalibrated probabilities. Isotonic calibration: Maps predicted probabilities to calibrated probabilities by dividing the probabilities into classes using a fitted piecewise constant, non-decreasing function. Sigmoid calibration: Maps predicted probabilities to calibrated probabilities by transforming the raw scores using a sigmoid function, which has an S-shaped curve, squeezing the input values into the range [0, 1]. 1.4 Ocean Predictoor, Classification, and Benchmarks

In Ocean Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $.

Classification models are used to predict whether the next BTC/USDT close value will go UP or DOWN. (Or ETH/USDT, BNB/USDT, etc.)

We developed a simulation tool (“pdr sim”) for data scientists to run simulations of their predictoor & trader bots to benchmark which modeling strategy, trading strategy, parameters, etc., on $ made by predicting or trading. The simulator also outputs classifier accuracy, f1 / recall / precision, and other classifier performance metrics.

1.5 Benchmarks Outline

To help users in setting parameters, we ran benchmarks using the “pdr multisim” tool, which invokes the simulator in a loop across various parameter settings. This blog post describes the results of those benchmarks.

We run benchmarks on each of the approaches:

ClassifLinearLasso — L1 regularization ClassifLinearRidge — L2 regularization ClassifLinearElasticNet models — spectrum between L1 & L2. ClassifLinearSVM — L2 regularization

And for each of those approaches, we run the three calibration approaches (None, Isotonic, Sigmoid).

All the plots for ClassifLinearLasso, ClassifLinearRidge, and ClassifLinearElastic net models is shown in the Predictoor benchmarks series first blogpost, “Predictoor Benchmarking On Regularized Linear Classifiers With Calibration”.

1.6 Experimental Setup

These parameters were defined in our my_ppss.yaml file, a customized version of the ppss.yaml file of the pdr-backend repo:

ClassifLinearRidge, ClassifLinearLasso, ClassifLinearElasticNet, and ClassifLinearSVM ML models were tested with None, Isotonic, and Sigmoid calibrations. Models predicted 5min candle UP/DOWN feeds for BTC-USDT and ETH-USDT for 5000 epochs. Models were trained on historical Binance 5min close candle data using either a BTC-USDT training set or both BTC-USDT and ETH-USDT training set from January 1, 2024, to June 30, 2024. The # of training samples (max_n_train) tested were 1000, 2000, 5000, 10000, 15000. Autoregressive n = 1 and 2 were tested. (Number of candles in the past to build models from.) Trading fees were set to 0%. The predictoor bot stake was set to 100 OCEAN per epoch. Other predictoors’ accuracy was set to 50.001% (barely better than random). 2. ClassifLinearSVM

Ocean Predictoor’s ClassifLinearSVM model is a Python scikit-learn support vector classifier with a linear kernel trick implemented for binary classification & L2 regularization.

2.1 ClassifLinearSVM Benchmarks 2.1.1 Predictoor Profitability

At 5831.64 OCEAN, the ClassifLinearSVM model achieved the greatest Predictoor profit of all the previous benchmarks and was trained on 5000 samples of BTC-USDT data with None calibration. The previous best benchmark for Predictoor profit was achieved by the linear regression model ClassifLinearRidge at 4937.0429 OCEAN on the BTC-USDT & ETH-USDT training data at 5000 iterations. The max Predictoor profit also surpasses the profitability achieved by the ClassifLinearLasso & ClassifLinearElasticNet models.

Predictoor Profit Tuning ClassifLinearSVM

Additional tuning of the ClassifLinearSVM model boosted profits greatly at max_n_train = 10000. At 6879.16 OCEAN achieved with BTC-USDT training data and 6122.31 OCEAN on BTC & ETH-USDT data, Predictoor profits from the ClassifLinearSVM at 10000 training samples hit all-time highs for the model.

2.1.2 Trader Profitability

While the ClassifLinearSVM model excelled in Predictoor profits, it did not outperform the previous maximum trader profit benchmark of $268.4489 USD achieved by the ClassifLinearLasso model.

Trader Profit Tuning ClassifLinearSVM

Nor did ClassifLinearSVM trader profit improve with tuning the max_n_train amount to 10000 & 15000 samples. The max trader profits for ClassifLinearSVM occurred around 5000 samples.

3. Comparison Analysis 3.1 Highest Predictoor Profits

The ClassifLinearSVM model emerged as the top performer for maximizing Predictoor profits and outperformed the ClassifLinearLasso, ClassifLinearRidge, and ClassifLinearElasticNet models across all Predictoor profit benchmarks.

3.2 Highest Trader Profits

Despite the impressive Predictoor profits, the ClassifLinearSVM model fell short in maximizing trader profits. The ClassifLinearLasso model remained the winner from prior benchmarks at a maximum trader profit of $268.4489 USD achieved with an Isotonic calibration, autoregressive_n =1, and 5000 samples of BTC-USDT training data. This suggests that while the ClassifLinearSVM model is highly effective in making accurate predictions, the logistic regression models’ calibrated probabilities better support confidence-based trading strategies.

4. Conclusion

The benchmarking results highlight the strengths and trade-offs between Linear SVC and Linear Regression Classifier models for predicting crypto price movements in the Ocean Predictoor platform. The ClassifLinearSVM model with None calibration achieved the highest Predictoor profits, showcasing its superior predictive power in identifying profitable trades. However, for trader profitability, the ClassifLinearLasso model with Isotonic calibration continued to achieve max profits, emphasizing the importance of well-calibrated probabilities for trading success.

These insights encourage benchmarking of more models to optimize both prediction accuracy and trading profitability. The results show that while SVMs are powerful for prediction, linear logistic regression models with proper classifier calibration remain crucial for confidence-based trading strategies.

5. Appendix: Tables 5.1 ClassifLinearSVM Data Table

The data for the ClassifLinearSVM model shows the maximum Predictoor profit at 5831.64 OCEAN, which surpasses all of the previous Predictoor Profit benchmarks. To achieve this, the model was trained on 5000 samples of BTC-USDT data with None calibration.

5.2 ClassifLinearLasso Data Table

The data for ClassifLinearLasso evidences the maximum trader profit of $268.4489 USD in the row with Isotonic calibration, autoregressive_n = 1, and max_n_train = 5000 trained on BTC-USDT data.

5.3 ClassifLinearRidge Data Table

The ClassifLinearRidge data contains the previous maximum Predictoor profit benchmark of 4937.0429 OCEAN in the row containing Sigmoid calibration with autoregressive_n = 1, and 5000 training samples on BTC-USDT & ETH-USDT data. Notice that this model also demonstrated strong trading profits.

5.4 ClassifLinearElasticNet Data Table

The ClassifLinearElasticNet table demonstrates similar patterns seen in the ClassifLinearLasso and ClassifLinearRidge benchmarks, which makes sense because this model is regularized with both L1 & L2 penalties.

6. Appendix: Details on Linear SVC Regularization

Linear SVC is a Support Vector Machine model with classification and a linear hyperplane. It includes two main parts: the first part minimizes the hinge loss, and the second part involves regularization, which can be either “L1” or “L2”. In these benchmarks, we used the default L2 regularization.

6.1 L2 Regularization

L2 regularization minimizes the sum of the squared hinge loss errors and includes an additional penalty proportional to the sum of the squared values of the model parameters (weights). The L2 term prevents the model from assigning too much importance to any single feature by penalizing large weights. This helps in preventing overfitting and enhancing the model’s generalization to new data.

Mathematically, L2 regularization is represented as:

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

Predictoor Benchmarking: Linear SVM Classifier with Calibration was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


FindBiometrics

Thailand Launches Selfie-based Digital Wallet Registration

Starting July 25, the Thai government is launching a digital wallet initiative allowing citizens to receive 10,000 baht (approximately $275 USD). Managed by the Ministry of Finance, this program aims […]
Starting July 25, the Thai government is launching a digital wallet initiative allowing citizens to receive 10,000 baht (approximately $275 USD). Managed by the Ministry of Finance, this program aims […]

UK Media Regulator Fines TikTok £1.875M for Failing to Provide Accurate Information

TikTok has been fined nearly £1.9 million by Ofcom, the UK’s media regulator, for failing to provide accurate information regarding its parental controls in response to a freedom of information […]
TikTok has been fined nearly £1.9 million by Ofcom, the UK’s media regulator, for failing to provide accurate information regarding its parental controls in response to a freedom of information […]

SC Media - Identity and Access

UN cybercrime treaty draft condemned for human rights threat

Such a draft, which expands cybercrime definitions to include crimes involving international communications technology, would prompt revisions of criminal laws to include more police powers without much consideration for human rights protections for dissidents.

Such a draft, which expands cybercrime definitions to include crimes involving international communications technology, would prompt revisions of criminal laws to include more police powers without much consideration for human rights protections for dissidents.


Nuggets

Nuggets Partners With Carahsoft to Bring Private, Reusable Identity and Passwordless Solutions to…

Nuggets Partners With Carahsoft to Bring Private, Reusable Identity and Passwordless Solutions to US Government Agencies The partnership advances interoperability and security of existing CIAM systems through trusted and verified decentralized identity and verifiable credentials We’re delighted to announce that we have partnered with Carahsoft Technology Corp., The Trusted Government IT Solution
Nuggets Partners With Carahsoft to Bring Private, Reusable Identity and Passwordless Solutions to US Government Agencies

The partnership advances interoperability and security of existing CIAM systems through trusted and verified decentralized identity and verifiable credentials

We’re delighted to announce that we have partnered with Carahsoft Technology Corp., The Trusted Government IT Solutions Provider®. Through this partnership, Carahsoft will serve as Nuggets’ Master Government Aggregator®, making our products available to the Public Sector through Carahsoft’s reseller partners and NASA Solutions for Enterprise-Wide Procurement (SEWP) V, Information Technology Enterprise Solutions — Software 2 (ITES-SW2), National Association of State Procurement Officials (NASPO) ValuePoint and OMNIA Partners contracts.

With the growing need for trusted digital solutions, decentralization offers unmatched privacy and security. Partnering with Carahsoft will amplify our reach within the Government, providing them with our cutting-edge private decentralized identity solutions. We are excited to leverage Carahsoft’s extensive network and expertise to enhance operational efficiency and security for our customers.

For some background, Public Sector organizations face unique challenges surrounding credentials, identity reusability, and interoperability. They need tools to combat the ever-evolving issues around fraud, AI, deep fakes, ransomware, and data privacy that will also deliver a seamless and frictionless user experience while increasing operational efficiencies.

Government agencies face two significant obstacles to data integrity: the increasing cost and challenges associated with data privacy and the acceleration of sophisticated scams and rampant fraud. Nuggets solves for both. Our fully decentralized wallet and platform protect organizations from data breaches, ransomware, and fraud while ensuring digital identities always remain verified, private, and secure.

Today’s first-generation systems contain enticing silos of sensitive personal data, making them a prime target for hackers and data breaches. These siloed components from multiple service providers can be difficult to integrate, creating poor visibility for individuals who don’t have an integrated stack.

Additionally, legacy systems have created a host of significant issues both in terms of security and business objectives and are hindering growth. They are exposed to high levels of fraud, have low assurance and utilize multiple authentication factors while their authorization controls are often limited.

By implementing Nuggets across existing customer identity and access management (CIAM) solutions, agencies can adopt a more modern and adaptive infrastructure, enabling transformational shifts.

Brian O’Donnell, Vice President of Cybersecurity Solutions at Carahsoft said: “As Government agencies face growing demands for secure and efficient digital processes, Nuggets’ advanced technology offers an ideal solution. Together with our reseller partners, we are dedicated to providing these innovative tools to enhance security and decrease fraud with a premier user experience.”

Nuggets is a Decentralized Self-Sovereign Identity and payment platform that guarantees trusted transactions, verifiable credentials, uncompromised compliance, and the elimination of fraud — all with a seamless user experience and increased enterprise efficiencies.

We’re building a future where digital identity is private, secure, user-centric, and empowering.

We’d love to hear from you if you want to enhance your data privacy and security offering.

You can learn more about our solutions here or get in touch with us here.

Nuggets Partners With Carahsoft to Bring Private, Reusable Identity and Passwordless Solutions to… was originally published in Nuggets on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokeny’s Talent | Omobola

The post Tokeny’s Talent | Omobola appeared first on Tokeny.
Omobola Giwa is Marketing Intern at Tokeny.  Tell us about yourself!

Hello, I am Omobola (most people prefer Bola because it’s shorter). I come from the vibrant city of Lagos, Nigeria, and I am married to my birthday mate—a delightful coincidence that makes our shared celebrations even more special. Now, we’re a trio with the addition of our lively and adorable 2-year-old son, who fills our lives with joy and laughter.

What were you doing before Tokeny and what inspired you to join the team?

I started my career in the legal field, having obtained a law degree from the University of Lagos, Nigeria, but the fast-paced world of business soon piqued my interest hence I transitioned into business development and project management roles, and spent a few fulfilling years doing these.

Upon relocating to Luxembourg, I seized the opportunity to deepen my business acumen by enrolling in a master’s program in Entrepreneurship and Innovation at the University of Luxembourg. As part of this program, I undertook a 3-month internship working in the marketing team at Tokeny, a decision that has proven to be both strategic and rewarding.

Choosing Tokeny for my internship was driven by a desire to explore new frontiers. Firstly, I craved uncharted territories. Fintech, especially tokenization, is a cutting-edge frontier, and I wanted to be a part of it! Secondly, the marketing role offered a fresh challenge. While I had engaged in marketing activities in my previous roles, I had never dedicated myself solely to this discipline. Finally, after exploring Tokeny’s website and employee testimonials, I was struck by the company’s people culture. The stories I read reflected a supportive and inclusive environment, perfect for professional growth.

How would you describe working at Tokeny?

My experience of working at Tokeny can be described as a unique blend of challenging work and supportive colleagues. Challenging because, even as an intern, you are entrusted with significant autonomy. You tackle diverse tasks, encouraging you to take ownership and innovate. This environment pushes you to grow and develop new skills continuously.

On the other hand, Tokeny is incredibly supportive. The company fosters a safe space where you are encouraged to give your best without fear of judgment. The sense of value and backing from the team is palpable. Colleagues are open, quick to acknowledge mistakes, and always ready to provide constructive feedback and assistance.

This culture of trust and collaboration not only boosts your confidence but also instills a strong commitment to excel and contribute meaningfully to the company’s success. It’s an environment where I felt empowered to go above and beyond.

What are you most passionate about in life?

Children are my greatest passion. Their innate curiosity, honesty, and unfiltered joy never ceases to amaze me. I am captivated by their simplicity and the way they view the world with such wonder and trust. Having children around brings me immense joy, and their genuine nature is both inspiring and heartwarming.

However, I am also deeply concerned about their vulnerability. I dream of one day establishing an entity that is dedicated to supporting underprivileged children, providing them with the basic needs and opportunities they deserve.

What is your ultimate dream?

My ultimate dream is to live a life filled with happiness and to be able to look back with no regrets. I aspire to create a meaningful and fulfilling life, both personally and professionally, where I can cherish every moment and be proud of the impact I’ve made.

What advice would you give to future Tokeny employees?

Embrace the challenges and opportunities that come your way. Tokeny is a place where you can grow and thrive if you are willing to take initiative and be open to learning.

What gets you excited about Tokeny’s future?

Tokenization is the future and Tokeny is part of the creators of that future—that in itself is exciting! We’re building the future of finance, and I can’t wait to see what incredible things Tokeny achieves next.

She prefers:

Coffee

Tea

check

None

check

Movie

Book

Work from the office

check

Work from home

check

Dogs

Cats

check

Call

Text

check

Burger

Salad

Mountains

check

Ocean

check

Wine

Beer

Countryside

check

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

check

Fiat

check

Night

Morning

More Stories  Tokeny’s Talent|Tony’s Story 18 November 2021 Tokeny’s Talent | Fedor 10 April 2024 Tokeny’s Talent|Xavi’s Story 19 March 2021 Tokeny’s Talent | Liam 30 March 2023 Tokeny’s Talent | Lautaro 17 May 2023 Tokeny’s Talent | Adrian 15 May 2024 Tokeny’s Talent|Héctor’s Story 29 July 2022 Tokeny’s Talent|Ben’s Story 25 March 2022 Tokeny’s Talent|Barbora’s Story 28 May 2021 Tokeny’s Talent|Ivie’s Story 1 July 2022 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent | Omobola first appeared on Tokeny.

The post Tokeny’s Talent | Omobola appeared first on Tokeny.


SC Media - Identity and Access

Identity resilience: What it is and how to achieve it

Five identity-security experts discussed and defined identity resilience in a recent CyberRisk Alliance webcast and provided tips on how to realize it in your organization.

Five identity-security experts discussed and defined identity resilience in a recent CyberRisk Alliance webcast and provided tips on how to realize it in your organization.


Ocean Protocol

DF99 Completes and DF100 Launches

Predictoor DF99 rewards available. DF100 runs Jul 25— Aug 1, 2024 1. Overview Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor. Data Farming Round 99 (DF99) has completed. DF100 is live today, July 25. It concludes on August 1. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards
Predictoor DF99 rewards available. DF100 runs Jul 25— Aug 1, 2024 1. Overview

Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor.

Data Farming Round 99 (DF99) has completed.

DF100 is live today, July 25. It concludes on August 1. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.

2. DF structure

The reward structure for DF100 is comprised solely of Predictoor DF rewards.

Predictoor DF: Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn.

3. How to Earn Rewards, and Claim Them

Predictoor DF: To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs.

4. Specific Parameters for DF100

Budget. Predictoor DF: 37.5K OCEAN + 20K ROSE

Networks. Predictoor DF applies to activity on Oasis Sapphire. Here is more information about Ocean deployments to networks.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in DF: adding new streams and budget adjustments among streams.

Updates are always announced at the beginning of a round, if not sooner.

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

DF99 Completes and DF100 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Unleashing the Power of Orchestration in Self-Managed PingFederate Environments

PingAM helps PingFederate customers orchestrate seamless and secure user experiences for complex use cases at scale.  

Lockstep

Making data valuable

In the latest episode of our podcast Making Data Better, George Peabody and I are joined by the former NSW Chief Data Scientist Dr Ian Oppermann. Among many other things, Ian helped set up the NSW Data Analytics Centre, he led the development of a series of superb papers on data sharing and the digital... The post Making data valuable appeared first on Lockstep.

In the latest episode of our podcast Making Data Better, George Peabody and I are joined by the former NSW Chief Data Scientist Dr Ian Oppermann. Among many other things, Ian helped set up the NSW Data Analytics Centre, he led the development of a series of superb papers on data sharing and the digital self at the Australian Computer Society, and he represented Australia in the development of the new international standard for Data Quality, ISO 8000.

We covered a lot of ground, but I especially liked our discussion with Ian about the value of data and monetisation of digital assets.

It’s almost a taboo topic. Surveillance capitalism has come to dominate and poison how people regard data monetisation yet there are legitimate interests in realising the value of data.

So we asked Ian about making data value a respectable idea; how do we make it governable?

In conversation, Ian highlighted the problem that “no one’s really sure how [data is] valuable”.

“We are, to this day, still without an accounting standard which values data. We don’t have a way of measuring it from a finance perspective”.

Yet we all know that data is an asset. Data contributes the majority of the value of digital companies like Facebook and LinkedIn.

So how do we measure data quality? Ian explains that “it’s not about simple things like format. It’s about the entire governance process of data. How does data flow into your organization? What are the controls and the chain of custody, the chain of authorizing frameworks?”

With government being the source of so much critical foundational data, George and I have been trying to conceptualise distribution networks to make verifiable data accessible at scale. Ian shares his practical experience about change management and the role of government.

Take a listen! And please let us know what you think.

The post Making data valuable appeared first on Lockstep.

Wednesday, 24. July 2024

FindBiometrics

NIST Updates PIV Standard, Seeking FIPS 201 Alignment

The National Institute of Standards and Technology (NIST) has updated its PIV standards to better align with revisions to the Federal Information Processing Standard (FIPS) 201 dating back to January […]
The National Institute of Standards and Technology (NIST) has updated its PIV standards to better align with revisions to the Federal Information Processing Standard (FIPS) 201 dating back to January […]

Holochain

Holochain 0.3, a new Launcher, and… HC on Mobile!

Dev Pulse 140

Holochain 0.3.1 is the newest recommended release for you to build your hApps on. It comes with a raft of performance improvements and bug fixes, and not too many breaking changes. The happy news with this release is that validation is considerably more performant, hitting fewer dependency deadlocks. (There’s other big news that developers need to know about — read below.)

This release also comes with a companion Launcher, which returns to Electron for the UI. This should help front-end devs have a more predictable development and debugging experience.

In the ecosystem, our friends at darksoil studio have come out with p2p Shipyard, a super powerful tool for building self-contained installables of your hApp for Windows, macOS, Linux, and Android (and iOS in the future)!

And finally, you can see what others are doing with Holochain on mobile; at the very end I’ll share a demo video of Relay, an Android messaging app that’ll ship on Volla Phone’s upcoming Quintus flagship smartphone.

Holochain 0.3.1: Validation performance, isolated and authenticated app interfaces, HDI/HDK changes

Release date: 11 June 2024
HDI compatibility: 0.4.x
HDK compatibility: 0.3.x
JavaScript client compatibility: 0.17.x
Rust client compatibility: 0.5.x
Tryorama compatibility: 0.16.x
Scaffolding tool: 0.3000.1
Launcher: 0.300.1

You should notice a decent improvement in the time it takes for published data to appear in the DHT, especially data that has lots of validation dependencies. That’s because the validation workflow has been rewritten. What’s changed? There’s now a single validation thread per DHT space, rather than multiple threads, which means that the validation queue can more intelligently prefetch dependencies and avoid deadlocks due to missing dependencies. There are other tweaks to validation as well, such as reducing the op validation retry timeout and making it adapt to the number of dependencies the op is waiting for.

There are some breaking changes to the P2P protocol and the database structure, meaning that peers using 0.3 won’t be able to communicate with peers using 0.2 or import an 0.2-based application’s database.

But that’s okay, because there are some breaking changes to the HDI and HDK anyway. There aren’t too many, and they should be easy to update your code for. Our core team has written a great 0.2 → 0.3 upgrade guide to help you out.

Lastly, there’s a big change to the application API: There is now only one app API WebSocket port, and each UI must pre-request an authentication token and supply it when it establishes a session. The UI will only have access to the one app it requested access for.

Fortunately most of this change has been abstracted away for you by the new releases of the JavaScript and Rust clients, and if your UI is deployed in the Launcher it’ll get an access token behind the scenes. (In fact, connecting to an app interface requires fewer parameters now — see the upgrade guide for the changes you need to make.) But if you’re writing your own client, you’ll need to get familiar with what these changes mean for establishing an authenticated connection.

If you want to read the changes in detail, read the changelog all the way back to the 20230503.003735 release.

Get it via Holonix or upgrade your existing app.

JavaScript client 0.17.0 and 0.17.1, Tryorama 0.16.0, Rust client 0.5: Updates for Holochain 0.3

The header says it all! This line of releases makes the client libraries and testing framework compatible with the preauthentication and per-app binding of the app API’s new security model. There’s also a small bugfix in the JS client to address a bug in the hash utility functions.

Check out the changelog for the Rust client for details on what’s changed — the process of preauthenticating a client connection is a little more manual than for the JavaScript client.

You can get the JS client and Tryorama from NPM and the Rust client from crates.io by updating the dependency in your package.json and Cargo.toml files, then following the upgrade instructions for your UI code.

hc-scaffold 0.3000.1: Streamlined flow, React and headless templates, simplified CSS

This release contains more than just an update to Holochain 0.3. A lot of polish has gone into this release, and as a sometime code reviewer I feel like this tool is coming into maturity. The developer UX has been tweaked in a lot of little ways that make it a pleasure to work with — particularly the newly streamlined “I only want a single-zome web app” flow. Now, when you run hc scaffold web-app you’re asked if you want to create a DNA and an integrity/coordinator zome pair.

The templates, which traditionally used Material UI to make it look nice, now use a single, clean, basic stylesheet build with Tailwind CSS. The intention is that this will be easier to work with, and easier to just delete and replace with your own stylesheet.

Alternative package managers are now supported via a new -p or --package-manager flag — in addition to npm , you can also scaffold a project that uses pnpm, yarn, or bun to run all of the building, testing, and UI tasks in its package.json files.

The codebase has also been cleaned up a lot. What does this mean for you? Well, if you’d like to try creating your own template, you don’t have to do everything from scratch or copy any non-UI templates — the common templates (like Tryorama tests) are now shared among UI templates.

This scaffolding tool is now available in the Holonix dev environment, so all you need to do to use it in a new project is go

nix run github:/holochain/holochain#hc-scaffold – web-app

(once you’ve installed Holonix, of course).

I’m still updating the Getting Started guide, so it’s a bit out of sync with the current state of the scaffolding tool. But honestly, it’s so easy to use that you probably won’t need my guide anyway.

Launcher 0.300.0: It’s smaller on the outside

When my colleagues showed me the latest Holochain Launcher, I was kinda startled. The Launcher I was familiar with was gone, replaced by… less. Much less. And I rather liked it!

You can tell that a lot of user experience work has gone into this Launcher. The startup UI consists of a tray icon and a little search box, similar to Spotlight on macOS.

That’s it!

I don’t have any apps installed right now, so I’ll try looking for something.

Not much there yet, but Kando looks useful, and I see a nice green ‘Verified’ badge (note: this helps protect you from accidentally installing a malicious HoloFuel lookalike that steals all your fuel — the Holochain team will be vetting all apps for now, but you can still sideload ‘unverified’ apps manually or find them in the hApp store, and there will eventually be a more scalable verification process).

The downloading spinner is simpler and easier to understand. This’ll matter less as Holochain gets faster, but at any stage of maturity it’s nice feedback.

The installation dialogue is easier to understand too. The network seed field has been renamed with less jargon, and a handy tooltip tells you what it does.

Nice, now I’ve got a kanban app installed!

I know from talking to my colleagues that there’s a lot more planned to make Launcher even nicer to use. Pretty excited about the direction things are heading.

And everything — App Store searches, app installation, and the KanDo app itself — feels a little snappier. It’s hard to measure this sort of thing as an end-user, but it feels like Holochain 0.3 is just faster and lighter on resources.

If you’re a developer, you can check that gear icon in the upper right corner, go into ‘System settings’, and install the developer tools. This lets you help host the DevHub package repository that powers the App Store, and you can also add a new hApp to the repository with only a couple clicks. (Note that we’re also working on a DevHub CLI that gives you more power over things like sharing reusable components.)

And one more thing for developers — this version of Launcher reverts back to Electron for all the UI stuff. You may remember that we replaced Electron with a similar library Tauri a year or two ago, and while it seemed to hold promise, the reality wasn’t as great as expected. The biggest issue was the webview — Tauri uses your OS’ system webview, which might be ancient and out of date with current web standards. This was causing a lot of debugging headaches for devs, because they didn’t know which browser their UI needed to target. Electron comes with a recent chromium binary, which means there’s a stable front-end target for you to work with.

Get Launcher 0.300.0 from GitHub, give it a spin, and let us know what you think!

If you’re on Ubuntu 24.04, take note: this version of Ubuntu made some annoying changes to protect you from malicious apps. You’ll need to follow our recommended steps to get the Launcher to run. (The binary you want to target is /opt/Holochain Launcher (0.3)/holochain-launcher-0.3 for the deb package and <path-to-your-app-image>/holochain-launcher-0.3-0.300.0.AppImage for the AppImage package. In the future we’ll ship an AppArmor profile in the deb package to fix this problem.

p2p Shipyard: build your hApp for mobile and desktop OSes

darksoil studio is a small dev shop building core infrastructure for Holocain and “simple peer to peer apps for groups of people to meet their non-digital needs”. And, in order for those apps to meet people’s needs, they have to be available in ways that work for them. Nowadays, that often means an app that you can download from an app store. And mobile is a must.

So darksoil studio set out to make Holochain ready for mobile. It required a huge amount of work, but now, by bridging Holochain and Tauri, they’re able to bundle Holochain, a hApp, and a UI into an Android APK that can be submitted to the Play Store or f-droid (and EXEs for Windows, DMGs for macOS, and AppImages for Linux too).

This is pretty huge. We’ve long recognised it was necessary to get Holochain working on mobile, but needed to focus our energy on getting Holochain itself (and Holo Hosting) ready for general use. So it’s fantastic that a dev team in the ecosystem has done the work to build the critical infrastructure.

Now you can build your hApp with the tool they created, p2p Shipyard, and start testing it on (almost) all the OSes. It’s not Open-Source — yet — but it is Source-Available, so anyone can audit the code for security. Towards patterns for sustainable Open-Source development, darksoil is running an experiment with something they’re calling “retroactive crowdfunding”. Here’s what they’re promising and asking:

The p2p Shipyard is currently Source-Available, and once the retroactive crowdfunding goal is reached, it will be Free and Open Source, Forever.

Until then, a license is needed to use it, with all license fees going towards the crowdfunding goal. If you’d like a license to use the p2p Shipyard for your project, get in touch with us here! 

Here’s what’s been done to make Holochain ready for mobile, if you’re curious:

Mobile nodes can be set to have ‘zero-width arcs’ — that is, they’re full DHT peers but don’t contribute to the storage of other peers’ data. (This means you’ll want to set up something to keep the DHT available — if your userbase is big enough, the number of desktop users might be sufficient, but otherwise you may want to establish a cultural practice of asking people to leave their computer running so their peers can still get data. darksoil is also working on a local-server solution and, of course, Holo hosting will also be an option soon.) p2p Shipyard uses Tauri, which is currently the most viable way to integrate Holochain and a UI into a native Android or iOS app. (You can’t do it with Electron.) The darksoil team have built it as a Tauri plugin which you can fit into your build pipeline. There’s been work to make wasmer, the WASM VM that Holochain uses, ready for mobile. Holo itself has contributed some funds to the wasmer project to get it working in iOS, which will make it possible to build iOS apps using p2p Shipyard in the future. The darksoil team have done a lot of work to get Holochain building for Android and iOS — experimenting, wrestling build systems, equipping Holonix to do the right thing, reporting bugs, fixing them, and of course creating p2p Shipyard as a way to reproducibly build binaries for these OSes.

Here’s a couple screenshots of my colleague Eric using p2p Shipyard to bundle up two hApps his spinoff dev shop has been building.

And here are two more (kando & emergence)!... Check out https://t.co/GeVWNgqrE7 tooling for building and deploying decentralized Holochain based applications on both desktop AND mobile. #holochain pic.twitter.com/BJhgBA4CW5

— Eric Harris-Braun (@zippy314) June 6, 2024

And speaking of mobile…

Holochain on the Volla Phone Quintus

You’ve probably seen this already, but here’s a demo of a hApp running on a smartphone from Volla, a small German manufacturer who are passionate about serving their customers — as in, the people who buy their phones, not the ad companies peering through the airwaves at them. This means privacy-respecting alternatives to what we’re stuck with now.

The app isn’t anything ground-breaking — it’s just a chat app — but what’s amazing is that it’s running fully peer-to-peer (Volla opted to have the phone owners be active DHT participants, hosting each other’s data rather than using the zero-width arc strategy). It’s going to be accompanied by a backup app, and probably more apps in the future.

And it was made possible thanks to the wonderful community Holochain finds itself in – particularly Hedayat Abedijoo and Nick Stebbings who instigated a relationship with Volla, and darksoil who produced p2p Shipyard!

Cover photo by Bernd 📷 Dittrich on Unsplash 


FindBiometrics

NEC to Lead NZ Immigration’s ‘Biometric Capability Upgrade’

Immigration New Zealand (INZ) says that its Biometric Capability Upgrade (BCU) did not require Cabinet approval, an assertion aimed at evading any misbegotten scrutiny. The project, estimated at $35 million, […]
Immigration New Zealand (INZ) says that its Biometric Capability Upgrade (BCU) did not require Cabinet approval, an assertion aimed at evading any misbegotten scrutiny. The project, estimated at $35 million, […]

Philippines Authorities Urge Residents to Use Digital ID, Report Businesses That Refuse It

A Philippine Statistics Authority (PSA) official overseeing one of the country’s major regions is urging residents to take advantage of the country’s new digital ID program, especially if they have […]
A Philippine Statistics Authority (PSA) official overseeing one of the country’s major regions is urging residents to take advantage of the country’s new digital ID program, especially if they have […]

Live FRT Leads to Arrests in England’s Bedfordshire County

Police in the English county of Bedfordshire used Live Facial Recognition (LFR) technology during the Bedford River Festival, resulting in arrests. The festival, held July 20-21, saw LFR systems deployed […]
Police in the English county of Bedfordshire used Live Facial Recognition (LFR) technology during the Bedford River Festival, resulting in arrests. The festival, held July 20-21, saw LFR systems deployed […]

Trust Stamp Announces Strategic Alliance for Global Identity Consortium

Trust Stamp has entered into a Letter of Intent with Qenta Inc. for a strategic alliance aimed at integrating into a global Identity Consortium. This partnership will serve a federated […]
Trust Stamp has entered into a Letter of Intent with Qenta Inc. for a strategic alliance aimed at integrating into a global Identity Consortium. This partnership will serve a federated […]

Anonym

Gartner Confirms Anonyome Labs’ Solutions Offer Competitive Edge

The 2024 Gartner Emerging Tech Impact Radar confirms that Anonyome Labs’ enterprise solutions are among the highest impact technologies for gaining a competitive business advantage.  The latest annual impact radar nominates privacy and transparency as one of four emerging tech themes with the most potential to disrupt a broad cross-section of markets. Anonyome Labs’ market […] The post Gart

The 2024 Gartner Emerging Tech Impact Radar confirms that Anonyome Labs’ enterprise solutions are among the highest impact technologies for gaining a competitive business advantage. 

The latest annual impact radar nominates privacy and transparency as one of four emerging tech themes with the most potential to disrupt a broad cross-section of markets. Anonyome Labs’ market leading solutions fit squarely within this theme. 

Gartner then pinpoints decentralized identity (DI) and privacy-enhancing technologies (PETs) as two emerging technologies within that theme that product leaders should be factoring into their strategic and investment planning. Anonyome Labs’ offers both DI and PETs solutions through its B2B solutions. 

According to Gartner, “Increasing digitization of assets, information and experiences and the usage of AI are making privacy and transparency issues increasingly important by increasing opportunities for bad agents to mimic, disrupt and intercept our activities. They are also intensifying concerns around negative consequences of AI tools and techniques.  

“Rapid innovation in critical enabling technologies, like Web3, scalable vector databases and neuromorphic computing are creating new possibilities for IT solutions.” 

Through that lens, Gartner recommends, “Stimulat[ing] growth while mitigating risk and restrictive regulation by building user trust via systems such as decentralized identity and behavioral analytics and applying human-centered AI and responsible AI principles … [and] support[ing] your strategic product roadmap by identifying relevant emerging technologies and business values that they can enable and identifying relevant innovation tech partners.” 

Gartner describes DI or self-sovereign identity systems as technologies that address privacy and transparency challenges with traditional identity systems, and PETs as robust approaches that allow the processing of information while protecting underlying personal data. See below for further reading. 

Gartner says all 30 emerging technologies and trends are critical for product leaders to evaluate as part of their competitive strategy. And it seems many have already started in the privacy and transparency themed space: More than 62 per cent of US companies plan to incorporate a DI solution into their operations, with 74 per cent likely to do so by June 2024.  

Anonyome Labs is the leader in privacy and digital identity protection technologies. From verifiable credentials to VPNs and encrypted communications, we leverage our cryptography and blockchain technology expertise to take data privacy and security to the next level. Talk to us today to find out how your enterprise can get ahead of the curve on Gartner’s recommendations.  

Learn more about Anonyome Labs’ DI and PETs offerings 

You might like: 7 Benefits to Enterprises from Proactively Adopting Decentralized Identity 

Want more on decentralized identity from Anonyome Labs? 

Can Decentralized Identity Give You Greater Control of Your Online Identity?  Simple Definitions for Complex Terms in Decentralized Identity  17 Industries with Viable Use Cases for Decentralized Identity  Inside the Massive Projected Growth in the Decentralized Identity Market  Why More Companies are Turning to SaaS for Decentralized Identity Solutions  What our Chief Architect said about Decentralized Identity to Delay Happy Hour  6 Ways Web3 and Decentralized Identity Technologies Could Stop Deep Fakes  5 Aha! Moments About Decentralized Identity from the Privacy Files Podcast  Our whitepapers 

Want more on privacy-enhancing technologies from Anonyome Labs? 

Want to Monetize Privacy? Here’s How to Do It, Fast  2 Ways to Give Your Customers Privacy Products, Not Just Privacy Advice  This is How You Go Fast to Market with Privacy and Identity Protection Apps  5 Easy Ways to Become Your Customers’ Go-to for Privacy  2 Big Problems with Passwords – and How You Can Easily Solve Them for Your Customers  How to Use the Sudo Platform to Deliver Customer Privacy Solutions  How the Sudo Digital Identity Can Help Stop the Attack on Personal Privacy  5 Predictions for Data Privacy in 2023 and Beyond  3 Signs the US Public is Taking Data Privacy in its Own Hands 

Check out our podcast, Privacy Files, to hear what your peers and experts are saying about the state of member and consumer privacy in real time. 

The post Gartner Confirms Anonyome Labs’ Solutions Offer Competitive Edge appeared first on Anonyome Labs.


SC Media - Identity and Access

US-based lawsuits against NSO Group supported by leading tech firms

Numerous major U.S. tech firms, including Microsoft and Google, have issued an amicus brief supporting NSO Group victims' filing of lawsuits against the Israeli spyware firm.

Numerous major U.S. tech firms, including Microsoft and Google, have issued an amicus brief supporting NSO Group victims' filing of lawsuits against the Israeli spyware firm.


Civic

Tokenized Identity: Quadratic Voting As A Public Good with Dean Pappas, Dean’s List

In this episode of Tokenized Identity, Titus Capilnean, our VP of Go-To-Market, speaks with Dean Pappas of Dean’s List. They explore everything from the state of DAOs, quadratic voting, $POLL and beyond. They discuss cultivating better communities in Web3. If you’re not familiar with Dean, he’s been instrumental in the development of DAOs on Solana […] The post Tokenized Identity: Quadratic Voti

In this episode of Tokenized Identity, Titus Capilnean, our VP of Go-To-Market, speaks with Dean Pappas of Dean’s List. They explore everything from the state of DAOs, quadratic voting, $POLL and beyond. They discuss cultivating better communities in Web3. If you’re not familiar with Dean, he’s been instrumental in the development of DAOs on Solana […]

The post Tokenized Identity: Quadratic Voting As A Public Good with Dean Pappas, Dean’s List appeared first on Civic Technologies, Inc..


SC Media - Identity and Access

KnowBe4 targeted by fake North Korean IT worker

Within 25 minutes of having received his Mac workstation, the North Korean operative — who used VPN to conceal the location of the IT mule farm where the workstation was sent — leveraged Raspberry Pi to facilitate malware downloads, session history file alterations, file transfers, and unauthorized software execution.

Within 25 minutes of having received his Mac workstation, the North Korean operative — who used VPN to conceal the location of the IT mule farm where the workstation was sent — leveraged Raspberry Pi to facilitate malware downloads, session history file alterations, file transfers, and unauthorized software execution.


UNISOT

ENSURING NON-TOXIC TAMPONS AND SAFE SUPPLY CHAINS

Recent studies have uncovered alarming levels of toxic metals such as arsenic and lead in tampons. UNISOT tools are designed to address this challenge by automating and securing the process of mapping a company’s supplier network across multiple tiers upstream. The post ENSURING NON-TOXIC TAMPONS AND SAFE SUPPLY CHAINS appeared first on UNISOT.

Recent studies have uncovered alarming levels of toxic metals such as arsenic and lead in tampons. Notably, lead concentrations were higher in non-organic tampons, while arsenic levels were higher in organic tampons. This raises significant health concerns given the high absorption potential of the vaginal mucosa​ (Berkeley Public Health)​​ (University of California)​.

Jenni A. Shearston, a postdoctoral scholar at the UC Berkeley School of Public Health, highlighted the importance of testing: “I really hope that manufacturers are required to test their products for metals, especially for toxic metals. It would be exciting to see the public call for this, or to ask for better labeling on tampons and other menstrual products”​ (Berkeley Public Health)​​ (University of California)​.

How UNISOT’s Technology Ensures Safety

UNISOT’s Asset Traceability Platform and Digital Product Passports provide essential tools for ensuring the safety of menstrual products. By leveraging blockchain technology, these tools offer an immutable record of the entire supply chain, from raw material sourcing to manufacturing processes. This transparency allows consumers to verify the safety and quality of the products they use, ensuring they are free from harmful toxins.

Addressing Arsenic in Organic Tampons

A key finding from the study was that the arsenic found in organic tampons can increase the risk of cancer, reproductive and developmental health, cardiovascular diseases, neurological effects, endocrine disruption, kidney and liver damage. It likely comes from natural fertilizers used in cotton farming. This highlights a significant issue of due diligence within the supply chain. UNISOT’s Supply Chain Due Diligence (SDD) tool is designed to address this challenge by automating and securing the process of mapping a company’s supplier network across multiple tiers upstream.

SDD protects supplier trading secrets by anonymizing company names within the supply chain network. This ensures that sensitive commercial information remains confidential while still allowing for a comprehensive mapping and analysis of the supply chain. This level of transparency and protection is crucial for identifying and mitigating sources of contamination, such as arsenic from natural fertilizers, and ensuring that only safe products reach consumers.

Addressing Lead in Non-Organic Tampons

The discovery of higher lead concentrations in non-organic tampons is particularly concerning due to the severe health risks associated with lead exposure, including neurological damage, reproductive issues and increased cancer risk. Lead can enter the cotton used in tampons through various environmental pathways, such as contaminated soil, water or air, often near industrial sites or due to the use of certain pesticides.

UNISOT’s Asset Traceability Platform can help address this issue by providing detailed insights into every step of the supply chain. By tracking the origins of raw materials and monitoring manufacturing processes, the platform can identify and mitigate sources of lead contamination. This includes ensuring that cotton fields are in safe, uncontaminated environments and that any chemicals used in the cultivation or processing of cotton are free from harmful substances.

Moreover, by implementing UNISOT’s Supply Chain Due Diligence, companies can continuously monitor and verify their suppliers’ compliance with safety standards. This proactive approach not only helps in identifying potential contamination sources but also ensures that suppliers adhere to stringent safety protocols, thereby reducing the risk of toxic contamination in tampons, guaranteeing that all menstrual products are safe for women to use.

Women deserve to have access to products that are completely free from harmful toxins, ensuring their health and well-being.

Sources:
https://news.sky.com/story/arsenic-lead-and-other-toxic-metals-found-in-tampons-study-says-13175436
https://www.cbsnews.com/news/toxic-metals-tampons-arsenic-lead/
https://www.npr.org/2024/07/11/nx-s1-5036484/tampons-heavy-metals-study

The post ENSURING NON-TOXIC TAMPONS AND SAFE SUPPLY CHAINS appeared first on UNISOT.


Ontology

Ontology Weekly Report (July 16th — July 22nd, 2024)

Ontology Weekly Report (July 16th — July 22nd, 2024) This week at Ontology was marked by significant developments and updates across our platform, enhancing our community’s experience and advancing our technical capabilities. Here’s a recap of the activities and progress: Latest Developments Community Update Post-ETHCC: Don’t miss our community update, which is making a return post-ETHCC!
Ontology Weekly Report (July 16th — July 22nd, 2024)

This week at Ontology was marked by significant developments and updates across our platform, enhancing our community’s experience and advancing our technical capabilities. Here’s a recap of the activities and progress:

Latest Developments Community Update Post-ETHCC: Don’t miss our community update, which is making a return post-ETHCC! Dive into what you might have missed and catch up on all the insightful discussions. MPost’s Hack Seasons Content Available: An episode of our sessions at MPost’s Hack Seasons has been uploaded. Check it out to learn more about our contributions and discussions on decentralized identity. X World Games Giveaway Conclusion: Our exciting giveaway with X World Games has come to an end. Thank you to everyone who participated! Update on Galxe Event with KIMA: We have an important update regarding our event with KIMA on Galxe. Stay tuned for details on what’s next! Development Progress Go Toolkit Upgrade: We’ve upgraded the Go toolkit for Ontology, enhancing the efficiency and functionality for developers. ONT Leverage Staking Design: Progress in the leverage staking design has reached 65%, bringing us closer to offering more versatile staking options. RPC Port Service Isolation: We’ve successfully fixed the Ontology RPC port service isolation issue, improving security and performance. Product Development Node Setup on ONTO: You can now set up your own node in ONTO! This new feature allows users to manage their nodes directly within the app, streamlining the process and enhancing user control. On-Chain Activity Stable dApp Ecosystem: The total number of dApps on our MainNet remains robust at 177. Transaction Growth: This week, we observed an increase of 1,595 dApp-related transactions, totaling 7,776,938. Overall transactions on MainNet grew by 6,662, reaching a total of 19,505,222. Community Growth Vibrant Community Discussions: Our social platforms, particularly Twitter and Telegram, continue to be hubs of activity, with ongoing discussions about the latest developments. Join us to stay engaged! Telegram Discussion on Interoperable DID Solutions: This week, led by Ontology Loyal Members, we delved into “Exploring Interoperable DID Solutions: Web2, Web3, and Beyond,” discussing how decentralized identities can revolutionize user verification, login mechanisms, and peer interactions. Stay Connected 📱

Engage with us and stay updated on the latest happenings by following our social media channels. Your participation and feedback are crucial as we continue to advance the blockchain and decentralized identity landscapes.

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Thank you for your ongoing support and engagement. We are excited about the future as we continue to innovate and enhance the Ontology ecosystem. Stay tuned for more updates next week!

Ontology Weekly Report (July 16th — July 22nd, 2024) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Crypto regulatory affairs: Hong Kong publishes stablecoin consultation response and announces sandbox participants

Financial sector watchdogs in Hong Kong have taken an important step to pave the way for stablecoin regulation, and to bolster Hong Kong’s reputation as a leading hub for crypto development in the Asia-Pacific (APAC) region.

Financial sector watchdogs in Hong Kong have taken an important step to pave the way for stablecoin regulation, and to bolster Hong Kong’s reputation as a leading hub for crypto development in the Asia-Pacific (APAC) region.

Tuesday, 23. July 2024

Lockstep

In praise of metadata

The term metadata has become rather loaded. Perhaps even poisoned, for its association with telecommunications surveillance. But I want to sing its praises for it is metadata that tells us if any given information is accurate or reliable, Or trustworthy, fit for purpose, valuable. National security hawks advocating stronger surveillance powers, have tried to whitewash... The post In praise of me

The term metadata has become rather loaded. Perhaps even poisoned, for its association with telecommunications surveillance. But I want to sing its praises for it is metadata that tells us if any given information is accurate or reliable, Or trustworthy, fit for purpose, valuable.

National security hawks advocating stronger surveillance powers, have tried to whitewash metadata collection. They liken telecomm metadata to the visible details on an ordinary envelope and insist it’s innocuous compared with the contents of the message.

On the other hand, U.S. General Michael Hayden, former head of the National Security Agency, once stated plainly and simply “we kill people based on metadata” (although he denied that any telecommunications metadata collected on regular citizens was used for that).

Data and metadata: same but different

From a privacy perspective, metadata should certainly not be distinguished from data in general.

As I understand prevailing principles-based privacy law, if a piece of metadata is personally identifiable, then it constitutes personal data and falls within the scope of such law.

So from one perspective, metadata is merely more data. Nevertheless, I find it useful to distinguish data and metadata, because the properties of data that make it valuable or reliable (or unreliable) are often codified in metadata.

For example:

the age of a cell phone number or email address can suggest it may be a burner account being used in a fraud data presented online for identification purposes or in card-not-present payments really needs to be “original” in the sense that it’s presented by the rightful subject watermarks generated within digital camera hardware can prove that an image is genuine, rather than AI-generated clinical trial results should be based on patient data collected under proper consent conditions.

We might argue that the value of any data lies in the metadata.

Verifiable credentials are really about metadata

This distinction between data and metadata also illuminates how verifiable credentials convey rich data quality signals.

Verifiable credentials are tools that convey machine readable assertions made by a third party about a subject — which is usually a human but verifiable credentials for non-human subjects such as IoT devices are expanding fast.

The important elements of verifiable credentials (and verifiable presentations) are:

they name (or point to) the subject of the credential they name the issuer of the credential they bear the digital signature of the issuer — which gives the credential provenance the presentation bears the signature of the subject (ideally generated within in a secure wallet) — which indicates their consent or control the credential carries a range of administrative metadata, such as validity date, applicable terms & conditions, and details of the device carrying the credential.

The digital signature on a verifiable presentation is typically created automatically in a wallet or chip. The signing process uses a private key embedded in the firmware, unique to the subject, but not visible to them.

The issuer of a credential is one of the most important factors used to determine whether to accept that credential or not. That is, the issuer confers value; some issuers are valued more than others.

So the name of the issuer of the credential is metadata of the credential: it is something that a first party wants to know about a second party before deciding to do a transaction.

About the credit cards in the banner graphic

The banner at the top of this blog shows some favourite images from my archive, from the very first charge card in 1955 through a series of technological innovations. First, the printed cardholder details were coded on a magnetic stripe for automated reading; tougher plastic cards supported antifraud measures like holograms and guilloche printing; the magnetic stripe was superseded by smart chips that prevent copying; and smartcards gave way to smart phones with biometrics to further protect the cardholder.

This evolution is really all about metadata!

Cards and phones, as far as the card payment system is concerned, are just data carriers. They store details about the card holder (in particular the Primary Account Number or PAN) and facilitate the presentation of that data to a merchant. The move from mag stripe to chip was the most important security measure in sixty-odd years; the chip provides signals about the originality of the PAN and the consent of the cardholder to each presentation.

Every major upgrade of credit card technology has improved the metadata that protects the primary data. All along, over several decades, the primary data has remained the same. But it has got better, thanks to metadata,

I analysed the evolution of cards in more detail here: A CMM for personal data carriers and digital wallets.

“Attributes” and “Claims”

The late great Kim Cameron — author of the prized Laws of Identity — carefully used the term “claim” in defining digital identity. The words claim, attribute and assertion might seem interchangeable but Kim singled out claim as “an assertion of the truth of something, typically one which is disputed or in doubt”.

He stressed that there is always uncertainty in the real world, and when authenticating another party, there is always going to be doubt over important attributes.

The trick is to reduce that doubt to acceptable levels. A relying party will always reserve the right to decide for itself if its doubts have been resolved.

Verifiable credentials technology provides multiple mechanisms for doing just that.

For one thing, a verifiable credential bears the name of the credential issuer. In many cases, there is a natural issuer of a credential of interest: driver licences are issued by government departments of motor vehicles, employee numbers are issued by employers, credit card numbers are issued by banks. When verifying a credential, one of the most important things to check is the issuer.

The familiarity of metadata in real life

This coupling of data and metadata is routine in the analogue world.

In courtroom dramas, stories turn on facts and evidence.  The facts tendered in a court case are only as good as the evidence. There are rules of evidence governing how information is obtained and safeguarded.

Facts and evidence in court procedures correspond to claims and proofs in digital identity. It’s all data and metadata.

How do you know?

In science, it’s not just what you know that matters but how do you know. What is the source of a statement or claim? Where is the evidence?

Children know this instinctively. As they develop a sense of how knowledge and trust are fluid, plucky kids will challenge the things they are told, with the riposte “How do YOU know?”.

Metadata and the stories behind the data.

Metadata can tell the story behind the data, a story that is increasingly important in all things digital.

As data supply chains become ever more complicated, we need enhanced abilities to interrogate the information we receive and depend on — whether that’s a news report, an photographic image, a student’ essay, a CV, the results of an automobile’s emissions test, or a scientific report on climate change.

Where did a given piece data come from? Who and/or what contributed to it?

The products of generative AI are starting to be watermarked, but that’s only a start. It will be important to know more, like which algorithms and version numbers were used, where did the models run, how were they trained, and was the training data audited?

Looking at signatures as metadata

With this orientation, everywhere I look now, I see metadata!

For example, a less obvious example of metadata is digital signatures.

A digital signature is a data value (technically a “cryptogram”) usually calculated by hashing and/or encrypting a record using a private key controlled by some actor. Note that I’m referring here to asymmetric or public key digital signatures.

The signature on a record can be checked at any future time to verify that a particular actor had something to do with that record, such as creating it or agreeing to it.

There are many different applications for digital signatures — but they are all used to create evidence that a given record at a certain time was touched in some way by a certain actor. That is, the signature tells a story about the history of a record. The digital signature is more metadata.

The post In praise of metadata appeared first on Lockstep.


SC Media - Identity and Access

3rd party cookies here to stay after Google changes privacy plan

After years of delays, Google ditched its plan to kill cookies, instead opting to let users make an “informed choice.”

After years of delays, Google ditched its plan to kill cookies, instead opting to let users make an “informed choice.”


Finicity

CFPB 1033 and Open Banking: Opportunities and Challenges for Banks

In this webinar from April 24, 2024 Tom Carpenter, Senior Vice President of Industry, Policy and Standards Engagement from Mastercard, along with panelists from Sidley and i2c, discussed the potential… The post CFPB 1033 and Open Banking: Opportunities and Challenges for Banks appeared first on Finicity.

In this webinar from April 24, 2024 Tom Carpenter, Senior Vice President of Industry, Policy and Standards Engagement from Mastercard, along with panelists from Sidley and i2c, discussed the potential of CFPB Section 1033 for open banking initiatives with banks. 

They discussed how banks can leverage data sharing to enhance customer experiences, create new product offerings and navigate competition from fintechs. You will also learn about the opportunities represented by the rule and how to develop strategies to capitalize on the evolving landscape. 

Find out why the CFPB Section 1033 rule is crucial in advancing open banking, any potential risks associated with the rulemaking, best practices for compliance with Section 1033 and new opportunities to leverage data sharing to innovate and offer new services. 

You can watch the webinar here

The post CFPB 1033 and Open Banking: Opportunities and Challenges for Banks appeared first on Finicity.


Why keep open banking top of mind? CFPB regulation and new opportunities

In this webinar with the Consumer Bankers Association from April 25, 2024, Mastercard’s Ben Soccorsy and Jenny Ziegler tackled the impact open banking regulation will have on banks and the… The post Why keep open banking top of mind? CFPB regulation and new opportunities appeared first on Finicity.

In this webinar with the Consumer Bankers Association from April 25, 2024, Mastercard’s Ben Soccorsy and Jenny Ziegler tackled the impact open banking regulation will have on banks and the opportunity having a regulated ecosystem provides. 

From the basics of open banking data and how consumers can access their data from any of their financial institutions to the way that data can be used for lending, financial management, wealth management, and payments. 

You can learn how the regulatory environment is accelerating the shift towards open banking. CFPB Dodd Frank Section 1033 is intended to break down barriers to accessing financial products, jump-start competition between financial institutions and fintechs and provide consumers more control and access to their financial data. 

Regulation will mandate that data providers must share their financial data with third parties or consumers via APIs safely and securely. The compliance deadline varies depending on the size of the financial institution. 

For banks, this means being on top of API enablement, consent management, information security, third party risk management, risk and compliance, data governance and data monetization strategies. 

You can watch the webinar here

The post Why keep open banking top of mind? CFPB regulation and new opportunities appeared first on Finicity.


KuppingerCole

CrowdStrike’s Cyber Blackout

by Mike Small Some years ago, the book Blackout by Marc Elsberg described the impact on society of malicious software infecting the electricity supply network in Europe. Last week the whole world experienced the consequences of a faulty software update to the security software CrowdStrike Falcon. According to Mr. Kurtz, the CEO of CrowdStrike, this was not a malicious act. However, the impact of

by Mike Small

Some years ago, the book Blackout by Marc Elsberg described the impact on society of malicious software infecting the electricity supply network in Europe. Last week the whole world experienced the consequences of a faulty software update to the security software CrowdStrike Falcon. According to Mr. Kurtz, the CEO of CrowdStrike, this was not a malicious act. However, the impact of this error was considerable.

We Are All Now Dependent on Digital Systems

I can understand the feelings of the CrowdStrike team. When I was VP of development of security software, I remember very clearly how I felt when one night I was called and told that our security software was preventing clinicians in a paediatric hospital from accessing the systems and that unless they regained access within the hour, babies would start to die.

Developing security software places an extra burden on the development teams. This software is intended to protect organisations from malicious actors, however by its very nature that security software can also prevent the systems it is intended to protect from operating.

Over the past several years governments have recognised the increasing dependence of society on IT systems and how this brings the need for greater resilience of these systems. In Europe, this has resulted in legislation that includes NIS2 and DORA. The best control to ensure resilience is diversity. However, in the world of IT, most organisations are heavily dependent on systems that or delivered by a few suppliers. This is especially true for desktop systems, where Microsoft is the dominant supplier, and for cloud services, where AWS, Google, and Microsoft have the lion’s share of the market.

In this instance, the problem was that an update to security software caused the systems that were running it to crash, and the fix required a significant amount of manual intervention on each affected machine. The end users had no control over the deployment of this patch, as the security vendor pushed it out to all endpoints across the world within a very short period of time. The issue affected all CrowdStrike customers running Windows-based systems including PCs, servers, kiosks and other forms of specialist terminals.

This is actually not the first time it has happened – in fact, CrowdStrike had a similar issue with the Linux version of its software just a few months ago. An update incompatible with the latest version of Debian Linux was released, causing servers to crash and refuse to boot. Back then, it took the company weeks to acknowledge the issue and reveal that Debian Linux wasn’t covered by their test procedures, despite being officially supported.

Other cybersecurity vendors, including McAfee, Sophos, and Symantec, had similar issues over the last two decades, although they have never had such a global impact.

What an End User Organisation Must Do

Since this occurred through a defect in security software, the normal advice relating to the use of up-to-date security software is not very helpful. Additionally, these are infrequent but high-impact events which make planning hard. Here are some actions that organisations can take:

Include this in your Business Continuity Plan – Consider this kind of risk as part of your business continuity planning. Remember that as your organisation goes digital, it becomes more dependent upon IT, and cyber risks require special treatment. Cyber incidents spread very rapidly across interconnected components, so moving to another physical location does not help.

Resilience through Diversity – The most powerful control to ensure resilience is diversity, but this is difficult to achieve given the dominance of a small number of major suppliers. Consider this kind of risk in your business continuity planning. Consider whether the trade-off between cost and ease of management against risk of cyber failure due to dependence upon a single IT environment is acceptable for your critical business systems.  For life-critical systems, best practice requires three different software elements provided by three different suppliers to minimize risk. This is impractical for most situations, but you could consider deploying security software from multiple vendors across different parts of your IT estate.

Evaluate Vendor Risk – When choosing security software, include consideration of this kind of risk in your vendor assessment process. Evaluate the kinds of controls that the vendor has to prevent and mitigate this kind of error. These can include the software design and development processes, including testing and deployment. Does the vendor phase the deployment of updates with inbuilt feedback? Does the vendor allow you any control over the deployment of updates, and can updates be selectively deployed to groups of systems? Does it follow the standard-based practices of software supply chain security?

Incident Plan – Have a well-tested incident response plan and include this kind of event in your planning. Include and test how you would manage having to reimage or reboot a large portion of your IT estate. Do you have the tools and skills to manage this? Don’t forget that you need to verify whether you have backed up your data and are able to restore it in time.

Keep Calm and Carry On – Unfortunately, a lot of cybercriminals and even a handful of security vendors have already recognized this massive incident as an opportunity to exploit victims’ insecurity and vulnerable state. We can already observe a massive increase in phishing and other criminal activities focusing on CrowdStrike’s and Microsoft’s products. Some vendors are trying to push their own products as “more resilient” alternatives. However, the best thing you can do now is to avoid making rash decisions. Focus on addressing the immediate consequences of the outage and start looking for neutral expert guidance for adjusting your long-term security strategies, architectures, and portfolios. Focus on methods that can be proven and validated and avoid snake oil at all costs.


Indicio

Verifiable credentials like the DTC are transforming travel by rewiring the way information is shared

The post Verifiable credentials like the DTC are transforming travel by rewiring the way information is shared appeared first on Indicio.

By Ken Ebert

When it comes to verifiable credentials, writes travel expert Mitra Sorrells, “there is little doubt that travel – as well as many other parts of everyday life – is on the verge of a fundamental, radical shift.”

For travel, that means a special kind of verifiable credential, called — unsurprisingly — a Digital Travel Credential or DTC. A DTC is based on standards from the International Civil Aviation Organization (ICAO) for creating digital equivalents of a passport.

In essence, a DTC is a digital version of a passport that a passenger creates by scanning their passport with an app on their smartphone. The app reads the information on the photo page of the passport in order to read the data in the passport chip and then checks the cryptographic information that proves it was issued by a bona fide passport authority. In addition, the app asks the passport holder to perform a biometric and liveness check that is then checked against the digital image in the passport’s chip. Only after all this checks out is a traveler issued a DTC.

The combination of cryptographic verification and biometric binding means that a government can be confident that the digital version of a passport is authentic, and belongs to the passport holder.

This is only a first security step. When a DTC is built on a verifiable credential it gains privacy and security superpowers (and this is why anything claiming to be a DTC that isn’t a verifiable credential isn’t really a DTC). 

A verifiable credential is 1) a way of “sealing” information such that if the information is altered, the seal breaks, and 2) a way of sourcing information so that you can be absolutely certain of where it originated and who it came from. 

Technically, there are new global standards and protocols and multiple layers of cryptographic wizardry — complex math — triangulating digital information so it can’t be copied, altered, or used by someone else. 

The result is the most powerful digital identity anyone has yet created – one that meets the standards of security that governments expect when it comes to protecting borders. 

And what can we do with this power? We can simplify the way information is used and verified so that processes and operations that were complicated, stressful, or just too difficult to do because of the risk, now become easy, instant, and secure.

Here are three ways a DTC simplifies travel:

The journey: Because airports, airlines, and governments can be confident that a person presenting a DTC is who they say they are, anything that once required manual checking can be automated. Travelers can be authorized to travel long before they leave for the airport. Airlines can quickly board passengers, certain that they are allowed to visit their destination. Borders can be crossed in seconds. All the congestion that currently slows down and stresses out travelers can be smoothed away. 

Privacy: DTCs and verifiable credentials do something radical that was previously impossible: they allow people to hold their personal data on their mobile device in a way that only they can access. Until now, personal data had to be stored in centralized databases by verifying parties in order to authenticate a person’s identity. This has turned into an enormous privacy and a security risk. But because a verifiable credential means you can be certain of the source of information and that it hasn’t been tampered with, this centralized model of identity access management is no longer needed, addressing many of the critical issues in data privacy and protection laws like GDPR. 

Security: Governments can do security and immigration checks ahead of travel. Airlines can be certain that all their passengers are allowed to visit the countries they are flying to, avoiding fines and repatriation. Manual document checks are replaced by cryptography.  And, of course, the risk of centralized data storage has been mitigated.

There are two other important opportunities that this technology creates. By removing friction and lines, airports can increase flow and capacity without increasing resources. This is critical as air travel is projected to keep growing. 

Verifiable credentials also expand the scope of customer personalization. In addition to being trusted digital identities, verifiable credential technology comes with its own way of communicating. It’s direct, one-to-one, and encrypted. This means each DTC has a consent-based way to create a secure trusted relationship with a traveler and take loyalty programs and service integration to the next level (again, this is why it’s important to make sure DTCs are verifiable credentials and not just fancy digital certificates).

In short, DTCs and verifiable credentials benefit everyone in the travel and tourism sector. And that’s because they rewire the way information travels so that it gets us to where we want to go, faster, more smoothly, and more securely.

For more information about verifiable credentials and digital travel, please visit us here or contact us.

###

The post Verifiable credentials like the DTC are transforming travel by rewiring the way information is shared appeared first on Indicio.


Shyft Network

Zero-Knowledge: The Future of More Secure and Scalable Blockchain

Zero-knowledge technology (ZK) enhances blockchain privacy and scalability. Recent innovations include ZK verifiers for Bitcoin and improved ZK protocols. Future ZK developments will focus on speed, usability, and efficiency. Blockchain technology has transformed the world with its inherent transparency, immutability, and security. However, amidst the rising dominance of a few tech companies,
Zero-knowledge technology (ZK) enhances blockchain privacy and scalability. Recent innovations include ZK verifiers for Bitcoin and improved ZK protocols. Future ZK developments will focus on speed, usability, and efficiency.

Blockchain technology has transformed the world with its inherent transparency, immutability, and security. However, amidst the rising dominance of a few tech companies, concerns about increasing centralization and censorship underscore the need for enhanced privacy.

This is where zero-knowledge technology comes into play. It not only offers improved privacy but also provides a better solution to blockchain’s scalability issues.

Zero-knowledge is a broader category of cryptographic methods designed to preserve privacy by allowing one party to prove to another cryptographically that they possess knowledge about a piece of information without revealing the actual underlying details.

In blockchain, zero-knowledge proofs (ZKPs) use algorithms to process data and confirm its truthfulness.

Zero-knowledge proofs focus on three main criteria:

Zero-knowledge: The verifier cannot access the original input but can only confirm the statement’s validity. Soundness: The protocol cannot validate invalid input as true. Completeness: The protocol always validates the statement, provided the input is valid.

So, a basic zero-knowledge proof consists of three components. A witness who provides the secret information; a challenge, in which the verifier selects a question for the prover to answer; and a response, in which the prover answers the question.

There are several types of Zero-Knowledge Proofs (ZKPs):

Interactive ZKPs: These require several exchanges between the prover and the verifier. Non-interactive ZKPs: Once set up, these do not require any further interaction between the prover and the verifier. SNARKs and zk-SNARKs: SNARKs provide brief proofs that can be quickly verified, while zk-SNARKs are a more advanced type of SNARK. ZKPs Taking Center Stage

While the basic concept of cryptographic primitives has existed for the past couple of decades, its development accelerated substantially only after the introduction of Bitcoin and Ethereum due to the technology’s ability to scale blockchains.

By enabling one person to demonstrate to another that a computation was performed correctly without redoing the work or sharing the data used, ZKPs streamline and speed up the verification process.

This efficiency reduces costs and accelerates transactions on blockchains like Bitcoin, as it eliminates the need for every node to re-execute each transaction.

Instead, a single node handles the processing and then uses a ZKP to prove its accuracy, while the other nodes only need to verify this proof. Thus, ZKPs facilitate the development of a financial system that, unlike traditional finance, does not depend on social trust.

With the help of zero-knowledge technology, crypto users can also maintain their anonymity on public blockchains, where all the transaction history is for everyone to see, track, and monitor.

The tech actually allows for private identity verification, allowing for compliance while eliminating the need to reveal the data itself. This way, it even takes the load off the blockchain, offering scalability benefits.

In the crypto world, teams like Polyhedra and Lambda Class are actively exploring this topic. The venture studio and investment firm Lamda Class sees SNARKs having a significant impact on shaping our world. Earlier this year, they proposed a simple and modular bridge that uses multi-storage proofs.

Ethereum co-founder Vitalik Buterin, too, has expressed his support for implementing zero-knowledge (ZK) technology to achieve user privacy, censorship resistance, and autonomy. He believes in the future, all rollups, a solution to improve blockchain scalability, will actually be ZK.

(Image Source)

“zk-SNARKs will be as important as blockchains in the next ten years,” Buterin said last year. He has long been a proponent of this cryptographic tech to help overcome the problem of scalability and privacy.

Not just crypto but also ZKPs will become essential in verifying whether AI-generated content is produced by AI models or not. Moreover, in the coming decades, ZKPs are likely to play a key role in enhancing efficiency, securing devices, and ensuring national security.

The Latest Developments

Over the past few years, zero-knowledge proofs have been widely used to scale Ethereum. Now, they are also recognized as a crucial element for unlocking Bitcoin’s programmability. Weikeng Chen, a PhD graduate from UC Berkeley and sponsored by infrastructure company Starkware, achieved the first implementation of a zero-knowledge verifier using Bitcoin script.

The historic milestone came after three months of exploring the possibilities of the technical proposal “OP_CAT” to expand Bitcoin’s capabilities by introducing smart contract functionality to the network. StarkWare’s ZK verifier marks the first large-scale practical application of the proposal’s opcode on the testnet, Bitcoin Signet.

“This was a tremendous effort and took a significant amount of time,” said Chen in an interview. “We started with nothing… We had to build the full stack, which eventually led to the implementation of the STARK verifier.”

While challenges remain, contributors at Starkware believe the project’s success represents “a monumental leap forward” towards Bitcoin scaling solutions that can use its ZK roll-up technology.

Other developments in the space include Polygon Labs’ latest version of its ZKP, Plonky3, which aims to enhance efficiency and security in distributed networks. While its previous versions had limitations in flexibility and adaptability, Plonky3 will empower developers to leverage ZK technology to build their own zkVM or zkEVM virtual machines.

With Plonky3, which has undergone auditing, the goal is to encourage innovation and community collaboration. Its versatility stems from its capability to adapt to various finite fields such as Goldilocks, BabyBear, and Mersenne31, as well as hash functions such as Keccak-256, BLAKE3, and Poseidon.

This development comes a month after Polygon Labs acquired the tech firm Toposware, the third ZK tech-based team. So far, the organization has invested $1 billion in zero-knowledge technology.

“ZK is easier, not only from a development perspective but for users and user experience too,” a Polygon spokesperson told local media.

These efforts by Polygon Labs aim to enhance interoperability. Yet, ZK-based technology struggles with compatibility issues within EVM networks and requires a Type 1 Prover to confirm the validity of a transaction to a blockchain. In response, Polygon Labs and Toposware are jointly developing such a Prover.

In April this year, VC giant Andreessen Horowitz (a16z) unveiled the release of its zero-knowledge virtual machine (zkVM) to help its portfolio companies scale their operations. a16z is also an investor in Matter Labs, a leading zkEVM maker.

ZK Proofs, according to the firm’s researcher and associate professor at Georgetown University, Justin Thaler, scale blockchains by doing the hard work off-chain. Not all nodes have to do all the work, but they get the guarantee that the work was done correctly.

The Future of Zero-Knowledge Tech

With scalability continuing to be a big challenge for blockchains, zero-knowledge technology has vast potential in the crypto space. For now, the implementation of ZK tech is in its early stages. However, the growing demand for privacy on public blockchains is expected to lead to growth and advancement in this technology.

The future of this tech focuses on prioritizing speed, improving developer tooling, reducing hardware requirements, enhancing flexibility, and widening support. With these advancements, we’ll truly see ZK’s transformative potential, leading to a more scalable and secure blockchain world.

About Shyft Network

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

Zero-Knowledge: The Future of More Secure and Scalable Blockchain was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


SC Media - Identity and Access

Three ways to thwart non-human identity attacks

Why having a plan for managing NHIs has become a requirement.

Why having a plan for managing NHIs has become a requirement.


Linx Security emerges from stealth with $33M

Linx has touted its platform to significantly ease the discovery and removal of unused employee accounts.

Linx has touted its platform to significantly ease the discovery and removal of unused employee accounts.


Ockto

Hoe brondata het toekennen van duurzaamheidsleningen versnelt bij Wartmefonds

Zeven miljard euro aan leningen, zoveel wil het Warmtefonds voor 2030 verstrekken aan huishoudens en onderwijsinstellingen om te verduurzamen.

Zeven miljard euro aan leningen, zoveel wil het Warmtefonds voor 2030 verstrekken aan huishoudens en onderwijsinstellingen om te verduurzamen.


Spruce Systems

Provably Forgotten Signatures: Adding Privacy to Digital Identity

We can enhance existing digital identity systems to support an important privacy feature known as “unlinkability:” sharing attributes without attribution.

Thank you to Ryan Hurst (SpruceID Advisor, former Google/Microsoft), Dan Boneh (Stanford), Abhi Shelat (Google/Northeastern), Foteini Baldimtsi (GMU), and Dick Hardt (Hellō) for reviewing the technical approach in this article, and providing several suggestions which improved the work. 

At SpruceID, our mission is to let users control their data across the web. We build systems based on Verifiable Digital Credentials (VDCs) to make the online and offline worlds more secure, while protecting the privacy and digital autonomy of individuals.

Developing models to implement this VDC future requires carefully thinking through every risk of the new model–including risks in the future. One of the edge-case risks privacy researchers have identified is sometimes known as “linkability.”

Linkability refers to the possibility of profiling people by collating data from their use of digital credentials. This risk commonly arises when traceable digital signatures or identifiers are used repeatedly, allowing different parties to correlate many interactions back to the same individual, thus compromising privacy. This can create surveillance potential across societies, whether conducted by the private sector, state actors, or even foreign adversaries.

In this work, we explore an approach that adds privacy by upgrading existing systems to prevent linkability (or “correlation”) and instead of overhauling them entirely. It aims to be compatible with already-deployed implementations of digital credential standards such as ISO/IEC 18013-5 mDL, SD-JWT, and W3C Verifiable Credentials, while also aligning with cryptographic security standards such as FIPS 140-2/3. It is compatible with and can even pave the way for future privacy technologies such as post-quantum cryptography (PQC) or zero-knowledge proofs (ZKPs) while unlocking beneficial use cases today.

Why This Matters Now 

Governments are rapidly implementing digital identity programs. In the US, 13 states already have live mobile driver’s license (mDL) programs, with over 30 states considering them, and growing. Earlier this year, the EU has approved a digital wallet framework which will mandate live digital wallets across its member states by 2026. This is continuing the momentum of the last generation of digital identity programs with remarkable uptake, such as India’s Aadhaar which is used by over 1.3 billion people. However, it is not clear that these frameworks plan for guarantees like unlinkability in the base technology, yet the adoption momentum increases.

Some think that progress on digital identity programs should stop entirely until perfect privacy is solved. However, that train has long left the station, and calls to dismantle what already exists, has sunk costs, and seems to function may fall on deaf ears. There are indeed incentives for the momentum to continue: demands for convenient online access to government services or new security systems that can curb the tide of AI-generated fraud. Also, it’s not clear that the best approach is to design the “perfect” system upfront, without the benefit of iterative learning from real-world deployments.

In the following sections, we examine two privacy risks that may already exist in identity systems today, and mitigation strategies that can be added incrementally.

Digital ID Risk: Data Linkability via Collusion

One goal for a verifiable digital credential system is that a credential can be used to present only the necessary facts in a particular situation, and nothing more. For instance, a VDC could prove to an age-restricted content website that someone is over a certain age, without revealing their address, date of birth, or full name. This ability to limit disclosures allows the use of functional identity, and it’s one big privacy advantage of a VDC system over today’s identity systems that store a complete scan of a passport or driver’s license. However, even with selective disclosure of data fields, it is possible to unintentionally have those presentations linkable if the same unique values are used across verifiers.

In our example, if a user proves their age to access an age-restricted content website (henceforth referred to simply as “content website”), and then later verifies their name at a bank, both interactions may run the risk of revealing more information than the user wanted if the content website and bank colluded by comparing common data elements they received. Although a check for “over 18 years old” and a name don’t have any apparent overlap, there are technical implementation details such as digital signatures and signing keys that, when reused across interactions, can create a smoking gun.

Notably, the same digital signature is uniquely distinguishable, and also new signatures made from the same user key can be correlated. This can all work against the user to reveal more information than intended.

Verifier-Verifier Collusion

To maximize privacy, these pieces of data presented using a VDC should be “unlinkable.” For instance, if the same user who’d proven their age at a content website later went to a bank and proved their name, no one should be able to connect those two data points to the same ID holder, not even if the content website and the bank work together. We wouldn’t want the bank to make unfair financial credit decisions based on the perceived web browsing habits of the user.

However, VDCs are sometimes built on a single digital signature, a unique value that can be used to track or collate information about a user if shared repeatedly with one or more parties. If the content website in our example retains the single digital signature created by the issuing authority, and that same digital signature was also shared with the bank, then the content website and the bank could collude to discover more information about the user than what was intended.

The case where two or more verifiers of information can collude to learn more about the user is known as verifier-verifier collusion and can violate user privacy. While a name-age combination may seem innocuous, a third-party data collector could, over time, assemble a variety of data about a user simply by tracking their usage of unique values across many different verifiers, whether online or in-person. At scale, these issues can compound into dystopian surveillance schemes by allowing every digital interaction to be tracked and made available to the highest bidders or an unchecked central authority.

Cycling Signatures to Prevent Verifier-Verifier Collusion

Fortunately, a simple solution exists to help prevent verifier-verifier collusion by cycling digital signatures so that each is used only once. When a new VDC is issued by a post office, DMV, or other issuer, it can be provisioned not with a single signature from the issuing authority that produces linkable usage, but with many different signatures from the issuing authority. If user device keys are necessary for using the VDC, as in the case of mobile driver’s licenses, several different keys can be used as well. A properly configured digital wallet would then use a fresh signature (and potentially a fresh key) every time an ID holder uses their VDC to attest to particular pieces of information, ideally preventing linkage to the user through the signatures.

Using our earlier example of a user who goes to a content website and uses their VDC to prove they are over 18, the digital wallet presents a signature for this interaction, and doesn’t use that signature again. When the user then visits their bank and uses a VDC to prove their name for account verification purposes, the digital wallet uses a new signature for that interaction.

Because the signatures are different across each presentation, the content website and the bank cannot collude to link these two interactions back to the same user without additional information. The user can even use different signatures every time they visit the same content website, so that the content website cannot even tell how often the user visits from repeated use of their digital ID.

Issuer-Verifier Collusion

A harder problem to solve is known as “issuer-verifier” collusion. In this scenario, the issuer of an ID–or, more likely, a rogue agent within the issuing organization–remembers a user’s unique values (such as keys or digital signatures) and, at a later time, combines them with data from places where those keys or signatures are used. This is possible even in architectures without “phone home” because issuing authorities (such as governments or large institutions) often have power over organizations doing the verifications, or have been known to purchase their logs from data brokers. Left unsolved, the usage of digital identity attributes could create surveillance potential, like leaving a trail of breadcrumbs that can be used to re-identify someone if recombined with other data the issuer retains.

Approaches Using Zero-Knowledge Proofs

Implementing advanced cryptography for achieving unlinkability, such as with Boneh–Boyen–Shacham (BBS) signatures in decentralized identity systems, has recently gained prominence in the digital identity community. These cryptographic techniques enable users to demonstrate possession of a signed credential without revealing any unique, correlatable values from the credentials.

Previous methods like AnonCreds and U-Prove, which rely on RSA signatures, paved the way for these innovations. Looking forward, techniques such as zk-SNARKs, zk-STARKs, which when implemented with certain hashing algorithms or primitives such as lattices can support requirements for post-quantum cryptography, can offer potential advancements originating from the blockchain ecosystem.

However, integrating these cutting-edge cryptographic approaches into production systems that meet rigorous security standards poses challenges. Current standards like FIPS 140-2 and FIPS 140-3, which outline security requirements for cryptographic modules, present compliance hurdles for adopting newer cryptographic algorithms such as the BLS 12-381 Curve used in BBS and many zk-SNARK implementations. High assurance systems, like state digital identity platforms, often mandate cryptographic operations to occur within FIPS-validated Hardware Security Modules (HSMs). This requirement necessitates careful consideration, as implementing these technologies outside certified HSMs could fail to meet stringent security protocols.

Moreover, there's a growing industry shift away from RSA signatures due to concerns over their long-term security and increasing emphasis on post-quantum cryptography, as indicated by recent developments such as Chrome's adoption of post-quantum ciphers.

Balancing the need for innovation with compliance with established security standards remains a critical consideration in advancing digital identity and cryptographic technologies.

A Pragmatic Approach for Today: Provably Forgotten Signatures

Given the challenges in deploying zero-knowledge proof systems in today’s production environments, we are proposing a simpler approach that, when combined with key and signature cycling, can provide protection from both verifier-verifier collusion and issuer-verifier collusion by using confidential computing environments: the issuer can forget the unique values that create the risk in the first place, and provide proof of this deletion to the user. This is implementable today, and would be supported by existing hardware security mechanisms that are suitable for high-assurance environments.

It works like this:

During the final stages of digital credential issuance, all unique values, including digital signatures, are exclusively processed in plaintext within a Trusted Execution Environment (TEE) of confidential computing on the issuer’s server-side infrastructure. Issuer-provided data required for credential issuance, such as fields and values from a driver’s license, undergoes secure transmission to the TEE. Sensitive user inputs, such as unique device keys, are encrypted before being transmitted to the TEE. This encryption ensures that these inputs remain accessible only within the secure confines of the TEE. Within the TEE, assembled values from both the issuer and user are used to perform digital signing operations. This process utilizes a dedicated security module accessible solely by the TEE, thereby generating a digital credential payload. The resulting digital credential payload is encrypted using the user’s device key and securely stored within the device’s hardware. Upon completion, an attestation accompanies the credential, verifying that the entire process adhered to stringent security protocols.

This approach ensures:

Protection Against Collusion: By employing confidential computing and strict segregation of cryptographic operations within a TEE, the risk of verifier-verifier and issuer-verifier collusion is mitigated. Privacy and Security: User data remains safeguarded throughout the credential issuance process, with sensitive information encrypted and managed securely within trusted hardware environments. Compliance and Implementation: Leveraging existing hardware security mechanisms supports seamless integration into high-assurance environments, aligning with stringent regulatory and security requirements.

By prioritizing compatibility with current environments instead of wholesale replacement, we propose that existing digital credential implementations, including mobile driver’s licenses operational in 13 states and legislatively approved in an additional 18 states, could benefit significantly from upgrading to incorporate this technique. This upgrade promises enhanced privacy features for users without necessitating disruptive changes.

New Approach, New Considerations

However, as with all new approaches, there are some considerations when using this one as well. We will explore a few of them, but this is not an exhaustive list.

The first consideration is that TEEs have been compromised in the past, and so they are not foolproof. Therefore, this approach is best incorporated as part of a defense-in-depth strategy, where there are many layered safeguards against a system failure. Many of the critical TEE failures have resulted from multiple things that go wrong, such as giving untrusted hosts access to low-level system APIs in the case of blockchain networks, or allowing arbitrary code running on the same systems in the case of mobile devices.

One benefit of implementing this approach within credential issuer infrastructures is that the environment can be better controlled, and so more forms of isolation are possible to prevent these kinds of vulnerability chaining. Issuing authorities are not likely to allow untrusted hosts to federate into their networks, nor would they allow arbitrary software to be uploaded and executed on their machines. There are many more environmental controls possible, such as intrusion detection systems, regular patching firmware, software supply chain policies, and physical security perimeters.

We are solving the problem by shifting the trust model: the wallet trusts the hardware (TEE manufacturer) instead of the issuing authority.

Another consideration is that certain implementation guidelines for digital credentials recommend retention periods for unique values for issuing authorities. For example, AAMVA’s implementation guidelines include the following recommendations for minimum retention periods: 

Source: AAMVA Mobile Driver's License Implementation Guidelines, r1.2

To navigate these requirements, it is possible to ensure that the retention periods are enforced within the TEE by allowing for deterministic regeneration of the materials only during a fixed window when requested by the right authority. The request itself can create an auditable trail to ensure legitimate usage. Alternatively, some implementers may choose to override (or update) the recommendations to prioritize creating unlinkability over auditability of certain values that may be of limited business use.

A third consideration is increased difficulty for the issuing authority to detect compromise of key material if they do not retain the signatures in plaintext. To mitigate this downside, it is possible to use data structures that are able to prove set membership status (e.g., was this digital signature issued by this system?) without linking to source data records or enumeration of signatures, such as Merkle trees and cryptographic accumulators. This allows for the detection of authorized signatures without creating linkability. It is also possible to encrypt the signatures so that only the duly authorized entities, potentially involving judicial processes, can unlock the contents.

Paving the Way for Zero-Knowledge Proofs

We believe that the future will be built on zero-knowledge proofs that support post-quantum cryptography. Every implementation should consider how it may eventually transition to these new proof systems, which are becoming faster and easier to use and can provide privacy features such as selective disclosure across a wide variety of use cases.

Already, there is fast-moving research on using zero-knowledge proofs in wallets to demonstrate knowledge of unique signatures and possibly the presence of a related device key for payloads from existing standards such as ISO/IEC 18013-5 (mDL), biometric templates, or even live systems like Aadhar. In these models, it’s possible for the issuer to do nothing different, and the wallet software is able to use zero-knowledge cryptography with a supporting verifier to share attributes without attribution.

These “zero-knowledge-in-the-wallet” approaches require both the wallet and the verifier to agree on implementing the technology, but not the issuer. The approach outlined in this work requires only the issuer to implement the technology. They are not mutually exclusive, and it is possible to have both approaches implemented in the same system. Combining them may be especially desirable when there are multiple wallets and/or verifiers, to ensure a high baseline level of privacy guarantee across a variety of implementations.

However, should the issuer, wallet, and verifier (and perhaps coordinating standards bodies such as the IETF, NIST, W3C, and ISO) all agree to support the zero-knowledge approach atop quantum-resistant rails, then it’s possible to move the whole industry forward while smoothing out the new privacy technology’s rough edges. This is the direction we should go towards as an industry.

Tech Itself is Not Enough

While these technical solutions can bring enormous benefits to baseline privacy and security, they must be combined with robust data protection policies to result in safe user-controlled systems. If personally identifiable information is transmitted as part of the user’s digital credential, then by definition they are correlatable and privacy cannot be addressed at the technical protocol level, and must be addressed by policy.

For example, you can’t unshare your full name and date of birth. If your personally identifiable information was sent to an arbitrary computer system, then no algorithm on its own can protect you from the undercarriage of tracking and surveillance networks. This is only a brief sample of the kind of problem that only policy is positioned to solve effectively. Other concerns range from potentially decreased accessibility if paper solutions are no longer accepted, to normalizing the sharing of digital credentials towards a “checkpoint society.”

Though it is out of scope of this work, it is critical to recognize the important role of policy to work in conjunction with technology to enable a baseline of interoperability, privacy, and security.

The Road Ahead

Digital identity systems are being rolled out in production today at a blazingly fast pace. While they utilize today’s security standards for cryptography, their current deployments do not incorporate important privacy features into the core system. We believe that ultimately we must upgrade digital credential systems to post-quantum cryptography that can support zero-knowledge proofs, such as ZK-STARKs, but the road ahead is a long one given the timelines it takes to validate new approaches for high assurance usage, especially in the public sector.

Instead of scorching the earth and building anew, our proposed approach can upgrade existing systems with new privacy guarantees around unlinkability by changing out a few components, while keeping in line with current protocols, data formats, and requirements for cryptographic modules. With this approach, we can leave the door open for the industry to transition entirely to zero-knowledge-based systems. It can even pave the path for them by showing that it is possible to meet requirements for unlinkability, so that when policymakers review what is possible, there is a readily available example of a pragmatic implementation. 

We hope to collaborate with the broader community of cryptographers, public sector technologists, and developers of secure systems to refine our approach toward production usage. Specifically, we wish to collaborate on:

Enumerated requirements for TEEs around scalability, costs, and complexity to implement this approach, so that commercial products such as Intel SGX, AMD TrustZone, AWS Nitro Enclaves, Azure Confidential Computing, IBM Secure Execution, or Google Cloud Confidential Computing can be considered against those requirements. A formal paper with rigorous evaluation of the security model using data flows, correctness proofs, protocol fuzzers, and formal analysis. Prototyping using real-world credential formats, such as ISO/IEC 18013-5/23220-* mdocs, W3C Verifiable Credentials, IMS OpenBadges, or SD-JWTs. Evaluation of how this approach meets requirements for post-quantum cryptography. Drafting concise policy language that can be incorporated into model legislation or agency rulemaking to create the requirement for unlinkability where deemed appropriate.

If you have any questions or interest in participation, please get in touch. I will be turning this blog post into a paper by adding reviews of related work, explanations, and some other key sections.

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions. We believe that instead of people signing into platforms, platforms should sign into people’s data vaults. Our products power privacy-forward verifiable digital credentials solutions for businesses and governments, with initiatives like state digital identity (including mobile driver’s licenses), digital learning and employment records, and digital permits and passes.

Monday, 22. July 2024

Indicio

Gartner names Indicio as a representative vendor in decentralized identity market guide

The post Gartner names Indicio as a representative vendor in decentralized identity market guide appeared first on Indicio.
Research firm says over half-a-billion people will be using verifiable credentials by 2026 and the decentralized identity market will likely be worth over $3 billion by 2031.

By Tim Spring 

Research firm Gartner says the market for decentralized identity is rapidly evolving and that decentralization will fundamentally change how we manage identity, privacy, and security. 

As conventional identity access management continues to struggle with identity authentication, data breaches, and privacy issues, decentralized identity, says Gartner, is now a viable market solution.

And with the European Union forging ahead with digital identity (eIDAS 2.0) and specifications for a digital identity wallet (EUDI), hundreds of millions of people will soon be able to use verifiable credentials to seamlessly access government services from simple digital wallet applications on their mobile devices.

These insights and more come from Gartner’s 2024 Market Guide for Decentralized Identity, which lists Indicio as a representative vendor. Indico is the first company specializing in decentralized identity technology to have a complete solution — Indicio Proven® — available in both AWS and Google Cloud Marketplace, and we have customers around the world deploying solutions. Indicio is also the first company to create certified training in all aspects of decentralized identity technology with Indicio Academy.

But to underscore just how rapidly the technology is evolving, the Market Guide omits several emerging sector use cases and technical developments driving adoption and innovation.

Digital Travel Credentials (DTC) based on International Civil Aviation Organization (ICAO) standards for digital passports are now being used by a government — Aruba — and multiple airlines (this is a key area for Indicio and our partner SITA; in collaboration, we successfully launched the world’s first DTC). 

With a DTC, a traveler is able to create a “government-grade” digital identity from their passport and can be preauthorized for border crossing before they leave home (thus enabling the actual border to be crossed seamlessly, in a few seconds). 

Given the power of a DTC as a digital identity, it can be used for much more than border crossing. As Phocuswire’s Mitra Sorrells recently noted, the result of this technology is that “Travel… is on the verge of a fundamental, radical shift.” 

Verifiable credentials are also being used to verify more than a person’s identity. New Zealand’s Digital Farm Wallet project is using verifiable credentials to authenticate devices and sustainability claims in agriculture, so that farmers can easily and repeatedly prove compliance requirements in order to access funding and meet other needs.

Given the number of connected devices and their uses, verifiable identity means verifiable data. It’s not just about identity management; it’s about being able to create seamless operations and processes based on trusted data.

One particularly powerful aspect of decentralized identity that many businesses and governments are not yet aware of is the ability to add a secure communications channel between decentralized identities. Decentralized Identifier Communication, or DIDComm, enables trusted secure communication, such that a bank is able to seamlessly authenticate a customer and a customer their bank before any data is shared. 

This feature provides a simple, cost-effective way not only to implement zero-trust security but to manage the threat of generative AI identity fraud, aka “deepfake phishing.” A bank can use the trusted channel to confirm that the call center caller is, in fact, their customer.

DIDComm has significant business value beyond security in that it enables rich, consent-based communication between businesses and customers, which when combined with verifiable credentials allows for an entirely new level of trust. At the recent KuppingerCole European Identity and Cloud Conference, DIDComm was the emergent protocol, notably through ID Union’s project to make it work with the European Union’s favored protocols, OID4VC.

The key takeaway from the Gartner report) is that this technology is gaining traction a lot faster than you (and even Gartner) think. So what should you do if you are interested in learning more?

If you’re a Gartner subscriber, the report is here.

If you want to get your head around the technology, we recommend our Beginner’s Guide to Decentralized Identity.

And if you really want something to chew on, book a free workshop with us, where we’ll take a use case of your choice and show you how we can either save you money, make you money, or mitigate risk by using verifiable credentials. Contact us here!

####

Sign up to our newsletter to stay up to date with the latest from Indicio and the decentralized identity community

The post Gartner names Indicio as a representative vendor in decentralized identity market guide appeared first on Indicio.


auth0

A Developer’s Journey with User Authentication: Species360

How Auth0 enhances this nonprofit startup’s use case
How Auth0 enhances this nonprofit startup’s use case

Extrimian

DWN free Community Node for Data Management

At Extrimian, we use Decentralized Web Nodes (DWNs) to enhance data management and security. Our approach ensures that sensitive information is accessible when needed, especially in critical situations where immediate and active authorization is not possible. This blog post will explain how to use DWNs to manage access to private information, highlighting our solutions and […] The post DWN free

At Extrimian, we use Decentralized Web Nodes (DWNs) to enhance data management and security. Our approach ensures that sensitive information is accessible when needed, especially in critical situations where immediate and active authorization is not possible.

This blog post will explain how to use DWNs to manage access to private information, highlighting our solutions and the vital role of authorization protocols in emergency scenarios.

Emergency Access Use Case

Imagine being involved in an accident and losing consciousness. Emergency responders need instant access to your medical records to provide optimal care. With a DWN setup, this can be efficiently managed through pre-authorized access to specific health information for emergency services.

Authorization Protocols: Pre-Defined Permissions: Users can pre-authorize emergency services to access their medical records stored in DWNs. This is facilitated by granting access based on credentials issued by recognized authorities, verifying the identity and purpose of the requester. Seamless Data Access: Once authenticated, emergency personnel can swiftly access the necessary data, ensuring they have the information required to treat you effectively, even in the absence of active authorization. Additional Scenarios for Authorization Protocols

The versatility of DWNs extends beyond emergency scenarios, offering pre-authorized access in various other contexts:

Health Applications: Health monitoring applications can periodically access medical data to provide personalized recommendations, thanks to pre-set authorization protocols. Financial Services: Financial advisors or applications can securely access financial records for tasks like tax filing or investment management without requiring repeated manual permissions. Travel and Safety: Travelers can authorize applications to access travel documents, vaccination records, and emergency contacts, ensuring assistance is readily available when needed. Detailed Functionality of Extrimian’s DWN Technology Decentralized Data Storage

DWNs provide a decentralized storage mechanism, where data is encrypted and distributed across various nodes, ensuring no single point of failure. This architecture enhances data security and availability.

Granular Access Control

Users have fine-grained control over who can access their data and under what conditions. Authorization protocols can be tailored to specific use cases, ensuring that only authenticated and authorized entities can access sensitive information.

Interoperability

Extrimian’s solutions are designed to be interoperable with various identity management systems and applications. This ensures seamless integration and functionality across different platforms and services.

Scalability

Our DWN implementation is highly scalable, capable of handling large volumes of data and numerous access requests without compromising performance or security.

Case Study: Secure Health Records Management

In our health records management use case, Extrimian leverages DWNs to securely store and manage patients’ medical information. Patients can authorize healthcare providers to access their records using credentials issued by recognized authorities. This setup ensures that healthcare providers can access the necessary information quickly and securely, enhancing patient care and operational efficiency.

Balancing Privacy and Accessibility

A core challenge in data management is balancing privacy with accessibility. At Extrimian, we ensure that all data within DWNs is encrypted and user-managed, offering and increasing data privacy. However, our authorization protocols allow predefined entities to access specific information under specified conditions, providing a seamless balance between privacy and necessary access. These protocols define authentication requirements and the precise data accessible to authenticated entities, ensuring that information remains private yet available when required.

Extrimian’s Technology and Solutions

Extrimian uses DWNs to empower users with greater control over their data while enhancing the ability to respond to various critical situations. Our solutions integrate advanced authorization protocols to ensure essential information is accessible without compromising privacy or security. For more information on how Extrimian’s technology supports secure and efficient data management, visit Extrimian.io and explore our academy at Extrimian Academy.

Extrimian’s Vision for Scalable Decentralized Digital Identity Solutions

Extrimian envisions a future where decentralized digital identity (SSI) solutions are scalable and widely adopted. To achieve this, we actively create alliances with governments and private sectors, fostering mass adoption and building a critical mass of SSI technology users. A prime example of this effort is our collaboration with the Government of Buenos Aires City, leading to the development of the QuarkID protocol.

QuarkID Protocol: A Model for Interoperability

The QuarkID protocol is designed to be a foundational layer upon which other protocols can be built, enabling seamless interoperability. This protocol provides a robust framework that can be adapted by various private sectors and service industries, enhancing data security, user experience, and privacy.

Alliances for Enhanced Security and Privacy

To ensure the highest standards of security and privacy, Extrimian partners with niche experts and leading crypto technology firms. For instance, we collaborate with zkSync for implementing Zero Knowledge Proof (ZKP) systems. These alliances enable us to incorporate advanced cryptographic techniques, ensuring that users’ data remains secure and private.

Benefits for Users and Citizens

By integrating these technologies and forming strategic alliances, Extrimian delivers a decentralized digital identity system that is more secure, interoperable, and user-friendly. Citizens benefit from enhanced privacy, improved data security, and a better overall user experience.

DIF announces DWN Community Node

In the article DWNs: The Next Frontier in Decentralized Identity, the Decentralized Identity Foundation (DIF) introduces the concept of Decentralized Web Nodes (DWNs) as a pivotal development in the landscape of decentralized identity. DWNs enable secure, private data storage and exchange without relying on centralized intermediaries, thus enhancing user control and privacy. The article highlights the role of DWNs in fostering interoperability and scalability within decentralized ecosystems, underscoring their potential to revolutionize data management across various sectors. This innovation is crucial for the advancement of Self-Sovereign Identity (SSI) systems, offering a robust infrastructure for future decentralized applications.

Web3 DWN free community node now available
Source: Decentrlized Identity Foundation Website: https://blog.identity.foundation/dwn-community-node/ Extrimian and DIF Partnership

Extrimian is a proud member of the Decentralized Identity Foundation (DIF), collaborating to advance the development and adoption of decentralized identity technologies. This partnership enhances our commitment to providing cutting-edge solutions in data security and decentralized identity.

As a result of this collaboration, Extrimian and DIF are launching a virtual Self-Sovereign Identity (SSI) training called HackAlong this August. For more information, visit this link:

Extrimian and DIF
Educational Resources

To deepen your understanding of DWNs and how they are transforming data management, Extrimian Academy offers comprehensive courses. In our course on self-sovereign identity, you can find two informative videos specifically focused on DWNs:

Introduction to Self-Sovereign Identity (SSI) Understanding DWNs in Practice

These videos provide a thorough overview of DWN technology, its implementation, and practical applications in data security.

Also,for further details on our secure health records management and other use cases, check out the Extrimian Use Cases.

Conclusion

Extrimian’s deployment of DWNs not only provides users with enhanced control over their data but also ensures that critical information is accessible during emergencies through robust authorization protocols. By balancing privacy and accessibility, our solutions deliver security and efficiency when it matters most. Visit Extrimian.io to learn more about how we are transforming data management with DWNs.

Document for Further Information

For more in-depth information on this topic, please visit our Extrimian Product Page that offers additional insights into our DWN technology and its applications in emergency access and data management.

The post DWN free Community Node for Data Management first appeared on Extrimian.


auth0

New Auth0 Integration for Vercel: Available Now

Get started fast with two best-in-class developer platforms: Auth0 and Vercel
Get started fast with two best-in-class developer platforms: Auth0 and Vercel

Caribou Digital

DPI or Digital Transformation? Identifying DPI-specific risks in a Caribou Digital — UNDP convening

DPI or Digital Transformation? Three takeaways from a Caribou Digital — UNDP convening to identify DPI-specific risks Credit: AI generated Caribou Digital and UNDP recently held a convening of practitioners and scholars to explore the distinction between digital public infrastructure (DPI) and mainstream digital transformation, and the specific risks and safeguarding requirements of DPI. As a ‘Cha
DPI or Digital Transformation? Three takeaways from a Caribou Digital — UNDP convening to identify DPI-specific risks Credit: AI generated

Caribou Digital and UNDP recently held a convening of practitioners and scholars to explore the distinction between digital public infrastructure (DPI) and mainstream digital transformation, and the specific risks and safeguarding requirements of DPI. As a ‘Chatham House’ style convening of practitioners, policy makers and academics, this blog contains three key takeaways from the discussion.

Digital public infrastructure (DPI) is, as defined by the G20 New Delhi Leaders Declaration, made up of ‘secure and interoperable digital systems that enable the delivery of public services, together with an enabling governance environment and public value goals.’ The UN Secretary General’s Office of the Special Envoy on Technology (OSET) and the UN Development Programme (UNDP) convened the DPI Safeguarding Initiative to support the development of a safe, inclusive, and rights-protecting DPI framework. The initiative includes the DPI Safeguarding Initiative Working Groups, convened by OSET and UNDP to support the development of the framework.

This convening brought together leading stakeholders from the digital development academic and practitioner communities to consider four scenarios that present DPI and ‘generic’ approaches to common aspects of digital transformation. Participants’ discussion was structured around two specific questions:

What distinguishes a ‘digital public infrastructure approach’ from a ‘generic’ approach to digital transformation? What are the specific risk profiles of a ‘digital public infrastructure’ approach to digital transformation?

Takeaway 1: Countries don’t think of DPI; instead, they think of systems. So the distinction between DPI and ‘generic’ digital transformation is more of a ‘how’ difference rather than a ‘what’ difference.

One of the recurring themes throughout the conversation was that, from a purely technology perspective, there is not much difference between a DPI and generic approach to digital transformation. The various components or attributes of DPI are not in themselves new. It’s a policy perspective and implementation approach — the ‘how’ — where DPI’s distinction lies.

One of the main characteristics of a DPI perspective is the idea of ‘thinking horizontally, rather than vertically’ so that systems work across ministries or sectors, rather than vertically siloed in one — for example, enabling a single registration to verify identity and eligibility for multiple systems and services. One of the main attributes that characterize this is the interoperability necessary to enable the horizontal flow of data. Interoperability is of course not new, and there is a wealth of literature and knowledge around mitigating the risks of interoperability that the DPI Safeguarding Initiative Working Groups and wider DPI community can draw from.

Another dimension that participants flagged was that ‘how’ questions are also often normative questions; that is, they introduce considerations of rights and inclusion. Examples include the policy dimensions of identification systems and the rights and entitlements that identification brings: in other words, what legal rights does being identified grant holders of that identity?

Takeaway 2: The distinction of DPI is the implications for development pathways and choice.

The second takeaway was DPI’s significance for development pathways and choice, namely in two dimensions. First, some participants discussed how DPI could strengthen states to make sovereign choices about their digital transformation path. Once DPI reduces digital transformation, from a technical architecture-systems perspective, to minimal building blocks and their core functionality, it opens a conversation about the importance of control for those blocks, instead of a conversation about a pre-selected system and its features. Participants felt this was particularly significant in the context of countries in the Global South who often depend on external financing for their development trajectory and are often forced to adopt the path of their funders. This is particularly the case in the context of donors who provide financing or investment for specific sectors or silos such as health, education, or welfare. Focusing on a DPI approach instead can put power back in the hands of the government.

The second aspect of sovereignty and choice was focused on individuals. Some participants highlighted how in some cases making systems mandatory — such as identification systems — can force users, whether they want to or not, to adopt systems that may serve state interests before individual interests. This consideration introduced another way of thinking about DPI risk: the risk of doing it too well, which could enable authoritarianism and autocracy, or not well enough, which could lead to service failure and a loss of trust and confidence in the state.

Takeaway 3: DPI’s complex business models present risks to both owners and systems.

Another takeaway from the discussion was the significance of DPI’s business models. One of the challenges that participants flagged was around the ownership of systems. For example, if DPI is government led, this can have implications for the business case of systems — for example around the necessary internal capacity to build or buy systems, and manage technical development and procurement. Another challenge that participants flagged was around the scaling of business models. Scaling competitively, and thus justifying public investment, was as important to consider as the potential for scale that the system introduced.

The characteristics of DPI also lead to other dimensions of business model risk. For example, interoperability introduces challenges for commercial suppliers of systems, as interoperability (should) make DPI elements (identity, payments, data sharing) commodity services. This is significant because, historically, companies prefer to make profits rather than compete on price in a commodity service market.

Another risk of DPI that participants flagged is its emphasis on open source. A number of participants flagged the challenge of open source and ‘abandonware’ — systems and technologies that are developed by a community (or lone developer) and then abandoned. This is a key risk for critical infrastructure and requires governments to develop the relevant capacity to mitigate the risk of infrastructure failure.

Reflection: Evaluation is key. How do we know if an intervention is a DPI approach, and what difference does DPI make?

A more general reflection on this discussion is the importance of monitoring and learning: establishing the basis for evaluating impact and whether a particular initiative is actually DPI in nature. What are the distinctive aspects of a deployment that can tell us whether it is upholding the principles and values of DPI? If DPI is an approach and a policy shift, how can we measure whether an approach and policy agenda is DPI in nature?

At Caribou Digital, we have a particular focus on monitoring, evaluation, and learning. We’re often asked to assess the impact of interventions. So we find it striking that in the conversation around DPI there has been very little attention paid to the difference a DPI approach makes, compared to a generic approach to digital transformation. Being clear about that difference is important on its own terms, but also to justify investments and to support the case for a DPI approach to policymakers and decision-makers, especially elected representatives.

The work to unpack distinctions between DPI and generic digital transformation is critical to developing appropriate safeguards, and to the broader effort to advance the case for and understanding of DPI. The work of developing instructive scenarios to break down the differences will confirm, and there is an open invitation to provide feedback that helps refine the existing scenarios, and to suggest and contribute new ones.

DPI or Digital Transformation? Identifying DPI-specific risks in a Caribou Digital — UNDP convening was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


Holochain

Volla Partnership Announcement

Holochain on Mobile

You might have already heard about Volla, but today we are excited to make it official. We are announcing our partnership with Volla Systeme GmbH (Volla for short). A privacy-first mobile phone company with their own Android based OS, Volla is preparing to ship their new Quintus phone this fall with Holochain applications pre-installed.

Relay

Built with Holochain, the secure messenger app "Relay" will offer 1-to-1 chats and group chats with secure authentication, allowing you to share photos and other files — all 100% encrypted. Users connect to each other via public key, providing a decentralized and secure system of identifiers. Relay is entering beta testing this August.

Recover

Recover is a Holochain based application allowing users to create encrypted backups of their data without a cloud provider. Enabling (selective) recovery in the event of theft or defect, it automatically and incrementally backs up Volla phones.

Reinvent 

Volla is reinventing what it means to hold a mobile phone in your hands, and we are glad to be a part of that. These are just the first Holochain apps to be built for Volla, but as a partnership committed to privacy first, distributed tech, we expect that there will be many more. Holochain on mobile is a huge leap, and we are thankful to Volla for blazing this trail.

Get Involved

Volla will be doing a pre-sale of the Quintus via Kickstarter. Keep your eyes on our socials to learn more.


KuppingerCole

Eviden DirX Audit

by Nitish Deshpande This KuppingerCole Executive View report looks at Eviden’s DirX Audit solution from its DirX portfolio. DirX Audit is an analytics and audit intelligence solution that stores historical identity data and recorded events from the IAM processes. This collection of data allows it to also provide insights into access risks and reporting it through user friendly interface.

by Nitish Deshpande

This KuppingerCole Executive View report looks at Eviden’s DirX Audit solution from its DirX portfolio. DirX Audit is an analytics and audit intelligence solution that stores historical identity data and recorded events from the IAM processes. This collection of data allows it to also provide insights into access risks and reporting it through user friendly interface.

DHIWay

Tarento and Dhiway collaborate to Foster Trust and Certainty in the Digital Landscape

Tarento Technologies Private Ltd., an IT services company providing data and AI Services and Dhiway, a leading provider of enterprise Web 3.0 trust infrastructure, are excited to announce a partnership to collaborate on the development of scale-out applications and solutions with the CORD Blockchain framework, As a Nordic-Indo IT Services company, Tarento has digitized business […] The post Tare

Tarento Technologies Private Ltd., an IT services company providing data and AI Services and Dhiway, a leading provider of enterprise Web 3.0 trust infrastructure, are excited to announce a partnership to collaborate on the development of scale-out applications and solutions with the CORD Blockchain framework,

As a Nordic-Indo IT Services company, Tarento has digitized business processes through low-code/no-code approaches, advanced NLP, Image and Speech solutions, and Deep Machine Learning.

Dhiway and Tarento have signed a Memorandum of Understanding (MoU) to collaboratively explore the design, development and production of various applications and services, including Dhiway’s credentialing platform MARK Studio and the trust infrastructure enabled by CORD Blockchain. The joint effort will leverage the modular trust infrastructure from Dhiway to provide verifiability of data with continuous assurance.

This partnership will transform how data pipelines are managed; data exchanges are set up and utilized in many use cases for improved business efficiency and growth.

Mohit Agarwal, Vice President – Digital at Tarento, stated, “This is a strategically important decision for us. We already have great work going on in govtech and startuptech spaces in AI engineering , product development and automotive/IoT spaces. With this partnership, we want to expand our open-source, open-standards technology services footprint to co-create meaningful blockchain rollouts for digital public infrastructure and country-scale impact.”

K P Pradeep, CEO at Dhiway, emphasized that “The CORD Blockchain framework has been designed to offer authentic data streams at scale using the composable trust infrastructure. This partnership enables significant opportunities for both companies to extend the impact of Web 3.0 technologies and especially distributed ledger technologies.” 


About Tarento Technologies Private Limited

Tarento is a Nordic-Indian technology services company with an unflinching desire to be the trusted global partner for startups, enterprises, government organizations and foundations/not-for-profits. We are proudly associated with architecting, building and operating some of India’s largest digital public infrastructure initiatives, government capacity building, edtech, and automotive digital platforms.


About Dhiway

Dhiway is a trust infrastructure company reshaping the digital future through population-scale technology solutions. We enable enterprises and government agencies to address key challenges around data stores, data exchange and data assurance through the CORD Blockchain – a Layer 1 enterprise blockchain technology.



The post Tarento and Dhiway collaborate to Foster Trust and Certainty in the Digital Landscape appeared first on Dhiway.


KuppingerCole

Trust in an AI Interconnected World

by Scott David Trust is an emotional state and belief held by human beings that is built upon a sense of reliability and predictability regarding future interactions.  The concept of trust is broadly applied to cover relationships among people, or between people and organizations.  The concept of a legal trust extends and formalizes the reliability of future interactions to create legal

by Scott David

Trust is an emotional state and belief held by human beings that is built upon a sense of reliability and predictability regarding future interactions.  The concept of trust is broadly applied to cover relationships among people, or between people and organizations.  The concept of a legal trust extends and formalizes the reliability of future interactions to create legally enforceable fiduciary obligations to elevate the subjective emotions and beliefs to become trustworthy, objective, reliable and actionable for future relationships. 

The concept of trust is not, however, usually applied to relationships AMONG organizations.  It seems naive to assert that one company (or any other purely legal person) trusts another.  At that point, the concept of reliability and predictability is more usually characterized as risk, rather than trust.   Organizations have developed myriad metrics for assessing risk across business, operating, legal, technical and social (BOLTS) domains as surrogate signals in the absence of human trust. 

Organizations do not have qualia, emotions, or beliefs, and therefore cannot be said to trust something.  However, as noted above, the reverse is not true, i.e., humans can trust organizations, and that is the source of brand loyalty (companies) and patriotism (countries), etc.

Trust is built on consistency of behaviors through time and space, and is encoded in signals associated with consistent behaviors.  With the advent of networked interaction and information systems (the Internet), the signals and behaviors upon which trust is built became mediated by multiple unseen layers, attenuating trust.  The caption of a well-known cartoon from the early Internet years: “On the Internet no one knows you’re a dog,” speaks to the challenges of trust in such unfamiliar, intangible domains.

With the advent of myriad systems and applications of so-called Artificial Intelligence, the signals, behaviors, and interactions online are rendered even more remote and unfamiliar, which further challenges human trust.  How can we trust interactions with AI (and mediated by AI) if we don’t even know what to expect of it?  The AI black box problem is not just confined to internal AI processing steps, it is also present in online interactions (and the information associated with such interactions) where AI is involved.  The advent of “agentic” AI systems, i.e., multi-step and AI P2P interactions, will create vast interaction complexities that will, in effect, enclose all online interactions in a black box rendering control to be stochastic at best, and illusory at worst.

In fact, the concept of trust with AI is a trap.  AI is a form of computational intelligence that processes human (mostly English) text purely computationally.  AI does not, at present, have any sense or understanding of the content or concepts that it is processing.  Putting aside the remarkable phenomenon that computational intelligence can produce outputs that reflect content and style familiar to humans, AI merely processes, it does not understand. 

In fact, AI’s ability to computationally derive such subtle, textured, and human-like patterns from text alone, supports the notion that a significant portion of human cognition (thinking) takes place in language (and material culture) itself, and not in the wet-ware organ of the brain.  The brain is just tuned to the mind that actually resides in language.  From this perspective, AI is a computational mind reader when it processes human text.  That possibility is at the same time creepy and beautiful, for it suggests a future hybridization of humans and AI systems into virtual chimera through a process that is most closely associated with symbiogenesis (from which eukaryotic mitochondria and chloroplasts are derived), but in an intangible form that might be called “sym-info-genesis.”

To put a finer point on this, human survival depended in part on overfitting perceptions of risk vectors from the environment.  Individual humans who perceived a lion hiding in the tall grass tended to live longer than humans that did not perceive the lion, even if the lion was not always there.  This quality of pareidolia (pattern detection in the environment) is responsible for humans perceiving AI output as presenting readable text and capturing author styles, etc. 

It is frequently said that AI hallucinates.  It is, in fact, the humans that are hallucinating AI outputs, not just the AI systems themselves.  In fact, human perception of meaning and content from AI system outputs is akin to predators (mis)perceiving that the eyespots on moth wings are the eyes of a much larger animal.  It is hallucination prompted by mimicry prompted by evolution. 

In the case of AI, the source of the mimicry is not evolution but computation AND human preference in selecting more useful forms of output.  Of course, when humans marvel at the efficacy and beauty of AI outputs, they are also creating a (virtual) fitness landscape, and selecting for those systems that are most evolved for survival in that (human interaction/information) landscape.  We were not around at the time that fish first crawled onto land, but humans are privileged to be able to watch the evolution of a new (non-physical) living form as AI evolves its way into the human trusted interaction landscape.

What are the implications for trust in the future internet where computational intelligence, i.e., AI, can mimic all sorts of content in ways that can benefit or harm interacting parties?  At cyberevolution in Frankfurt this December, we will explore the implications of these and related phenomenon, with the intention of understanding the dynamics at the crossroads of trust and risk.  We hope that you can join us.


Ontology

Comprehensive Guide for Ontology’s $10 Million Boost for Decentralized Identity Innovation

Hello, Ontonauts! 🌐 🎉 We’re excited to announce Ontology’s $10 million fund aimed at boosting innovation and adoption of Decentralized Identity (DID) using ONT & ONG tokens. This initiative is designed to empower, educate, and evolve our ecosystem in exciting new ways! Whether you’re a creator, educator, developer, or business, there’s an opportunity for you to contribute and benefit. Let’s c

Hello, Ontonauts! 🌐

🎉 We’re excited to announce Ontology’s $10 million fund aimed at boosting innovation and adoption of Decentralized Identity (DID) using ONT & ONG tokens. This initiative is designed to empower, educate, and evolve our ecosystem in exciting new ways! Whether you’re a creator, educator, developer, or business, there’s an opportunity for you to contribute and benefit. Let’s check out the various funding opportunities available!

Introduction to the Ontology DID Fund

The Ontology DID Fund represents a strategic initiative designed to promote and accelerate the understanding and implementation of Decentralized Identity (DID) technologies. Through financial support and resources, the fund aims to empower creators, educators, and innovators to develop impactful educational content and projects that showcase the benefits and functionalities of DID, particularly those leveraging Ontology’s advanced protocols and products.

APPLY TODAY

Purpose of the Fund

The fund seeks to catalyze widespread adoption and literacy of DID technologies across various sectors by financing projects that can:

Educate and Engage Diverse Audiences: From tech enthusiasts and developers to businesses and policymakers. Develop Comprehensive Technical Documentation: Ensuring that Ontology’s DID solutions are accessible and easy to implement. Integrate DID with Existing Projects: Supporting seamless identity verification and other applications across various platforms. Build Innovative New Projects: Encouraging the development of fresh, sustainable solutions that utilize ONT ID in groundbreaking ways. Metrics for Assessing Projects

To ensure that the funds are allocated to projects with the highest potential for impact, the following metrics are considered during the evaluation process:

User Base: The size of the audience that the project can reach or influence. Daily Active Users (DAU): For interactive or digital platforms, DAU is a key indicator of engagement. Social and Community Impact: How the project influences or contributes to community and social spheres. Previous Grants: Indications of the project’s credibility and history of successful outcomes. Application and Funding Process Proposal Submission: Applicants submit comprehensive proposals detailing project objectives, target audience, content strategy, and expected outcomes. Review and Evaluation: Proposals undergo rigorous review based on relevance to DID education, innovation in content delivery, potential outreach, and impact metrics. Grant Awarding: Successful proposals are awarded funding based on alignment with the fund’s objectives and their potential contribution to the DID ecosystem. Funding Allocation

Projects that align with the fund’s goals and meet the high standards set by these metrics will receive support in the form of grants. These grants cover costs related to content creation, project development, and dissemination activities. The Ontology DID Fund aims to foster a robust ecosystem where Decentralized Identity technologies are better understood, more widely accepted, and effectively utilized across various sectors.

🎓 Empowering Education on DID Objective:

We’re committed to spreading knowledge about the power of decentralized identity. We’re calling all creatives and educators to help us demystify the world of DID. Whether you’re a writer, a filmmaker, or an event organizer, there’s a place for you to contribute! We’re supporting all kinds of content to help everyone from beginners to experts understand and utilize DID more effectively.

Purpose:

To create educational content that makes DID concepts accessible and engaging for a wide audience, including developers, businesses, and end-users.

Allocation:

Funds will be distributed to projects that develop educational content, such as articles, videos, tutorials, and interactive learning tools.

Metrics: Number of educational materials created. User engagement and feedback. Reach and impact of educational content. Target Audience: Beginners: Basic introductions and foundational knowledge. Intermediate Users: Deeper knowledge of ONT ID’s technical specifics and use cases. Advanced Users: Detailed information on integrating ONT ID, smart contract development, and security practices. Guidelines for Content Creation: Balance Technical Detail and Accessibility For Developers: Clear explanations, code snippets, API documentation, architectural diagrams. For Businesses and End-users: Simplify jargon, highlight practical benefits, and explain solutions to common identity issues.

2. Use of Visuals and Analogies

Diagrams, infographics, and animations to explain processes and benefits. Analogies comparing DID to familiar concepts, like digital passports.

3. Interactive and Practical Examples

Quizzes, interactive diagrams, simulations, real-world use cases.

4. Consistent and Clear Messaging

Emphasize the importance of DID, highlight Ontology’s contributions, maintain a consistent tone. 📘 Developing Technical Documentation for ONT ID Objective:

To provide comprehensive technical documentation for ONT ID, making it accessible to users of all skill levels. These resources are designed to enhance both user and developer experiences by providing clear and detailed information on the implementation and use of ONT ID.

Purpose:

To ensure that Ontology’s DID solutions are well-documented and easy to implement, supporting developers and technical users in integrating ONT ID into their projects.

Allocation:

Funds will be distributed to projects that develop detailed technical documentation, ranging from beginner guides to advanced technical manuals.

Metrics: Quality and clarity of documentation. User engagement and feedback. Number of users successfully implementing ONT ID. Target Audience: Beginners: Introduction to ONT ID, basic guides. Intermediate Users: Detailed tutorials, practical applications. Advanced Users: In-depth technical documentation, integration guides. Implementation Steps: Identify Learning Objectives: Define clear objectives for each segment. Develop Engaging Content: Create content tailored to the target audience using diverse formats. Incorporate Interactive Elements: Enhance documentation with examples, diagrams, and practical exercises. Gather and Implement Feedback: Use community feedback to refine content. Monitor and Evaluate Impact: Track user engagement metrics and update content accordingly. 🔗 Integration and Partnership Opportunities Objective:

We’re looking to expand the reach of ONT ID by integrating it across various platforms and forming strategic partnerships. If you have a project that could benefit from seamless identity verification or if you’re looking to innovate within your current platform, we want to support your journey.

Purpose:

To integrate ONT ID into various platforms and foster strategic partnerships to enhance its adoption.

Allocation:

Funds will be distributed to projects that propose strategic partnerships and integration plans.

Metrics: Effectiveness of integration. Impact and reach of integrations. User feedback and adoption rates. Target Audience: Businesses and platforms looking to integrate ONT ID for seamless identity verification. Developers and innovators seeking to enhance their projects with ONT ID. Criteria for Fund Distribution: Strategic Partnerships: Support initiatives establishing impactful partnerships. Integration Proposals: Present clear, actionable integration plans. Support for ONTO Wallet and Orange Protocol: Explore innovative enhancements using ONT ID features. Additional Support: Technical resources, including APIs, SDKs, and development tools. Market validation guidance. Idea incubation support, such as brainstorming sessions and mentorship. 🌟 Innovate with ONT ID Objective:

Got a groundbreaking idea? We’re here to help turn it into reality. Projects that utilize ONT ID in innovative ways are eligible for funding to bring fresh and sustainable solutions to the market. Let’s build the future of digital identity together!

Purpose:

To encourage innovative applications and services utilizing ONT ID, demonstrating its versatility and utility.

Allocation:

Funds will be distributed to innovative projects that propose new uses for ONT ID.

Metrics: Innovation and feasibility of proposed projects. Potential market impact and need. Long-term sustainability and growth potential. Target Audience: Innovators and developers with groundbreaking ideas. Businesses looking to implement innovative DID solutions. Criteria for Fund Distribution: Innovation and Feasibility: Propose innovative uses of ONT ID. Market Potential: Ensure solutions address clear problems or opportunities. Sustainability and Growth: Evaluate long-term viability and scaling potential. Ensuring Legacy: Milestone-based funding. Community involvement in project selection and feedback. Follow-up support, including mentoring and networking opportunities. General Guidelines for All Initiatives Define Clear Objectives: Establish specific, measurable, achievable, relevant, and time-bound (SMART) goals for each project to ensure focused and effective implementation. Tailor Solutions to the Target Audience: Understand the needs and preferences of your audience, whether they are developers, businesses, or end-users, and design your project to meet those needs. Leverage Ontology’s Resources: Utilize Ontology’s APIs, SDKs, and other technical resources to ensure your project is built on a solid foundation and can fully leverage Ontology’s capabilities. Incorporate Practical Use Cases: Demonstrate the real-world applications of your project, showing how it can solve existing problems or improve current processes. Gather and Implement Feedback: Actively seek feedback from the community and stakeholders throughout the project lifecycle to ensure it remains relevant and effective. Use this feedback to make necessary adjustments and improvements. Monitor and Evaluate Impact: Track key performance indicators (KPIs) and other relevant metrics to measure the success and impact of your project. Regularly evaluate progress and make data-driven decisions to enhance outcomes. Ensure Scalability and Sustainability: Design your project with future growth in mind, ensuring it can scale to meet increasing demand and remain sustainable over the long term. Foster Collaboration and Partnerships: Engage with other projects, businesses, and community members to build synergies and enhance the overall impact of your initiative.

Let’s make a mark in the digital identity landscape together! 🚀 Submit your proposals and join us on this exciting journey. Stay connected and share your thoughts on our social media channels. 🌟

APPLY TODAY

Comprehensive Guide for Ontology’s $10 Million Boost for Decentralized Identity Innovation was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 21. July 2024

KuppingerCole

AI and Digital Trust: Ensuring Fairness and Transparency

In this episode of the KuppingerCole Analyst Chat, Matthias Reinwarth talks to Marina Iantorno, Research Analyst at KuppingerCole Analysts. They explore the concept of digital trust in our AI-driven, interconnected world. The discussion explores the definition and importance of digital trust, the current landscape of AI systems, and examples of successful and failed attempts to build trust. Marina

In this episode of the KuppingerCole Analyst Chat, Matthias Reinwarth talks to Marina Iantorno, Research Analyst at KuppingerCole Analysts. They explore the concept of digital trust in our AI-driven, interconnected world. The discussion explores the definition and importance of digital trust, the current landscape of AI systems, and examples of successful and failed attempts to build trust. Marina also breaks down key tenets crucial for fostering digital trust, including transparency, data privacy, security, accountability, and more. The episode provides actionable strategies for implementing these tenets and highlights tools and technologies that support digital trust.



Saturday, 20. July 2024

Safle Wallet

Introducing Safle LENS

Weekly Safle Update! 🚀 Greetings Sentinels, We’re excited to share the latest progress and milestones achieved with Safle this week. Here’s the update on our this week’s journey: 🚨 Exciting Release Alert: SafleID Documentation is Live! Our comprehensive SafleID Documentation has finally released. Delve into the detailed intricacies of SafleID and discover the exceptional value i
Weekly Safle Update! 🚀 Greetings Sentinels,

We’re excited to share the latest progress and milestones achieved with Safle this week. Here’s the update on our this week’s journey:

🚨 Exciting Release Alert: SafleID Documentation is Live!

Our comprehensive SafleID Documentation has finally released. Delve into the detailed intricacies of SafleID and discover the exceptional value it offers.

Explore Now 👉

🔗SafleID Documentation

🛠️ Portfolio Website Overhaul

Introducing Safle LENS, our new portfolio viewer. Join us on this exciting journey and look forward to a sleek, user-friendly experience that’s truly exceptional.

Here’s a sneak peek at the design 👇

Stay tuned and keep your eyes on the stars—Safle LENS is coming soon! 🌟 Sign-Up Flow Development

The new sign-up flow is nearly complete and we’re gearing up for testing. Soon, onboarding will be smoother and faster for all new Saflenauts, Sentinels, and users.
Be among the first to try it out by signing up for our beta release program!

🔗Click here

🤝🏻 New Partnership with Coinshift!

We’re happy to announce our new partnership with Coinshift for Safle’s treasury management. Stay tuned as we will share more about it in the coming days.

🔗Follow us

💬 Join us on Discord

Join our Discord channel to connect with the team, share your thoughts, ask questions, and stay updated on all things Safle. We’re building in public, and our team is live every day on Discord, come say Hey at Safle Build in Public channel!

🔗Join us here

🚀 We are Hiring !!

Got any DevOps Ninjas, Kickass Growth Marketers, Detail-Oriented QA Experts, or Innovative Blockchain Engineers in your circle? Send them our way, and we’ll take it from there. Check out our openings and join Safle’s journey.

🔗Here

Thank you for being an integral part of our journey. Together, we’re reaching for the stars!

Stay stellar,
The Safle Team

Download the Safle App Now!

Experience the power of Safle at your fingertips 🚀

🔗SafleWallet

Thursday, 18. July 2024

SC Media - Identity and Access

Five ways to boost identity intelligence to enhance visibility and decision-making

Implementing modern IAM platforms and policies will strengthen organizational cybersecurity as a whole, a collection of cybersecurity leaders said in a recent roundtable discussion.

Implementing modern IAM platforms and policies will strengthen organizational cybersecurity as a whole, a collection of cybersecurity leaders said in a recent roundtable discussion.


This week in identity

E56 - Emergency Episode Discussing the Global Crowdstrike Issue

Simon and David convene for a special episode to discuss the ongoing global IT outages caused by a Crowdstrike update. Note this was released Friday 19th July 9am PST / 5pm BST

Simon and David convene for a special episode to discuss the ongoing global IT outages caused by a Crowdstrike update. Note this was released Friday 19th July 9am PST / 5pm BST


SC Media - Identity and Access

Microsoft-signed driver leveraged by HotPage adware

Aside from performing code injections into remote processes, the distributed kernel driver also allows system data exfiltration to a remote server connected to Hubei Dunwang Network Technology Co., Ltd, according to an ESET analysis.

Aside from performing code injections into remote processes, the distributed kernel driver also allows system data exfiltration to a remote server connected to Hubei Dunwang Network Technology Co., Ltd, according to an ESET analysis.


Third-party postal address sharing resolved by USPS

The U.S. Postal Service has confirmed halting the sharing of its online customers' postal addresses with Meta, Snap, and LinkedIn following a TechCrunch report detailing its disclosure of customer details via tracking pixels across its website.

The U.S. Postal Service has confirmed halting the sharing of its online customers' postal addresses with Meta, Snap, and LinkedIn following a TechCrunch report detailing its disclosure of customer details via tracking pixels across its website.


Ocean Protocol

Predictoor Benchmarking on Regularized Linear Classifiers with Calibration

Comparing Lasso (L1) vs Ridge Regression (L2) vs ElasticNet (L1-L2), and Calibration of Classifier Probabilities Summary This post describes benchmarks on Ocean Predictoor simulations across various approaches to (a) linear regularization, and to (b) calibrating the models’ output probabilities. It then proceeds to do a walk-through of each of the benchmarks for predictoor/trader profit, and co
Comparing Lasso (L1) vs Ridge Regression (L2) vs ElasticNet (L1-L2), and Calibration of Classifier Probabilities Summary

This post describes benchmarks on Ocean Predictoor simulations across various approaches to (a) linear regularization, and to (b) calibrating the models’ output probabilities.

It then proceeds to do a walk-through of each of the benchmarks for predictoor/trader profit, and comparison plots for the models & their calibrations.

1. Introduction 1.1 What is Classification?

Classification models (“classifiers”) predict the category or class label of an input. For example, in email spam detection, a classifier might predict whether an email is Spam or Not Spam. Since it has just two possible categories, this makes it a binary-valued classification model (True or False).

To compare, regression models (“regressors”) predict a continuous-valued output. In an email spam application, a regressor might predict the number of spam messages per day.

1.2 What is Regularization?

“Regularization” is a technique in fitting models that aims to reduce the uncertainty of future predictions. (“Minimize volume of future predictions confidence ellipsoid.) There are many approaches to regularization. When learning linear models, they occupy a spectrum from:

L1 / Lasso: Try to get sparse models with few parameters L2 / Ridge Regression: Allow each variable to have a weight, however small ElasticNet: Shades of gray between L1 & L2

The appendix holds details of each.

1.3 What’s Calibration of Classifier Probabilities?

Beyond just predicting True or False, classification models can also output their estimated probability of True or False (or more general category).

Probabilities fall out naturally with some classifier formulations, such as logistic regression. Here we are using logistic regression.

We can “calibrate” output probabilities by warping the model’s estimated probability to a training set. This means its probability estimates can be more trustworthy. Some techniques include: None, Isotonic, and Sigmoid.

None calibration: uses the model’s raw, uncalibrated probabilities. Isotonic calibration: maps predicted probabilities to calibrated probabilities by dividing the probabilities into classes using a fitted piecewise constant, non-decreasing function. Sigmoid calibration: maps predicted probabilities to calibrated probabilities by transforming the raw scores using a sigmoid function, which has an S-shaped curve, squeezing the input values into the range [0, 1]. 1.4 Ocean Predictoor, Classification and Benchmarks

In Ocean Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $.

Classification models are used predict whether the next BTC/USDT close value will go UP or DOWN. (Or ETH/USDT, BNB/USDT, etc.)

We developed a simulation tool (“pdr sim”) for data scientists to run simulations of their predictoor & trader bots to benchmark which modeling strategy, trading strategy, parameters, etc on $ made by predicting or trading. The simulator also outputs classifier accuracy, f1 / recall / precision, and other classifier performance metrics.

1.5 Benchmarks Outline

To help users in setting parameters, we ran benchmarks using the “pdr multisim” tool, which invokes the simulator in a loop across various parameter settings. This blog post describes the results of those benchmarks.

We run benchmarks on each of the regularization approaches

ClassifLinearLasso — L1 regularization ClassifLinearRidge — L2 regularization ClassifLinearElasticNet models — spectrum between L1 & L2.

And for each of those approaches, we run the three calibration approaches (None, Isotonic, Sigmoid).

1.6 Experimental setup

These parameters were defined in our my_ppss.yaml file, a customized version of the ppss.yaml file of the pdr-backend repo:

ClassifLinearRidge, ClassifLinearLasso, and ClassifLinearElasticNet ML models were tested with None, Isotonic, and Sigmoid calibrations. Models predicted 5min candle UP/DOWN feeds for BTC-USDT and ETH-USDT for 5000 epochs. Models were trained on historical Binance 5min close candle data using either a BTC-USDT training set or both BTC-USDT and ETH-USDT training set from January 1, 2024 to June 30, 2024. The # of training samples (max_n_train) tested were 1000, 2000, 5000. Autoregressive n = 1 and 2 were tested. (Number of candles in the past to build models from.) Trading fees were set to 0%. The predictoor bot stake was set to 100 OCEAN per epoch. Other predictoors’ accuracy was set to 50.001% (barely better than random). 2. ClassifLinearLasso

Ocean Predictoor’s ClassifLinearLasso model is a Python scikit-learn logistic regression model implemented with Lasso regularization (L1).

2.1 ClassifLinearLasso Benchmarks 2.1.1 Predictoor Profitability Predictoor Profits using ClassifLinearLasso Trained on BTC-USDT DataPredictoor Profits using ClassifLinearLasso Trained on both BTC-USDT & ETH-USDT Data

Sigmoid calibration of ClassifLinearLasso generated the best results for the Predictoor bot’s profitability at 5000 training samples using either of the training sets. However, None calibration outperformed Sigmoid & Isotonic at only 1000 training samples of BTC-USDT, demonstrating that the calibrations are most effective when generalizing patterns over larger datasets.

2.1.2 Trader Profitability Trader Profits using ClassifLinearLasso Trained on BTC-USDT DataTrader Profits using ClassifLinearLasso Trained on both BTC-USDT & ETH-USDT Data

While Sigmoid calibration generated the highest Predictoor profits, indicating accurate predictions, Isotonic calibration improved trader profits better than both Sigmoid and None. This suggests that an Isotonic calibration fits a confidence-based trading model better. High accuracy is great — but it’s not enough to reliably make $ because high accuracy with low confidence can still yield trading losses.

Why can trading losses occur even when prediction accuracy is >50%?

Losses occur when the magnitude of the losses for DOWN candles is greater than the gains for the UP candles. Calibrations can differ in how they conserve losses better or net bigger gains, but adding confidence intervals to the models can help increase returns regardless of fixed or variable trade sizes.

3. ClassifLinearRidge

Ocean Predictoor’s ClassifLinearRidge model is a Python scikit-learn logistic regression model implemented with Ridge regression (L2).

3.1 ClassifLinearRidge Benchmarks 3.1.1 Predictoor Profitability Predictoor Profits using ClassifLinearRidge Trained on BTC-USDT DataPredictoor Profits using ClassifLinearRidge Trained on both BTC-USDT & ETH-USDT Data

Sigmoid calibration of the ClassifLinearRidge model generated the greatest profits and even outperformed the ClassifLinearLasso model’s max profit. None calibration at 1000 training samples also demonstrated good profitability when trained on BTC-USDT data, as was seen in the ClassifLinearLasso benchmark, but this one was significantly more profitable.

3.1.2 Trader Profitability Trader Profits using ClassifLinearRidge Trained on BTC-USDT DataTrader Profits using ClassifLinearRidge Trained on both BTC-USDT & ETH-USDT Data

Sigmoid & Isotonic calibrations of ClassifLinearRidge performed similarly in generating trader profits - both calibrations achieved trader profits above $200 USD training on either of the data sets. Isotonic calibration, however, generated the max profit of all the benchmarks when the ClassifLinearRidge model was trained using BTC-USDT data only.

4. ClassifLinearElasticNet

Ocean Predictoor’s ClassifLinearElasticNet model is a Python scikit-learn logistic regression model implemented with Elastic Net classification.

4.1 ClassifLinearElasticNet Benchmarks 4.1.1 Predictoor Profitability Predictoor Profits using ClassifLinearElasticNet Trained on BTC-USDT DataPredictoor Profits using ClassifLinearElasticNet Trained on both BTC-USDT & ETH-USDT Data

The Predictoor profit benchmark for ClassifLinearElasticNet has a very similar behavior as the ClassifLinearLasso & ClassifLinearRidge benchmarks, including the Sigmoid as the best performing calibration & the None calibration performing well at 1000 training samples of BTC-USDT data. This is somewhat to be expected because ClassifLinearElasticNet includes both L1 & L2 penalties from the Lasso & Ridge models in its formula.

4.1.2 Trader Profitability Trader Profits using ClassifLinearElasticNet Trained on BTC-USDT DataTrader Profits using ClassifLinearElasticNet Trained on both BTC-USDT & ETH-USDT Data

Isotonic calibration, as in the previous two trader profit benchmarks, generated the best profits. The consistently profitable Isotonic calibration should be benchmarked with other logistic regression models — SVC, Gaussian, Xgboost, etc. to see if it continues to generate good profits & can be a natural fit for pairing with confidence intervals in a trading optimized model.

5. Comparison Analysis 5.1 Highest Predictoor Profits Similar Performance Among All Three Models With Sigmoid Classification Using BTC-USDT DataClassifLinearLasso Model Beats Both ClassifLinearRidge & ClassifLinearElasticNet When Training on BTC-USDT & ETH-USDT Data 5.1.1 The Winning Combination for Predictoor Profit

The max Predictoor profit was 4,937.0429 OCEAN and was generated by the ClassifLinearRidge model. The model was tuned with a Sigmoid calibration, a lookback of 1 (autoregressive_n=1), and 5000 training samples (max_n_train=5000) from a data including both BTC-USDT and ETH-USDT.

In all the Predictoor profit benchmarks, Sigmoid calibration resulted in the best accuracy & Predictoor profits. The inclusion of ETH-USDT data to the training set improved profits for ClassifLinearRidge and ClassifLinearElasticNet too.

5.2 Highest Trader Profits 5.2.1 The Winning Combination for Trader Profit

The highest trading profit was $268.49 USD and was gained by the ClassifLinearLasso model. Tuning included Isotonic calibration, a lookback of 1, and 5000 training samples of only BTC-USDT data. In five of the six most profitable results, Isotonic calibration was used, so this calibration should be tested in future logistic regression benchmarks that optimize trader profit. The 6 most profitable models & tunings all clustered around $250 USD.

6. Conclusion

The ClassifLinearRidge model with Sigmoid calibration emerged as the top performer for maximizing Predictoor profits (4,937.0429 OCEAN profit) by having the highest accuracy predictions of all the models & tunings. Including ETH-USDT data in the training set further boosted profits for both ClassifLinearRidge and ClassifLinearElasticNet models, highlighting the importance of diversifying the data to improve model performance.

In contrast, the ClassifLinearLasso model with Isotonic calibration achieved the highest trader profits ($268.49 USD profit), showing that while accuracy is crucial to making $, the confidence in predictions plays an important role in trading success. Isotonic calibration provided the highest trader profits across all three models, suggesting that Isotonic calibration might be a natural fit for confidence-based trading strategies.

In sum, while Sigmoid calibration optimizes for predictive accuracy and Predictoor profits, Isotonic calibration enhances trader profits by offering better-calibrated probabilities, making it suitable for confidence-based trading models. These results encourage more Predictoor benchmarking tests of logistic regression models, especially with more iterations, to see how to make the most $.

7. Appendix: Tables 7.1 ClassifLinearLasso Data Table

The data for ClassifLinearLasso evidences that Sigmoid calibration generates the highest Predictoor profits consistently. Notably, the table also includes the maximum trader profit of $268.4489 USD in the row with Isotonic calibration, autoregressive_n = 1, and max_n_train = 5000 trained on BTC-USDT data.

7.2 ClassifLinearRidge Data Table

The data for ClassifLinearRidge also shows consistently high Predictoor profits for Sigmoid calibration, especially around 5000 training samples. The data contains the maximum Predictoor profit of 4937.0429 OCEAN in the row containing Sigmoid calibration with autoregressive_n = 1, and 5000 training samples on BTC-USDT & ETH-USDT data. Notice that this row also demonstrated a strong trading profit.

7.3 ClassifLinearElasticNet Data Table

The ClassifLinearElasticNet table demonstrates similar patterns seen in the previous two tables, but has the notable difference of generating all the 2nd best Predictoor & trader profits regardless of the training set. This is congruent with the model being balanced between both L1 & L2 penalization. The 2nd best Predictoor profit is 4,808.649 OCEAN in the row with Sigmoid calibration, autoregressive_n = 1, and 5000 training samples of BTC-USDT & USDT data. The 2nd best trader profit is $246.8255 USD in the row for Isotonic calibration, with autoregressive_n = 1, and 5000 training samples of BTC-USDT data.

8. Appendix: Details on Regularization Approaches 8.1 Lasso Regularization

Lasso is an optimization formulation to learn the parameters of a linear model. It has two parts. The first part minimizes sum-squared errors. The second part is the “L1” regularization” part. The absolute-value term has the effect of driving small weights to zero. Therefore

drives out weights of zero or near-zero. It does this by mathematically applying a penalty (L1) explained below. This helps in identifying the most relevant features and often enhancing predictive performance on new data.

8.2 Ridge Regression

Ridge regression is an optimization formulation to learn the parameters of a linear model, just as Lasso is. However, whereas Lasso drives negligible parameter values to zero, ridge regression lets most variables stick around with some weight. Ridge regression’s formulation is called L2 regularization.

The penalty is equivalent to the sum of the squared values of the coefficients, does not exclude features (as Lasso does) but shrinks all coefficients, maintaining all features in the model with smaller weights for less important ones. This approach encourages the model to prioritize features that contribute the most to the prediction, leading to better performance on unseen data. Below is the mathematical explanation for the L2 penalty.

8.3 ElasticNet

Elastic Net is an optimization formulation to learn the parameters of a linear model. Just like Lasso and Ridge Regression, its first term is to minimize least-squares error. Then its second term combines both the Lasso term (absolute values of weights) and the Ridge Regression term (square of weights). The “lambda” parameter weights the latter against the former; this means one can choose something closer to Lasso, or to Ridge Regression, or in between.

The Elastic Net penalty is defined as:

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

Predictoor Benchmarking on Regularized Linear Classifiers with Calibration was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Complying with MiCA’s stablecoin requirements using Ecosystem Monitoring

On June 30, new rules for stablecoin issuers came into effect across the European Union under the bloc’s Markets in Cryptoasset (MiCA) regulation. Stablecoin issuers must now obtain approval from EU member state supervisory authorities and meet a range of regulatory requirements prior to offering their tokens to consumers within the EU. 

On June 30, new rules for stablecoin issuers came into effect across the European Union under the bloc’s Markets in Cryptoasset (MiCA) regulation. Stablecoin issuers must now obtain approval from EU member state supervisory authorities and meet a range of regulatory requirements prior to offering their tokens to consumers within the EU. 


Tokeny Solutions

The Journey to Becoming the Leading Onchain Finance Operating System

The post The Journey to Becoming the Leading Onchain Finance Operating System appeared first on Tokeny.

Product Focus

The Journey to Becoming the Leading Onchain Finance Operating System

This content is taken from the monthly Product Focus newsletter in July 2024.

We are thrilled to share the exciting evolution of Tokeny. After years of continuous development and collaboration with asset owners and leading financial institutions across major financial hubs, our platforms and systems have significantly advanced. Today, we are proud to offer a robust and competitive DLT-based infrastructure that encompasses the entire spectrum of financial services.

As we reach this milestone, we are now focused on helping financial institutions transition to onchain finance. This means providing them with the necessary tools to conduct all their operations on a shared IT infrastructure powered by blockchain technologies.

A Complete Onchain Finance Operating System
Financial institutions have been investing significant resources for years to explore and understand the potential of blockchain technology and its application to their business use cases. Most of the time, they were not successful. They need a fully integrated onchain operating system, allowing them to quickly experiment and launch real-world applications in just a matter of days, with a proven ecosystem and use cases. Shaped by the demands of hundreds of real-life tokenization projects, our products provide exactly that.

Three Products for Businesses of All Kinds
In turn, we have developed and packaged our solutions into 3 layers to offer comprehensive solutions for businesses of all kinds:

T-REX Platform: The no-code, white-label solution that enables you to quickly launch your digital asset marketplace and compliantly manage tokenize assets. T-REX Engine: A set of onchain finance APIs that allow you to tailor and integrate the solutions for different business use into your existing systems and applications. T-REX Protocol: An advanced implementation of the open-source ERC-3643 token standard for ecosystem builders to build and enrich the ecosystem.

With these tools, you can tokenize any asset, on any EVM blockchain, tailor compliance setups to meet regulations in any jurisdiction, create custom workflows, manage tokens, serve investors, and authorize agents for corporate actions effortlessly.

Establishing An Incomparable Ecosystem
Another issue is that each stakeholder may use different service providers, such as custodian wallets. The role of the operating system is to ensure that regardless of stakeholders’ preferences, everything works seamlessly.

By partnering with over 200 service providers, we have formed an incomparable ecosystem to overcome these challenges. Our unique value proposition emerged as an onchain finance enabler for any type of business from large asset managers, fund administrators, distributors, and investment banks, to innovative entrepreneurs.

Thrive in the Onchain Era
Onchain finance represents a monumental shift in capital markets, where real-time operations and transactions have long lagged behind other sectors. Imagine a future where acquiring assets is as intuitive as shopping on Amazon, and transferring assets is as effortless as sending money via PayPal, even when you are a large and regulated financial institution.

Asset management will never be the same again. Onchain assets become smart and easy to manage. Investors receive instant, interactive, and personalized services. This marks the dawn of a truly modern era for finance. Equipped with cutting-edge tools, know-how expertise, and a complete ecosystem we are here to propel you to the forefront of onchain finance.

Xavi Aznal Head of Product Subscribe Newsletter

This monthly Product Focus newsletter is designed to give you insider knowledge about the development of our products. Fill out the form below to subscribe to the newsletter.

Other Product Focus Blogs The Journey to Becoming the Leading Onchain Finance Operating System 19 July 2024 Streamline On-chain Compliance: Configure and Customize Anytime 3 June 2024 Multi-Chain Tokenization Made Simple 3 May 2024 Introducing Leandexer: Simplifying Blockchain Data Interaction 3 April 2024 Breaking Down Barriers: Integrated Wallets for Tokenized Securities 1 March 2024 Tokeny’s 2024 Products: Building the Distribution Rails of the Tokenized Economy 2 February 2024 ERC-3643 Validated As The De Facto Standard For Enterprise-Ready Tokenization 29 December 2023 Introducing Multi-Party Approval for On-chain Agreements 5 December 2023 The Unified Investor App is Coming… 31 October 2023 Introducing WalletConnect V2: Discover the New Upgrades 29 September 2023 Tokenize securities with us

Our experts with decades of experience across capital markets will help you to digitize assets on the decentralized infrastructure. 

Contact us

The post The Journey to Becoming the Leading Onchain Finance Operating System first appeared on Tokeny.

The post The Journey to Becoming the Leading Onchain Finance Operating System appeared first on Tokeny.


PingTalk

Transformative Approaches to Reduce Identity Fraud in Banking

New technologies and architectures from Ping Identity are now available to counter traditional fraud vectors and frustrate generative and adversarial AI.

Banking fraud becomes costlier each year, and the threat of generative and adversarial AI technologies being misused adds additional approaches and sophistication of attack vectors never experienced before. Identity fraud is a major contributor to the rise in overall bank fraud, driven by many factors including an explosion in identity theft, with experts believing there is a new victim of identity theft every 22 seconds, and total fraud and identity theft cases up 47% from the previous year to $10.2 billion according to the Federal Trade Commission (FTC). Meanwhile, the Financial Crimes Enforcement Network (FinCEN) has released a Financial Trend Analysis in January 2024 that reveals approximately 1.6 million, or 42% of around 3.8 million total Bank Secrecy Act (BSA) reports, equivalent to $212 billion in suspicious activity, were related to identity.

 

These government agencies are sounding the alarm because banks and other financial institutions are increasingly challenged by sophisticated, motivated cybercriminals who are constantly finding new and creative ways to commit fraud. At the same time, customer demands mean that financial institutions are under significant pressure to provide Open Banking APIs and other new federated connections with business partners, despite the fact that this significantly increases their attack surface. 

 

Fortunately, new technologies and architectures are now available that can help banks counter the traditional attacks and future-proof against new and enhanced AI-based attacks.

 

Compromised Identity Is Central to Banking Fraud

Identity crimes often precede the many types of fraud common in banking. Whether fraudsters are aiming to open new accounts or apply for loans or new credit cards under a stolen or synthetic identity, or are seeking to gain access to existing accounts in order to make fraudulent transfers or harvest sensitive information, they must commit identity fraud first.

 

It is unsurprising, then, that the cost of identity fraud in banking as well as the volume of fraud cases related to identity continues to go up. Andrea Gacki, Director of FinCEN revealed in June 2024 some preliminary results of an early assessment of the Suspicious Activity Reports (SARs) from 2022 and 2023. Director Gacki revealed that in just two years, the percentage of the 4.7 million reported SARs tied to some impersonation, circumvention or compromise of identity has jumped from 42% (2021 assessment) to 75%. Director Gacki said, “Based on initial indications, by 2023, identity-related SARs accounted for around half of value and almost three quarters of volume.”

 

AI Has Created New Threat Vectors

The development in artificial intelligence technologies has been a book to fraudsters, who can now use generative AI to commit fraud more effectively and at scale. As just one example of how this might play out, many European banks and regulators have instituted remote video interviews as a requirement to opening a bank account. However, what our eyes see and our ears hear can no longer be relied upon thanks to generative and other AI technologies being exploited by adversaries. Rapid implementation and usage tools now available as layers on top of the AI core tech enables video and audio deep fakes to be created and injected into a digital interaction with little effort.

 

Fraud departments already struggle to keep up with the number of cases that need their attention, and AI is likely to make this problem much worse. Ping recently surveyed 700 IT decision makers from around the world about the topics of AI, fraud, and decentralized identity, and found that only 52% of respondents felt fully confident that they could detect a deepfake of their CEO. Meanwhile, AI emerged as the top area of significant concern among the professionals surveyed, and 54% of organizations admitted to being extremely concerned that AI technology would increase identity fraud.

  Digital and Open Banking Increases Attack Surface

Digital and online banking continues to increase at a rapid pace with customer demand to execute routine financial transactions driving adoption. 81% of users in the US surveyed say they have linked their bank account to third parties online. Regulation from governing authorities demanding Open Banking so as to not lock customers into one bank and enabling them to move between banks has added additional pressure. 

 

Enabling access using traditional methods like server-side APIs and federation (such as OIDC) does not lend itself to increasing security. Every time account access APIs are published for consumption by third parties or federated integrations are created between the bank and a third party, the attack surface of the bank increases, making it more vulnerable and statistically more likely to experience an attack that must be mitigated. As sophistication increases with generative and adversarial AI, securing these connections and mitigating attacks will become increasingly expensive with a higher probability of failure to mitigate.

 

New Technologies and Architectures Open Up New Protective Fronts to Fight Fraud

Fortunately, new technologies and architectures are now available that can help banks counter the traditional attacks and future-proof for the fast-approaching AI-based attacks. One such solution is the PingOne Neo product suite, which includes identity verification with liveness and data injection detection (deep fake protection), verifiable credentials, and decentralized identity and integration.

 

To see how these technologies can help, let’s examine some of the functional areas requiring protection in banking and how these new technologies can help.

Thursday, 18. July 2024

Extrimian

Essential Workflows in an SSI Ecosystem

A Focus on the Trust Triangle for Digital Identity In a previous article, Functional Analysis for Implementing Self-Sovereign Identity (SSI) in Your Business, we discussed how decentralized identity is transforming digital identity management. It offers users complete control over their personal data and emphasizes the importance of a detailed functional analysis for successful implementation. In
A Focus on the Trust Triangle for Digital Identity

In a previous article, Functional Analysis for Implementing Self-Sovereign Identity (SSI) in Your Business, we discussed how decentralized identity is transforming digital identity management. It offers users complete control over their personal data and emphasizes the importance of a detailed functional analysis for successful implementation. In this article, we will focus on the main use cases within an SSI ecosystem, exploring the interactions and roles between the key players.

What is the SSI Trust Triangle?

The trust triangle is a fundamental concept describing the trust relationship among three actors in the SSI ecosystem: the issuer, the holder, and the verifier. This trust triangle ensures that digital credentials are issued, managed, and verified securely and reliably.

Trust Triangle decentralized identity and reputation_web3_Extrimian and QuarkID Roles and Responsibilities: Issuer: The entity that issues digital credentials based on certain attributes or information of the holder and digitally signs them. Examples include universities issuing digital diplomas, governments issuing digital IDs, or companies issuing employment certificates. Holder: The person or entity that receives and possesses the digital credential. The holder stores these credentials in their identity wallet and presents them when needed. They have full control over who can view and verify their credentials, ensuring privacy and control over their identity. Verifier: The entity that verifies the authenticity and validity of the digital credential presented by the holder. They ensure that the credential was issued by a trusted issuer and that the information contained in the credential is valid. Examples include employers verifying employment certificates, airlines verifying digital passports, or financial institutions verifying customer information. Credential Issuance

Credential issuance involves the issuer and the holder and can be initiated in two ways:

1. User-Initiated Request

The holder initiates the action by requesting the issuer to generate a credential. This process can be done through an application provided by the issuer. Once the request is approved, the credential is sent to the user’s identity wallet.

2. Automatic Issuance

Automatic issuance occurs without an explicit request from the user. It happens when a specific action within a system triggers the issuance of a credential, which is then sent automatically to the holder’s identity wallet without requiring additional confirmation.

Credential Reception

The holder receives credentials either through mobile applications or web applications.

Mobile Identity Wallet

If the holder initiates the credential issuance request, the issuer’s site or application generates a credential embedded in a QR code or a deeplink.

QR Code: The holder scans the QR code with their phone’s camera or the wallet’s integrated camera. This initiates the Wallet and Credential Interactions (WACI) flow, involving a message exchange with the SSI backend. The user accepts and saves the generated credential in their wallet. Deeplink: The holder accesses the deeplink received from the issuer, automatically initiating the WACI flow. The user accepts and saves the generated credential in their wallet. Web Identity Wallet

For web wallets, the reception can be automatic. The generated credential appears directly in the wallet without needing user confirmation. This process also involves the WACI protocol.

Credential Presentation and Verification

Credential presentation involves both the holder and the verifier. The verifier can be a web or mobile application, adapting to the user’s and verification context’s needs.

Presentation Methods: 1. QR Code Scan

The verifier presents a QR code that the user scans with their device. The user selects the credential they wish to present and can choose to use Selective Disclosure, showing only the necessary data from the credential or presenting the entire credential.

2. Automatic Presentation

The user can select the credential they wish to present from their web wallet and choose the verifier to whom they wish to present it.

Verifier Validations

The verifier validates the credential, ensuring it is current, valid, and issued by an authorized issuer. After validating the credential, specific business rules of the verifier are applied. The validation results are shown to both the user and the verifier.

Conclusion

These use cases highlight the flexibility and control that decentralized identity offers, allowing users to manage their credentials securely and efficiently. Understanding these flows and the information exchange between SSI ecosystem actors is crucial to appreciating the benefits and innovation brought by this system.

For more information on Self-Sovereign Identity (SSI), use cases, applications, and industries that can implement decentralized identity systems, visit:

Self-Sovereign Identity (SSI) Use Cases Decentralized Identity Foundation

The post Essential Workflows in an SSI Ecosystem first appeared on Extrimian.


SC Media - Identity and Access

Why the AT&T breach matters – and how to respond

Here are five tips for security pros in the wake of the recent AT&T breach.

Here are five tips for security pros in the wake of the recent AT&T breach.


liminal (was OWI)

Balancing UX and Security in Customer Authentication

In the rapidly transforming digital landscape, ensuring secure and seamless customer authentication has become a critical priority for businesses across various sectors. Customer authentication is essential for optimizing user experience (UX) and security. Companies striving to implement robust customer authentication encounter significant challenges that can impact their bottom line. Outdated acc
In the rapidly transforming digital landscape, ensuring secure and seamless customer authentication has become a critical priority for businesses across various sectors. Customer authentication is essential for optimizing user experience (UX) and security. Companies striving to implement robust customer authentication encounter significant challenges that can impact their bottom line. Outdated account recovery methods, persistent reliance on passwords, and the growing threat of phishing and fraud present considerable obstacles. However, integrating advanced customer authentication solutions offers promising avenues for mitigating these issues and achieving substantial cost savings. The Importance of Customer Authentication

Customer authentication solutions are integral to regulating user access to online applications, digital resources, and transaction flows. Historically dependent on passwords and knowledge-based methods, contemporary solutions now employ various passive and active authentication techniques to confirm identities and secure login attempts. Businesses invest in these solutions to protect against account takeover attacks and to ensure that only authorized users can access their digital platforms. Furthermore, regulatory influences like the second Payment Services Directive for online transactions drive the adoption of these advanced authentication methods.

Challenges in Implementing Customer Authentication

Despite the availability of sophisticated authentication technology, businesses continue to rely on legacy solutions. Though familiar to consumers, these solutions create challenges in implementation and efficacy. Traditional account recovery methods remain costly, friction-filled, and vulnerable to phishing attacks. According to recent surveys, 59% of authentication practitioners are dissatisfied with their current account recovery capabilities, which heavily rely on passwords. This dissatisfaction leads to higher operational costs, increased call center volumes, and elevated fraud risks. Additionally, educational gaps and legacy systems hinder the widespread adoption of passwordless solutions, with 41% of businesses acknowledging these barriers.

Another significant issue is balancing user experience and security in authentication flows. While 49% of businesses prioritize enabling convenient user experiences, 51% place greater emphasis on preventing unauthorized access. This delicate balance often results in trade-offs that frustrate users and compromise security.

The complexity of the customer authentication landscape further complicates the situation. The market is crowded with over 50 companies offering various solutions, from global tech giants like Google and Microsoft to specialized vendors like 1Kosmos and Curity. Each vendor presents unique capabilities and approaches, making it challenging for businesses to select the most suitable solution. For instance, while some companies provide end-to-end authentication platforms, others focus on niche areas like passkeys, OTPs, or biometrics. 

Moreover, manual risk decisioning, idiosyncratic authorization methods, and unprotected one-time passwords (OTPs) prevent current authentication solutions from realizing their full potential. Although 93% of practitioners seek AI-based adaptive or continuous authentication capabilities, few vendors leverage AI and machine learning to produce real-time automated recommendations based on context and risk levels. Similarly, without standard protocols and frameworks, deploying customized access control policies is complex, limiting the effectiveness of fine-grain authorization capabilities.

The Cost of Inadequate Customer Authentication

Businesses also struggle with the high costs associated with not successfully authenticating customers. The average cost per successful phishing attack is $5,285, and the cost per successful telephone fraud attack is $792. Additionally, 29% of call center volumes relate to account recovery, contributing to significant operational expenses. These factors underscore the need for more robust and efficient authentication solutions.

Advanced Customer Authentication: A Path Forward

Leading customer authentication solutions offer a path forward by addressing these challenges and providing substantial benefits. By adopting advanced authentication methods like FIDO2 passkeys, standardized protocols such as OAuth 2.0 and OpenID Connect, and expanding native capabilities, businesses can enhance security and user experience.

FIDO2 Passkeys: Emerging as a phishing-resistant replacement for passwords and OTPs, FIDO2 Passkeys address account recovery challenges by ensuring the authentication process remains entirely on the user’s device. This hardware-based authentication method uses strong cryptography and biometrics, reducing the need for easily compromised static credentials.

Standardized Protocols: Standardizing authentication and authorization protocols through OAuth 2.0 and OpenID Connect promotes interoperability among customer authentication solutions. OAuth 2.0 provides secure delegated access, allowing third-party services to request resources on behalf of users without exposing credentials. OpenID Connect adds an authentication layer to verify user identity, ensuring secure and compatible integrations between service providers, identity providers, and authentication providers.

Expanding Native Capabilities: Vendors are broadening their capabilities beyond managing daily customer access by integrating fraud detection, prevention, and identity verification into their platforms. Emerging orchestration capabilities enable businesses to customize authentication flows with pre-integrated partners, starting with basic MFA and adding third-party authenticators as needed to optimize user flows.

Cost Savings and Efficiency: Businesses adopting leading customer authentication solutions can achieve significant cost savings. For instance, prevention of account resets directed to call centers can reduce call center volumes by 38%. Intuitive passwordless authentication and self-service account recovery can decrease call center labor requirements by 60%. Additionally, these solutions can reduce customer churn by 13% and successful phishing attacks by 12%.

Integrating advanced customer authentication solutions enhances security and user experience and drives substantial cost savings and operational efficiencies. As businesses navigate the complexities of the authentication landscape, embracing these innovative solutions will be crucial for staying ahead of evolving threats and maintaining competitive advantage.

Related Content:

Customer Authentication Market and Buyer’s Guide Link Index for Customer Authentication Account Takeover Market and Buyer’s Guide

The post Balancing UX and Security in Customer Authentication appeared first on Liminal.co.


Spruce Systems

Meet the SpruceID Team: Scotty Matthewman

Get to know Scotty Matthewman, Senior Designer here at SpruceID.

Name: Scotty Matthewman
Team: Design
Based in: New York City About Scotty

Applying my industrial design background to tech products, I have worked in the startup and agency worlds and, most recently, in corporate innovation. After finding my way into Blockchain tech and innovative tech in general, decentralized and digital identity became a very interesting area to explore.

I was excited to join SpruceID because I’d have the opportunity to design things that have never been designed before, and both the company and industry's values align with the impact I’d like to have: improving accessibility and inclusion through experience optimization.

Can you tell us about your role at SpruceID?

I am the Senior Designer on the team, so I work on product design, website design, brand and marketing materials, demo prototypes and videos, and communication artifacts. In addition to design work, my role involves product strategy discussions and user research.

What advice would you give to a designer who is early in their career?

I would say that being genuinely curious is necessary. You have to seek out how things work or why things are the way they are if you want to make educated decisions in your designs; otherwise, you’re just shooting in the dark.

I think the life of a professional designer is a lot different than that of a design student. Nothing is as simple as it is when you receive an assignment, so you will always have to go out and proactively understand the constraints, capabilities, business value, etc., of anything you’re working on. 

Additionally, it’s way easier to spend a lot of time on something when it’s interesting to you, so find what’s interesting, and you’re more likely to find a great path. In my opinion, every other skill in design stems from curiosity and wanting to learn.

What are you currently learning, or what do you hope to learn?

The obvious answer is all the tech behind our product. I’m not an engineer, but understanding the moving parts, at least at a high level, helps me make more educated decisions in the designs.

What has been the most memorable moment for you at SpruceID so far?

Our team offsite in Rio de Janeiro, Brazil was an awesome experience. I had just recently joined SpruceID, and because we had the majority of the team in one place, I got to know many people early on. I think in-person experiences like this can help build rapport and chemistry in a team, which can enable them to work quickly and effectively.

How do you define success in your role, and how do you measure it?

Success in my role is taking abstract problems and delivering the most intuitive, effective solutions.

These may not be official measurements, but it’s a big win when someone says, “Oh! Yeah, that would work!” because you bring a new, creative way of designing something.

You feel like you solved the puzzle. It’s very gratifying and promising when your chosen solution feels like the obvious choice.

What is your favorite part about working at SpruceID?

I have really enjoyed being on a team made up of very smart people who are all trying to make ambitious strides. It is clear how great everyone is at what they do, so I am humbled to be included in this group.

Fun Facts

What do you enjoy doing in your free time?: I like to play basketball, walk through NYC, spend time with friends, go to comedy shows, surf when I can, and take singing classes, which have been such a fun part of living in New York.

If you could be any tree, what tree would you be and why?: I would be a maple tree. I used to make furniture, and I always loved using maple. The color and grain are so pretty and clean, and it gives very clean, Nordic and Japanese vibes

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions.

Want to work with us? Check out our open roles here.


Tokeny Solutions

MOCHAX and Tokeny Partner to Provide Unprecedented Values to Equity Investment through RWA Tokenization

The post MOCHAX and Tokeny Partner to Provide Unprecedented Values to Equity Investment through RWA Tokenization appeared first on Tokeny.

Luxembourg, 18th July – MOCHAX, a leading digital asset management firm, is launching a Real World Asset (RWA) tokenization project in collaboration with Tokeny, the pioneering onchain finance operating system for tokenized securities. This initiative tokenizes the equity of the firm, bringing new levels of liquidity, accessibility, and efficiency to the equity market.

According to Pitchbook, private equity has historically outperformed public markets in return comparisons over periods ranging from 5 to 20 years. However, the traditional equity market is inaccessible and illiquid as firms generally require a large minimum investment, high transaction costs, and a long lock-up period.

RWA tokenization, the process of representing an asset on the blockchain, provides a solution to solve these challenges. MOCHAX leverages Tokeny’s white-label T-REX Platform to tokenize its equity, offering an e-commerce-like digital experience for investors. This approach simplifies the investment process and replaces traditional manual methods. By enabling 24/7 peer-to-peer automated transactions among qualified investors, MOCHAX is reshaping the equity market, making previously impossible features accessible to equity investors and increasing liquidity.

An EY survey shows that high-net-worth investors and institutional investors rank tokenized equity as the top choice among tokenized alternative assets, due to its increased liquidity, lower transaction costs, improved performance, and enhanced transparency. Since 2021, the team at MOCHAX has been at the forefront of blockchain and digital asset investments, achieving a 44X return on invested capital. To meet this demand, MOCHAX is positioning itself as the go-to platform for tokenized equity with the launch of its security token offering.

Teaming up with Tokeny unlocks a new era for us, harnessing their unmatched technical prowess to revolutionize our onchain equity capabilities. The integration of the ERC-3643 standard for RWA tokenization guarantees seamless interoperability across the entire ecosystem, eliminating the inefficiencies of isolated systems. This strategic move empowers us to stay agile, scalable, and primed for innovation, ready to seize future market opportunities with confidence. Gregory GriffithsGeneral Partner at MOCHAX Onchain finance is revolutionizing value exchange with real-time transactions, automated compliance, and seamless interoperability. Private equity will benefit immensely, transforming slow, cumbersome processes into efficient onchain operations. Early adopters like MOCHAX will gain a competitive edge. We're proud to partner with them to deliver unparalleled user experiences and drive industry change. Luc FalempinCEO Tokeny About MOCHAX

MOCHAX represents a pioneering endeavor in the realm of venture capital investment, introducing a novel approach through tokenization of real-world assets like equities. As a security token offering (STO), MOCHAX aims to democratize access to venture capital opportunities by leveraging blockchain technology to tokenize traditional venture capital assets. By transforming startup equity and tokens into tradable digital assets, MOCHAX enables investors to participate in venture capital investments with increased liquidity, transparency, and accessibility.

www.mochax.xyz | info@mochax.xyz

About Tokeny

Tokeny is a leading onchain finance operating system. Tokeny has pioneered compliant tokenization with the open-source ERC-3643 standard and advanced white-label software solutions. The enterprise-grade platform and APIs unify fragmented onchain and offchain workflows, integrating essential services to eliminate silos. It enables seamless issuance, transfer, and management of tokenized securities. By automating operations, offering innovative onchain services, and connecting with any desired distributors, Tokeny helps financial actors attract more clients and improve liquidity. Trusted globally, Tokeny has successfully executed over 120 use cases across five continents and facilitated 3 billion onchain transactions and operations.

The post MOCHAX and Tokeny Partner to Provide Unprecedented Values to Equity Investment through RWA Tokenization first appeared on Tokeny.

The post MOCHAX and Tokeny Partner to Provide Unprecedented Values to Equity Investment through RWA Tokenization appeared first on Tokeny.


Elliptic

$235 million lost by WazirX in North Korea-linked breach

Earlier today, Indian exchange WazirX suffered a major hack and a resulting loss of funds due to a suspected hack: 

Earlier today, Indian exchange WazirX suffered a major hack and a resulting loss of funds due to a suspected hack: 


Ocean Protocol

DF98 Completes and DF99 Launches

Predictoor DF98 rewards available. DF99 runs Jul 18 — Jul 25, 2024 1. Overview Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor. Data Farming Round 98 (DF98) has completed. DF99 is live today, July 18. It concludes on July 25. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.
Predictoor DF98 rewards available. DF99 runs Jul 18 — Jul 25, 2024 1. Overview

Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor.

Data Farming Round 98 (DF98) has completed.

DF99 is live today, July 18. It concludes on July 25. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.

2. DF structure

The reward structure for DF99 is comprised solely of Predictoor DF rewards.

Predictoor DF: Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn.

3. How to Earn Rewards, and Claim Them

Predictoor DF: To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs.

4. Specific Parameters for DF99

Budget. Predictoor DF: 37.5K OCEAN + 20K ROSE

Networks. Predictoor DF applies to activity on Oasis Sapphire. Here is more information about Ocean deployments to networks.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in DF: adding new streams and budget adjustments among streams.

Updates are always announced at the beginning of a round, if not sooner.

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

DF98 Completes and DF99 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 17. July 2024

HYPR

Identity Evolved: The Rise of Multi-Factor Verification

Identity verification has traditionally played an important but limited role in the world of identity and access management (IAM). To establish someone’s identity, you need to prove that they are who they say they are, linking  their digital identity to their real-world identity. For employees, this verification typically occurs during onboarding; for customers, it happens when the

Identity verification has traditionally played an important but limited role in the world of identity and access management (IAM). To establish someone’s identity, you need to prove that they are who they say they are, linking  their digital identity to their real-world identity. For employees, this verification typically occurs during onboarding; for customers, it happens when they open a new account. Once validated, they receive credentials, are granted appropriate authorizations, and enter the vast identity access flow universe — with identity verification rarely called upon again.

This system is fundamentally flawed.

Help desk social engineering, synthetic identities and AI-powered attacks are exploiting inadequate identity verification systems to completely circumvent IAM security. The $100 million attack on MGM resorts occurred when attackers impersonated an employee, convinced the IT help desk to reset credentials, and then escalated privileges until gaining control of the entire system. Just a few months later, a finance worker at a multinational firm was tricked into wiring out $25 million when cybercriminals posed as senior executives using video and audio deepfakes. In fact, 78% of organizations were targeted by identity-related attacks last year.

Unmasking Social Deception

The industry urgently needs to evolve its approach to combat these modern threats. Multi-factor verification (MFV) offers the answer. A recent article by Susan Morrow makes the case  eloquently — I highly recommend the read. Multi-factor verification moves beyond relying on authentication as the primary gatekeeper, making identity verification that uses multiple verification factors and risk assessment, an intrinsic part of daily access flows.

This transformation is the next step in identity security maturation, similar to authentication’s progression from passwords, to multi-factor authentication, to phishing-resistant MFA and passkeys. Authentication had to adapt to combat escalating phishing and password-related attacks. Multi-factor verification is essential to stem the onslaught of sophisticated social engineering threats.

Fake Passport Used to Bypass Crypto Exchange IDV System. Image Source: 404 Media

The Current State of Authentication vs. Verification

To understand what makes multi-factor verification such a powerful tool, it’s helpful to go back to IAM basics.

What Is Authentication?

In the digital world, authentication is the process of confirming the identity of a user before allowing them to access a device or account. Note that I say user, not person, because that’s what they are in this process — a user in the system. Common authentication factors are something the user knows (like a password), something the user owns (like a mobile phone or hardware security key) or something the user is (biometric data like a fingerprint). Multi-factor authentication (MFA) requires two or more factors from different categories to confirm identity.

What Is Identity Verification?

Also referred to as identity proofing, identity verification makes sure a person is who they claim to be, and that the identity is genuine. Verification can be done in person or digitally, using various methods, depending on the level of identity assurance required. Methods include location checks, comparing user-supplied person information against official databases, examining government issue documents, matching a selfie against an official ID and personal interactions, among others.

Authentication vs. Verification

In a nutshell, verification involves establishing a legitimate, proven user identity in a system. Authentication is about keeping unauthorized users out of the system.

What Is Multi-Factor Verification (MFV)?

Today, access to an organization’s systems and resources is primarily controlled by the authentication process. Yes, there are variations and layers — adaptive authentication, risk-based authentication, access controls like RBAC and PAM — but essentially the act of providing the right combination of credentials gets you through the door. Multi-factor verification (MFV) brings deeper identity verification checks and risk assessment into this daily access process.

Multi-factor verification integrates multiple verification factors dynamically and contextually throughout the user session. This approach combines continuous verification with authentication mechanisms so that you are not just validating the user, you are validating the human.

Multi-factor authentication vs. multi-factor verification

How MFV Works

Today, comprehensive identity verification checks are generally performed only at specific points in time, such as when opening a new account or beginning a job. At other critical moments, such as resetting a credential or registering a new phone, most organizations rely on knowledge-based answers or calling the helpdesk, which are notoriously vulnerable to social engineering.

Anatomy of the Help Desk Social Engineering Attack on MGM Resorts

By contrast, MFV continuously verifies the person's identity based on a combination of factors such as behavior, context, and biometrics. This dynamic verification adapts in response to behavior anomalies, device telemetry, environment and other risk signals, making it more difficult for attackers to exploit. By integrating these factors in real-time, MFV offers a secure, fast and less intrusive verification process.

Benefits of Multi-Factor Verification

Nearly 4 in 10 organizations name identity verification as a top identity security challenge. MFV addresses their pain points on multiple fronts.

Stop Social Engineering and other Identity Threats:  Last year saw a 71% increase in attacks abusing valid accounts. MFV's continuous verification significantly reduces the risk of ATO, session hijacking and other attacks. By continuously adapting to the user's behavior and context, multi-factor verification provides greater resistance to sophisticated threats, ensuring that only legitimate users can maintain access.

Improved User Experience: Most organizations struggle with real-time verification, spending more than two hours verifying identity during employee onboarding, when replacing a device, recovering an account or during other high-risk scenarios. MFV provides a seamless and less intrusive verification process, with basic checks conducted behind the scenes. Additional forms of proof are only required at times of increased risk, creating a smoother and more personalized experience.

Scalability and Flexibility: MFV is easily adaptable to different industries and use cases. Its flexibility allows integration with existing identity stacks, making it a scalable solution for organizations of all sizes.

How HYPR Uses Multi-Factor Verification

Multi-factor verification is core to HYPR’s Identity Assurance Platform. The HYPR Platform unifies  phishing-resistant passwordless authentication, adaptive risk mitigation and automated identity verification into a seamless, user-centric access flow. Organizations can easily choose and configure the identity verification processes that suit their environment and use cases. For example, secure self-service options at times of low risk, with additional steps such as live video chat in higher risk scenarios or when security anomalies are detected. They can also enforce a range of phishing-resistant authenticators including device-bound Enterprise Passkeys, hardware keys and smart cards.

Example Multi-Factor Verification Flow With HYPR

Toward Identity-Centric Security With MFV

Organizations worldwide have an identity problem. The vast majority of breaches today are related to identity issue. As Gartner’s Cybersecurity Chief of Research, Mary Ruddy, pointed out, “Digital security is reliant on identity whether we want it to be or not. In a world where users can be anywhere and applications are increasingly distributed across datacenters in the multi-cloud, identity IS the control plane.

Current access processes are no match against attackers’ nimble and incessant tactics. Initiatives like FIDO’s recently announced identity verification certification program bring critical advancement, but are just part of the answer. Multi-Factor Verification (MFV) marks a major leap forward in identity security, offering stronger protection and a better user experience. As organizations plan to build a more identity-centric security approach, it’s imperative they include MFV in their identity security protocols. Emerging technologies like decentralized identity systems hold promise for even more secure and efficient verification methods. Continuous innovation will drive MFV’s evolution, ensuring it remains a strong defense against emerging threats.


1Kosmos BlockID

Blockchain Identity Management: A Complete Guide

Introduction Traditional identity verification methods show their age, often proving susceptible to data breaches and inefficiencies. Blockchain emerges as a beacon of hope in this scenario, heralding a new era of enhanced data security, transparency, and user-centric control to manage digital identities. This article delves deep into blockchain’s transformative potential in identity verification,
Introduction

Traditional identity verification methods show their age, often proving susceptible to data breaches and inefficiencies. Blockchain emerges as a beacon of hope in this scenario, heralding a new era of enhanced data security, transparency, and user-centric control to manage digital identities. This article delves deep into blockchain’s transformative potential in identity verification, highlighting its advantages and the challenges it adeptly addresses.

What is Blockchain?

Blockchain technology represents the decentralized storage of a digital ledger of transactions. Distributed across a network of computers, decentralized storage of this ledger ensures that every transaction gets recorded in multiple places. The decentralized nature of blockchain technology ensures that no single entity controls the entire blockchain, and all transactions are transparent to every user.


Types of Blockchains: Public vs. Private

Blockchain technology can be categorized into two primary types: public and private. Public blockchains are open networks where anyone can participate and view transactions. This transparency ensures security and trust but can raise privacy concerns. In contrast, private blockchains are controlled by specific organizations or consortia and restrict access to approved members only. This restricted access offers enhanced privacy and control, making private blockchains suitable for businesses that require confidentiality and secure data management.

Brief history and definition

The concept of a distributed ledger technology, a blockchain, was first introduced in 2008 by an anonymous entity known as Satoshi Nakamoto. Initially, it was the underlying technology for the cryptocurrency Bitcoin. The primary goal was to create a decentralized currency, independent of retaining control of any central authority, that could be transferred electronically in a secure, verifiable, and immutable way. Over time, the potential applications of blockchain have expanded far beyond cryptocurrency. Today, it is the backbone for various applications, from supply chain and blockchain identity management solutions to voting systems.

Core principles

Blockchain operates on a few core principles. Firstly, it’s decentralized, meaning no single entity or organization controls the entire chain. Instead, multiple participants (nodes) hold copies of the whole blockchain. Secondly, transactions are transparent. Every transaction is visible to anyone who has access to the system. Lastly, once data is recorded on a blockchain, it becomes immutable. This means that it cannot be altered without altering all subsequent blocks, which requires the consensus of most of the blockchain network.

The Need for Improved Identity Verification

Identity verification is a cornerstone for many online processes, from banking to online shopping. However, traditional methods of identity verification could be more challenging. They often rely on centralized databases of sensitive information, making them vulnerable to data breaches. Moreover, these methods prove identity and often require users to share personal details repeatedly, increasing the risk of data theft or misuse.

Current challenges in digital identity

Digital credentials and identity systems today face multiple challenges. Centralized systems are prime targets for hackers. A single breach can expose the personal data of millions of users. Additionally, users often need to manage multiple usernames and passwords across various platforms, leading to password fatigue and increased vulnerability. There’s also the issue of privacy. Centralized digital identities and credentials systems often share user data with third parties, sometimes without the user’s explicit consent.


Cost of identity theft and fraud

The implications of identity theft and fraud are vast. It can lead to financial loss, credit damage, and a long recovery process for individuals. For businesses, a breach of sensitive information can result in significant financial losses, damage to business risks, reputation, and loss of customer trust. According to reports, the annual cost of identity theft and fraud runs into billions of dollars globally, affecting individuals and corporations.

How Blockchain Addresses Identity Verification

 

Blockchain offers a fresh approach to identity verification. By using digital signatures and leveraging its decentralized, transparent, and immutable nature, blockchain technology can provide a more secure and efficient way to verify identity without traditional methods’ pitfalls.

Decentralized Identity

Decentralized identity systems on the blockchain give users complete control over their identity data. Users can provide proof of their identity directly from a blockchain instead of relying on a central authority to keep medical records and verify identity. This reduces the risk of a centralized data breach and gives users autonomy over their identities and personal data.

Transparency and Trust

Blockchain technology fosters trust through transparency, but the scope of this transparency varies significantly between public and private blockchains. Public blockchains allow an unparalleled level of openness, where every transaction is visible to all, promoting trust through verifiable openness. On the other hand, private blockchains offer a selective transparency that is accessible only to its participants. This feature maintains trust among authorized users and ensures that sensitive information remains protected from the public eye, aligning with privacy and corporate security requirements.

Immutability

Once identity data is recorded on a blockchain, it cannot be altered without consensus. This immutability of sensitive, personally identifiable information ensures that identity data remains consistent and trustworthy. It also prevents malicious actors from changing identity data for fraudulent purposes.

Benefits of Blockchain Identity Verification

 

Blockchain’s unique attributes offer a transformative approach to identity verification, addressing many of the challenges faced by the traditional identity systems’ instant verification methods.

Enhanced Security

Traditional identity verification systems, being centralized, are vulnerable to single points of failure. If a hacker gains access, the entire system can be compromised. Blockchain, with its decentralized nature, eliminates this single point of failure. Each transaction is encrypted and linked to the previous one. This cryptographic linkage ensures that even if one block is tampered with, it would be immediately evident, making unauthorized alterations nearly impossible.

User Control

Centralized identity systems often store user data in silos, giving organizations control over individual data. Blockchain shifts this control back to users. With decentralized identity solutions, individuals can choose when, how, and with whom they share their personal information. This not only enhances data security and privacy but also reduces the risk of data being mishandled or misused by third parties.

Reduced Costs

Identity verification, especially in sectors like finance, can be costly. Manual verification processes, paperwork, and the infrastructure needed to support centralized databases contribute to these costs. Blockchain can automate many of these processes using smart contracts, reducing the need for intermediaries and manual interventions and leading to significant cost savings.

Interoperability

In today’s digital landscape, individuals often have their digital identities and personal data scattered across various platforms, each with its verification process. Blockchain can create a unified, interoperable system where one’s digital identity documents can be used across multiple platforms once verified on one platform. This not only enhances user convenience but also streamlines processes for businesses.

The Mechanics Behind Blockchain Identity Verification

Understanding its underlying mechanics is crucial to appreciating the benefits of the entire blockchain network’s ability for identity verification.

How cryptographic hashing works

Cryptographic hashing is at the heart of the blockchain network’s various security measures. When a transaction occurs, it’s converted into a fixed-size string of numbers and letters using a hash function. This unique hash is nearly impossible to reverse-engineer. When a new block is created, it contains the previous block’s hash, creating a blockchain. Any alteration in a block changes its hash, breaking the chain and alerting the system to potential tampering.

Public and private keys in identity verification

Blockchain uses a combination of public and private keys to ensure secure transactions. A public key is a user’s address on the blockchain, while a private key is secret information that allows them to initiate trades. Only individuals with the correct private key can access and share their data for identity verification, ensuring their data integrity and security.

The role of consensus algorithms

Consensus algorithms are protocols that consider a transaction valid based on the agreement of the majority of participants in the network. They play a crucial role in maintaining the trustworthiness of the blockchain. In identity verification, consensus algorithms ensure that once a user’s identity data is added to the blockchain, it’s accepted and recognized by the majority, ensuring data accuracy and trustworthiness.

Conclusion

Through its unique attributes, blockchain presents a compelling and transformative alternative to the pitfalls of conventional identity management and verification systems. By championing security, decentralization, and user empowerment, it sets a new standard for the future of digital and blockchain identity and access management solutions. To understand how this can redefine your identity management and verification processes, book a call with us today and embark on a journey toward a more secure security posture.

The post Blockchain Identity Management: A Complete Guide appeared first on 1Kosmos.


auth0

Use Private Key JWTs to Authenticate Your .NET Application

Add Private Key JWT authentication to your .NET application to empower security in sensitive contexts.
Add Private Key JWT authentication to your .NET application to empower security in sensitive contexts.

liminal (was OWI)

Navigating the Account Takeover Threat Landscape: Prevention Strategies for Phishing and Social Engineering

The post Navigating the Account Takeover Threat Landscape: Prevention Strategies for Phishing and Social Engineering appeared first on Liminal.co.

Indicio

Digital ID and SSI to cause radical shift in travel experience by 2035

SITA The post Digital ID and SSI to cause radical shift in travel experience by 2035 appeared first on Indicio.

UNISOT

Protecting Olive Oil Authenticity with UNISOT

In light of the recent significant seizure of counterfeit olive oil by Italian authorities, the need for robust traceability and authenticity in the olive oil industry has never been more critical. The post Protecting Olive Oil Authenticity with UNISOT appeared first on UNISOT.

In light of the recent significant seizure of counterfeit olive oil by Italian authorities,  the need for robust traceability and authenticity in the olive oil industry has never been more critical. Italian authorities confiscated nearly €900,000 worth of fake extra virgin olive oil – that’s about 42 ton – highlighting the pervasive issue of olive oil fraud.

“Some of the 42 tons of oil was already packaged ready for sale. Authorities confiscated 71 tons of what was referred to as an “oily substance” in plastic tanks, as well as 623 liters of chlorophyll, a component of extra virgin olive oil that was being added to oil of a lesser value.

They found packaging equipment, labels purporting that the oil was “extra virgin” when it was clearly not, and commercial documentation including 1,145 customs excise duty stamps that are being studied for forgery, the statement said.
– Barbie Nadeau – CNN”

This fraudulent activity not only undermines consumer trust but also poses serious health risks. UNISOT’s Asset Traceability Platform can play a crucial role in preventing such fraudulent activities.

One of our AgriOnChain customers is Az. Agricola Francesco Pepe. Francesco has successfully implemented UNISOT’s Digital Product Passports to secure his supply chain. This implementation has been instrumental in proving and maintaining his reputation for producing exceptional olive oil, winning numerous awards, and achieving recognition in the Olive Oil Bible FlosOlei. By using UNISOT, he has ensured that his products remain authentic and traceable, reinforcing consumer confidence and loyalty.

Feel free to scan the QR code on the image, which will take you directly to the Digital Product Passport for this Erede Extra Virgin Olive Oil.

“AgriOnChain’s innovative traceability solutions have enabled me to share the story of my premium Italian olive oil with the world. With Digital Product Passports and Smart QR-codes, I can highlight the authenticity, quality, and sustainability of my products to customers everywhere. Through this partnership, I have not only expanded my market reach but also deepened the connection with my customers, creating a community that values the traditions and principles I hold dear. UNISOT’s AgriOnChain technology empowers me to focus on what I do best – producing exceptional olive oil – while they handle the technology seamlessly behind the scenes.” – Francesco Pepe, Az. Agricola Francesco Pepe

How UNISOT’s Asset Traceability Platform can help

COMPREHENSIVE TRACEABILITY

UNISOT’s Platform ensures complete traceability of olive oil from the grove to the consumer. Every step of the production process, including harvesting, pressing and bottling, is securely recorded, digitally signed and timestamped on the blockchain. This transparency ensures that every bottle of olive oil is traceable back to its origin, making it extremely difficult for counterfeit products to infiltrate the supply chain.

PRODUCT AUTHENTICATION

Utilizing unique digital identities and QR codes, UNISOT allows consumers to verify the authenticity of Agricola Francesco Pepe’s olive oil instantly. By scanning the QR code on the bottle, consumers can access detailed information about the product’s origin, production process and quality certifications.

SUPPLY CHAIN INTEGRITY

Our platform enhances the integrity of the supply chain by enabling real-time monitoring and alerts for any anomalies. This capability helps in early detection of potential fraud and ensures that only high-quality olive oil reaches the market.

CONSUMER CONFIDENCE

With increasing incidents of food fraud, consumer trust is paramount. AgriOnChain not only protects the reputation of premium producers like Agricola Francesco Pepe but also assures consumers of the product’s authenticity and superior quality. This trust is vital for sustaining and growing market share in a highly competitive industry.

COMBATING DOCUMENTATION FRAUD

In addition to ensuring product authenticity, AgriOnChain can also help prevent fraud related to commercial documentation. The recent case involving 1,145 customs excise duty stamps suspected of forgery underscores the need for secure and verifiable documentation. By leveraging UNISOT’s Secure Document Collaboration solution, every document related to the production, distribution and sale of olive oil can be securely recorded and verified. This reduces the risk of forgery and ensures that all commercial documents, including excise duty stamps, are legitimate and traceable.

The recent olive oil fraud cases underscore the urgent need for effective measures to safeguard the authenticity of olive oil. UNISOT’s AgriOnChain provides a comprehensive solution to combat fraud, enhance traceability and build consumer trust. As we continue to support high-quality producers like Francesco Pepe, we are committed to ensuring that consumers receive genuine, high-quality olive oil.

We can turn the tide against olive oil fraud and ensure a future where consumers can enjoy genuine, high-quality olive oil with confidence. For more information on how UNISOT can help protect your olive oil brand, visit our website or contact our sales team.

Sources:

The Grocer: https://www.thegrocer.co.uk/commodities/dozens-of-tonnes-of-fake-olive-oil-confiscated-by-italian-authorities/693330.article
Food Safety News: https://www.foodsafetynews.com/?s=olive+oil

The post Protecting Olive Oil Authenticity with UNISOT appeared first on UNISOT.


Ontology

Ontology Weekly Report (July 9th — July 15th, 2024)

Ontology Weekly Report (July 9th — July 15th, 2024) Welcome to another edition of the Ontology Weekly Report. This week, we’re excited to share updates on our development milestones, new product features, and active community engagements. Here’s a detailed look at our activities this week: Latest Developments Decentralized Identity Discussion at MPost’s Hack Seasons: Geoff and Humpty
Ontology Weekly Report (July 9th — July 15th, 2024)

Welcome to another edition of the Ontology Weekly Report. This week, we’re excited to share updates on our development milestones, new product features, and active community engagements. Here’s a detailed look at our activities this week:

Latest Developments Decentralized Identity Discussion at MPost’s Hack Seasons: Geoff and Humpty delivered an insightful talk on decentralized identity during MPost’s Hack Seasons, discussing the impact and future of DID in the blockchain space. DXSALE Giveaway: We partnered with DXSALE for an exciting giveaway, providing our community members with more opportunities to engage and win rewards. Development Progress Ontology EVM Trace Trading Function Completion: We’re thrilled to announce that the Ontology EVM Trace Trading Function is now fully operational at 100%, enhancing our trading capabilities within the EVM significantly. Added API for ONT RPC Trading: A new API for ONT RPC trading has been added, improving functionality and accessibility for developers and users. ONT Leverage Staking Design: Development progress has reached 60%, moving us closer to providing innovative and flexible staking options for our community. Product Development New Feature in ONTO: A new feature has been launched in the ONTO app, allowing users to manage their nodes directly within the app, streamlining node operations and user interactions. Ongoing ONTO Giveaways: The giveaway with Fiat24 continues. Don’t miss out on the chance to participate and win exciting prizes. On-Chain Activity Stable dApp Ecosystem: The number of dApps on our MainNet remains robust at 177. Transaction Growth: This week, there was an increase of 1,828 dApp-related transactions, totaling 7,775,343, and an overall increase of 8,277 transactions on MainNet, reaching 19,498,560. Community Growth Engaging Community Discussions: Our community platforms, including Twitter and Telegram, have been buzzing with lively discussions and updates. This week’s Telegram discussion led by Ontology Loyal Members focused on “Exploring Interoperable DID Solutions: Web2, Web3, and Beyond,” covering topics like KYC, login systems, and peer-to-peer interactions. Stay Connected 📱

Engage with us and stay updated on the latest happenings by following our social media channels. Your participation and feedback are invaluable as we continue to advance the blockchain and decentralized identity landscapes.

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Thank you for your ongoing support and engagement. Together, we are paving the way for a more secure and decentralized future. Stay tuned for more updates and developments next week!

Ontology Weekly Report (July 9th — July 15th, 2024) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

Are pig butchering scammers using AI? Here’s what the latest trends show

Recently, Elliptic published a report into pig butchering and the role of Cambodia-based Huione Guarantee marketplaces in selling a range of tools for scammers. Among the tools on sale was AI face-changing software – marketed for numerous live chat platforms and messaging apps. Elliptic has conducted separate research into deepfakes and their role in the crypto romance scams that suc

Recently, Elliptic published a report into pig butchering and the role of Cambodia-based Huione Guarantee marketplaces in selling a range of tools for scammers. Among the tools on sale was AI face-changing software – marketed for numerous live chat platforms and messaging apps.

Elliptic has conducted separate research into deepfakes and their role in the crypto romance scams that such groups are notorious for facilitating. The good news is that, despite their experimentation with AI, the actual use thereof currently appears to be low. Elliptic has not identified the use of AI tools to enhance on-chain obfuscation of pig butchering proceeds, although there is a small but growing indication of AI-generated content being used to initiate these scams.


Dock

Community AMA: Binance Delisting and Future Plans for DOCK

Hey everyone, For those who missed the AMA on July 15th with Nick Lambert (CEO) and Elina Cadouri (COO), here are the key highlights: 1. Nick and Elina expressed their heartfelt gratitude to the community for their support, especially during the recent challenges with the Binance delisting. They emphasized how

Hey everyone,

For those who missed the AMA on July 15th with Nick Lambert (CEO) and Elina Cadouri (COO), here are the key highlights:

1. Nick and Elina expressed their heartfelt gratitude to the community for their support, especially during the recent challenges with the Binance delisting. They emphasized how much the community's loyalty means to them.

2. Binance Delisting: What happened with the Binance delisting?

DOCK was delisted from Binance without any prior warnings and only 7 days after being added to the Monitoring List. Even though Binance did not provide one specific reason for adding DOCK to the Monitoring List, the team believes it could have been due to trading volume or liquidity given that all of the other reasons do not apply to the project. Dock had already been in the process of hiring a  Market Maker to improve the liquidity aspect who began working on this immediately. But Binance, without warning, delisted DOCK after only a week whereas projects are typically given months or even a year to work on removing the monitoring tag Despite this setback, the fundamentals of DOCK remain strong, and the team is committed to moving forward.

3. How is the team ensuring the stability and growth of the project?

We increased validator rewards to support and encourage network stability. Engaged a regulated Market Maker to ensure liquidity for the DOCK token. We are exploring additional exchange listings to broaden DOCK’s availability and reach. Dock’s platform currently serves enterprise clients and client acquisition remains a top priority.

4. Strategy and Goals: What are the main goals and strategies for the next year?

Continuing to execute the published roadmap with exciting new developments. Filed a patent for "Verifier Pay Issuer", a feature that enables issuers to charge for the verification of a credential, showcasing DOCK’s innovative approach to decentralized identity. Several partnership announcements are anticipated, which will drive adoption and demand for the DOCK token.

5. Community: How can the community help support DOCK?

Community members can share and amplify DOCK’s social media content to create awareness. It is important to keep positive engagement and constructive feedback. Dock remains transparent and accessible for any concerns or questions from the community.

6. Roadmap for 2024: What can we look forward to in the second half of 2024?

Dock will develop the ability to verify eIDAS 2.0 and mDL credentials, launch a Cloud Wallet Beta, integrate Biometric-bound credentials, launch an Embeddable Wallet SDK, and roll out support for the OpenID4VC standard.

7. Adoption and Token Demand: Why is client acquisition important for DOCK?

Every transaction on the DOCK network uses DOCK tokens, driving demand as more companies adopt the technology. DOCK is working to ensure that even companies without technical expertise can easily integrate and benefit from their innovative solutions.

8. Audience questions: Are team members personally invested in DOCK?

Yes, several team members hold DOCK in their personal portfolios and has received tokens as part of their compensation.

9. When will DOCK Wallet staking be available?

DOCK Wallet staking will be launched soon, with ongoing efforts to integrate with Nova Wallet.

10. Is there a chance of being relisted on Binance?

The likelihood of being relisted on Binance is low. The team is focusing on other growth opportunities and new exchange listings.

Thank you all for your continued support. We are excited about the future and look forward to sharing more updates soon!

You can watch the entire AMA here: https://youtu.be/mfn6jVKaN60


Shyft Network

The Rising Focus on L2 Solutions in the Crypto Ecosystem

Layer 2 solutions enhance scalability by resolving the high fees and slow processing times associated with Layer 1 blockchains like Bitcoin and Ethereum. By bundling transactions, L2s significantly boost transactions per second, improving speed and reducing costs. Increasing cryptocurrency adoption is driving L2 innovation and investment, leading to diverse technologies and substantial cap
Layer 2 solutions enhance scalability by resolving the high fees and slow processing times associated with Layer 1 blockchains like Bitcoin and Ethereum. By bundling transactions, L2s significantly boost transactions per second, improving speed and reducing costs. Increasing cryptocurrency adoption is driving L2 innovation and investment, leading to diverse technologies and substantial capital inflow.

Time and again, we have seen popular L1 networks — from Bitcoin and Ethereum to Solana — have been clogged with pending transactions, especially during periods of high activity. This not only leads to a significant increase in transaction fees but also prevents users from capitalizing on opportunities in time.

Such occurrences show us that speed and cost remain the biggest technical challenges in the crypto world. The root cause is the blockchain trilemma, which involves trade-offs among the technology’s three most critical aspects: decentralization, security, and scalability.

Layer 1 blockchains prioritize security and decentralization, achieved through a distributed, global network of participants. However, there’s also a downside to Layer 1 blockchain, as they often face scalability issues.

At the largest annual European Ethereum event, the Ethereum Community Conference (EthCC), Ethereum co-founder Vitalik Buterin highlighted Ethereum’s limitations, including its struggle to handle high volumes of transactions. This issue often results in increased fees and delays. He also pointed out the complexities newcomers face when interacting with decentralized applications (dApps) and the challenges associated with becoming a network validator.

To resolve these issues, scalability in particular, developers have taken to building layer 2 solutions.

Layer 2 blockchains are off-chain solutions built on top of L1s. Some of the current popular L2 solutions are Lightning Network, Stacks, Merlin Chain for Bitcoin and Optimism, Arbitrum, Base, and zkSync for Ethereum, to name a few.

Unlike L1, where every transaction has to go through the distributed network for processing and broadcasting, L2s take the load off by performing most of the functions off-chain. This is how popular payment platforms like Visa work. Instead of managing thousands of daily transactions separately, which ends up clogging the network, they batch the transactions for final settlement.

Similarly, L2 solutions offload the burden of managing thousands of transactions from the mainnet. To achieve this, L2s bundle a large number of transactions into a single transaction, which increases the throughput, i.e., transactions per second (TPS). For instance, Bitcoin has a TPS of 5 while Ethereum has 7, and in comparison, the L2 solution boasts tens of thousands in TPS.

Higher throughput helps increase speed and lower fees on these layer 2 solutions. Higher TPS and lower fees improve user experience and enhance the utility. Meanwhile, by settling transactions on the mainnet, they also retain security and decentralization.

L2s, however, aren’t of just one type. They utilize different technologies. Rollups tech is a popular one where transactions are executed off L1 and then rolled into a single piece of data before it gets posted back to the mainnet, where it is reviewed. There are even variations to rollups, such as Optimism and ZK rollups. Then, there are sidechains that work as independent blockchains and run parallel to the main blockchain. To interact with L1, sidechains utilize bridges.

A Massive Wave of L2s

With crypto adoption rising significantly, the need for greater TPS is more important than ever. As of 2024, over half a billion users currently own crypto, and this number is projected to double by the end of this decade.

So, the greater the number of crypto users, the higher the number of transactions happening daily, and the greater the need for higher network capacity. Hence, there is an increasing need for and interest in L2s.

Today, there are over a hundred projects working on enabling improved scalability if we go by Coingecko’s data alone. Top L1 coins are worth $1.8 trillion, with Bitcoin alone accounting for $1.16 trillion of it. L2 coins, meanwhile, have a combined market cap of almost $20 bln. To go further into it, top sidechain coins are worth $1.43 bln, while top Bitcoin sidechains have a $2.6 bln collective market size. Top rollup coins, on the other hand, have a $11.8 bln market cap.

According to L2Beat.com, more than $40 billion worth of capital is locked (TVL) across L2 projects.

The data clearly shows that a lot is happening in the L2 space, but this is just the beginning. As their usage and capital inflow continue to surge, many exciting things are coming up.

Some of the exciting developments currently happening in the sector include L2 network Starknet introducing staking on its ecosystem before the year is over. For scalability, it produces STARK proofs off-chain and then sends them on-chain. In the future, Starknet users will be able to lock their tokens for a 21-day period and earn rewards in proportion to the STRK tokens staked. Its CEO, Eli Ben-Sasson, called this “an important step in building the staking community and technology, offering new opportunities for users and developers.”

Popular Bitcoin L2 Stacks is currently preparing for a big upgrade called Nakamoto to honor the trillion-dollar crypto asset’s pseudonymous creator. With this upgrade, the L2 solution aims to decouple the Stacks block production schedule from that of Bitcoin to solve the congestion issues.

Hong Kong’s licensed crypto exchange operator, HashKey Group, is also planning to launch its Ethereum layer-2 solution, HashKey Chain, in Q4. Even meme coins like Shiba Inu have launched their very own L2 called Shibarium to handle a greater number of users and bring additional value to their ecosystem.

Then, new waves of L2s are entering the space. Blockchain platform Celo is launching its Dango Layer 2 testnet, for which it is utilizing Optimism’s OP Stack. Bitcoin and Ethereum-powered hybrid L2 project BOB raised $1.6mln in a funding round led by Ledger Cathay Fund and contributions from BlackRock, Rarible, Ordinals, Aave, Curve, Magic Eden, Mechanism, Injective, and Babylon.

While Solana boasts a high TPS, projects like Rome are raising funds from Polygon Ventures, HashKey, and angel investors, including Solana’s Anatoly Yakovenko and Austin Federa, to allow Ethereum-based rollups to use Solana as a shared sequencer.

Given L2’s focus on allowing higher throughput and, as a result, greater transaction inclusion, it makes sense everyone is onboarding the L2 train. However, it’s important that equal efforts are being made to attract users to engage on these platforms. For that, we need to focus on simplifying user onboarding and providing a more seamless user experience.

About Shyft Network

Shyft Network powers trust on the blockchain and economies of trust. It is a public protocol designed to drive data discoverability and compliance into blockchain while preserving privacy and sovereignty. SHFT is its native token and fuel of the network.

Shyft Network facilitates the transfer of verifiable data between centralized and decentralized ecosystems. It sets the highest crypto compliance standard and provides the only frictionless Crypto Travel Rule compliance solution while protecting user data.

Visit our website to read more, and follow us on X (Formerly Twitter), GitHub, LinkedIn, Telegram, Medium, and YouTube. Sign up for our newsletter to keep up-to-date on all things privacy and compliance.

The Rising Focus on L2 Solutions in the Crypto Ecosystem was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

How to harness the data-sharing capabilities of the private sector, with Lloyd Emmerson.

We sit down with the Director of Strategic Solutions at Cifas – the UK’s largest not-for-profit fraud prevention service – to discuss what changes the company would make to the UK’s fraud strategy, the importance of a whole-of-system solution to a whole-of-system problem and much more. Beginning on a personal note, you seemed to have […]
We sit down with the Director of Strategic Solutions at Cifas – the UK’s largest not-for-profit fraud prevention service – to discuss what changes the company would make to the UK’s fraud strategy, the importance of a whole-of-system solution to a whole-of-system problem and much more. Beginning on a personal note, you seemed to have spent most of your career in fraud prevention in some capacity. What was it that first attracted you to the industry? 

What attracted me to the fraud prevention community initially was the rewarding nature of keeping people safe and, more importantly, trying to stay one step ahead of an adversary that has no moral boundaries. Feeling like you have really made a difference to people’s lives at the end of each working day is what gets me out of bed in the mornings too. 

Perhaps a good place to get into this is with the UK’s fraud strategy, released in 2023. Do you think the earmarked £100 million and 400 new specialist fraud officers is enough to help reduce fraud by 10% by 2025? How likely do you think it will be to achieve? 

Fraud represents almost 40% of all crime in England and Wales and has more than doubled in Scotland over the past nine years. It’s the most prevalent crime in the UK that devastates individuals financially and emotionally, damages business reputations, and targets the public purse meaning wider communities miss out on critical resources and support. 

While the publication of the government’s Fraud Strategy in 2023 was a positive first step, we must do more to turn the tide on the fraud epidemic sweeping the UK. 

We want a future without fraud so welcome the additional investment and extra resources. However, we also recognize we can’t arrest our way out of the problem. The focus needs to be on prevention – that means clear focus on intelligence-led responses that go beyond traditional policing and builds on the premise of organizations working together. 

What would you say are the most common forms of fraud (e.g. money mules, social engineering fraud) in the UK? 

The latest data recorded by our 750-plus membership to the Cifas National Fraud Database (NFD) revealed that account takeover – where a criminal utilizes compromised personal data to hijack an existing account or product – rose by 13% in 2023 compared to 2022. Additionally, we saw a 5% increase in account abuse (often referred to as ‘misuse of facility’). Identity fraud also remained our most dominant case type, accounting for 64% of all 374,000-plus NFD cases. 

Filings to the Cifas Insider Threat Database continued to increase and was up 14% in 2023 compared to the year before. Just under half of these (49%) related to dishonest action by employees.

UK Fraud Awareness Report 2024 Learn more about the British public’s awareness of fraud and their attitudes toward fraud-prevention technology. Get your free copy On May 15, Cifas delivered its ‘Fraud Pledges 2024’ to Number 10 Downing Street. What areas of the government’s approach to tackling fraud do you think need to be improved? What was the reaction and have any next steps been agreed? 

As our Cifas Fraud Pledges make clear, there is no silver bullet for tackling fraud. It is a whole-of-system problem requiring a whole-of-system solution. However, we think a good starting point for the next government would be to appoint a Minister for Economic Crime to drive proper cross-system leadership on this issue. 

Beyond this, it is essential we invest properly in fraud policing and the criminal justice response as well as harness the capabilities of the private sector, through enhanced data-sharing, to disrupt fraud and financial crime and act as the first line of defense. 

Cifas is advocating for social media companies to collaborate more to combat fraud. As social media fraud is such a huge arena, spanning almost every platform, from job search platforms to dating platforms, which parties do you think should be collaborating? And, as many platforms offer a very light-touch identity verification process, how important is it to ensure all users are required to undergo a thorough customer onboarding process? 

No one sector can single-handedly tackle fraud. It is an issue which cuts through industries and across the public and private sectors. The only way we can tackle the problem is by breaking down cross-sector barriers, finding meaningful ways to collaborate, and sharing data and intelligence to ensure there is no weak link in the chain. 

Additionally, it is essential that the biggest online platforms and services most abused by criminals join the counter-fraud community and multi-sector data-sharing initiatives, and including where appropriate, to ensure robust customer screening. 

With most fraud in the UK happening online, across every medium and industry, how can people and companies best protect themselves? 

At Cifas, our whole purpose is to eradicate fraud in the UK. To do so, we must build products and services that help organizations, and their customers fight economic crime and protect themselves against fraud. 

We’re rolling out several preventative solutions to help businesses scale their counter-fraud efforts and keep people safe. One currently being developed is a consumer-based app that proactively protects consumers from identity fraud and puts them in complete control of their personal information, effectively making stolen data worthless. It’s in beta testing and on track to launch in 2025. 

More generally, people and companies must work collaboratively to gain maximum protection against the threats of fraud because criminals will always find new ways to exploit weaknesses, particularly when they’re able to hide in relative anonymity online. We would always urge individuals and companies to stay vigilant and think if something seems too good to be true, it probably is.

What dangers does AI pose to UK’s fraud landscape? What steps can business, government and the local public take to weather the storm? 

Technology and AI are going to be critical in how we deal with the threat posed by criminals’ use of AI and other technologies – it presents both opportunities and challenges to the UK’s fraud landscape. 

Typically, criminals exploit weaknesses and rely on panic and urgency to get personal information and/or financial details. Both businesses and consumers should take a moment and challenge where a piece of communication has come from. Where possible, getting a second opinion is incredibly important and if something doesn’t look right, report it at the earliest opportunity. 

Overall, eradicating fraud is not down to one individual, industry, the government or law enforcement – it’s a collective effort. Our mission at Cifas is to create the largest counter-fraud community that shares data, intelligence and knowledge – all of which can be used to create products and services that protect everyone. To do so, we want to be a central component of a new and more effective UK-wide, data-sharing architecture that unlocks cross-sector collaboration across the public, private and law enforcement domains. 

How have fraud attacks changed over the last 20 years and how do you envision it changing in the next 20? 

Today, you no longer need a complex piece of malware to trick a consumer into handing over key personal data. All that’s required is a mobile phone, a few vague details about an individual and some social engineering for an attack to be successful time and time again. 

I feel that fraud will evolve with AI over the next 20 years in such that the AI component, whatever that may be, will automate a lot of the attack vectors we see today and make them even more scalable and lucrative for criminals to execute. 

Again, it comes back to cross-sector collaboration – if we create the right data flows, safeguards, and frameworks to share real-time risk data, we can defeat fraud.

If you’re interested in more insights from industry insiders and thought leaders from the world of fraud and fraud prevention, check out one of our interviews from our Spotlight Interview series below.

Jinisha Bhatt, financial crime investigator
Paul Stratton, ex-police officer and financial crime trainer
David Birch, global advisor and investor in digital financial services

By

Jody Houton
Senior Content Manager at IDnow
Connect with Jody on LinkedIn

Tuesday, 16. July 2024

SC Media - Identity and Access

Squarespace botch of Google handover leads to domain hijack

A hiccup in the handover of former Google Domains has lead to a rash of site takeovers.

A hiccup in the handover of former Google Domains has lead to a rash of site takeovers.


Indicio

Digital travel and decentralized identity reveal the value of simplicity

The post Digital travel and decentralized identity reveal the value of simplicity appeared first on Indicio.
The ability to seamlessly share and trust information is the key to a new era of safe, secure travel.

By Heather Dahl

Technology transformations start slowly and then seem to go so fast that they rapidly become just how things are done almost without noticing. When friction dissipates, we rapidly cross into a new normal. This is the case with decentralized identity. For years, people talked about what could be done with the technology, now companies are implementing it to drive digital transformation, with air travel leading the way.

Over the coming weeks, our partner SITA — the world’s leading specialist in air transport communications and information technology — will explain how decentralized identity is driving this transformation in digital travel, from border crossing to passenger loyalty programs to baggage and beyond.

Here, I’d like to give a brief overview of the technology. Most people have no idea what decentralized identity means let alone what it can do and why it will transform how we share data, prove our identities, enable privacy, do business, and — of course — travel.

Much like our physical landscape of roads, rail lines and rivers, we connect, digitally, through an infrastructure of APIs, databases, emails, SMS, and webpages. To share information, we need a way to travel from one point to another and an identity, like a passport, to prove who we are when we get there.

But our digital journeys are filled with friction and danger. Digital identity is easily faked, passwords are guessed, access credentials are stolen, databases are plundered, connections between disparate systems are complex and costly to create, you can’t be sure that information hasn’t been altered, and you don’t know who is tracking you or what they’re doing with your information. Security solutions like multifactor authentication and privacy solutions like GDPR just add more complexity, cost, and friction — and they don’t actually solve the underlying problems.

Our current system relies on checking in with a source to establish proof of identity or the veracity of data. You have multiple digital identities held as accounts accessed and verified by logins and passwords: multiple accounts for airlines, booking sites, hotels, event ticketing — in addition to employment, school, social services, utilities, ecommerce, and social media. The assumption is that when you need to use these services, you can prove you created your account. Sometimes, you can use one account for several services (as in social media logins), but all these accounts store personal data about you, making identity theft and fraud a constant risk and making data tracking a challenging privacy issue.

What if we could declutter this mess without having to rip it up and start over? What if we could take our digital landscape of roads and borders and identities and streamline it so that we can go from anywhere to anywhere — any system to any system — and prove that we are who we say we are and that the data we present hasn’t been altered?

This is what decentralized identity does.

Decentralized identity uses decentralized identifiers — a new global standard for identity — cryptography, and verifiable digital credentials to prove that you and only you are at one end of a secure communication and that the person or organization or device at the other end — your airline, bank, hotel, — can only be who they claim to be. A verifiable credential is like a digital container that can seal any kind of digital information. You always know the source of the container. You can’t alter the contents of the container without breaking the seal. If you trust the source of the container, you can trust the contents.

In practice, this means no more logins, passwords, third-party identity providers, multifactor authentication, or direct integrations. It means we can hold and share our data in a secure, consent-based way to anyone or any system with the software to verify it. It means our mobile devices can have the functionality of APIs but with a significantly better security profile.

Removing the key sources of data insecurity and identity fraud — centralized databases, user accounts and passwords— makes it a game changer. But there’s a bigger picture: The more easily information can be shared and the more it can be trusted, the faster and more efficient market decisions can be. This decluttering is a significant cost and time saver. Once information is issued in a verifiable format, it can be confidently reused over and over again.

At a certain point in this digital transformation, the combination of decreased friction and increased speed and scale will spark innovation. Think about how the telegraph went in a few decades from wiring messages over a few miles to enabling rail networks, expanding markets, and to wiring money.

This is the rapidly changing landscape we’re seeing as decentralized identity accelerates. Companies like SITA see how a digital identity that can be trusted by governments everywhere can be used in myriad ways to simplify operations for airlines, airports, and passengers. As the technology can be easily integrated with any existing system, there’s no need for costly infrastructure upgrades. And, because data authentication, sharing, and consent is now technically easy, airlines and airports can integrate new partners to provide new services to passengers.

Because the technology frees airlines and airports from having to store and manage huge amounts of passenger personal data, it also makes compliance with emerging data protection and privacy policies like eIDAS, GDPR, CCPA, vastly simpler: companies don’t have to hold people’s personal data for verification. There is a genuine mechanism for people to consent to share their data.

We also see decentralized identity as essential to the successful deployment of AI (combating deepfakes) and, in the longer term, something that will have just as much, if not more, impact. This is because the ability to instantly and seamlessly trust identity and data simplifies markets and allows for much more complex interactions at minimal cost. Whenever, in history, people can trust each other to share information that can be trusted, we see rapid innovation and increasing prosperity.

In an age of fragmentation, mistrust, fraud, and friction, decentralized identity is both the solution we knew we urgently needed and the path to opportunities. Some of those opportunities are quickly coming into view: seamless digital travel will reshape the air travel, the tourist sector, and passenger experience, as SITA will explain.

But behind the absence of friction, the simplicity of authentication, the capacity to consent, and the respect for privacy, there’s a transformation in the way our data and our identities move, connect, and interact across digital infrastructure. That’s the hidden travel story, and it will change everything.

For those who want to read more about the technology, read our “Beginner’s Guide to Decentralized Identity”.

####

Sign up to our newsletter to stay up to date with the latest from Indicio and the decentralized identity community

The post Digital travel and decentralized identity reveal the value of simplicity appeared first on Indicio.


HOW VERIFIABLE CREDENTIALS AND DIGITAL WALLETS WILL TRANSFORM TRAVEL

Phocuswire The post HOW VERIFIABLE CREDENTIALS AND DIGITAL WALLETS WILL TRANSFORM TRAVEL appeared first on Indicio.

Civic

Civic Milestones & Updates: Q2 2024

A few important milestones marked the second quarter of 2024, implying new circumstances for the crypto sector. Most importantly, the SEC approved 8 Ethereum ETFs, including BlackRock and Fidelity, ushering in new growth. At the same time, Bitcoin ETFs grew rapidly after their January launch to about $50 billion. On the US regulatory front, the […] The post Civic Milestones & Updates: Q2 202

A few important milestones marked the second quarter of 2024, implying new circumstances for the crypto sector. Most importantly, the SEC approved 8 Ethereum ETFs, including BlackRock and Fidelity, ushering in new growth. At the same time, Bitcoin ETFs grew rapidly after their January launch to about $50 billion. On the US regulatory front, the […]

The post Civic Milestones & Updates: Q2 2024 appeared first on Civic Technologies, Inc..


SC Media - Identity and Access

Patagonia accused of privacy violations related to AI use

Patagonia, a U.S. outdoor recreation clothing retailer, was hit by a class action lawsuit alleging California privacy law violations stemming from its usage of services from artificial intelligence-powered customer service provider Talkdesk.

Patagonia, a U.S. outdoor recreation clothing retailer, was hit by a class action lawsuit alleging California privacy law violations stemming from its usage of services from artificial intelligence-powered customer service provider Talkdesk.


Python repositories threatened by inadvertently exposed GitHub token

PyPi has immediately moved to revoke the authentication token, which had been given to PyPI Admin EE Durbin before March 3, 2023, reported JFrog researchers.

PyPi has immediately moved to revoke the authentication token, which had been given to PyPI Admin EE Durbin before March 3, 2023, reported JFrog researchers.


KuppingerCole

LoginRadius CIAM Platform

by John Tolbert This KuppingerCole Executive View report looks at the issues and options available to IT managers and security strategists to manage consumer and customer identity access management. A technical review of the LoginRadius CIAM platform is included.

by John Tolbert

This KuppingerCole Executive View report looks at the issues and options available to IT managers and security strategists to manage consumer and customer identity access management. A technical review of the LoginRadius CIAM platform is included.

SC Media - Identity and Access

New Okta products aim to address security gaps and identity concerns

Identity compromise is the leading entry vector into most cyberattacks and data breaches. A collection of new tools and upgrades from Okta may help even the odds.

Identity compromise is the leading entry vector into most cyberattacks and data breaches. A collection of new tools and upgrades from Okta may help even the odds.


KuppingerCole

Oct 01, 2024: Transforming Access Management: Strategies for the New Digital Landscape

In today's rapidly evolving digital landscape, organizations face increasing complexity in managing application access. The proliferation of diverse applications, coupled with the end-of-life (EOL) for traditional solutions like Oracle and SAP GRC, necessitates a reevaluation of access governance strategies. Traditional methods often fall short in addressing these challenges, requiring a shift towa
In today's rapidly evolving digital landscape, organizations face increasing complexity in managing application access. The proliferation of diverse applications, coupled with the end-of-life (EOL) for traditional solutions like Oracle and SAP GRC, necessitates a reevaluation of access governance strategies. Traditional methods often fall short in addressing these challenges, requiring a shift towards more comprehensive and integrated approaches.

Monday, 15. July 2024

Lockstep

Thinking about coding

This week in the Boston Globe, Mike Orcutt writes about how old arguments over coding as free speech are being revisited. A First Amendment fight for the future of the internet: Is computer code a form of speech? A case emerging from the wreckage of a cryptocurrency heist is putting that question to the test.... The post Thinking about coding appeared first on Lockstep.

This week in the Boston Globe, Mike Orcutt writes about how old arguments over coding as free speech are being revisited.

A First Amendment fight for the future of the internet: Is computer code a form of speech? A case emerging from the wreckage of a cryptocurrency heist is putting that question to the test.

“[For] nearly three decades US courts have recognized software code as protected speech”,

The “crypto wars” were raging in 1995 when I started in PKI (and “crypto” didn’t mean cryptocurrency at that time!).

When Bruce Schneier first published his classic textbook “Applied Cryptography” in 1993, he was prohibited from including a floppy disk containing executable encryption programs. But the First Amendment enabled him to include the C code for the algorithms as appendices in the printed work. As a middle finger to the NSA, the publishers included extra indicia marks to make the pages easier for OCR.

The court ruling that printed human readable code could be distributed but executables could not became known as the “Terrorists Can’t Type” interpretation.

I can see the free speech nuance.

Some high-level programming is highly creative. There can be non-obvious ways to write a program; recursive solutions can be quite sublime for their elegance and the strange intuition needed to draft them. I suspect there might be an infinite number of possible programs to implement most given specifications, so it’s certainly plausible to see programming as a form of expression. [I’m using “program” here to mean code written in a high-level language such as Python, C++ or Lisp].

But OTOH at the machine level, code is literally just a matter of throwing switches. Compiled software in memory connected to a processor looks a lot like a machine; indeed, at that level, there is little or no difference between programmed and wired logic.

I am not a lawyer but as I understand the right to free speech, I am largely free to write a book that describes bomb making for example, but I may not be allowed to make the bomb. So there are differences between text (and thought) and action. We can regulate actions but not thoughts.

Similarly, maybe the action of throwing switches is not protected. That is, compiled machine code is not speech. It’s certainly not the sort of speech that humans freely produce!

Both books and high-level computer programs are capable of being translated into actions.

Books can be read in different ways; a book about making a bomb doesn’t necessarily lead to the bomb being made let alone used. But there is only one intended reader of a computer program — the computer. And it’s only going to read the program one way.

The post Thinking about coding appeared first on Lockstep.


Indicio

政府発行䛾 身分証明書 を使用した 本人確認

The post 政府発行䛾 身分証明書 を使用した 本人確認 appeared first on Indicio.

auth0

Developer Day 2024 Is Here!

Join the Okta event for developers who care about the security of their applications and the identity of their users.
Join the Okta event for developers who care about the security of their applications and the identity of their users.

Coming in Hot… Auth0’s Q2 Developer Releases

Keep scrolling to learn more about our newest releases, updates, and all things developer.
Keep scrolling to learn more about our newest releases, updates, and all things developer.

Verida

Verida Airdrop 1 & 2 Claim Open Now

Introduction Verida is excited to announce the opening of claims for Airdrop 1 and Airdrop 2 on 15 Jul 2024 11:00 UTC. This initiative aims to reward our early adopters and participants in our Verida Missions, Galxe, and Zealy campaigns with Verida Storage Credit Token (VDA). The claim process will be open for 60 days from 15 Jul 2024 11:00 UTC, giving all eligible participants ample time t
Introduction

Verida is excited to announce the opening of claims for Airdrop 1 and Airdrop 2 on 15 Jul 2024 11:00 UTC. This initiative aims to reward our early adopters and participants in our Verida Missions, Galxe, and Zealy campaigns with Verida Storage Credit Token (VDA).

The claim process will be open for 60 days from 15 Jul 2024 11:00 UTC, giving all eligible participants ample time to claim their rewards. After 60 days, the claim window will close.

What is the Verida Inaugural Early Adopters Airdrop?

This airdrop is the first stage of Verida’s Community Rewards program, designed to reward early participants in Verida’s Missions program. Verida has allocated 1.3 million VDA Storage Credit Tokens for this initial community airdrop to reward these early contributors.

Early adopters have provided valuable insights and feedback to the Verida Team. In recognition of this community input, Verida is pleased to reward all eligible members with the Early Adopters Airdrop of Verida Storage Credit Tokens (VDA), which will drive Verida’s privacy-preserving data economy.

Check the eligibility criteria in our FAQs.

What is the Airdrop 2 for Early Galxe and Zealy Campaign Participants?

This airdrop is the second stage of Verida’s Community Rewards program, designed to reward early participants in Galxe and Zealy campaigns. Verida allocated another 2.2 million VDA Tokens to the campaign and increased the cumulative allocated award amount to 3.2 million VDA Tokens. This airdrop rewards those who participated in Verida’s Galxe and Zealy campaigns in 2023 and early 2024.

Check the eligibility criteria in our FAQs.

Claim Process: Head to Verida Missions. Connect your Verida identity via Polygon. Go to the ‘Airdrops’ tab, select the airdrop, click ‘Claim’, then confirm the claim in your wallet. Add the VDA contract to your wallet: 0x683565196C3EAb450003C964D4bad1fd3068D4cC Now you should be able to see VDA on your wallet.

Reminder: The claim process will be open for 60 days from 15 Jul 2024 11:00 UTC, giving all eligible participants ample time to claim their rewards. After 60 days, the claim window will close.

After claiming the airdrop, holders can use their VDA within the network to pay for data storage and access various Verida ecosystem initiatives.

Make the most of your Verida Storage Credits

Maximize Earnings Through Staking and Lending: With your VDA tokens, you can participate in Verida’s lending pools and incentivized liquidity programs. These initiatives offer up to 20% APY, significantly increasing your token holdings over time compared to a one-time sale.

Personal Data Bridge to enable Personal AI: VDA storage credits enable users to reclaim ownership of their personal data from centralized platforms like Google, Meta, and LinkedIn. This data can be securely stored and managed within Verida’s decentralized network. Users can harness their own data for upcoming personal AI applications and web3 platforms, thus directly benefiting from their data’s value.

Access to Personalized AI: Verida’s integration of confidential compute allows for the development of personal AI applications. Holders of VDA tokens will get early access to these advanced, privacy-preserving AI services that utilize your personal data securely.

Enhanced Privacy and Security: Data stored on the Verida network is encrypted and protected against unauthorized access. This ensures privacy, reduces the risk of data breaches, and provides a secure environment for sensitive personal data. VDA is a crucial asset to ensure data security across the network of storage nodes.

User-Centric Applications: With VDA, users can enable hyper-personalized applications such as AI agents and healthcare solutions. For instance, AI-powered personalized care and secure messaging become possible by utilizing user-controlled data, showcasing practical, everyday benefits of holding and using VDA.

Explore more VDA use-cases here: https://community.verida.network/

Future Initiatives:

Beyond the initial reward program, Verida intends to implement a series of future reward initiatives. The Verida Foundation is allocated a pool of 200 million VDA tokens (20% of the overall supply) that will be distributed through various community incentive programs.and help t you in claiming your data from centralized platforms and taking your data under your control.

Verida’s Personal Data Bridge to enable Personal AI

Explore how you can be rewarded for all the information you generate daily.

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for a wide range of industries. With a thriving community and a commitment to transparency and security, Verida is leading the charge towards a more decentralized and user-centric digital future.

Verida Missions | X/Twitter | Discord | Telegram | LinkedInLinkTree

Verida Airdrop 1 & 2 Claim Open Now was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Identity-Centric Finance Regulations - Europe

See which European financial regulations have stringent identity standards, and how identity access management helps achieve compliance.

Like most other global regions, European countries have strict guidelines and compliance standards regulating the activities of banks and financial service providers. Specifically in today’s digital world, the European Union (EU) has been at the forefront of enacting cybersecurity and data privacy regulations, more so than the rest of the world.

 

In recent years, regulatory bodies like the EBA, ESMA, ESRB, EIOPA, and the European Commission have supported, proposed, and implemented Identity-centric legislation. This legislation helps to protect consumers’ digital identities while securing financial transactions and ensuring fairness and transparency throughout the markets.

 

In this blog, we’ll provide an overview of identity-centric regulations impacting the finance industry in Europe and the regulatory bodies that influence and oversee their implementation.

Sunday, 14. July 2024

KuppingerCole

Identity Security - A Link between the Identity and Security Worlds

In this episode, Matthias hast three guests: his colleagues Phillip, Warwick and Alejandro. They take a look back at EIC 2024 and discuss the most important topics at the upcoming cyberevolution conference. They reflect on the cybersecurity trends and challenges addressed at EIC, including zero trust, decentralized identity, and AI in security. They also highlight the importance of regulations

In this episode, Matthias hast three guests: his colleagues Phillip, Warwick and Alejandro. They take a look back at EIC 2024 and discuss the most important topics at the upcoming cyberevolution conference. They reflect on the cybersecurity trends and challenges addressed at EIC, including zero trust, decentralized identity, and AI in security.

They also highlight the importance of regulations like NIS2 and DORA in driving cybersecurity practices, and mention the key topics for cyberevolution, such as zero trust, AI, and trust in an AI-driven world, as well as the focus on leadership and mental health in cybersecurity.




Spherical Cow Consulting

FedCM: The Tightrope Walk of Developing a Specification

I have thoughts on FedCM and the challenges of creating a good technical specification. It's not that easy! The post FedCM: The Tightrope Walk of Developing a Specification appeared first on Spherical Cow Consulting.

The Federated Credential Management API (FedCM), a specification currently under development through the W3C’s standardization process, was originally scoped to make sure federated authentication would still function in a world where third-party cookies didn’t exist. The problem statement as defined during a workshop in 2021 stated it this way:

Non-transparent, uncontrollable tracking of users across the web needs to be addressed and prevented.
Many applications and services need to work through the browser to support SSO/federated login, and yet federated login and tracking tools use the same features and are indistinguishable from the browser’s perspective.

One of the ongoing challenges, however, is that people who implement and design federated authentication services don’t think of those services just in terms of cookies. As soon as you start talking about federated authentication, those people also think about all the different web primitives like website redirects and long URLs.

Balancing a Specification: Focus vs Practicality

Let’s step back for a moment and think about creating a technical specification. If you ask me (and it’s my blog, so I get to say) a good specification is focused on a very specific problem. If you try to solve too many things in one spec, you’re going to limit innovation, miss important use cases, or create something too complicated to implement.

That said, if you don’t consider how a spec will actually be implemented, it probably won’t be of any practical use. Solving for one piece of the larger architecture without any idea as to how the other pieces will be treated is as bad as creating too broad a spec.

A spec can and should allow for extensions as people find new use cases that need a solution, or profiles that tighten the spec down where required. But at its core, the problem being solved should be clear and specific.

That’s the challenge facing FedCM right now: staying within the scope of a very specific problem when people are Very Concerned about what else will need to change to prevent non-transparent, uncontrollable tracking of users across the web.

The FedCM Specification

Coming back to FedCM, the Federated Identity Community Group in the W3C just published a blog post that talked about some of the problems FedCM doesn’t solve. (Full disclosure: I held the pen on that post as one of the co-chairs, with input from several members of the group.) FedCM originated from the need to preserve the functionality of third-party cookies when used in a federated authentication scenario. But there are more web primitives than cookies used for federated authentication. Should the specification consider those as well? Is it worth implementing at all if it doesn’t solve (or at least offer some direction) on how to handle the other web primitives (like redirects) that are also used for tracking on the web?

I think it comes down to a subtle lack of alignment and/or understanding of the problem being solved. Is FedCM supposed to solve for the deprecation of third-party cookies or is it supposed to solve for tracking across the web? When we first started, it was clearly just solving for the third-party cookie problem. Over time, however, there is confusion even within the groups working on the spec whether they are aiming for the larger problem.

Implementation Decisions

So, what are implementers to do? They could implement the specification (which, to be clear, is NOT a standard and does NOT have consensus yet in the groups working on it) to get ahead of what the browser vendors are doing, but have to deal with breaking changes as the spec evolves. They could wait and suddenly find out that their federated authentication infrastructure is breaking because they missed the latest changes as the specification gets closer to standardization.

Adapting to an Evolving Web

If there was an easy answer, we wouldn’t be talking about standards and tech. There are too many competing use cases, threat models, business considerations, and architectures to take into consideration. My only advice is to get involved in the standards process if you can or pay attention to developer sites (like this one or this one) and the people involved. The way the Internet and the web works is always evolving; designing your business or service to adapt to change is the only way to be successful.

I want to help you go from overwhelmed at the rapid pace of change in identity-related standards to prepared to strategically invest in the critical standards for your business. Follow me on LinkedIn or reach out to discuss my Digital Identity Standards Development Services.

The post FedCM: The Tightrope Walk of Developing a Specification appeared first on Spherical Cow Consulting.

Saturday, 13. July 2024

Safle Wallet

Revised Tokenomics is Live

Weekly Safle Update! 🚀 Attention all Sentinels, The Safle spaceship has re-emerged, fully powered and bursting with stellar discoveries! Now it’s time to embark for the next leg of our interstellar journey. Here’s the Week 6 mission debrief as we continue our voyage through the galaxy: New Tokenomics & Roadmaps - Now Live! Our latest tokenomics and roadmap are now live on t
Weekly Safle Update! 🚀 Attention all Sentinels,

The Safle spaceship has re-emerged, fully powered and bursting with stellar discoveries! Now it’s time to embark for the next leg of our interstellar journey.

Here’s the Week 6 mission debrief as we continue our voyage through the galaxy:

New Tokenomics & Roadmaps - Now Live!

Our latest tokenomics and roadmap are now live on the Safle website. Check them out to see what’s in store for our exciting journey ahead! 🌐📈

🔗safle.com

Wallet Connect Integration Complete!

We’re thrilled to announce that our Wallet Connect that supports 6000+ dApps is now fully integrated across all 10 chains, including the newest additions: Base, Polygon zkEVM, and Avalanche!
Hold on to your seats we will be live soon! 🌐

Portfolio Designs - Ongoing!

Our team is wrapping up the new portfolio designs. Be ready to manage your assets with style. Users will be now be able to portfolio viewer and compare how their tokens will perform across NFTs and 10 chains. 🎨✨

Safle Wallet - We’ve heard our users!

We have taken your feedback and are on a mission to a fully decentralised world, soon have a smooth signup experience, no emails, no numbers needed, Safle ID for all. 🛤️🚀

We’re Growing - Join Our Team!

We’re on the lookout for talented individuals to join our dynamic team. Open positions include:

- Growth Associate

🔗https://wellfound.com/jobs/3056266-growth-associate

- WEB3 Partnerships Manager

🔗https://wellfound.com/jobs/3056284-web3-partnership-manager

- Software Tester (Quality Analyst)

🔗https://wellfound.com/jobs/684071-software-tester-quality-analyst

- DevOps Engineer

https://wellfound.com/jobs/1690778-devops-engineer

Interested? Apply now and be a part of something big! 🚀

Join the Safle Community 🧑‍🚀

Stay updated with our latest news and announcements!

🔗https://linktr.ee/safle

Friday, 12. July 2024

Safle Wallet

Safle’s Strategic Roadmap: Revolutionizing the Blockchain Wallet Landscape

Safle stands out as a non-custodial, multi-chain, identity wallet and blockchain infrastructure provider, offering users a comprehensive solution for managing digital assets securely and efficiently. With our robust features and user-centric design, Safle is transforming the way people interact with blockchain technology. Impressive features Safle excels with its multichain accessibility and use

Safle stands out as a non-custodial, multi-chain, identity wallet and blockchain infrastructure provider, offering users a comprehensive solution for managing digital assets securely and efficiently. With our robust features and user-centric design, Safle is transforming the way people interact with blockchain technology.

Impressive features

Safle excels with its multichain accessibility and user-centric features. Its cross-resolving Web3 naming registry simplifies navigation across blockchains, while multi-chain and cross-chain swaps make asset exchanges seamless. Its on-the-go fiat-crypto transitions enhance user experience.

Safle’s impact is evident with over 3,000 assets available for swap, 15,000+ SafleIDs issued, and 150,000+ app downloads. Safle offers easily manageable multichain identities, seedless onboarding, and full self-custody for enhanced security. Safle supports NFT discovery and real-world asset access, blending digital and physical realms. Users enjoy cold wallet security with smartphone access. Soon users will be able to benefit from NFC recovery, and biometric login, all through an intuitive dashboard that simplifies asset management.

A Robust Roadmap unveiled

As Safle continues to establish itself as a powerhouse in the blockchain wallet and infrastructure space, our meticulously planned roadmap outlines an ambitious journey towards widespread adoption and technological innovation.

Currently, Safle supports major chains like Ethereum, Polygon, Arbitrum, Optimism, Mantle, and Bitcoin. Safle is also excited to announce upcoming additions, including Stacks, Avalanche, Polygon zkEVM, Base, Solana, and more.

Foundation and Initial Enhancements (Starting June 2024)

Q1 & Q2 2024:: Laying the Groundwork

Month 1: Enhancing Accessibility

In the first month, Safle will kick off with listings on smaller centralized exchanges (Tier 2 & 3) to make Safle tokens easily accessible. By integrating Bitcoin L2 chains, Safle wallet will seamlessly support the entire Bitcoin L2 ecosystem, enhancing accessibility and usability for all users. Safle is poised to become the go-to identity wallet and gateways for providing a comprehensive suite of features including bridging, swaps and many more all in one place.

Month 2: Streamlining User Onboarding and Connectivity

Next, Safle is simplifying the sign-up process with anonymous, email-free options to make it incredibly easy for new users to join. With the launch of WalletConnect, this will provide secure connections across all supported EVM chains, allowing seamless interactions between Safle wallet and dApps.

Month 3: Portfolio Viewer and Enhanced Security

In month three, Safle will be introducing a web-based tool for managing multi-chain portfolios, making it more convenient for users to view and manage their assets. It will also increase its visibility and accessibility by listing on a major Tier 1 exchange. To enhance security and trust, Safle is implementing AI tools to monitor and flag suspicious transactions.

Months 4–6: Expanding Functionality and Interoperability

As we move into months four to six, Safle will enable seamless swaps with Bitcoin Layer 2 solutions, offering users more options for efficient transactions. Expanding access to decentralized finance (DeFi) across all supported chains will allow users to engage with a wide range of DeFi applications. It also aims to facilitate easy asset transfers between different blockchains, breaking down barriers and enhancing interoperability. Lastly, introducing staking options will allow users to earn rewards, encouraging more active participation in our ecosystem.

Q3 & Q4 2024:: New Features and Security Upgrades

Month 7: User Engagement and in-wallet Quests

In month seven, Safle will launch exciting in-wallet quests in a ‘learn to earn’ format, designed to captivate and reward users. By exploring and engaging with our innovative and user-friendly features, users will be able to earn rewards and incentives, making the user experience both educational and engaging.

Months 8–9: Enabling $SAFLE utility

Users will have the capability to utilize SAFLE tokens to settle gas fees across all available networks integrated into Safle Wallet. This feature will be seamlessly integrated with SafleID, effectively decoupling the blockchain interface layer from direct user interaction.

Months 10–12: Unified Identity and Enhanced Security

In the final months of our first year, Safle will launch a multichain identity solution, making it easier for users to manage their identities across different blockchains. Safle will begin transitioning existing users to the new SafleID system, ensuring a smooth upgrade. Additionally, it aims to continue expanding its supported chains and enhancing security measures to keep our platform robust and up-to-date.

Expansion and AI Integration (Starting January 2025) Q1 & Q2 2025: AI and Wallet Innovations

Months 13–14: Smart Financial Strategies

Entering the second year, Safle will enhance users’ financial strategies with AI-generated insights in the upgraded portfolio viewer, offering smart recommendations and data-driven analysis. Thus simplifying account management with advanced abstraction techniques, making it easier for users to interact with blockchain applications.

Months 15–16: Enhanced Security and Smarter Wallets

To increase security, Safle will introduce multi-signature wallet support, allowing multiple approvals for transactions. It will also bring in AI-driven functionalities that anticipate your needs, streamlining wallet interactions based on user intents and conversational inputs.

Months 17–18: Embracing the Future — M2M economy

Safle is paving the way for the future with machine-to-machine wallet interactions, facilitating seamless transactions between IoT devices. Continuous improvements to the user interface and experience will ensure maximum satisfaction.

Q3 & Q4 2025: Continued Growth and Optimization

Months 19–20: Security and Flexibility

As we move forward, Safle will add hardware wallet compatibility to provide extra security. Expanding swap functionalities to include more blockchain networks and revisit the security of the entire stack and if needed get an audit.

Months 21–22: Advanced Monitoring and Security

Strengthening fraud detection with advanced AI algorithms will improve our ability to identify and prevent malicious activities. Safle will continuously upgrade security measures to stay resilient against evolving threats.

Months 23–24: Feature Optimization

Finally, Safle will continue ongoing updates and refinement of its features and user experience, driven by valuable user feedback and current market trends, to consistently achieve peak performance and enhance user satisfaction

By focusing on user-friendly features, robust security, and cutting-edge technology, Safle is set to drive significant growth and innovation in the blockchain wallet space. Safle is committed to provide its community with the best tools to engage with blockchain/digital assets , securely & efficiently.

This strategic plan highlights Safle’s commitment to enhancing user experience, expanding its ecosystem. Join us as we explore the exciting future of decentralized finance and continue to push the boundaries of innovation.

About Safle

Safle is a decentralized platform focused on creating secure digital identities and financial services. By leveraging blockchain technology, Safle provides users with a self-sovereign identity system, ensuring privacy and control over personal data. The platform also offers a suite of tools for managing digital assets, making decentralized finance accessible and user-friendly. With a commitment to transparency and innovation, Safle is poised to revolutionize the way individuals interact with digital services.

To learn more about Saffle, visit:

Website | X | Discord | Telegram | Instagram | Medium | Github |

Thursday, 11. July 2024

KuppingerCole

Beyond Passwords: Revolutionizing Consumer Authentication

Explore the transformative journey toward passwordless authentication, a key trend that is reshaping the way consumers interact with digital services. This webinar will explore the technological advancements and market dynamics driving this shift, as well as practical insights into deploying consumer-centric passwordless solutions. We'll uncover the unique challenges and opportunities of eliminati

Explore the transformative journey toward passwordless authentication, a key trend that is reshaping the way consumers interact with digital services. This webinar will explore the technological advancements and market dynamics driving this shift, as well as practical insights into deploying consumer-centric passwordless solutions. We'll uncover the unique challenges and opportunities of eliminating traditional passwords, emphasizing the importance of user experience, security, and technological adaptability.

As an expert in digital identity and cybersecurity, KuppingerCole's analyst Alejandro Leal will guide you through the evolving landscape of passwordless authentication, highlighting recent developments and future trends. Drawing from extensive research, he'll discuss the critical role of user-friendly and secure authentication methods in enhancing consumer interactions, focusing on the practical steps organizations can take to implement these technologies effectively.

Join this webinar to: 

   Learn how removing passwords can enhance user engagement and reduce friction in digital interactions.    Gain knowledge of the key strategies and considerations when transitioning to passwordless authentication.    Explore current trends driving the adoption of passwordless solutions in the consumer space.


Anonym

The 20+ Patents Behind Anonyome Labs’ Privacy and Identity Products 

As organizations globally race to meet growing consumer demand for privacy-first products and to comply with ever-expanding privacy regulations, Anonyome Labs has been excelling in the digital identity and privacy space for more than a decade. In that time, we’ve leveraged our deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography […]

As organizations globally race to meet growing consumer demand for privacy-first products and to comply with ever-expanding privacy regulations, Anonyome Labs has been excelling in the digital identity and privacy space for more than a decade. In that time, we’ve leveraged our deep expertise in security, identity management, authentication and authorization, cloud, privacy, and cryptography to develop an extensive range of market-leading consumer and enterprise products: 
 

Our persona-based consumer applications MySudo all-in-one privacy app, MySudo VPN, and our new personal data removal service (MySudo Reclaim)  Our enterprise solutions in decentralized identity (e.g. reusable credentials IDV solution, credential issuance and verification services, and wallet solutions (we’ve just updated our market-leading wallet SDK), and cyber security applications including white label VPN, safe browser, and password manager, and more.  
 

Our business solutions is the toolkit for enterprise developers to rapidly integrate digital identity services into their own consumer products, including email, telephony, VPN, safe browsing, password management, virtual credit cards, end-to-end encrypted communication, DI verifiable credential issuing and verifying, DI wallets, and more.  
 

At the heart of our offerings is the Sudo, a secure, customizable digital identity (‘profile’ or ‘persona’) that intentionally differentiates from a consumer’s personal identity and protects their personal data. When a user manages multiple identities (Sudos) they can easily and powerfully compartmentalize (organize) their daily activities for optimal privacy protection. (Read our popular deep dives on compartmentalization as a data privacy protection strategy, why data privacy matters, and what consumers want.) 

Underpinning all of Anonyome Labs’ success in the market are the 20-plus patents issued by the US Patent and Trademark Office (USPTO). Check back regularly for more additions to this patents list:  

Persona System 

Apparatus and Method for Masking a Real User Controlling Synthetic Identities, Patent #: US9372987, June 20th, 2016 (Issued) 

This patent outlines the design of a persona application and persona services that allow a user to create personas for their own use and where only they know the identity and can operate the personas that they own. The persona services do not have the information to relate users to personas. 

Apparatus and Method for Administering Proxy Identities, Patent #: US10356052, July 16th, 2019 (Issued) 

This patent outlines the design for the persona services. Each user can have multiple personas; each persona fitted with identity attributes such as phone number, email address, virtual payments information, contacts, and avatar, and with capabilities such as communications, browsing, anonymous payments, and reputation. The persona services also allow for a marketplace of personas so that personas can be bought and sold between users. 

Apparatus and Method for Building, Extending and Managing Interactions Between Digital Identities and Digital Identity Applications, Patent #: US10931650, February 23rd, 2021 (Issued) 

This is a more detailed examination of the design of the persona applications and persona services, including verifying the distribution of applications built from the persona services. 

Apparatus and Method for Evaluating and Modifying Data Associated With Synthetic Identities, Patent #: US11477178, October 18th, 2022 (Issued) 

This patent provides a design for a system to measure the similarity between personas, or personas and real identities, with the aim of alerting a user to instances where they inadvertently provide increased correlation between identities. 

Persona Anonymity with Recourse 

Apparatus and Method for Enabling Owner Authorized Monitored Stewardship Over Protected Data in Computing Devices, Patent #: US10963582, March 30th, 2021 (Issued) 

This is a design for an anonymity with recourse system that allows a user to maintain complete anonymity, possession of cryptographic keys, possession of encrypted communication and so on up to the point where a legal process allows uncovering of that data. The blockchain-based system records the implementation of the process steps so that any collusion or misbehavior by participants can be uncovered. 

Apparatus and Method for Managing Digital Identities and Controlling Their Correlation to Legal Identities, Patent #: US11159578, October 26th, 2021 (Issued) 

This builds on the generic anonymity with recourse system in the above patent to focus on the use of personas for anonymity and how under legal due process the identity of the user can be uncovered. Again, a blockchain-based system ensures that any misbehavior is uncovered. 

Persona Usage with Decentralized Identities 

Digital Wallet for Digital Identities and Interactions with a Digital Identity Services Platform, Patent #: US11507943, Nov 22nd, 2022 (Issued) 

This patent outlines the design of how to enhance personas with decentralized identity such that each persona can access a full range of compartmentalized decentralized identity services, such as DIDComm encrypted messaging, verifiable credentials (request, hold, present), passwordless login and so on. Each persona has a decentralized identity wallet holding encryption keys, DIDs, verifiable credentials and other information specifically for that persona.  

Persona Reputation 

A Decentralized Reputation Service for Synthetic Identities, Patent #: US9703986, July 11th, 2017 (Issued) 

This defines a system and method to calculate ongoing persona reputation. It also uses a block chain as an immutable store of the reputation calculation. 

Apparatus and Method for Establishing Trust of Anonymous Identities, Patent #: US11177937, November 16th, 2021 (Issued) 

This disclosure is an extension to the reputation design outlined above. This disclosure presents a method whereby a reputation score can be calculated for anonymous online users (personas) by analyzing their public activities performed in online settings, analytics gathering, and/or other disclosed or discoverable data points. Inputs to the reputation score may come from the ratings or reviews submitted by other parties of verified transactions where an online user has participated, such as purchases, rentals, enrolments etc.  

Persona-Based Anonymous Payments 

Apparatus and Method for Processing Virtual Credit Cards for Digital Identities, Patent #: US11568408, January 31st, 2023 (Issued) 

This patent outlines a design for allowing anonymous payments by providing personas each with virtual payment cards. The system outlines how these cards are created, managed and closed. 

Persona-Based Browsing 

Apparatus and Method for Augmenting a Messaging Application with Cryptographic Functions, Patent #: US10491631, November 26th, 2019 (Issued) 

This patent is directed towards techniques for augmenting messaging applications 

with separate cryptographic functions. 

System and Method to Automate Website User Interface Navigation, Patent #: US10943063, March 9th, 2021 (Issued) 

This patent provides a design for a system to provide automated web site form fill using personas or real identities. This includes, for example, the use of persona or real information when creating web site accounts. 

Apparatus and Method for Persona Based Isolation Browsing, Patent #: US11290429, March 29th, 2022 (Issued) 

This patent extends the persona-based privacy browser application design above by allowing the creation of an individual isolation browser configuration/instance for each persona. 

A Persona-Based Privacy Browser, Patent #: US11860984, January 2nd, 2024 (Issued) 

This describes the design of a persona-based browser application system that allows for improved anonymization and compartmentalization of persona browsing by allowing the creation of an individual browser configuration/instance for each persona. 

Persona-Based Email 

Email Application for Synthetic Identities, Patent #: US9729519, August 8th, 2017 (Issued) 

This outlines the design for an email system for users with multiple personas. This provides compartmentalized email for each persona. The email communication is encrypted between personas. 

Method and System for Automating Secure Email for Multiple Personas, Patent #: US10382211, August 13th, 2019 (Issued) 

This patent extends the above patent to describe the design of a standards-based S/MIME persona-based email system to allow encrypted persona-to-persona email, or encrypted persona-to-third party S/MIME email. 

Apparatus and Method for Managing Digital Identities, Patent #: US10511493, December 17th, 2019 (Issued) 

This extends the above patents to specifically describe the management of multiple email accounts for each persona. These are machine-generated random email accounts such that a single persona could have an account for each service that the persona interacts with, giving the user very fine-grained control of email accounts for each  

persona. 

Persona-Based VPN 

Method and System for Providing Persona Masking in a Computer Network, Patent #: US10320753, June 11th, 2019 (Issued) 

This describes the design of a persona-based VPN system that allows for improved anonymization and compartmentalization of persona traffic by allowing the creation of a VPN configuration for each persona. 

Persona-Based Telephony 

Apparatus and Method for Supporting Telephonic Synthetic Identities, Patent #: US9374689, June 20th, 2016 (Issued) 

This outlines the design for a telephony application system for users with multiple personas. The telephony communication provides both SMS/MMS and voice calling, with each persona having one or more telephony numbers. This provides compartmentalized telephony for each persona. 

Persona Protection 

Apparatus and Method for Identifying and Warning of Synthetic Identity Behavior that Reduces User Privacy, Patent #: US10178106, January 8th, 2019 (Issued) 

This patent defines methods for protection from unwittingly disclosing the relationship between a persona identity and user identity, or between persona identities. The aim is to keep strong compartmentalization and to not reduce user privacy. 

Cleaning Up Personal Digital Footprint 

Apparatus and Method for Email Based Digital Footprint Sanitization, Patent #: US11836245, December 5th, 2023 (Issued)  

This patent includes a design for users to scrutinize their email accounts to sanitize them in a manner that protects individual privacy by deleting personal data from data holders. 

Digital Rights Management 

Apparatus and Method For Persistent Digital Rights Management, Patent #: US11928188, March 12th, 2024 (Issued) 

This patent includes a design for protecting content using persistent Digital Rights Management. 

Finally, did you know the 2024 Gartner Emerging Tech Impact Radar confirms that Anonyome Labs’ enterprise solutions are among the highest impact technologies for gaining a competitive business advantage? The latest annual impact radar nominates privacy and transparency as one of four emerging tech themes with the most potential to disrupt a broad cross-section of markets and Anonyome Labs’ market leading solutions fit squarely within this theme. 

If you’re ready to capitalize on what we do best, get in touch today. We’d love to partner with you. 

Explore our products: 

MySudo family of apps 

Sudo Platform services platform 
 

You might also like: 
  

What is a Sudo? 

Dr Paul Ashley Presents on How to Solve the Privacy Problem 
More on decentralized identity on our Anonyome Labs blog 

More on MySudo consumer app on our MySudo blog 
Our popular podcast, Privacy Files 

The post The 20+ Patents Behind Anonyome Labs’ Privacy and Identity Products  appeared first on Anonyome Labs.


Microsoft Entra (Azure AD) Blog

Microsoft Security Service Edge now generally available

Today, we announced the general availability of the Microsoft Entra Suite which brings together identity and network access controls to secure access to any cloud or on-premises application or resource from any location. It consistently enforces least privilege access to achieve your governance requirements while improving your employee experience.   Companies today have good reason to fo

Today, we announced the general availability of the Microsoft Entra Suite which brings together identity and network access controls to secure access to any cloud or on-premises application or resource from any location. It consistently enforces least privilege access to achieve your governance requirements while improving your employee experience.

 

Companies today have good reason to focus on security. On one hand, we’re reaping the advantages of increased scalability, efficiency, and cost reductions, including all the benefits gained from generative AI’s large language models. However, these advantages also make it possible for malicious actors to exploit advanced technologies to create malware, target network vulnerabilities, and generate phishing attacks that put organizations’ data and reputations at higher risk. 

 

When identity and network access solutions operate in isolation and not in tandem, they can lead to increased complexity, inconsistent policies, and a lack of unified context across standalone solutions. This can unintentionally result in a fragmented security posture and vulnerabilities that malicious actors could exploit, potentially disrupting business continuity and compromising the user experience.

 

Neither identity nor network security controls alone can protect all your access scenarios, highlighting the need for you to adopt a holistic strategy to counteract evolving threats and protect your critical assets—no matter where the users and resources are located. 

 

The case for unified security: A strategic imperative

 

Along with the Microsoft Entra Suite general availability, we also announced Microsoft’s Security Service Edge (SSE) solution general availability, Microsoft Entra Private Access and Microsoft Entra Internet Access. These two products coupled with our SaaS security-focused CASB—Microsoft Defender for Cloud apps—comprise Microsoft's Security Service Edge solution, a cloud-delivered, identity-centric networking model that transforms the way you secure access.

 

Microsoft’s SSE solution is all about helping you eliminate security gaps in your defenses, extending Conditional Access and continuous access evaluation to all your applications and resources, whether they’re on-premises or in any cloud.

 

Figure 1: Secure access to any app or resource, from anywhere, with an identity-centric Security Service Edge (SSE) solution.

 

Here, in more detail, are the key advantages of Microsoft’s SSE solution to your organization.

 

Eliminate security loopholes caused by identity and network access silos

 

Microsoft’s SSE Solution ensures that your identity and network access solutions work together. By unifying these separate elements, your security teams can bolster your organization’s security stance in the face of emerging threats. No more deciding which tool works for each app or how to bridge the policies your identity and network teams created. Now you can secure access with an easy-to-manage, unified, identity-centric approach to any application, resource, or destination—and not sacrifice user productivity due to complex, disjointed security controls.

 

Simplify access and improve end user experience at a global scale

 

Microsoft’s SSE solution is delivered from one of the largest global private networks: Microsoft’s Global Wide Area Network. The network connects Microsoft data centers across 61 Azure regions with more than 185 global network POPs and a vast array of growing SSE edge locations strategically placed around the world. This helps you optimally connect your users and devices to public and private resources seamlessly and securely, improving performance and boosting productivity by offering your people a fast, consistent, hybrid work experience.

 

Activate side-by-side, flexible deployment options with other SSE and networking solutions

 

Microsoft Entra Private Access and Microsoft Entra Internet Access can be deployed standalone or side-by-side with other SSE solutions. Global Secure Access client allows control over network traffic at the user endpoint device, giving you the ability to route specific traffic profiles through Microsoft’s SSE solution. The client for Windows and Android operating systems are now in general availability, and for iOS and Mac operating systems, in public preview. With flexible deployment options, the Global Secure Access client could acquire traffic based on the traffic forwarding profiles you configure for Private Access, Internet Access, and Microsoft traffic.  

 

For example, you can configure Private Access profiles anywhere you replace your third-party legacy VPNs—with an identity-centric Zero Trust Network Access (ZTNA) solution. You can also configure your Microsoft profile to enable improved performance for Microsoft applications, while you keep your private and internet traffic protected with the SSE solution of your choice. 

 

A closer look at Microsoft Entra Private Access

 

Microsoft Entra Private Access is an identity-centric ZTNA solution that helps you secure access to all private apps and resources for your users—located anywhere. Private Access allows you to replace your legacy VPN with ZTNA to securely connect your users to any private resource and application—without providing full network access to all private resources. This solution embraces Zero Trust principles to protect against cyber threats and to mitigate lateral movement, while enforcing advanced app segmentation and adaptive least-privilege access policies. Using Microsoft’s global private network, you can give your users a fast, seamless access experience that balances security with productivity.

 

Figure 2: Secure access to all private apps and resources, for users anywhere, with an identity-centric Zero Trust Network Access (ZTNA).

 

Here, in more detail, are the key use cases of Microsoft Entra Private Access.

 

Replace legacy VPNs with an identity-centric ZTNA solution

 

With Microsoft Entra Private Access, easily start retiring your legacy VPN and level up to an identity-centric ZTNA solution that helps you reduce your attack surface, mitigate lateral threat movement, and remove unnecessary operational complexity for your IT teams. Unlike traditional VPNs, Microsoft Entra Private Access protects access by granting least privilege access to your network for all your hybrid users— whether they are remote or local—and, accessing any legacy, custom, modern, or private apps that are on-premises or on any cloud. 

 

Enforce Conditional Access across all private resources

 

To enhance your security posture and minimize the attack surface, it’s crucial to implement robust Conditional Access controls—without making any changes to your private applications and resources such as multifactor authentication (MFA). You can also seamlessly enable single sign-on (SSO) across all private resources and applications, including legacy or proprietary applications that may not support modern authorization.  

 

Deliver fast and easy access at global scale

 

Enhance your workforce’s productivity by leveraging Microsoft’s vast global edge presence, providing fast and easy access to private apps and resources, whether on-premises or on private data centers, and across any cloud. Users benefit from optimized traffic routing through the closest worldwide points-of-presence (POP), reducing latency for a consistently swift hybrid work experience. 

 

A closer look at Microsoft Entra Internet Access

 

Microsoft Entra Internet Access is an identity-centric Secure Web Gateway (SWG) for SaaS apps and internet traffic. It’s the industry’s first truly identity-centric SWG solution capable of converging all enterprise access controls in one place. This advantage eliminates the security loopholes created by using multiple security solutions, while it also protects your enterprise from malicious internet traffic, unsafe or non-compliant content, and other threats from the open internet. Working alongside Microsoft Entra Private Access and the rest of the Microsoft Entra identity stack, it unifies your access policies across all internet resources and SaaS apps.

 

Figure 3: Secure access to all internet and SaaS apps and resources with an identity-centric Secure Web Gateway (SWG).

 

Protect your organization against internet threats

 

Microsoft Entra Internet Access provides robust web content filtering options to restrict enterprise users from accessing undesirable online content. With web category filtering, you can easily allow or block a vast range of internet destinations based on pre-populated web categories, which include liability, high bandwidth, productivity loss, general browsing, and security threat (malware, compromised websites, spam sites, etc.) sites. For more granular control, you can use fully qualified domain name (FQDN) filtering to establish policies that allow or block specific endpoints or to override general web category policies effortlessly. 

 

Extend Conditional Access context richness to internet security

 

Modern businesses require versatile filtering policies that adjust to different scenarios. Microsoft Entra Internet Access gives you the ability to apply Conditional Access controls to your SWG policies leveraging the user, device, risk, and location signals to allow or block access to relevant internet destinations. Internet Access consolidates network and identity access controls into one policy engine and allows you to extend Conditional Access (and in future Continuous Access Evaluation) to cover all external destinations and cloud services, even those not federated with Microsoft Entra ID. Additionally, our deep integrations with Entra ID include valuable features like token theft protection, source IP restoration, and data exfiltration safeguards through Universal Tenant Restriction.  

 

Deliver fast and consistent access at global scale

 

Enhance your users' productivity by providing swift and smooth access through a global network edge, with POPs located near the user and private WAN. Utilize numerous peering agreements with internet providers to deliver top performance and reliability. Minimize additional hops and streamline traffic routing for all Microsoft services. Implement optimal traffic management for Microsoft applications in conjunction with solutions from third-party SSE providers using side-by-side access models.

 

Conclusion

 

Organizations need an easier, more agile approach to protect access to all their applications and resources. This action safeguards your critical assets no matter where they are located. Today’s general availability of our Microsoft Entra Internet and Private Access products—our Microsoft’s SSE solution—does just that. It makes it harder for bad actors to gain access to your sensitive data—even if they successfully infiltrate your network—by extending identity security controls and access governance to your network. 

 

Now, you can benefit from a streamlined security environment where your users have access to only the necessary resources, simplifying their work. With Conditional Access, granular identity and network access policies are now unified, closing critical security gaps and reducing operational complexity. The global, private, wide area network provided by Microsoft ensures a seamless, efficient hybrid work experience. And integration with Microsoft’s extensive security portfolio and partner ecosystem supports the implementation of Zero Trust principles throughout the entire security landscape, enhancing your overall protection. 

 

Be sure to register for the Zero Trust spotlight on July 31, 2024, where Microsoft experts and thought leaders will dive deeper into these announcements. Also, stay tuned for product deep dive blogs and our upcoming Tech Accelerator product deep dive sessions on Aug 14, 2024. We'll expand on how our SSE solution and its two core products, Microsoft Entra Private Access and Microsoft Entra Internet Access can uniquely and successfully provide a secure approach to access across your organization’s entire digital estate. 

 

To get started, contact a Microsoft sales representative, begin a trial, and explore Microsoft Entra Private Access and Microsoft Entra Internet Access general availability. Share your feedback to help us make this solution even better.  

 

Sinead O’Donovan

Vice President of Product Management, Identity and Network Access at Microsoft

 

 

Read more on this topic

Microsoft Entra Internet Access Microsoft Entra Private Access Get started and try Entra suite products  Simplify your Zero Trust strategy with the Microsoft Entra Suite and unified security operations platform, now generally available

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog⁠Microsoft Entra blog | Tech Community  ⁠Microsoft Entra documentation | Microsoft Learn  Microsoft Entra discussions | Microsoft Community  

Microsoft Entra Suite now generally available

Today we announced the general availability of Microsoft Entra Suite - the industry’s most comprehensive secure access solution for the workforce. The Microsoft Entra Suite delivers the most comprehensive Zero Trust user access solution and enables organizations to converge access policy engine across identities, endpoints, and private and public networks.     What is Microsoft

Today we announced the general availability of Microsoft Entra Suite - the industry’s most comprehensive secure access solution for the workforce. The Microsoft Entra Suite delivers the most comprehensive Zero Trust user access solution and enables organizations to converge access policy engine across identities, endpoints, and private and public networks.  

 

What is Microsoft Entra Suite? 

The Microsoft Entra Suite delivers a complete cloud-based solution for workforce access. It brings together identity and network access that secures employee access to any cloud or on-premises application and resource from any location, consistently enforces least privilege access, and improves the employee experience.​  

 

This new offering advances our vision for the Microsoft Entra product line that can serve as a universal trust fabric for the era of AI, securely connecting any trustworthy identity with anything, from anywhere. In a recent blog post we also shared the four stages of creating such trust fabric for your organization, starting with foundational Zero Trust controls, and extending it to protecting access for your workforce, protecting access for your customers and partners, and protecting access in any cloud. The Microsoft Entra Suite delivers the complete toolset for the second stage of this journey – secure access for your workforce.  

 

The Microsoft Entra Suite includes the following products:  

 

 

 

 

Microsoft Entra Private Access – an identity-centric Zero Trust Network Access that secures access to private apps and resources and reduces operational complexity and cost by replacing legacy VPNs.  Microsoft Entra Internet Access – an identity-centric Secure Web Gateway (SWG) for SaaS apps and internet traffic that protects against malicious internet traffic, unsafe or non-compliant content, and other threats from the open internet.  Microsoft Entra ID Governance – a complete identity governance and administration solution that automates identity and access lifecycle to ensure that the right people have the right access to the right apps and services at the right time.  Microsoft Entra ID Protection – an advanced identity solution that blocks identity compromise in real time using high-assurance authentication methods, automated risk and threat assessment, and adaptive access policies powered by advanced machine learning (also included in Microsoft Entra ID P2).   Microsoft Entra Verified ID - a managed verifiable credentials service based on open standards that enables real-time identity verification in a secure and privacy respecting way. Included in the Microsoft Entra Suite are premium Verified ID capabilities, starting with Face Check.     Microsoft Entra Suite enables you to:  Unify Conditional Access policies for identities and networks.  Ensure least privilege access for all users accessing all resources and apps.  Improve the user experience for both in-office and remote workers.  Reduce the complexity and cost of managing security tools from multiple vendors. 

 

Check out the Microsoft Entra Suite introductory video below:

 

 

Unify Conditional Access policies for identities and networks 

You only have to manage one set of policies in one portal to configure access controls for both identities and networks. Conditional Access evaluates any access request, no matter where it’s coming from, performing real-time risk assessment to strengthen protection against unauthorized access.  

 

Ensure least privilege access for all users accessing all resources and apps 

You can automate the access lifecycle from the day a new employee joins your organization, through all their role changes, until the time of their exit. No matter how long or multifaceted an employee’s journey, Microsoft Entra ID Governance ensures that your employees have the right access to just the applications and resources they need, helping prevent an adversary’s lateral movement in case of a breach.  

 

Improve the user experience for both in-office and remote workers 

You can ensure that employees enjoy a faster and easier onboarding experience, faster and more secure sign-in via passwordless authentication, single sign-on for all applications, and superior performance. Using a self-service portal, your employees can request access to relevant packages, manage approvals and access reviews, and view request and approval history. Face Check with Microsoft Entra Verified ID enables real-time verification of your employee's identity, which streamlines remote onboarding and self-service recovery of passwordless accounts.  

 

Reduce the complexity and cost of managing security tools from multiple vendors 

Since traditional on-premises security solutions don’t scale to the needs of modern cloud-first, AI-first environments, organizations are seeking ways to secure and manage their assets from the cloud. With the Microsoft Entra Suite, you can retire multiple on-premises security tools, such as traditional Virtual Private Networks (VPNs), on-premises Secure Web Gateways (SWGs), and on-premises identity governance. 

 

Microsoft Entra Suite is currently priced at $12 per user per month. Microsoft Entra P1 is a licensing and technical prerequisite. Please refer to the Microsoft Entra Suite pricing page for more detail. 

 

 

Join us for upcoming events! 

We encourage you to register for the Zero Trust spotlight on July 31, 2024, when Microsoft experts and thought leaders will dive deeper into these and other announcements, including the general availability of Entra Internet Access and Entra Private Access, which is part of the Microsoft Entra Suite.  

 

Additionally, register for the Tech Accelerator to join us on August 14, 2024, for a deep dive into the Microsoft Entra Suite, and Private Access and Internet Access products. 

 

 

Learn More 

The availability of the Microsoft Entra Suite marks a key milestone in our commitment to continue to provide a more seamless and robust secure access experience that will empower the workforce anywhere and everywhere. Learn more from the official announcement

 

Visit the Microsoft Entra Suite trial page to get started. 

 

Irina Nechaeva, General Manager, Identity and Network Access Product Marketing 

 

 

Read more on this topic 

Watch the Microsoft Entra Suite mechanics video  Microsoft Entra product page Microsoft Entra portal 

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog   ⁠⁠Microsoft Entra blog | Tech Community   ⁠Microsoft Entra documentation | Microsoft Learn  Microsoft Entra discussions | Microsoft Community  

1Kosmos BlockID

Understanding the Snowflake Data Breach and Its Implications

Recently, the cybersecurity world was rocked by another significant breach, this time involving Snowflake, a major player in the data storage and analysis industry. The breach, orchestrated by the hacking group Shiny Hunters, exploited a weakness in customer account security, bypassing the need for a direct vulnerability within Snowflake’s platform itself. What Happened? The hackers … Continued

Recently, the cybersecurity world was rocked by another significant breach, this time involving Snowflake, a major player in the data storage and analysis industry. The breach, orchestrated by the hacking group Shiny Hunters, exploited a weakness in customer account security, bypassing the need for a direct vulnerability within Snowflake’s platform itself.

What Happened?

The hackers managed to gain access by exploiting unencrypted usernames and passwords stored on a worker’s machine and in a project management tool called JIRA. These credentials were used to access several Snowflake customer accounts, including those of Ticketmaster and Santander. Shockingly, none of these accounts had multi-factor authentication (MFA) enabled, making it easier for the hackers to infiltrate.

The Data Compromised The breach resulted in the theft of extensive customer data: Over 30 million bank account details, including 6 million account numbers and balances. 28 million credit card numbers. Personally identifiable information about staff.

Other potential victims mentioned by the hackers include LendingTree and Advanced Auto Parts, indicating the broad scope of this data theft.

Lessons Learned Enable MFA: This breach underscores the critical importance of multi-factor authentication. Despite its limitations, MFA adds a crucial layer of security that can deter many unauthorized access attempts. Secure Third-Party Access: The initial compromise occurred through a third-party contracting firm, emphasizing the need for robust security measures extending beyond your organization. Ensure that all third parties adhere to stringent security protocols. Encrypt Sensitive Data: Unencrypted usernames and passwords were a key vulnerability. Encrypting sensitive data can prevent it from being easily exploited if accessed. Awareness and Training: Regularly train and remind employees about security best practices, such as the importance of not storing unencrypted sensitive information on personal devices or project management tools. Looking Forward

At 1Kosmos, we continually strive to enhance security and protect our clients from such breaches. While no system can be completely immune, implementing comprehensive security measures, including MFA and strong data encryption, can significantly mitigate risks.

As we navigate through the evolving landscape of cybersecurity threats, staying informed and proactive is crucial. The Snowflake breach serves as a reminder of the continuous need for vigilance in protecting sensitive data.

For more insights on the Snowflake breach, watch our latest IBA Friday episode.

The post Understanding the Snowflake Data Breach and Its Implications appeared first on 1Kosmos.


Ocean Protocol

DF97 Completes and DF98 Launches

Predictoor DF97 rewards available. DF98 runs Jul 11— Jul 18, 2024 1. Overview Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor. Data Farming Round 97 (DF97) has completed. DF98 is live today, July 11. It concludes on July 18. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.
Predictoor DF97 rewards available. DF98 runs Jul 11— Jul 18, 2024 1. Overview

Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor.

Data Farming Round 97 (DF97) has completed.

DF98 is live today, July 11. It concludes on July 18. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.

2. DF structure

The reward structure for DF98 is comprised solely of Predictoor DF rewards.

Predictoor DF: Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn.

3. How to Earn Rewards, and Claim Them

Predictoor DF: To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs.

4. Specific Parameters for DF98

Budget. Predictoor DF: 37.5K OCEAN + 20K ROSE

Networks. Predictoor DF applies to activity on Oasis Sapphire. Here is more information about Ocean deployments to networks.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in DF: adding new streams and budget adjustments among streams.

Updates are always announced at the beginning of a round, if not sooner.

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

DF97 Completes and DF98 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Safle Wallet

Growth Rush

Weekly Safle Update! 🚀 Attention all Saflenauts, The Safle spaceship has re-emerged, fully powered and bursting with stellar discoveries! Now it’s time to regroup and prepare for the next leg of our interstellar journey. Here’s your Week 5 mission debrief as we continue our voyage through the galaxy: 💰We’ve Raised Funds! We now boasts a $77M valuation and is gearing up for $SAFL
Weekly Safle Update! 🚀 Attention all Saflenauts,

The Safle spaceship has re-emerged, fully powered and bursting with stellar discoveries! Now it’s time to regroup and prepare for the next leg of our interstellar journey. Here’s your Week 5 mission debrief as we continue our voyage through the galaxy:

💰We’ve Raised Funds!

We now boasts a $77M valuation and is gearing up for $SAFLE listings on major exchanges in July 2024, signaling significant growth and innovation in the Web3 space.

🔗https://sg.finance.yahoo.com/news/safle-identity-wallet-unveils-roadmap-145500273.html

🏦 ️Lbank Listing

We’re thrilled to announce our debut CEX listing on Lbank! This milestone is the first step towards seemless accessibility, enhanced liquidity and Safle’s larger goals.

🔗https://x.com/GetSafle/status/1808569125792260144

🌱 Community Growth

Our community is expanding faster than the universe! We’ve welcomed a galaxy of new members and boosted engagement through stellar partnerships. Don’t miss our partnership and AMA with PlayZap, the play-to-earn gaming platform with 1M+ players.

🔗https://x.com/GetSafle/status/1808841690884571506

🔗 Wallet Connect Integration

Say goodbye to complicated transactions! Our new Wallet Connect integration in the mobile app ensures a smooth and user-friendly experience.

🔗https://hgbopbs0vss.typeform.com/to/u2I3IQyG

🤖 AI R&D

Dive into our vision for AI in Web3. Discover how we’re leveraging AI to revolutionize blockchain technology.

🔗https://safle.medium.com/how-ai-is-impacting-web3-blockchain-benefits-for-developers-and-safles-integration-4b68ccaf083a

🌐 Bitcoin L2 and Inscriptions

Soon, see your ordinals and runes in Safle Wallet, with more chains supporting the Bitcoin ecosystem on Safle.

🎤 AMA Session

We’re going live on Twitter Space with Crypto Monster to discuss $Safle and its future plans.

Save the Date: 8th July

Time: 2PM UTC
Speakers: Apoorv & Shikha
Venue: Crypto Monster X’s Space
Reward Pool: $100 USDT

Get ready for an insightful conversation!
For more info -

🔗https://x.com/GetSafle/status/1809591139571240991

Download the Safle App Now!

🔗https://app.getsafle.com/signup

Join the Safle Community 🧑‍🚀

🔗https://linktr.ee/safle

Wednesday, 10. July 2024

KuppingerCole

IAM Systems Integrators North America: Insights Into the Current Market

The established players have vast resources and tend to leverage their global footprint to deliver projects in the North American market. These vendors support all known service offerings and IAM technologies along with a vast network of strong partnership with leading technology vendors. However, large firms come with their own set of challenges. Boutique firms, on the other hand, are focused o

The established players have vast resources and tend to leverage their global footprint to deliver projects in the North American market. These vendors support all known service offerings and IAM technologies along with a vast network of strong partnership with leading technology vendors. However, large firms come with their own set of challenges.

Boutique firms, on the other hand, are focused on specialized offerings as per their capacity. They have limited resources but are agile for that same reason. These firms can be very quick off the line to understand the project scope and provide initial support. Despite having these advantages over big firms, the boutique firms have their own set of challenges. However, there are certain vendors in North America that have limited resources compared to big firms, but they are successfully managing millions of identities and supporting most of the major IAM technologies.

Nitish Deshpande, Research Analyst at KuppingerCole Analysts will discuss the types of IAM systems integrators vendors in the north American market space. Additionally, Nitish will also address the role of systems integrators and some of their core capabilities.  In his recent research, Nitish conducted a thorough analysis of the current vendor landscape and identifies industry leaders based on their commitment to innovation, market presence, and overall service capabilities.

Join this webinar to:

Overview of North American systems integrators Understand the differences between small and large systems integrators Core capabilities of systems integrators Other services supported by systems integrators Types, methods, and average duration length of engagements supported by systems integrators


Indicio

Why banks need to use verifiable credentials to protect biometric data

The post Why banks need to use verifiable credentials to protect biometric data appeared first on Indicio.
Biometrics are a powerful solution to identity authentication: but they also escalate the risks of data breaches into the realm of catastrophe: here’s how to solve the problem with decentralized identity and verifiable credentials.

By Trevor Butterworth

You can’t reset a person. That is, cosmetic surgery aside, you can’t reset the physically distinguishing and unique characteristics that make us who we are.

Which isn’t a problem, unless you create an identity system based on using these characteristics — aka biometrics — to authenticate access to high value information  like bank accounts.

And, of course, that’s just what we’ve done. To address the inherent weaknesses of passwords, we’ve turned to deeper, probabilistically unique, identifiers like facial recognition, fingerprint scans, iris scans, and behavioral and vocal recognition.

In one important way, these are much better ways of authenticating access than passwords and pin codes simply because they are unique to the person seeking authentication. They provide confidence that a person is who they say they are in a way that passwords never could. They are also fast and approach the seamless interaction that people desire and systems require.

The way biometric data is stored should scare you

But in one critical respect,biometrics are much worse than passwords: they can’t be reset.

The way biometric data is used requires comparing a real-time scan of a particular biometric characteristic against stored data for that characteristic.

Therein lies the first of two problems: if the biometric data is stored in a centralized database, the risk of a data breach will never be zero. And if there is a data breach, it’s a privacy and security catastrophe. Fraudsters have access to your unique characteristics.

Now guess where biometric data are conventionally stored.

“Bring your own biometrics”

Verifiable credential technology provides a simple fix: it removes the need for centrally storing biometric data in order to facilitate authentication.

Instead, during onboarding, a person’s biometric data is captured by a relevant authority (e.g., a bank)  in a biometric template. This biometric template is then issued as a verifiable credential, which the person holds in a digital wallet.

The credential is a unique cryptographic proof that their biometric data has been authenticated by the given authority. As the biometric data recorded in the template is digitally signed, any attempt to alter it will render the credential non-functional.

When the person performs a biometric scan, they also present their biometric template credential. Because of the way these presentations are made, only the person issued with the credential can present their credential. The biometric template in the credential is cryptographically verified as authentic and compared to the biometric scan.

In this way, a person “brings their own biometrics” for authentication. The source of the credential is verified by the relying party — e.g., a bank can cryptographically verify that it had issued you and only you with this biometric credential, while digital signatures show that the biometric data has not been tampered with.

With no more than a couple of screen swipes, the problem of centralized biometric data disappears, and with it the nightmare of data privacy and protection and compliance.

In our next piece, we’ll look at the second problem with biometrics and authentication: how verifiable credentials can defend against the challenge of generative AI deepfakes.

If you would like to see how you can easily get started with this technology you can learn more about Indicio’s full solution for decentralized identity, Indicio Proven®.

If you have questions about how decentralized identity could apply to your organization our team would be happy to discuss it with you. You can get in touch or book a demo here.

Sign up to our newsletter to stay up to date with the latest from Indicio and the decentralized identity community

The post Why banks need to use verifiable credentials to protect biometric data appeared first on Indicio.


auth0

JWT Access Tokens Profiles, Now in GA

Auth0 now offers the option to choose between two access token profiles: the Auth0 token profile, which remains the default, and RFC 9068.
Auth0 now offers the option to choose between two access token profiles: the Auth0 token profile, which remains the default, and RFC 9068.

DHIWay

Stocks, Bonds, Real Estate or: How the real world Asset Tokenization is taking over capital markets?

In recent years, blockchain technology has emerged as a groundbreaking innovation with the potential to revolutionize various industries, particularly finance. One of the most promising applications of blockchain in the financial sector is asset tokenization. Asset tokenization refers to the process of representing ownership of real-world assets, such as real estate, stocks, bonds, or commodities,


In recent years, blockchain technology has emerged as a groundbreaking innovation with the potential to revolutionize various industries, particularly finance. One of the most promising applications of blockchain in the financial sector is asset tokenization. Asset tokenization refers to the process of representing ownership of real-world assets, such as real estate, stocks, bonds, or commodities, as digital tokens on a blockchain. This transformative approach not only enhances liquidity and accessibility but also introduces new opportunities for investment and innovation in global capital markets.

This post delves deep into the essence of asset tokenization, exploring its mechanisms, benefits, challenges, and future potential. By examining real-world examples and case studies, we aim to provide a comprehensive understanding of how asset tokenization is reshaping the financial landscape and what it means for investors, institutions, and regulators.


Understanding Asset Tokenization
What is Asset Tokenization?

Asset tokenization involves creating a digital representation of an asset on a blockchain. This process typically entails the issuance of digital tokens that represent ownership or a stake in the underlying asset. These tokens can be traded, transferred, and managed on blockchain platforms, enabling a more efficient and transparent way of handling assets.


How Does Asset Tokenization Work?

1. Identification of the Asset: The first step in asset tokenization is identifying the asset to be tokenized. This can include physical assets like real estate, precious metals, and artworks, as well as financial assets such as stocks, bonds, and derivatives.

2. Valuation and Structuring: Once the asset is identified, it is valued and structured for tokenization. This involves determining the total value of the asset and dividing it into smaller units or tokens. Each token represents a fraction of the total asset value.

3. Issuance of Tokens: The next step is issuing the tokens on a blockchain platform. These tokens are typically created using smart contracts, which are self-executing contracts with the terms of the agreement directly written into code.

4. Trading and Transfer: After issuance, the tokens can be traded and transferred on blockchain-based exchanges or platforms. This allows for seamless and instantaneous transactions, reducing the need for intermediaries and minimizing transaction costs.

5. Ownership and Governance: The ownership of the tokens is recorded on the blockchain, providing a transparent and immutable record of all transactions. Token holders may also have certain governance rights, such as voting on decisions related to the underlying asset.


Benefits of Asset Tokenization
Increased Liquidity

One of the primary benefits of asset tokenization is increased liquidity. Traditional assets, such as real estate or fine art, are often illiquid, meaning they cannot be easily bought or sold without significantly impacting their price. Tokenization enables fractional ownership, allowing investors to buy and sell smaller portions of an asset. This fractionalization makes it easier to match buyers and sellers, enhancing market liquidity.


Fractional Ownership and Accessibility

Tokenization democratizes access to high-value assets by enabling fractional ownership. Investors can purchase tokens representing a fraction of an asset, lowering the barrier to entry and allowing more individuals to participate in the market. This increased accessibility can attract a broader range of investors, including those who may not have the financial resources to acquire entire assets.


Transparency and Security

Blockchain technology provides a transparent and immutable ledger of all transactions, enhancing trust and security in the market. Each transaction is recorded on the blockchain, providing a clear and verifiable record of ownership and transfer. This transparency reduces the risk of fraud and ensures that all parties have access to consistent and up-to-date information.


Efficiency and Cost Reduction

By eliminating intermediaries and automating processes, asset tokenization can significantly reduce transaction costs and improve efficiency. Traditional asset transactions often involve multiple intermediaries, such as brokers, custodians, and clearinghouses, each adding costs and delays to the process. Tokenization streamlines these processes by enabling direct peer-to-peer transactions and automated settlement through smart contracts.


Enhanced Regulatory Compliance

Blockchain’s transparent and immutable ledger facilitates enhanced regulatory compliance. Regulators can have real-time access to transaction data, enabling better oversight and quicker response to suspicious activities. Additionally, blockchain can simplify the compliance process by providing a single source of truth for all transactions, reducing the costs and complexities associated with regulatory reporting.



Applications of Asset Tokenization
Real Estate

Real estate is one of the most promising sectors for asset tokenization. Traditional real estate investments often involve significant capital outlay and lengthy transaction processes. Tokenization enables fractional ownership of real estate properties, making it easier for investors to buy and sell shares of properties and enhancing liquidity in the market.

For example, a real estate developer can tokenize a commercial property, issuing digital tokens that represent ownership shares. Investors can purchase these tokens, gaining exposure to the property and earning a share of the rental income. The tokens can be traded on blockchain-based platforms, allowing for seamless and instantaneous transactions.


Art and Collectibles

The art and collectibles market is another sector that can benefit from asset tokenization. High-value artworks and collectibles are often illiquid, with few opportunities for investors to buy and sell shares. Tokenization enables fractional ownership of these assets, providing greater liquidity and accessibility.

For instance, a rare painting can be tokenized, with each token representing a fraction of the painting’s value. Investors can purchase tokens, gaining exposure to the artwork and participating in its potential appreciation. The tokens can be traded on blockchain-based platforms, allowing for greater flexibility and liquidity in the market.


Stocks and Bonds

Tokenization can also be applied to traditional financial assets, such as stocks and bonds. By representing these assets as digital tokens on a blockchain, investors can benefit from increased liquidity, transparency, and efficiency.

For example, a company can issue digital tokens representing shares of its stock. Investors can purchase these tokens, gaining exposure to the company’s equity and participating in its potential growth. The tokens can be traded on blockchain-based platforms, enabling seamless and instantaneous transactions.

Similarly, bonds can be tokenized, with each token representing a fraction of the bond’s value. Investors can purchase tokens, earning interest payments and participating in the bond’s potential appreciation. The tokens can be traded on blockchain-based platforms, providing greater liquidity and flexibility in the market.


Commodities

Commodities, such as gold, silver, and oil, can also be tokenized, providing greater liquidity and accessibility for investors. Tokenization enables fractional ownership of commodities, making it easier for investors to buy and sell shares and participate in the market.

For instance, a gold producer can tokenize a portion of its gold reserves, issuing digital tokens that represent ownership shares. Investors can purchase these tokens, gaining exposure to the gold and participating in its potential appreciation. The tokens can be traded on blockchain-based platforms, allowing for seamless and instantaneous transactions.


Future Outlook and Potential Developments
Decentralized Finance (DeFi)

Decentralized finance (DeFi) is an emerging sector that leverages blockchain technology to create decentralized and open financial systems. DeFi platforms enable a wide range of financial services, such as lending, borrowing, and trading, without the need for traditional intermediaries.

Asset tokenization plays a crucial role in the DeFi ecosystem by enabling the creation of digital assets that can be seamlessly integrated into DeFi platforms. Tokenized assets can be used as collateral for loans, traded on decentralized exchanges, and integrated into various DeFi protocols, enhancing liquidity and accessibility in the financial system.


Security Tokens and STOs

Security tokens and security token offerings (STOs) represent a new paradigm in fundraising and investment. Security tokens are digital representations of traditional securities, such as stocks, bonds, and real estate, issued and traded on a blockchain. STOs provide a regulated and compliant framework for issuing and trading security tokens, combining the benefits of blockchain technology with the regulatory compliance of traditional securities.

The proliferation of security tokens and STOs has the potential to transform capital markets by enhancing liquidity, reducing transaction costs, and democratizing access to investment opportunities. As regulatory frameworks evolve and market participants embrace innovation, security tokens are poised to play a significant role in the future of finance.


Integration with Emerging Technologies

Blockchain technology can be integrated with other emerging technologies, such as artificial intelligence (AI), the Internet of Things (IoT), and big data, to create innovative solutions for asset management and trading. For example, AI can be used to analyze blockchain data and identify patterns and trends, providing valuable insights for investment strategies. IoT devices can generate real-time data that can be recorded on a blockchain, enhancing transparency and traceability in supply chain finance.


Global Adoption and Collaboration

The future of asset tokenization depends on global adoption and collaboration among market participants, regulators, and technology providers. Building a standardized and interoperable framework for asset tokenization requires international cooperation and alignment of regulatory and technical standards.

Collaborative efforts can drive innovation, reduce fragmentation, and ensure the seamless integration of tokenized assets into the global financial system. As more countries and institutions recognize the potential of asset tokenization, the financial landscape is poised for significant transformation.


Conclusion

Asset tokenization represents a revolutionary shift in how assets are managed, traded, and owned. By leveraging blockchain technology, asset tokenization enhances liquidity, accessibility, transparency, and efficiency in global capital markets. This transformative approach democratizes access to high-value assets, reduces transaction costs, and provides new opportunities for investment and innovation.

While challenges remain, such as regulatory compliance, technical infrastructure, and market adoption, the potential benefits of asset tokenization are significant. Real-world examples and case studies demonstrate the viability and advantages of tokenization across various asset classes, including real estate, art, stocks, bonds, and commodities.

As technology continues to evolve and market participants embrace innovation, asset tokenization is poised to play a crucial role in shaping the future of finance. The journey towards widespread adoption is ongoing, and the future holds exciting possibilities for the financial industry.

In the long term, asset tokenization has the potential to fundamentally alter the structure and operation of capital markets. By providing a more transparent, efficient, and secure framework, tokenization can enhance market integrity, reduce systemic risks, and create new opportunities for growth and development. As we move forward, the financial industry must navigate the challenges and seize the opportunities presented by this transformative technology.

The post Stocks, Bonds, Real Estate or: How the real world Asset Tokenization is taking over capital markets? appeared first on Dhiway.


Elliptic

Huione Guarantee: The multi-billion dollar marketplace used by online scammers

Huione Guarantee is an online marketplace that has become widely used by scam operators in South East Asia, including those involved in so-called “pig butchering”. Merchants on the platform offer technology, data and money laundering services, and have engaged in transactions totaling at least $11 billion. Huione Guarantee is part of Huione Group, a Cambodian conglomerate with
Huione Guarantee is an online marketplace that has become widely used by scam operators in South East Asia, including those involved in so-called “pig butchering”. Merchants on the platform offer technology, data and money laundering services, and have engaged in transactions totaling at least $11 billion.

Huione Guarantee is part of Huione Group, a Cambodian conglomerate with links to Cambodia’s ruling Hun family.

Our research indicates that another Huione business is actively involved in laundering proceeds of scams from around the world. 

Elliptic customers are able to protect themselves against exposure to this activity through the use of our cryptoasset transaction screening and investigative solutions.

 


Metadium

Understanding the Virtual Asset User Protection Act

The Virtual Asset User Protection Act began its legislative journey in July 2023 and is set to take effect on the 19th of this month. As interest in this new law grows, we will delve into the background, detailed provisions, impact on Metadium, and future challenges of the Virtual Asset User Protection Act. What is the Virtual Asset User Protection Act? The primary aim of the Virtual Asset User

The Virtual Asset User Protection Act began its legislative journey in July 2023 and is set to take effect on the 19th of this month. As interest in this new law grows, we will delve into the background, detailed provisions, impact on Metadium, and future challenges of the Virtual Asset User Protection Act.

What is the Virtual Asset User Protection Act?

The primary aim of the Virtual Asset User Protection Act is to safeguard the assets of virtual asset users and systematically regulate unfair trading practices to enhance the transparency and stability of the virtual asset market. Over the past few years, virtual assets, or cryptocurrencies, have experienced rapid growth, attracting a surge of investors. However, this growth has been accompanied by an increase in hacking, fraud, and other illegal activities in virtual asset exchanges, leading to significant investor losses. The Virtual Asset User Protection Act was introduced to address these issues and protect users.

Detailed Provisions of the Act

Protection of Customer Assets

One of the most notable aspects of the Act is the requirement for domestic virtual asset exchanges, wallets, and custodians registered with the financial authorities to separate and safeguard customer deposits. These deposits, which investors entrust to exchanges for coin purchases, must be held separately in a bank. This ensures that even in the event of an exchange’s bankruptcy or business closure, the deposits can be safely returned to the investors via the bank. It is important to note that virtual assets held by exchanges are not protected in the same manner as deposits. In the case of exchange bankruptcy, investors’ virtual assets may still be subject to creditor claims and potential losses.

2. Enhanced Security and Prohibition of Unfair Trading Practices

The Act also targets unfair trading practices, such as market manipulation. Exchanges must establish systems for continuously monitoring abnormal transactions and report any suspicious activities to the financial authorities. Should the Financial Supervisory Service confirm illegal activities during investigations, the exchanges could face criminal charges and fines.

3. Information Disclosure and Reporting Obligations

Virtual asset businesses are required to disclose key management information periodically and report suspicious transactions to the authorities regularly. Exchanges must build systems for continuous monitoring and report any abnormal transactions to the financial authorities. Confirmed illegal activities will lead to criminal prosecution and fines.

Future Challenges

While the Virtual Asset User Protection Act represents a significant step towards a healthy market development, several challenges remain.

Strengthening International Cooperation

Given that virtual asset transactions frequently cross borders, international cooperation is essential. The Financial Services Commission (FSC) has announced plans to incorporate global standards into regulations for decentralized finance (DeFi) services, like Solana (SOL). Establishing and effectively enforcing regulations that align with global standards is crucial.

2. Adapting to Technological Advancements

Virtual asset technology is rapidly evolving. Regulatory authorities must continuously update regulations to keep pace with technological advancements and flexibly respond to market changes.

3. Investor Education

Strengthening education programs to ensure investors properly understand virtual assets is vital. This will further enhance investor protection.

4. Continuous Monitoring

The dynamic nature of the virtual asset market necessitates ongoing monitoring. Regulatory authorities should regularly assess market conditions and take appropriate measures as needed.

Impact on Metadium: Positive or Negative?

Metadium is committed to diligently disclosing key information such as circulating supply, as mentioned in the Act, and is not involved in unfair trading practices or market manipulation. Therefore, it is likely to avoid immediate drastic impacts. However, implementing the Act is expected to reduce illegal or unfair activities, thereby enhancing the transparency and stability of the virtual asset market. This can positively contribute to the long-term value and growth of the Metadium Community.

Conclusion

The Virtual Asset User Protection Act, taking effect on July 19, will play a crucial role in enhancing the transparency and safety of the virtual asset market and protecting investors. Ultimately, the introduction of this legislation is expected to contribute to the healthy development of the virtual asset market. It is important to remain vigilant about ongoing trends and adapt to changes flexibly. As always, Metadium will strive to protect user assets and foster a healthy and stable market. Thank you.

가상자산이용자보호법 알아보기

가상자산이용자보호법이 2023년 7월부터 제정되기 시작해 이달 19일 부터 시행될 예정입니다. 이를 둘러싸고 많은 관심이 모아지고 있는데, 오늘은 가상자산이용자보호법의 도입 배경과 세부 내용, 메타디움에 미칠 영향, 그리고 앞으로의 과제에 대해 자세히 살펴보겠습니다.

가상자산이용자보호법이란?

가상자산 이용자의 자산을 보호하고 불공정 거래행위를 체계적으로 규제하여, 가상자산 시장의 투명성과 안정성을 제고하는데 그 목적을 둡니다. 가상자산, 즉 암호화폐는 지난 몇 년간 급격한 성장을 이루었으며 이에 따라 투자자들도 폭발적으로 늘어났습니다. 그러나 이러한 성장과 함께 가상자산 거래소의 해킹, 사기 및 기타 불법행위도 함께 증가하면서 투자자들이 많은 피해를 입기도 했습니다. 이러한 문제를 해결하고 이용자들을 보호하고자 가상자산이용자보호법을 도입하게 되었습니다.

시행령 세부 내용

고객 자산 보호 가장 눈에 띄는 점 중 하나는 금융 당국에 신고한 국내 가상자산 거래소와 지갑 및 보관업체 등에 예치금 분리 보관 의무가 생긴다는 점입니다. 투자자가 코인 구매 등을 위해 거래소에 맡긴 예치금을 은행에 별도로 보관하고, 거래소 파산 또는 사업자 말소 시에도 은행을 통해 예치금을 안전하게 돌려받을 수 있습니다.
주의할 점은 거래소가 보관하는 투자자들의 가상자산은 예치금과 달리 거래소 파산 시 보호대상이 아니라는 점입니다. 코인의 경우 거래소 자체 재산과 분리해 보관해야 하는 의무만 질 뿐, 거래소가 문 닫으면 채권자의 압류 등으로 손실을 입을 수도 있습니다.

2. 보안 강화 및 불공정 거래 행위 금지

코인 시세 조종(Market Making) 등 불공정 거래 행위를 겨냥한 것도 특징입니다. 거래소는 앞으로 코인 이상거래를 상시 감시하는 시스템을 구축 하여 이상거래 발생시 금융 당국에 통보해야 합니다. 또한 금감원 조사에서 불법 혐의가 확인되면 검찰 수사 등을 거쳐 형사 처벌과 과징금 부과 등 제재를 받습니다.

3. 정보 공개 및 보고 의무

가상자산 사업자는 주요 경영 정보를 정기적으로 공개하고 의심스러운 거래를 당국에 보고해야 합니다. 거래소는 앞으로 코인 이상거래를 상시 감시하는 시스템을 구축 하여 이상거래 발생시 금융 당국에 통보해야 합니다. 또한 금감원 조사에서 불법 혐의가 확인되면 검찰 수사 등을 거쳐 형사 처벌과 과징금 부과 등 제재를 받습니다.

앞으로의 과제

지금까지 가상자산보호법의 주요 내용을 살펴보았습니다. 가장자산이용자보호법의 시행은 건전한 시장 발전을 위해 중요한 첫걸음이지만 여전히 과제가 남아있습니다.

국제협력 강화 가상자산은 국경을 초월한 거래가 일상적이고, 따라서 국제적인 협력이 필수적입니다. 금융위는 솔라나(SOL) 등으로 대표되는 디파이(DeFi, 탈중앙화된금융) 서비스도 글로벌 스탠다드를 반영해 규율 방안을 마련할 계획이라고 밝혔습니다. 글로벌 기준에 맞는 규제를 마련하고 이를 효과적으로 집행해야하는 필요성이 남아있습니다.

2. 기술발전 대응

가상자산 기술은 빠르게 발전하고 있습니다. 규제 당국은 이러한 기술 발전에 발맞춰 규제를 지속적으로 업데이트하고, 시장의 변화에 유연하게 대응해야 합니다.

3. 투자자 교육

투자자들이 가상자산에 대해 올바른 이해를 가지고 투자할 수 있도록 교육 프로그램을 강화해야 합니다. 이를 통해 투자자 보호를 더욱 강화할 수 있습니다.

4. 지속적인 모니터링

가상자산 시장은 매우 역동적이기 때문에 지속적인 모니터링이 필요합니다. 규제 당국은 시장 상황을 주기적으로 점검하고, 필요시 적절한 조치를 취해야 합니다.

메타디움에는 어떤 영향을 미칠까? 긍정적일까, 부정적일까?

메타디움은 시행령에서 언급하고 있는 유통량 등 중요 정보를 성실하게 고시하고 있으며, 시행령에서 주요하게 다루고 있는 불공정 거래 혹은 시세조종과도 관련이 없으므로 당장 급격한 영향을 받지는 않을 것으로 보입니다. 다만 법안의 시행으로 인해 불법적이거나 불공정한 행위가 감소하면 가상자산 시장의 투명성과 안정성이 높아지게 됩니다. 이는 메타디움 코인의 장기적 가치 상승 및 성장에 도움이 될 수 있을 것으로 보입니다. 결론

7월 19일부터 발효되는 가상자산이용자보호법은 가상자산 시장의 투명성과 안전성을 높이고, 투자자를 보호하는 데 중요한 역할을 할 것입니다. 그러나 궁극적으로는 법안의 도입이 가상자산 시장의 건전한 발전에 기여할 것으로 보입니다. 앞으로도 관련 동향을 주의 깊게 살피며, 변화에 유연하게 대응하는 것이 중요합니다. 지금까지 그래왔던 것 처럼 메타디움 또한 이용자의 자산을 안전하게 보호하고 건전하고 안정적인 시장 형성을 위해 최선을 다하겠습니다.

- 메타디움 팀

Website | https://metadium.com

Discord | https://discord.gg/ZnaCfYbXw2

Telegram(EN) | http://t.me/metadiumofficial

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Understanding the Virtual Asset User Protection Act was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 09. July 2024

Indicio

ARUBA TA INTRODUCI SISTEMA INNOVATIVO PA PROCESO PASAHERO RAPIDAMENTE

SITA The post ARUBA TA INTRODUCI SISTEMA INNOVATIVO PA PROCESO PASAHERO RAPIDAMENTE appeared first on Indicio.

Microsoft Entra (Azure AD) Blog

Microsoft Entra certificate-based authentication enhancements

Howdy, folks! Today I'm excited to share the latest enhancements for Microsoft Entra certificate-based authentication (CBA). CBA is a phishing-resistant, password less, and convenient way to authenticate users with X.509 certificates, such as PIV/CAC cards, without relying on on-premises federation infrastructure, such as Active Directory Federated Service (AD FS). CBA is particularly critica

Howdy, folks! Today I'm excited to share the latest enhancements for Microsoft Entra certificate-based authentication (CBA). CBA is a phishing-resistant, password less, and convenient way to authenticate users with X.509 certificates, such as PIV/CAC cards, without relying on on-premises federation infrastructure, such as Active Directory Federated Service (AD FS). CBA is particularly critical for federal government organizations that are already using PIV/CAC cards and are looking to comply with Executive Order 14028, which requires phishing-resistant authentication. 

 

Today we're announcing the general availability of many improvements we introduced earlier this year – username bindings, affinity bindings, policy rules, and advanced CBA options in Conditional Access are all GA! I am also excited to announce the public preview of an exciting new capability - issuer hints. The issuer hints feature greatly improves user experience by helping users to easily identify the right certificate for authentication.

 

Vimala Ranganathan, Principal Product Manager on Microsoft Entra, will now walk you through these new features that will help you in your journey toward phishing-resistant multifactor authentication (MFA).    

 

Thanks, and please let us know your thoughts!    

Alex Weinert   

 

--  

 

Hello everyone, 

 

I’m Vimala from the Microsoft Entra PM team, and I’m excited to walk you through the new issuer hints feature, as well as the features that will go into general availability.   

 

The issuer hints feature improves user experience by helping users to easily identify the right certificate for authentication. When enabled by tenant admin, Entra will send back Trusted CA Indication as part of the TLS handshake. The trusted Certificate Authority (CA) list will be set to subject of the Certificate Authorities (CAs) uploaded by the tenant in the Entra trust store. The client or native application client will use the hints sent back by server to filter the certificates shown in certificate picker and will show only the client authentication certificates issued by the CAs in the trust store. 

 

Figure 1: Enhanced certificate Picker with issuer hints enabled

 

We’re also thrilled to announce the features below are going to be in general availability. You can read more about each of the features in detail in our public preview blog: Enhancements to Microsoft Entra certificate-based authentication - Microsoft Community Hub.

 

CBA username bindings, which CBA added support for three remaining username bindings and is now at parity with on-premises Active Directory. The three bindings that are being added are: IssuerAndSerialNumber, IssuerAndSubject, and Subject. More at Configure Username binding policy.   

 

CBA Affinity Binding allows admins to set affinity binding at the tenant level, as well as create custom rules to use high affinity or low affinity mapping for covering many potential scenarios our customers have in use today. More at CBA Affinity Bindings.    

 

CBA Authentication policy rules help determine the strength of authentication as either single-factor or multifactor. Multiple custom authentication binding rules can be created to assign default protection level for certificates based on the certificate attributes (Issuer or Policy Object Identifiers (OID) or by combining the Issuer and OID). More at Configure authentication binding policy.    

 

Advanced CBA options in Conditional Access allow access to specific resources based on the certificate Issuer or Policy OIDs properties. More at authentication strength advanced options.   

 

You can learn more about Microsoft Entra CBA here and Microsoft’s commitment to Executive Order 14028.    

 

What’s next     

 

Over the last year, we’ve seen many federal and regulated industry customers migrate off AD FS to Microsoft Entra ID seamlessly by leveraging staged migration and providing end users a familiar sign-in experience with CBA. In fact, in the last 12 months, we’ve seen an over 1400% increase in phishing-resistant authentication for United States government customers. 

 

Keep your feedback coming at Microsoft Entra Community! We’re working diligently to bring more enhancements like the removal of limits on Certificate Revocation List (CRL), new certificate authority trust store, CBA support on the resource tenant for B2B external guest users, and iOS UX enhancements, to name just a few! 

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog⁠Microsoft Entra blog | Tech Community Microsoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community  

auth0

Strong Customer Authentication Explained

An introduction to Strong Customer Authentication (SCA), a mechanism for providing a higher level of security in sensitive contexts.
An introduction to Strong Customer Authentication (SCA), a mechanism for providing a higher level of security in sensitive contexts.

Elliptic

Introducing Ecosystem Monitoring: The world’s first proactive alerting and analytics system for stablecoin and token issuers

Today, we’re excited to announce the launch of Elliptic Ecosystem Monitoring, a first-of-its-kind solution that enables stablecoin and token issuers to screen their ecosystems for financial crime risk in real-time. Leveraging a suite of proactive monitoring and asset analytics capabilities, issuers will be able to obtain a comprehensive view of licit and illicit activity in their crypto

Today, we’re excited to announce the launch of Elliptic Ecosystem Monitoring, a first-of-its-kind solution that enables stablecoin and token issuers to screen their ecosystems for financial crime risk in real-time. Leveraging a suite of proactive monitoring and asset analytics capabilities, issuers will be able to obtain a comprehensive view of licit and illicit activity in their cryptoasset’s ecosystem for the first time.

It has become more important than ever for stablecoin and token issuers to ensure they’re adopting appropriate risk detection and mitigation strategies within their ecosystems. With the stablecoin market expected to grow to $2.8 trillion in the next five years (up from $125 billion), along with new and proposed regulatory regimes emerging around the globe, issuers must be able to demonstrate to regulators that they understand who is utilizing their token or stablecoin and where financial crimes risks are stemming from.

This need is underscored by the increasing use of stablecoins by illicit actors. Stablecoins can prove attractive to illicit actors owing to a variety of factors  such as the price stability they afford, transaction processing speeds, and their deep liquidity across multiple blockchains. Consequently, it is more important than ever for participants in the cryptoasset ecosystem to work to ensure that illicit actors aren’t exploiting stablecoin or token ecosystems.


Crypto regulatory affairs: MiCA’s stablecoin provisions go live, with Circle the first issuer to obtain full approval

On June 30, the European Union’s new rules for stablecoin issuers entered into force, ushering in a new era of regulatory oversight for innovators in the crypto space. 

On June 30, the European Union’s new rules for stablecoin issuers entered into force, ushering in a new era of regulatory oversight for innovators in the crypto space. 


KuppingerCole

IAM Systems Integrators North America

by Nitish Deshpande Identity and Access Management (IAM) systems integrators are companies that provide support in consulting, implementing, and managing services for operations of IAM technologies for businesses. These services can range from planning and designing to implementing and operating various IAM technologies based on customers’ requirements. This Buyer's Compass provides insight into

by Nitish Deshpande

Identity and Access Management (IAM) systems integrators are companies that provide support in consulting, implementing, and managing services for operations of IAM technologies for businesses. These services can range from planning and designing to implementing and operating various IAM technologies based on customers’ requirements. This Buyer's Compass provides insight into what these services do and what companies should consider when selecting them.

Shyft Network

Shyft DAO June 2024 Recap and Community Update

Hello, Chameleons! As we have entered July, here’s the scoop on what happened at Shyft DAO in June and what’s brewing currently: 🦎🍦 Ambassador Program Update Our Ambassador Program is currently on hold as we gear up for its 2.0 generation. This new phase is all about empowering our community and fostering leadership from within. Empowering Community Leaders We’re identifying standout

Hello, Chameleons! As we have entered July, here’s the scoop on what happened at Shyft DAO in June and what’s brewing currently: 🦎🍦

Ambassador Program Update

Our Ambassador Program is currently on hold as we gear up for its 2.0 generation. This new phase is all about empowering our community and fostering leadership from within.

Empowering Community Leaders

We’re identifying standout community members to lead the charge. These leaders will help shape a more engaging and proactive community environment.

Collaborative Innovation

In our efforts to enhance our program, we’re consulting with these key community members to develop a more effective strategy that encourages taking initiative. This new approach moves away from task-based activities to more dynamic, self-driven contributions.

Looking Ahead

As we work on these exciting updates, stay tuned for more information. We’re eager to unveil Ambassador Program 2.0 and continue building our community together.

In the meantime, enjoy the summer if in the Northern Hemisphere ☀️💦

The Shyft DAO community is committed to building a decentralized, trustless ecosystem that empowers its members to collaborate and make decisions in a transparent and democratic manner. Our mission is to create a self-governed community that supports innovation, growth, and diversity while preserving the privacy and sovereignty of its users.

Follow us on Twitter and Medium for up-to-date news from the Shyft DAO.

Shyft DAO June 2024 Recap and Community Update was originally published in Shyft DAO on Medium, where people are continuing the conversation by highlighting and responding to this story.


Safle Wallet

$SAFLE Live on LBank: A Comprehensive Guide on How to Participate

The $SAFLE token has officially gone live on LBank, marking a significant milestone for the Safle ecosystem. This article aims to provide a comprehensive walkthrough on what $SAFLE is, its utility, and how you can participate in trading it on LBank. What is $SAFLE? $SAFLE is the native utility token of the Safle ecosystem, a decentralized identity and wallet management platform. Safle

The $SAFLE token has officially gone live on LBank, marking a significant milestone for the Safle ecosystem. This article aims to provide a comprehensive walkthrough on what $SAFLE is, its utility, and how you can participate in trading it on LBank.

What is $SAFLE?

$SAFLE is the native utility token of the Safle ecosystem, a decentralized identity and wallet management platform. Safle aims to revolutionize how users manage their digital identities and assets securely. The $SAFLE token plays a crucial role within this ecosystem, providing various utilities and benefits to its holders.

Governance: $SAFLE token holders have the right to participate in the governance of the Safle ecosystem. They can propose and vote on changes, ensuring the platform evolves in a decentralized manner. Staking Rewards: Users can stake $SAFLE tokens to earn rewards. This not only provides an incentive for holding the token but also helps secure the network. Transaction Fees: The token is used to pay for transactions and gas fees within the Safle ecosystem, ensuring seamless and cost-effective transactions. Access to Premium Features: Holding $SAFLE tokens grants users access to premium features and services within the Safle platform. $SAFLE Goes Live on LBank

Listing $SAFLE on LBank is a significant achievement for the Safle ecosystem. LBank is a global digital asset trading platform known for its innovation and security. Being listed on LBank’s Innovation Zone opens up $SAFLE to a broader audience, enhancing its liquidity and market reach.

Listing details:

Trading Pair: SAFLE/USDT Trading Zone: Innovation Zone Start Deposit: 10:00 on July 10, 2024 (UTC) Start Trading: 10:00 on July 9, 2024 (UTC) Start Withdrawal: 10:00 on July 10, 2024 (UTC)

SAFLE/USDT spot at: https://www.lbank.com/trade/safle_usdt

Note — This will be effective upon commencement of trading.

How to Participate in Trading $SAFLE on LBank?

Participating in trading $SAFLE on LBank is straightforward. Here is a step-by-step guide to get you started:

Step 1: Create an Account on LBank Visit LBank’s Website: Go to the official LBank website at LBank. Sign Up: Click on the “Sign Up” button and fill in the required details to create your account. You will need to provide a valid email address and set a strong password. Verify Your Email: After signing up, verify your email address by clicking on the verification link sent to your email. Step 2: Complete KYC Verification Login to Your Account: Once your email is verified, log in to your LBank account. Go to KYC Verification: Navigate to the “Account” section and select “KYC Verification.” Submit Documents: Follow the instructions to submit the necessary identification documents. This process is essential for security and compliance purposes. Step 3: Deposit Funds Go to Wallet: Click on the “Wallet” tab on the top menu. Select Deposit: Choose the cryptocurrency you want to deposit. If you are depositing USDT, select USDT and click “Deposit.” Copy Address: Copy the deposit address provided and transfer your funds from your external wallet to LBank. Step 4: Buy $SAFLE Go to Markets: Navigate to the “Markets” section on the top menu. Search for $SAFLE: In the search bar, type “$SAFLE” and select the $SAFLE/USDT trading pair. Place an Order: You can place a buy order by specifying the amount of $SAFLE you wish to purchase and the price you are willing to pay. You can choose between a market order (buying at the current market price) or a limit order (setting your desired price). Step 5: Secure Your $SAFLE Go to Wallet: After purchasing $SAFLE, go to your wallet to see your balance. Transfer to Secure Wallet: For added security, consider transferring your $SAFLE tokens to a secure wallet, such as the Safle wallet, which offers advanced security features and full control over your assets. Conclusion

The listing of $SAFLE on LBank marks a significant step forward for the Safle ecosystem, providing increased liquidity, global exposure, and enhanced security for $SAFLE token holders. By following this guide, you can easily participate in trading $SAFLE on LBank and take advantage of the benefits it offers. As the Safle ecosystem continues to grow, holding and trading $SAFLE tokens positions you to be part of this exciting journey toward decentralized identity and asset management.

About Safle

Safle is a decentralized platform focused on creating secure digital identities. By leveraging blockchain technology, Safle provides users with a self-sovereign identity system, ensuring privacy and control over personal data. The platform also offers a suite of tools for managing digital assets, making decentralized finance accessible and user-friendly. With a commitment to transparency and innovation, Safle is poised to revolutionize the way individuals interact with digital services.

To learn more about Saffle, visit:

Website | X | Discord | Telegram | Instagram | Medium | Github |


IDnow

KYC and signing in seconds: IDnow introduces two new e-signature solutions

IDnow InstantSign and eID eSign reduce the cost and complexity of contract signing Munich, July 9, 2024 – IDnow, a leading identity verification platform provider in Europe, has launched two new e-signature solutions to the market. InstantSign empowers financial businesses to innovate friction-heavy and lengthy signing processes, such as loan contracts, by way of a […]
IDnow InstantSign and eID eSign reduce the cost and complexity of contract signing

Munich, July 9, 2024 – IDnow, a leading identity verification platform provider in Europe, has launched two new e-signature solutions to the market. InstantSign empowers financial businesses to innovate friction-heavy and lengthy signing processes, such as loan contracts, by way of a reusable identity, while eID eSign enables users to verify their identity and digitally sign contracts with the help of Near Field Communication (NFC) technology.

AML compliance redefined: QES without reverification

Existing solutions traditionally either require a new identity verification before a Qualified Electronic Signature (QES) can be issued or the identification needs to be handled by the financial institution. IDnow’s latest signing solution InstantSign removes these obligations, allowing users who have already performed Anti-Money-Laundering (AML)-compliant identity verification at onboarding to digitally sign contracts at a qualified level, with no additional integrations required and in accordance with European eIDAS regulations.

“Eliminating the requirement for reverification by reusing an existing identification significantly streamlines the process of issuing a QES in an AML-compliant e-signing journey. InstantSign has been proven to reduce drop-offs and accelerate contract completion. With physical signing processes sometimes taking days, our customers can now cut this to mere seconds with InstantSign, by foregoing the printing, signing and return process of a hard copy contract”, explains Vikas Seth, Chief Product Officer at IDnow.  

“Additionally, if reverification is needed due to an expired identity document, InstantSign works seamlessly with IDnow’s entire range of identity verification solutions from automated and hybrid, to video, thus keeping identity data up-to-date, and providing a compliant solution for financial services organizations.”

eID eSign – Fast verification and secure signing thanks to NFC technology  

eID eSign allows end customers to quickly verify their identity and digitally sign contracts remotely. In less than a minute, users can verify themselves by utilizing compliant and secure NFC technology, for a QES that is the legal equivalent to a wet-ink signature on the Level of Assurance (LoA) High. IDnow’s eID eSign solution currently works with the German ID card, residence permit or EU citizen card, using a secure, compliant NFC chip readout of these documents. It expands IDnow’s existing signing solution portfolio and complements the available identity verification methods for various types of signatures with solutions like VideoIdent, AutoIdent and IDCheck.io.

“With eID eSign, our customers can future-proof their offering according to the German AML laws (GwG) and the European eIDAS regulations. The use of the NFC chip reduces the risk of identity theft and unauthorized access while cutting costs associated with printing, storage and manual handling of documents. Both solutions propel us and our customers into the next stage of compliance digitalization,” concludes Seth.  

Monday, 08. July 2024

Indicio

Indicio named in Gartner Market Guide for the Decentralized Identity

SITA The post Indicio named in Gartner Market Guide for the Decentralized Identity appeared first on Indicio.

Microsoft Entra (Azure AD) Blog

How to break the token theft cyber-attack chain

We’ve written a lot about how attackers try to break passwords. The solution to password attacks—still the most common attack vector for compromising identities—is to turn on multifactor authentication (MFA).   But as more customers do the right thing with MFA, actors are going beyond password-only attacks. So, we’re going to publish a series of articles on how to defeat more advanced att

We’ve written a lot about how attackers try to break passwords. The solution to password attacks—still the most common attack vector for compromising identities—is to turn on multifactor authentication (MFA).

 

But as more customers do the right thing with MFA, actors are going beyond password-only attacks. So, we’re going to publish a series of articles on how to defeat more advanced attacks, starting with token theft. In this article, we’ll start with some basics on how tokens work, describe a token theft attack, and then explain what you can do to prevent and mitigate token theft now. 

 

Tokens 101 

 

Before we get too deep into the token theft conversation, let’s quickly review the mechanics of tokens.

 

A token is an authentication artifact that grants you access to resources. You get a token by signing into an identity provider (IDP), such as Microsoft Entra ID, using a set of credentials. The IDP responds to a successful sign-in by issuing a token that describes who you are and what you have permission to do. When you want to access an application or service (we’ll just say app from here), you get permission to talk to that resource by presenting a token that’s correctly signed by an issuer it trusts. The software on the client device you’re using takes care of all token handling behind the scenes.

 

Figure 1: Basic token flow

 

 

The first token you get, called a session token, shows that you successfully signed into the IDP, and how you signed in. When you sign into an app, it can exchange that session token for an access token, which gives you access to a specific resource for a certain amount of time without having to reauthenticate. To use an analogy, think of an amusement park. The IDP is the ticket office, which issues a park pass that provides credits for different rides. If you want to go on the roller coaster, you go to the ticket office, show your season park pass, and receive a ticket for that ride.

 

Just as you might be able to buy a day pass, season pass, or lifetime pass to the park, each token has a lifetime, usually between one and 24 hours. And just as a 12-month season pass may get you a one-day pass to a specific ride, session tokens can have different—and usually much longer—lifetimes than access tokens. Moreover, access token lifetimes can differ, so the roller coaster pass may last an hour while the Ferris wheel pass is good for an entire day.

 

Traditionally, longer lifetimes are more convenient for users and more resilient against potential IDP outages (they save round trips to the IDP and associated latency) but riskier, while shorter lifetimes are safer (the IDP checks the integrity of the request more often). Technologies such as continuous access evaluation provide continuous assessment, so a shorter token lifetime isn’t a benefit when these are in place. When a token expires or continuous access evaluation reports heightened risk, the client goes back to the IDP and requests a refresh. This process is typically invisible to users, but if a risk condition has changed, and your organization policy requires it, then you may have to reauthenticate and get a new token. One last thing to note: while it’s a bummer to lose your roller coaster ticket, it’s really bad to lose your season park pass. An attacker can use your roller coaster ticket to get on a single ride for a short while, but with your season park pass, they can get on any ride for as long as they want. It’s similar with which, if stolen, give an actor a lasting ability to get access tokens.

 

How token theft works

 

Attackers steal tokens so they can impersonate you and access your data for as long as that stolen token lives. To do this, they get access to where a token is stored (on the client, in proxy servers, or in some cases in application or network logs) to acquire it and replay it from somewhere else.

 

Figure 2: Token theft cyberattack

 

 

 

Identity provider

Ticket office

Session token

Season park pass

Access token

Individual ride ticket

 

When an attacker steals your session token, it’s like picking your pocket after you’ve purchased your all-access season park pass at the fair’s ticket office. Because a token is digital, token theft is like stealing the pass from your pocket, making a photocopy, and then putting the original back in your pocket. The attacker can use their copy of your session token to get unlimited new access tokens to keep stealing your data, just as they can show a copy of a valid park pass to keep getting on rides without paying.

 

An attacker stealing your access token is comparable to someone stealing your ride ticket as you stand in line. They do the same copy-and-replace trick, using their copy of your token to access the resource, just as they could show a copy of a valid ticket to get on an individual ride without paying.

 

And because in both cases the attacker puts the original pass or ticket back in your pocket, you don’t even know an attacker is riding the rides in your name. Your token seems fine, even though an attacker is using an illegitimate copy of it, and it may take a while to determine that anything is amiss—if you ever do.

 

Here’s an example:

 

Contoso stores all their documents in a secure cloud storage service and requires all employees to verify their identity using MFA before accessing it.

 

One day, after starting their workday by signing into Contoso's cloud storage service, a user inadvertently installed malware on their device by clicking on a malicious 'phishing' link sent to them via email. The malicious code copied the user's session token and sent it to the attacker.

 

The attacker then used the stolen and MFA-validated session token, now copied to their machine, to gain access to Contoso's environment.

 

The attacker then downloaded as many documents as they could access, including a bunch of confidential reports, and leaked them on the internet.

 

Use of malware on the client to acquire the token is one common, easy method for attackers. Other tactics used to steal tokens include:

Copying tokens from the network as they pass through a proxy or router that the attacker controls. Extracting tokens from unsecured server logs of the relying party.

 

While token theft still constitutes fewer than 5% of all identity compromises, incidents are growing. alone, we detected 147,000 token replay attacks, a 111% increase year-over-year.

 

Protecting tokens

 

IDPs and clients should handle tokens as securely as possible by only transmitting them over encrypted channels and not storing them in the open. But if an attacker infiltrates the device or network channel as in the example above, they can steal tokens and use them until they expire.

 

Ideally, a token would only work when used from the device to which it was issued. That is, if replayed from a different device, such as one an attacker controls, they would be rejected. 

 

A key part of Microsoft’s protections against token theft is the use of tokens that are cryptographically tied to the device they own. This is often called token binding, but may also be called sender constrained tokens, or token proof of possession. Token protection makes it harder to execute the main types of attacks designed to steal tokens, including network-based attacks and those using malware on the device by restricting use of the stolen token from devices they weren't issued to.

 

In Microsoft Entra, token protection binds tokens to cryptographic keys specific to the device and ties them to the device registration. Once developers enable their applications to use protected tokens, you can enforce an Entra Conditional Access policy that requires client applications to use protected tokens to access a service. This policy rejects tokens which are not cryptographically tied to the device they were issued to. In the theme park analogy, this is like the ticket office taking your picture and printing it on your ride ticket and requiring ride operators to match the picture to your face before letting you ride.

 

Figure 3: Token protection in Microsoft Entra

 

 

This is a large project, spanning operating system platforms, native and web applications, all our cloud services, and the full range of different tokens in use for each case. It will be released in stages for specific scenarios. The first stage, in public preview now, protects the sign-in session tokens that native applications on Windows devices use when accessing Exchange, SharePoint and Teams services.

 

Token protection policy is available for Windows clients today. We’ll support Azure management scenarios and web applications that access Microsoft 365 resources and extend our cross-platform capabilities to Mac, iOS, Android, and other clients over the next year.

 

Practical steps for countering token theft

 

Token protection will offer the strongest protection against token theft; however, it will take the industry time to update all applications to use bound tokens. The good news is that Microsoft offers compelling countermeasures against attacks involving token theft that you can use today to reduce their risk and impact. We recommend a systematic defense-in-depth approach:

 

Reduce the risk of successful token theft. Prevent malicious use of stolen tokens. Be prepared to detect and investigate attacks that use stolen tokens.

 

Reduce the risk of successful token theft

 

The first line of defense is to reduce the chances of attackers stealing tokens in the first place, and below are some well-established techniques for building it. It’s the equivalent of keeping your ride tickets and park passes safe from pickpockets while you’re in the theme park.

 

Require managed and compliant devices. Use device management and define Conditional Access policies to require that users access resources from a compliant device. Compliance policies we recommend to reduce the risk of successful token theft from devices include:

 

To help prevent accidental infection with token-stealing malware, require users running on Windows to run as standard users rather than with device admin rights and require that all devices run up to date anti-malware and virus tools. Use storage encryption to protect device content, including tokens, in case someone steals the device itself. Enable Local Security Authority (LSA) protection to help protect Entra ID tokens in LSA memory. LSA protection is on by default for new devices and can be enabled for other devices via Intune. Use jailbreak or rooting detection for mobile devices. Jailbroken devices are more likely to expose tokens and cryptographic secrets to potential attacks.

 

Find step-by-step instructions for enabling credential guard in our documentation.

 

Turn on Credential Guard for your Windows users. If your users are running Windows 10 or later, you can prevent theft of Active Directory credentials by configuring Credential Guard, which uses virtualization-based security (VBS) to isolate local and cached credentials so that only privileged system software—and not malware—can access them. Starting in Windows 11, version 22H2, Credential Guard is on by default for devices that meet requirements. This also helps protect cloud applications and resources when hybrid-joined devices using Active Directory authentication initiate a session to access cloud applications.

 

Find step-by-step instructions for enabling credential guard in our documentation.

 

Prevent malicious use of stolen tokens

 

While device management and strong credentials certainly reduce the risk of token theft, not everyone has them, and they’re still not completely foolproof. The next layer of defense is to prevent attackers from using stolen tokens for ongoing access by configuring policies to reject them wherever possible, and by detecting attempted use and responding automatically.

 

Require token protection in Conditional Access, and where possible, choose apps and services that use token protection. Microsoft is updating our apps, identity provider, and operating systems to support token protection, so if you’re using our apps and platforms, be sure to use the latest versions. Then configure Conditional Access to require token protection for sign-in sessions so only applications and devices using bound sign in session tokens, which can’t be used if they’ve been stolen and moved to another device.

 

Find step-by-step instructions for creating a Conditional Access policy that requires token binding in our documentation

 

Create a risk policy to disrupt token theft in your environment automatically. When a user initiates a session or attempts to access an application, ID Protection will examine user and session risk factors to see if any have changed. Configure Conditional Access policies to protect both medium and high-risk sessions by either challenging users with MFA or by requiring reauthentication. This will make it difficult or impossible for an attacker to initiate a session using a stolen session token.

 

Wherever available, Continuous Access Evaluation (CAE) can automatically invalidate tokens when ID Protection raises the risk for a user or a service principal. This triggers the risk-based Conditional Access policies to mitigate in real-time, requiring re-authentication.

 

Find step-by-step instructions for creating risk-based Conditional Access policies in our documentation.

 

Reduce the risk of token reuse by restricting sessions for use within network boundaries. Most attackers use stolen tokens from untrusted IP addresses. You can establish network boundaries with policies that prevent users from accessing your resources if they’re coming from unknown locations or from known bad locations.

 

Restrict networks with Entra Conditional Access: Conditional Access includes controls that will block requests from outside a network compliance boundary that you define. This will prevent an attacker from refreshing a stolen Entra token, restricting its use to the lifetime of the token.

 

Find step-by-step instructions for defining a network compliance boundary with Conditional Access in our documentation.

 

Enhance network controls with Microsoft’s Security Service Edge (SSE) solution: To prevent the attacker from using a token outside of a trusted network at all, Entra Internet Access and Entra Private access use agents installed on endpoints and a compliant network check (enforced in real-time via CAE) to verify whether a user is connecting from a trusted network. Find step-by-step instructions for enabling compliant network check with Conditional Access in our documentation.

 

CAE-capable applications and services such as Teams, Exchange Online, and SharePoint Online will continuously enforce the IP-based named location Conditional Access policies and compliant network policies to ensure that tokens can be used only from trusted networks to access services. CAE offers a strict location enforcement mode to maximize protection. Find the step-by-step instructions for enabling this in our documentation.

 

Revoke tokens using Continuous Access Evaluation

 

In addition to ensuring that the supported services can only be accessed from trusted locations, CAE can revoke tokens when admins (or users themselves) take action in response to detecting an account compromise or token theft. These include disabling accounts, changing passwords, and revoking refresh tokens. Learn more about Continuous Access Evaluation in our documentation.

 

Be prepared to detect and investigate attacks that use stolen tokens

 

Use Entra ID Protection and Microsoft Defender to monitor for token theft. When a threat actor replays a token, their sign-in event can trigger detections such as ‘anomalous token’ and ‘unfamiliar sign-in properties’ from both Entra ID Protection and Microsoft Defender for Cloud Apps. Premium detections recognize abnormal characteristics such as an unusual token lifetime, a token played from an unfamiliar location, or token attributes that are unusual or match known attacker patterns. Signals from Microsoft Defender for Endpoint (MDE) can indicate a possible attempt to access the Primary Refresh Token.

 

Find step-by-step instructions for investigating token theft in our documentation.

 

Pull all your data into one Security Information and Event Management (SIEM), such as Microsoft Sentinel, to investigate potential token theft. If you receive an alert for an event that may indicate token theft, you can investigate it in the Microsoft Sentinel portal or in another SIEM. Microsoft Sentinel gives you important details about a specific incident, such as its severity, when it occurred, how many entities were involved, which events triggered it, and whether it reflects any MITRE ATT&CK tactics or techniques. You can then view the investigation map to understand the scope and root cause of the potential security threat.

 

Find step-by-step instructions for investigating incidents using Sentinel in our documentation.

 

Reduce the risk of successful token theft Prevent malicious use of stolen tokens Be prepared to detect and investigate attacks that use stole tokens

Require managed and compliant devices.

 

Turn on Credential Guard for your Windows users.

 

Require token protection in Conditional Access and where possible, choose apps and services that use token protection.

 

Create a risk policy to disrupt token theft in your environment automatically.

 

Reduce the risk of token reuse by restricting sessions for use within network boundaries.

 

Revoke tokens using Continuous Access Evaluation

Use Entra ID Protection and Microsoft Defender to monitor for token theft.

 

Pull all your data into one SIEM, such as Microsoft Sentinel, to investigate potential token theft.

 

As defenders building defenses to help everyone strengthen cybersecurity, Microsoft is in a big strategic fight against token theft. We’ll keep you updated on any advancements you can use to counter attacks that use token theft. In the meantime, to help defend your environment, configure your Conditional Access policies to take advantage of token protection wherever you can and employ the countermeasures we’ve described here.

 

Stay safe out there,

Alex Weinert

 

 

Learn more about Microsoft Entra

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.

Microsoft Entra News and Insights | Microsoft Security Blog⁠Microsoft Entra blog | Tech CommunityMicrosoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community 

 


liminal (was OWI)

Link Index for Privacy and Consent Management

The post Link Index for Privacy and Consent Management appeared first on Liminal.co.

Safle Wallet

Safle Tokenomics: Empowering a Decentralized Future

Safle is an innovative platform with a well-structured tokenomics model at its core, designed to drive its decentralized ecosystem. The Safle token (SAFLE) plays a crucial role in ensuring security, utility, and governance within the platform. This article provides an in-depth look at the tokenomics of Safle, highlighting its various facets and future potential. The $SAFLE token is a utility toke

Safle is an innovative platform with a well-structured tokenomics model at its core, designed to drive its decentralized ecosystem. The Safle token (SAFLE) plays a crucial role in ensuring security, utility, and governance within the platform. This article provides an in-depth look at the tokenomics of Safle, highlighting its various facets and future potential.

The $SAFLE token is a utility token that helps token holders to propose changes and steer the Safle ecosystem by creating proposals and voting on them.

Token Details:

Token Name: Safle Token Symbol: SAFLE Token Type: Utility (In-wallet) Total Supply: 1,000,000,000 Total Circulation: 178,000,000 Base Chain: Polygon

Bridge: Celer (Polygon, ETH, BSC)

Here is the utility of the $SAFLE token.

Governance: SAFLE token holders are granted the power to participate in the decision-making processes of the platform. This decentralized governance model allows token holders to vote on key proposals, influencing the future development and direction of the Safle ecosystem. This ensures that the community has a say in important decisions, fostering a democratic and transparent environment. Staking: Tokens can be staked to earn a yield and get access premium services in the Safle Wallet. This enhances the wallet’s usability, provides liquidity for payments & incentives the users to hold Safle token for a longer period of time. Gas Fees: The token can be used to pay for services and transaction fees within the Safle ecosystem.The Token will be used as an intermediary for Gas Fees on multiple chains in order to provide users with seamless transactions across multiple chains without holding the native tokens, thus easing the user experience. Payments: Safle tokens will be used at the native currency of the ecosystem for utilizing features, rewarding developers to contribute to Safle & the community for their efforts in enhancing the Safle’s ecosystem. Incentives: $SAFLE token is used as an incentive for users and developers to engage with the platform using network effect economics. Re-vesting

Safle team has revested their unlocked tokens for 2 years as a Proof of Commitment towards the community. Moreover, the unlocked supply for different pools have also been re-vested for 2 years. Only 17.8% tokens are in circulation as of 5th July 2024.

Distribution Model

Safle has implemented a strategic token distribution model to ensure fair allocation and support the platform’s growth. The total supply of SAFLE tokens is fixed to 1,000,000,000 tokens, with the allocation structured as follows:

| Particulars | Allocation | Tokens allocated |
|------------------------------------|----------------|----------------------|
| Community | 12.900% | 129,000,000 |
| Staking Rewards | 19.464% | 194,640,000 |
| Ecosystem | 12.500% | 125,000,000 |
| Marketing | 9.400% | 94,000,000 |
| Reserves unlocked | 5.000% | 50,000,000 |
| Reserves vested | 8.840% | 88,400,000 |
| Team | 2.500% | 25,000,000 |
| Founders | 11.515% | 115,150,000 |
| Previous Releases | 2.881% | 2,881,000 |
| Early backers & Prior Allocations | 15.000% | 150,000,000 |

Users who stake their SAFLE tokens can earn rewards, promoting long-term holding and contributing to the stability and security of the network. This incentivizes users to remain engaged with the platform and benefit from their contributions.

Safle is committed to expanding the utility of its token and integrating more services into its ecosystem. The platform aims to continuously enhance the value proposition of the SAFLE token by introducing new features and functionalities. As the ecosystem grows, so will the opportunities for SAFLE token holders, ensuring a dynamic and evolving economic model.

About Safle

Safle is a decentralized platform focused on creating secure digital identities. By leveraging blockchain technology, Safle provides users with a self-sovereign identity system, ensuring privacy and control over personal data. The platform also offers a suite of tools for managing digital assets, making decentralized finance accessible and user-friendly. With a commitment to transparency and innovation, Safle is poised to revolutionize the way individuals interact with digital services.

To learn more about Saffle, visit:

Website | X | Discord | Telegram | Instagram | MediumGithub


Microsoft Entra (Azure AD) Blog

Microsoft Entra ID Governance licensing clarifications

In the past few weeks, we’ve announced the general availability of Microsoft Entra External ID and Microsoft Entra ID multi-tenant collaboration. We’ve received requests for more detail from some of you regarding licensing, so I’d like to provide additional clarity for both of these scenarios.   One person, one license   Included in the first announcement of more multi-tenant org

In the past few weeks, we’ve announced the general availability of Microsoft Entra External ID and Microsoft Entra ID multi-tenant collaboration. We’ve received requests for more detail from some of you regarding licensing, so I’d like to provide additional clarity for both of these scenarios.

 

One person, one license

 

Included in the first announcement of more multi-tenant organization (MTO) features to enhance collaboration between users, we stated that only one Microsoft Entra ID P1 license is required per employee per multi-tenant organization. Expanding on that, the term “multi-tenant organization” has two descriptions: an organization that owns and operates more than one tenant; and a set of features that enhance the collaboration experience for users between these tenants. However, your organization doesn’t have to deploy those capabilities to take advantage of the one person, one license philosophy. An organization that owns and operates multiple tenants only needs one Entra ID license per employee across those tenants. The same philosophy applies to Entra ID Governance: the organization only needs one license per person to govern the identities of these users across these tenants.

 

Note that this philosophy includes administrative accounts. In some organizations, administrators use standard user accounts for day to day tasks, and separate administrator accounts for privileged access. A person with a standard user account and an administrator account only needs one Entra ID Governance license for both identities to be governed. Of course, they could also leverage Entra ID Governance’s Privileged Identity Management (PIM) to temporarily elevate the access rights of a single account, instead of maintaining two accounts.

 

To illustrate this scenario, let’s consider an organization called Contoso, which owns ZT Tires and Tailspin Toys. Mallory is hired by Contoso, which uses Lifecycle Workflows in Entra ID Governance to onboard her user account and grant her access to the resources she needs for her job. Her account receives an access package with an entitlement to ZT Tires’ ERP app, and she requests access to Tailspin Toys inventory management app. Because Mallory has an Entra ID Governance license in the Contoso tenant, her identity can be governed in the ZT Tires and Tailspin Toys tenants with no additional governance licenses – one person, one license.

 

Diego is an identity administrator whose user account is in the ZT Tires tenant. He uses a separate administrator account for privileged access tasks in Contoso, Tailspin Toys, and ZT Tires tenants. Because Diego has an Entra ID Governance license in the ZT Tires tenant, both his user and administrator identities can be governed in all three tenants with no additional governance licenses – again, one person, one license.

 

Entra ID Governance in Microsoft Entra External ID

 

The other announcement covered Entra External ID, Microsoft’s solution to secure customer and business collaborator access to applications. In November, I blogged about the licensing model to govern the identities of business guests in the B2B scenario for Entra External ID and shared that pricing would be $0.75 per actively governed identity per month. Because metered, usage-based pricing to govern the identities of business guests is a different model than the existing, licensed-based pricing model to govern the identities of employees, I’d like to share more detail.

 

A business guest identity in Entra External ID will accrue a single $0.75 charge in any month in which that identity is actively governed, no matter how many governance actions are taken on that identity. For example: 

 

A Contoso employee named Gerhart collaborates with Pradeep of Woodgrove Bank to produce Contoso’s quarterly financial statements. Contoso has deployed Entra External ID for its business partners such as Woodgrove Bank. In April, Pradeep accesses Contoso’s Microsoft Teams where Gerhart stores his quarterly reporting documents, but his Entra External ID has no identity governance actions taken on them, so it doesn’t accrue any charges.

 

In May, Pradeep receives an access package with an entitlement to Contoso’s accounting system, and Gerhart reviews Pradeep’s existing access to Contoso’s inventory management database, as well as to the Teams with the quarterly reporting documents. Because Pradeep’s identity in Entra External ID had identity governance actions taken on it, Contoso will accrue a $0.75 charge. Note that the charge is applied once, even though there were three identity governance actions taken during the month. Once that Entra External ID identity was governed in May, additional identity governance actions do not generate additional charges for that identity in May.

 

To learn more about Microsoft Entra ID Governance licensing, visit the Licensing Fundamentals page.

 

 

Read more on this topic 

Entra ID multi-tenant collaboration  Microsoft Entra External ID general availability 

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog   ⁠⁠Microsoft Entra blog | Tech Community   ⁠Microsoft Entra documentation | Microsoft Learn  Microsoft Entra discussions | Microsoft Community  

 


YeshID

🚀 YeshID Monthly Release Notes: July 2024

📣 What’s New This Month We’re excited to share our latest updates to YeshID! This month, we’ve focused on improving workflows, enhancing performance, and introducing features to make your experience... The post 🚀 YeshID Monthly Release Notes: July 2024 appeared first on YeshID.
📣 What’s New This Month

We’re excited to share our latest updates to YeshID! This month, we’ve focused on improving workflows, enhancing performance, and introducing features to make your experience smoother and more efficient.

🌟 New Features Slack Integration for Access Management 💬: Request and approve app access directly through Slack, streamlining the access management process. Custom User Fields 🏷️: Add organization-specific information to user profiles, such as employee ID or department code. Enhanced Email Customization 📧: Add your company logo to YeshID emails and send tailored messages for different workflows. Improved Google Workspace Integration 🔄: View and edit email aliases directly within YeshID for better Google Workspace management. Task Management Improvements ✅: Use new features like expandable subtasks and in-task discussions to improve team coordination.

💪 Enhancements Performance Optimization 🏎️: We’ve reduced the loading time of the Application view by 86%, significantly improving responsiveness. User Management Refinements 👥: Streamlined user creation process with improved email validation Enhanced invite link functionality to work seamlessly for already active users Added a check to prevent creation of users with email addresses that are aliases of existing users Workflow Enhancements 🔧: Improved task completion tracking with clearer indications of who completed or rejected a task Enhanced notifications for task rejections in onboarding and offboarding processes Added a text input for providing reasons when responding to access requests UI Improvements 🎨: Added spacing between task status and assignments for better readability Improved navigation with direct links from ledger items to related tasklists Introduced filters in the Google user directory for easier user lookup 🛠️ Fixes and Polish

We’ve addressed several issues to ensure smoother operation:

Resolved an issue where the completed state display was not showing correctly Fixed group reset functionality when cancelling certain operations Improved error handling in various workflows, particularly in task validation processes

————————

We hope these updates make your YeshID experience even better. As always, we value your feedback and look forward to hearing how these changes improve your day-to-day operations. Thank you for being a part of the YeshID community!

The post 🚀 YeshID Monthly Release Notes: July 2024 appeared first on YeshID.


Shyft Network

Carret Integrates Shyft Veriscope to Comply with FATF Travel Rule

‍Shyft Network is excited to announce that Carret, one of India’s leading crypto trading platforms, has integrated Veriscope and User Signing to comply with the FATF Travel Rule. Through this integration, Carret aims to enhance its regulatory adherence by enabling efficient and secure information sharing required under the Travel Rule for crypto transfers. Shyft Veriscope facilitates the discover

Shyft Network is excited to announce that Carret, one of India’s leading crypto trading platforms, has integrated Veriscope and User Signing to comply with the FATF Travel Rule. Through this integration, Carret aims to enhance its regulatory adherence by enabling efficient and secure information sharing required under the Travel Rule for crypto transfers.

Shyft Veriscope facilitates the discovery and verification of counterparty Virtual Asset Service Providers (VASPs) in a non-intrusive manner, allowing direct, compliant data exchange without the need for intermediaries. User Signing, on the other hand, will enable Carret to directly request cryptographic proof from users’ non-custodial wallets, enhancing withdrawal validation and supporting secure asset custody.

The partnership with Shyft Network not only strengthens Carret’s compliance capabilities but also positions it as a trustworthy player in the global crypto market, committed to ensuring the safety and privacy of its users.

The FATF Travel Rule requires crypto service providers to identify and share specific information about parties involved in transactions. By integrating Shyft Veriscope and User Signing, Carret enhances its ability to assess risks and enables its users to conduct transactions in a safer crypto environment.

“We are thrilled to partner with Shyft Network and integrate Veriscope’s cutting-edge technology,” said Neha Kumari, Carret’s Co-founder. “This collaboration is key to our strategy of providing secure and compliant crypto trading services, and it further cements our position as a leader in the cryptocurrency market in India.”

Commenting on the partnership, Zach Justein, Veriscope co-founder, said:

“Carret’s decision to integrate Veriscope highlights the strong benefits our technology brings to VASPs, with its advanced compliance infrastructure ensuring seamless adherence to the FATF Travel Rule. This is a significant step forward for the entire cryptocurrency industry, as Carret and Veriscope lead the way in setting higher standards for safety, transparency, and user experience.”

About Carret
Carret, one of the leading names in India’s growing crypto market, is transforming the way investors approach the nascent digital assets space. As a leading platform, Carret simplifies the complexities of crypto investments while ensuring high security and compliance standards. Offering features like direct trading and high-yield earnings on crypto assets, Carret caters to a diverse user base by supporting multiple assets. Committed to user satisfaction, Carret continues to innovate, driving forward the future of cryptocurrency investments with a focus on transparency and ease of use.

About Veriscope
‍‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process. Veriscope brings trust, security, privacy, and reliability to the crypto compliance process and is a go-to solution for the leading virtual asset service providers worldwide.

For more information, please reach out to comms@shyft.network.

Carret Integrates Shyft Veriscope to Comply with FATF Travel Rule was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Veriscope Regulatory Recap: June 17th to July 7th

Welcome to the latest issue of the Veriscope Regulatory Recap. In this edition, we explore recent regulatory developments in the European Union and Turkey, examining their potential impacts on the crypto industry. MiCA’s Stablecoin Regulations Are Now in Effect in the EU The implementation of the Markets in Crypto-assets Regulations (MiCA) will occur gradually in the European Union. For exa

Welcome to the latest issue of the Veriscope Regulatory Recap. In this edition, we explore recent regulatory developments in the European Union and Turkey, examining their potential impacts on the crypto industry.

MiCA’s Stablecoin Regulations Are Now in Effect in the EU

The implementation of the Markets in Crypto-assets Regulations (MiCA) will occur gradually in the European Union. For example, the stablecoin-related regulations have been in effect since June 30. Additional regulations impacting crypto asset service providers (CASPs) will follow in December this year.

(Image Source)

One immediate effect of MiCA is the psychological shift it brings, dispelling doubts about the legitimacy of crypto businesses within the EU. After all, the regulations eliminate uncertainties regarding the future of crypto in Europe.

Focus on Stablecoins

On the downside, MiCA also introduces market disruptions. Take Stablecoins, for example. They now must comply with stringent new requirements, which may lead to delisting non-compliant stablecoins from EU-based exchanges. Already, Uphold has delisted six stablecoins, including Tether USDT, the most dominant stablecoin globally.

Such moves might push issuers to exit the EU market altogether or transition towards euro-backed stablecoins.

Crypto Regulations Just One Step Away from Implementation in Turkey

Turkey passed a major crypto bill in the Turkish Grand National Assembly on June 27, 2024. This new law increases oversight of digital assets with strict penalties for non-compliance.

Key Provisions and Oversight

The bill gives control of crypto oversight to Turkey’s financial watchdog, the Capital Markets Board, which will also license digital asset exchanges. Crypto service providers must ensure that customer fund transfers are accessible and traceable by law enforcement.

(Image Source)

Platforms that break these rules will face fines from $7,500 to $182,600 and prison terms of up to five years. The Capital Markets Board can also approve audit firms for digital asset companies and temporarily shut down non-compliant platforms.

Next Steps

President Recep Tayyip Erdoğan is expected to approve the bill soon. Once approved, it will be published in the Official Gazette or returned to parliament for further review.

Implications for the Crypto Industry

The EU’s MiCA regulations aim to bring clarity and stability, which could attract more investors. Turkey’s strict rules and penalties, on the other hand, are intended to build trust and transparency in the sector.

As both regions update their rules, the global crypto market is watching closely. It’s important for everyone in the crypto industry to stay informed and adapt to these changes.

Interesting Reads

Carret Integrates Shyft Veriscope to Comply with FATF Travel Rule

FATF Travel Rule Compliance Guide for Gibraltar

A Guide to FATF Travel Rule Compliance in Liechtenstein

FATF Crypto Travel Rule Adoption: 6-Month Status Update

A Guide to FATF Travel Rule Compliance in the United States

‍About Veriscope

‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process.

For more information, visit our website and contact our team for a discussion. To keep up-to-date on all things crypto regulations, sign up for our newsletter and follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium.

Veriscope Regulatory Recap: June 17th to July 7th was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

Season 3 of the Ocean Zealy Community Campaign!

We’re happy to announce Season 3 of the Ocean Zealy Community Campaign, an initiative that has brought together our vibrant community and rewarded the most active and engaged members. 💰 Reward Pool 3,000 FET tokens that will be rewarded to the Top50 users in our leaderboard 📜Program Structure Season 3 of the Ocean Zealy Community Campaign will feature more engaging tasks and activiti

We’re happy to announce Season 3 of the Ocean Zealy Community Campaign, an initiative that has brought together our vibrant community and rewarded the most active and engaged members.

💰 Reward Pool

3,000 FET tokens that will be rewarded to the Top50 users in our leaderboard

📜Program Structure

Season 3 of the Ocean Zealy Community Campaign will feature more engaging tasks and activities, providing participants with opportunities to earn points. From onboarding tasks to Twitter engagement and content creation, there’s something for everyone to get involved in and earn points and rewards along the way.

⏰Campaign Duration: 31st of July 12:00 PM UTC

🤔How Can You Participate?

Follow this link to join and earn:

https://zealy.io/cw/oceanprotocol/questboard

Season 3 of the Ocean Zealy Community Campaign! was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Verida

Building a Personal Data Bridge to enable Personal AI

Building a Personal Data Bridge to enable Personal AI We are excited to announce that Verida is building a “Personal Data Bridge” to enable user data to be unlocked from centralized platforms and made available for a new generation of personal AI applications and web3 applications that require user data. Our personal data today is locked in centralized databases controlled by big tech, b
Building a Personal Data Bridge to enable Personal AI

We are excited to announce that Verida is building a “Personal Data Bridge” to enable user data to be unlocked from centralized platforms and made available for a new generation of personal AI applications and web3 applications that require user data.

Our personal data today is locked in centralized databases controlled by big tech, banks, healthcare providers, and governments. Verida is building the necessary infrastructure and tools to help individuals extract their data from these platforms and then use it for exciting new use cases such as personal AI.

Unlocking User Data

Verida is building the critical infrastructure to enable users to reclaim their personal data from any centralized platform, securely store it on decentralized database infrastructure and provide the tools to consent to the use of their data.

This unlocks user data to be integrated into the next generation of hyper-personal applications such as AI agents and disruptive user-centric web3 applications.

Flow of data from centralized web3 platforms, to user control, through to personal AI applications

The Personal Data Bridge is the connective tissue between centralized web2 platforms and end users, enabling them to easily take ownership and control of their personal data.

The Personal Data Bridge consists of a Data Connector API with corresponding framework for integrating with centralized APIs, along with a user interface in the Verida Vault for end users to easily connect, disconnect and configure the data they pull from centralized platforms.

Verida’s personal data bridge opens up numerous possibilities:

DeFi: Private and reusable on-chain credentials for KYC and credit scores, enabling under-collateralized lending and insurance products. Healthcare: AI-powered personalized care with secure messaging and follow-up processes. Metaverse: Portable personal profiles and player data across games and guilds. AI: Personalized AI products that have access to a digital twin for highly personalized interactions

These applications highlight the potential of combining decentralized identity, user-owned data, and crypto assets to create a more user-centric internet.

Verida’s approach to off-chain data storage is designed for privacy, speed, and security. Data is stored in encrypted databases, inaccessible to unauthorized users, and transactions are recorded quickly without waiting for network confirmations. This method reduces the risk of data breaches and provides a seamless user experience.

Data Connections in the Verida Vault

The Verida Vault is currently in development and will be made available in Q3 2024. It provides an easy web interface for users to manage their personal data, connect to third party applications, and pull their data from centralized platforms.

Mockup of the user interface for the Verida Personal Data Bridge in the Verida Vault web application

Users can pull their personal data from platforms such as Meta, Google, X, Email, LinkedIn, Strava, and much more. This data is encrypted and stored in a user-controlled personal data Vault on the Verida network.

Users can accept data access requests from third party applications, allowing them to connect their personal data with emerging personal AI applications.

Data Connector API

The Verida Data Connector API is currently in development and will be made available in Q3 2024. It is a secure web service that enables a user to authenticate with a centralized platform to take ownership of their personal data stored with that service, to be stored, encrypted in the user’s personal data vault.

This API will operate within Verida’s recently announced self-sovereign confidential compute infrastructure, ensuring user data is not visible to any infrastructure operator and is protected at all times in the data lifecycle.

The data migration status of each connector is tracked, so data can regularly synchronize and be kept up-to-date as new information becomes available. As such, this creates a near real time stream of data flowing into a user’s personal data vault.

User data is mapped into common data schemas to create an ecosystem of interoperable data. For example, social media posts from X, Facebook and LinkedIn are merged into a common “social post” schema for easy integration.

All user data is signed, so it can be independently verified to have originated from a particular third party platform.

The Data Connector API is built on an open framework, enabling any developer to contribute new connectors for centralized platforms. In this way, we can rapidly scale the number of data sources available for end users and maximize the volume of personal data users can take ownership of.

Unlocking Innovation and Mass Adoption for Web3

Centralized platforms have a monopoly over how your data is used, with any innovation or products using your data limited to what those platforms build.

The future of web3 depends on the growth of user adoption and application development. Verida’s data bridge is a significant step towards achieving this vision, enabling a more connected, personalized, and user-controlled internet. As more off-chain data sources are integrated, the potential for new applications and services will continue to expand, driving the vibrant Web3 ecosystem we all anticipate.

There are significant global regulatory changes occurring that are making it increasingly easy for end users to gain access to their data stored with centralized platforms. Through GDPR, the EU has introduced the right to data portability that is forcing the big tech companies to release data portability API’s. The time is right now to connect user’s to their data and unlock it for exciting new use cases.

Verida is building this crucial bridge, enabling users to pull their personal data from centralized platforms like Facebook, Google, Twitter, and GitHub into web3 and personal AI applications. This data is stored in Verida’s encrypted, user-controlled personal data Vault.

The Road Ahead

Verida is leading the charge in creating a bridge between web2 and web3, allowing users to reclaim their personal data and use it to enhance their digital experiences. By ensuring GDPR compliance, data privacy, and user control, Verida is setting the standard for the future of decentralized identity and personal data management.

This vision of data sovereignty, personal AI and hyper-personalized web3 applications is not just a dream; it’s becoming a reality, paving the way for a more secure, user-centric internet.

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for various industries. Verida’s ecosystem of KYC partners and technologies are ideally suited to help Kima expand into new markets, streamlining processes and efficiency for compliant transactions. For more information, visit Verida.

Verida Missions | X/Twitter | Discord | Telegram | LinkedInLinkTree

Building a Personal Data Bridge to enable Personal AI was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Sep 26, 2024: How to Build a Modern Approach to Identity Governance in a SaaS first-World

In today's tech landscape, the shift towards distributed software environments and diverse access standards has transformed identity governance into a complex maze. Our upcoming webinar, "How to Build a Modern Approach to Identity Governance in a SaaS-First World," addresses the challenges and solutions for managing identities and access in cloud-based SaaS environments.
In today's tech landscape, the shift towards distributed software environments and diverse access standards has transformed identity governance into a complex maze. Our upcoming webinar, "How to Build a Modern Approach to Identity Governance in a SaaS-First World," addresses the challenges and solutions for managing identities and access in cloud-based SaaS environments.

Sunday, 07. July 2024

KuppingerCole

A Pragmatic View of Software Supply Chain Security

In this episode, Matthias Reinwarth and Alexei Balaganski discuss the topic of software supply chain security. They explore the different perspectives and challenges surrounding this topic, including the lack of common definitions and the need for practical recommendations. They also discuss the importance of trust in software and the potential solutions, such as software bill of materials and col

In this episode, Matthias Reinwarth and Alexei Balaganski discuss the topic of software supply chain security. They explore the different perspectives and challenges surrounding this topic, including the lack of common definitions and the need for practical recommendations. They also discuss the importance of trust in software and the potential solutions, such as software bill of materials and collaboration within the industry. The episode concludes with a call for further discussion and exploration of this complex and evolving topic.




Spherical Cow Consulting

Introducing Digital Identity Standards Development Service for Executives

Offering a new service to inform executives about digital identity standards is a valuable resource. It helps executives understand when and how to assign resources to standards, ensuring critical use cases are met worldwide. Engineers and standards architects can either engage in or facilitate the process, not both. My service summarizes key developments, providing the necessary information for s

Offer a new service for digital identity standards development, and suddenly, everyone has questions! (This makes me so happy! I love questions!) So, let’s look at why I think offering a service to inform executives about what’s happening in the digital identity standards space is a good idea.

I’m making a few assumptions as to why this service is valuable:

Executives aren’t opposed to conforming to standards, but they need a more structured way to understand when and how to assign resources to make the standards work for their products. The standards process needs more input to ensure that the standards in question serve the critical use cases worldwide. Engineers and standards architects are brilliant, but they can either engage in or facilitate the process, and they (usually) can’t do both.

Basically, whether it’s due to resource constraints, a lack of specific expertise, or simply the rapid pace at which changes occur, many organizations are unsure of how to engage effectively without overcommitting valuable resources. I want to help with that.

Let’s Talk Inspiration

Throughout my nearly 15 years working closely with Internet standards development efforts, I’ve seen brilliant engineers and busy executives struggle to juggle (heh, rhymes!) their day-to-day responsibilities with the necessity of staying aware of and (when necessary) influencing Internet standards. As much as the people who live and breathe standards development would prefer it otherwise, not every organization is ready, willing, or able to engage in standards development fully. Yet, staying on the sidelines isn’t a viable option either if you want to ensure that emerging standards will work for, and not against, your business interests.

If I were the queen of the universe, resources would magically appear; enough people with the right skills would be engaged in developing standards. I am not, however, queen of the universe (give me time). So, is there any part of this problem that I can solve? What if I could summarize what’s going on for people who need information that will let them decide when they need to engage? At the same time, can I act as a neutral party to help facilitate the process so that it runs as smoothly as possible while still including outreach to encourage participation and consensus?

Let’s find out.

Why A Digital Identity Standards Development Service Helps

My area of interest is in the digital identity space. I am actively engaged in the APIs under development in the W3C’s Federated Identity Community and Working Groups. I’m also kicking off as co-chair of the IETF’s Secure Patterns for Internet Credentials (spice) working group. Those are the areas I can immediately help an executive learn more about. If your organization has services that use federated authentication via a web browser or if they are interested in potential work on credentials to be used by both human and non-human identities, then these topics should be of interest.

And let’s be clear: I’m not here to push any specific solutions or architectures. Instead, I focus on providing the information your team needs to make decisions that align with your business goals.

There are undoubtedly other digital identity-related standards out there that may also be of interest. Adding more to my plate would involve a conversation.

Making the Case

If you need language to make the case internally for the services I’m building, this might help.

My Digital Identity Standards Development Service will help you:

Stay Informed by keeping track of critical developments without needing to sift through every detail of the standards discussion. Save Resources by concentrating your efforts and resources on areas that impact your business the most without spreading your team too thin. Make Strategic Decisions by applying insights and expert analysis. This will enable you to decide when and how to engage effectively, ensuring you’re always a step ahead. Wrap Up

If you’ve read this far and haven’t clicked on the direct link to the page describing this new service in detail, now’s a good time for that! If you and your organization feel overwhelmed by the prospect of keeping up with Internet standards, or if you’ve struggled to figure out how to engage without overcommitting resources, this service is designed for you. It’s about strategic engagement and ensuring that you do so with the maximum impact and minimum fuss when you choose to engage.

The post Introducing Digital Identity Standards Development Service for Executives appeared first on Spherical Cow Consulting.

Friday, 05. July 2024

ARTiFACTS

4 Critical Challenges Facing Pharma Supply Chains

The pharmaceutical industry plays a pivotal role in safeguarding public health by ensuring the timely and reliable distribution of life-saving medications worldwide. However, the complexities of the global pharmaceutical supply chain are increasing, and the industry is facing several critical challenges that demand immediate attention. This article delves into the top four critical challenges conf

The pharmaceutical industry plays a pivotal role in safeguarding public health by ensuring the timely and reliable distribution of life-saving medications worldwide. However, the complexities of the global pharmaceutical supply chain are increasing, and the industry is facing several critical challenges that demand immediate attention. This article delves into the top four critical challenges confronting pharmaceutical supply chains today.

Supply Chain Resilience

Pharmaceutical supply chains are grappling with an ever-increasing need for resilience. Events like the COVID-19 pandemic have exposed vulnerabilities in the supply chain, demonstrating the urgency of having robust contingency plans in place. Supply chain disruptions due to natural disasters, political instability, or health crises can lead to drug shortages, jeopardizing patient well-being.

Building resilience entails diversifying sources of raw materials, active pharmaceutical ingredients (APIs), and finished products. Companies must also invest in data analytics and demand forecasting tools to predict and prepare for potential disruptions. Collaborative efforts across the industry to share information and resources during crises can further bolster supply chain resilience.

Regulatory Compliance and Quality Control

The pharmaceutical industry operates under stringent regulatory frameworks designed to ensure product safety, efficacy, and quality. Compliance with these regulations is paramount, yet it poses a significant challenge for supply chains. The intricacies of regulatory requirements vary across regions, necessitating harmonization efforts to simplify compliance.

Quality control is another critical aspect. Ensuring that each batch of medication meets the highest standards demands rigorous testing and monitoring throughout the supply chain. Any deviations can result in product recalls, financial losses, and reputational damage. Innovations in technology, such as blockchain and real-time monitoring, are helping companies maintain better control over the quality and authenticity of their products.

Cold Chain Management

Many pharmaceutical products, particularly vaccines and biologics, require strict temperature control throughout the supply chain. The maintenance of a cold chain is essential to preserving the efficacy of these products. This challenge becomes even more pronounced when dealing with global distribution, as variations in climate and infrastructure can affect temperature-sensitive medications.

To address this issue, pharmaceutical companies must invest in advanced cold chain logistics, temperature-monitoring devices, and packaging technologies. Ensuring the integrity of the cold chain is not only a matter of compliance but also a critical factor in patient safety and the overall success of pharmaceutical supply chains.

Counterfeiting and Supply Chain Security

Counterfeit pharmaceuticals pose a grave threat to public health and the reputation of the pharmaceutical industry. Criminal networks continuously devise sophisticated methods to infiltrate the supply chain and distribute fake medications. This challenge is exacerbated by the global nature of pharmaceutical supply chains, which involve multiple intermediaries and complex distribution networks.

To combat counterfeiting and enhance supply chain security, the industry is turning to innovative technologies like serialization, track-and-trace systems, and blockchain. These solutions – such as ARTiFACTS Verify – create transparency and traceability, making it easier to identify and remove counterfeit products from the supply chain. Collaboration among stakeholders, including governments, law enforcement, and pharmaceutical companies, is crucial in addressing this challenge effectively.

Technology solutions to pharma’s critical challenges

The pharmaceutical supply chain is at a crossroads, facing numerous critical challenges that demand immediate attention and innovative solutions. Supply chain resilience, regulatory compliance, cold chain management, and counterfeiting are among the top concerns that must be addressed to ensure the continued availability and safety of essential medications.

To navigate this complex landscape successfully, pharmaceutical companies need to invest in advanced technologies, strengthen collaboration across the industry, and adapt to evolving regulatory requirements. The ultimate goal is to safeguard public health by maintaining a secure, efficient, and reliable pharmaceutical supply chain that can withstand unforeseen disruptions and deliver life-saving medications to those who need them most. By tackling these challenges head-on, the pharmaceutical industry can continue to fulfill its vital mission in the years to come.

The post 4 Critical Challenges Facing Pharma Supply Chains appeared first on Artifacts VERIFY.


Infocert

Electronic Seal: What It Is and How It Works

What is the Electronic Seal and What Function Does It Serve? What is the Electronic Seal? The electronic seal is a digital technology that confirms the authenticity, integrity, and origin of the digital documents to which it is attached. It is regulated by the EU eIDAS Regulation (910/2014), which describes it as a set of […] The post Electronic Seal: What It Is and How It Works appeared first o
What is the Electronic Seal and What Function Does It Serve? What is the Electronic Seal?

The electronic seal is a digital technology that confirms the authenticity, integrity, and origin of the digital documents to which it is attached. It is regulated by the EU eIDAS Regulation (910/2014), which describes it as a set of data in electronic form, logically associated with other digital data to guarantee the origin and integrity of the document. Considered the digital equivalent of a physical stamp, the electronic seal provides certification of provenance and protection against unauthorized alterations.

Qualified Electronic Seal (QeSeal)

The qualified electronic seal represents the most secure and reliable form of seal available according to the eIDAS regulation. This type of seal is particularly suitable for organizations that need to ensure the integrity and authenticity of their digital documents at a high level of security. A qualified electronic seal is generated using a Qualified Signature Creation Device (QSCD). These devices are designed to be highly secure and resistant to tampering attempts, ensuring that the seals created are robust and secure.
• Certification and Compliance:
Each qualified electronic seal is supported by a qualified certificate, issued by a recognized certification authority. The certificate contains the identifying information of the legal entity responsible for the seal, ensuring transparency and traceability. Compliance with the requirements of the eIDAS Regulation ensures that these seals are recognized and accepted in all EU member states.
• Practical Applications:
The qualified electronic seal is used in a variety of business contexts, ensuring the security of important documents such as contracts, statements, and administrative documentation. It is frequently used in regulated sectors such as banking, insurance, healthcare, and the public sector, where the need to protect sensitive information is particularly critical.
• Strategic Advantages:
Adopting the qualified electronic seal allows companies to improve operational efficiency, reduce the risk of document fraud, and enhance trust in electronic transactions. Its implementation can significantly reduce operational costs associated with managing paper documents, contributing to the digital transformation of the company.

How to Obtain It

To obtain an electronic seal, an organization must undergo a rigorous verification process by a certification authority recognized by AgID. InfoCert is one of the leading providers recognized in Europe for the qualified validation of electronic seals, enabling German companies to use such seals for various applications, including registration with the European EPREL database for product energy classification.

Buy Now

The post Electronic Seal: What It Is and How It Works appeared first on infocert.digital.


KuppingerCole

Identity Security

by John Tolbert We’ve been hearing about Zero Trust and that “identity is the new perimeter” for years. Maybe it’s time to say, “identity is a perimeter”, because even though network perimeters are porous, they still exist. There is a growing awareness of the centrality of identity and access management in the larger field of IT security. This is because so many cyber-attacks leverage digital ide

by John Tolbert

We’ve been hearing about Zero Trust and that “identity is the new perimeter” for years. Maybe it’s time to say, “identity is a perimeter”, because even though network perimeters are porous, they still exist. There is a growing awareness of the centrality of identity and access management in the larger field of IT security. This is because so many cyber-attacks leverage digital identities. This has led to increased emphasis and spending on IAM, beyond just the full IAM suite solutions. For example, we see higher quality and more user-friendly modular solutions for multi-factor (and passwordless) authentication, fine-grained authorization, privileged access management, and identity governance and lifecycle management, often delivered as cloud-hosted services.

The need for identity security was evidenced at our recent European Identity and Cloud (EIC) conference, where our track on identity security was well attended. We had speakers addressing Zero Trust implementations, Identity Threat Detection & Response (ITDR), fraud techniques, the use of generative AI for perpetrating account takeovers (ATOs), gamification of cybersecurity and identity defenses to improve security postures, API security, and the role of identity in cloud security.

ITDR garnered much attention from both attendees and vendors. Vendors have developed solutions that are targeted at reducing threats and risks from enterprise ATOs that lead to asset compromise, data loss, and ransomware attacks. Sophisticated attackers today sometimes do not even use malware to attack victims. They simply take over legitimate accounts, often those that are not properly secured against remote attacks. In other cases, they buy access to real corporate accounts on the dark web. ITDR solutions primarily integrate with identity repositories and other security tools to understand normal activities in order to be able to identify abnormal activities that may be signs of malicious intent.

Some ITDR solutions leverage user behavioral analytics from endpoints, and some have identity deception capabilities, whereby fake accounts, credentials, and other lures are used to draw in would-be attackers to learn about their Tactics, Techniques, and Procedures (TTPs). The ITDR market is relatively new and there is a lot of variety in how the products are implemented. We at KuppingerCole expect that the ITDR field will grow, mature, and standardize on more inclusive feature sets. For our latest research on ITDR, see the recently published Leadership Compass on the subject.

Fraud prevention is a desired outcome for identity security, especially within consumer and customer IAM. Fraudsters are constantly innovating on their techniques and finding new targets. Organizations must strive to keep up with mechanisms and services to detect fraud and prevent losses. Fortunately, numerous Fraud Reduction Intelligence Platforms (FRIPs) are available to help organizations of all kinds prevent ATOs, new account fraud, synthetic fraud, and ecommerce abuse. FRIP solutions typically have multiple capabilities to detect and stop fraud attempts, including identity verification, compromised credential intelligence, device intelligence, user behavioral analysis, behavioral biometrics, and bot detection and management. KuppingerCole will be updating our research on FRIPs later this year. 

Looking forward to our cyberevolution conference, we will continue our focus on identity security. We will have speakers addressing ITDR and fraud prevention, as well as practical sessions on the implementation of Zero Trust to bolster identity security. cyberevolution will take place in Frankfurt, Germany on December 3-5, 2024.  To register, click here.


Metadium

Metadium 2024 H1 Activity Report

Summary The second quarter of 2024 saw a total of 6,920,279 transactions and 33,046 DID wallets created. As part of our new Metadium plan, we updated our website. It represents the growth of Metadium and the progress towards the 2024 vision map. The 2024 vision map was released, outlining the goals for the new Metadium. We unified non-circulating wallets for better management of Met

Summary

The second quarter of 2024 saw a total of 6,920,279 transactions and 33,046 DID wallets created. As part of our new Metadium plan, we updated our website. It represents the growth of Metadium and the progress towards the 2024 vision map. The 2024 vision map was released, outlining the goals for the new Metadium. We unified non-circulating wallets for better management of Metadium wallets. New director appointed. Technology

Q2 Monthly Transactions

During the second quarter of 2024, there were 6,920,279 transactions, and 33,046 DID wallets(as of July 2st).

Website Update

We are happy to announce a new look for our website. This update is part of our new Metadium plan, which represents our growth and progress and our 2024 vision map.

Find more information here.

Metadium 2024 Vision Map

The 2024 vision map envisions improvements to the blockchain network, governance changes, and expanded usability of DID.

Metadium aims to create new markets through its evolving technology and will continue to break down the boundaries between Web2 and Web3 and create new value-added service models.

Find more information here.

Non-circulating wallet unification plan

We’ve announced and completed the plan to unify Metadium’s non-circulating wallets.

Find more information here.

Major management changes

New Director Appointed :

As of June 6, 2024, Francisco E. Filho has joined Metadium. Francisco is a powerhouse of knowledge and expertise, now joining our team to revolutionize the field of blockchain technology. Read More

안녕하세요. 메타디움 팀입니다.

2024년 상반기에도 메타디움은 중요한 발전을 이어갔습니다. 2024년 메타디움과 함께 해주신 여러분께 감사드리며, 지난 하반기에 이은 메타디움의 주요 성과와 변화를 돌아보고 요약하여 보고합니다.

요약

2024년 4월부터 6월 간 총 6,920,279건의 트랜잭션과 33,046개의 DID 월렛이 생성되었습니다. 새로운 메타디움 계획의 일환으로 웹사이트가 업데이트 되었습니다. 메타디움의 성장과 2024년 비전맵을 위한 발전을 나타내고 있습니다. 새로운 메타디움의 목표를 담은 메타디움 2024 비전맵이 발표되었습니다. 메타디움 지갑의 원활한 관리를 위해 비유통량 지갑을 통합하였습니다. 메타디움 디지털 자산 유통계획을 공시했습니다. 메타디움 이사회에 Francisco E. Filho가 새로운 이사로 합류했습니다. 기술 업데이트

Q2 월간 트랜잭션

2024년 4월부터 6월 간 총 6,920,279 건의 트랜잭션과 33,046개의 DID 월렛이 생성되었습니다. (7월 2일 기준)

메타디움 웹사이트 업데이트

메타디움의 웹사이트가 새롭게 변경되었습니다. 이번 홈페이지 업데이트는 새로운 메타디움 계획의 일환으로 이루어진 작업으로, 메타디움의 성장과 2024년 비전맵을 위한 발전을 나타내고 있습니다.

자세한 내용은 여기를 확인해보세요.

메타디움 2024 비전맵

메타디움은 2024년 비전 맵을 통해 블록체인 네트워크의 개선과 거버넌스 변화, 그리고 DID 사용성 확장을 계획하고 있습니다.
메타디움은 진화된 기술을 통해 새로운 시장을 창출하고자 하며, 웹2와 웹3의 경계를 허물고 부가가치가 높은 새로운 서비스 모델을 창출하기 위해 지속적인 노력을 기울일 것입니다.

자세한 내용은 여기를 확인해보세요.

비유통량 지갑 통합 계획

메타디움 지갑의 원활한 관리를 위해 메타디움 비유통량 지갑 통합 계획을 발표하고 작업을 완수했습니다.

자세한 내용은 여기를 확인해보세요.

메타디움 디지털 자산 유통계획

커뮤니티 여러분과의 신뢰 구축 및 시장에서의 건전한 성장을 도모하기 위해, 메타디움 디지털 자산 유통계획을 공시했습니다.

자세한 내용은 여기를 확인해보세요

주요 경영진 변경

지난 6월 6일부터 메타디움 이사회에 Francisco E. Filho가 새로운 이사로 합류했습니다. 메타디움의 더욱 다채로운 여정을 함께 하시기 바랍니다. Read more

- 메타디움 팀

Website | https://metadium.com

Discord | https://discord.gg/ZnaCfYbXw2

Telegram(EN) | http://t.me/metadiumofficial

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Metadium 2024 H1 Activity Report was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 04. July 2024

auth0

An Overview of Commonly Used Access Control Paradigms

When you are working on a complex web application, at some point, you’ll want to restrict or grant access. In this post we'll explore some of the most commonly used access control paradigms.
When you are working on a complex web application, at some point, you’ll want to restrict or grant access. In this post we'll explore some of the most commonly used access control paradigms.

Tokeny Solutions

Tokeny Expands Chain Support to IOTA EVM

The post Tokeny Expands Chain Support to IOTA EVM appeared first on Tokeny.

Luxembourg, 4 July 2024 – Tokeny, the leading onchian operating system, continues to broaden its multi-chain compatibility by integrating with IOTA EVM, a pioneering EVM-compatible smart contract chain built on the highly interoperable IOTA network. This integration is poised to empower companies seeking to harness IOTA’s ecosystem for seamless tokenization of real-world assets (RWA) and regulated financial products, offering comprehensive and user-friendly tokenization solutions.

IOTA EVM represents a leap forward in blockchain interoperability, offering a high-velocity, plug-and-play environment for deploying smart contracts on the IOTA network. Leveraging IOTA’s feeless and scalable infrastructure, IOTA EVM opens new avenues for enterprises seeking innovative tokenization solutions.

Tokeny delivers institutional-grade tokenization solutions designed to cover the full lifecycle of securities, offering features tailored for front, middle, and back office operations. By leveraging the ERC-3643 permissioned token standard, Tokeny’s solutions automate compliance validation processes on-chain, ensuring that only qualified investors can transfer tokens. Issuers also maintain full control over tokens, with the ability to freeze or recover them as needed.

We are thrilled about Tokeny's integration as it perfectly aligns with our mission to democratize access to tokenized RWA and financial instruments in our ecosystem. Tokeny stands out as the most advanced institutional-grade tokenization platform supporting market standard ERC-3643, poised to accelerate institutional tokenization on IOTA EVM. Dominik SchienerChairman of the Board of Directors and Co-Founder of IOTA This partnership aligns with our vision of enabling institutions to leverage desired network benefits. IOTA's feeless transaction model empowers cost-effective transfers and operations of tokenized securities, addressing specific institutional needs. Our role is to facilitate rapid tokenization to accelerate adoption and meet evolving market demands. Luc FalempinCEO Tokeny Solutions About IOTA

Founded in 2015, IOTA is a public goods infrastructure to bring trust in our digital world. Through IOTA, governments, organizations and people are able to interact with each other in a secure, trusted and verifiable way. IOTA is one of the most established Blockchain projects in the world and is primarily driven by a global ecosystem of non-profit organizations.

About Tokeny

Tokeny provides a compliance infrastructure for digital assets. It allows financial actors operating in private markets to compliantly and seamlessly issue, transfer, and manage securities using distributed ledger technology. By applying trust, compliance, and control on a hyper-efficient infrastructure, Tokeny enables market participants to unlock significant advancements in the management and liquidity of financial instruments. 

The post Tokeny Expands Chain Support to IOTA EVM first appeared on Tokeny.

The post Tokeny Expands Chain Support to IOTA EVM appeared first on Tokeny.


Ocean Protocol

DF96 Completes and DF97 Launches

Predictoor DF96 rewards available. DF97 runs Jul 4— Jul 11, 2024 1. Overview Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor. Data Farming Round 96 (DF96) has completed. DF97 is live today, July 4. It concludes on July 11. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards. 2
Predictoor DF96 rewards available. DF97 runs Jul 4— Jul 11, 2024 1. Overview

Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor.

Data Farming Round 96 (DF96) has completed.

DF97 is live today, July 4. It concludes on July 11. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.

2. DF structure

The reward structure for DF97 is comprised solely of Predictoor DF rewards.

Predictoor DF: Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn.

3. How to Earn Rewards, and Claim Them

Predictoor DF: To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs.

4. Specific Parameters for DF97

Budget. Predictoor DF: 37.5K OCEAN + 20K ROSE

Networks. Predictoor DF applies to activity on Oasis Sapphire. Here is more information about Ocean deployments to networks.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in DF: adding new streams and budget adjustments among streams.

Updates are always announced at the beginning of a round, if not sooner.

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

DF96 Completes and DF97 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 03. July 2024

auth0

Using Auth0 to Collect Consent for Newsletter Signups

Learn how to seamlessly integrate Auth0 into your newsletter sign-up process, ensuring compliance with GDPR regulations. This simple guide walks you through different methods to collect user consent for newsletter signups.
Learn how to seamlessly integrate Auth0 into your newsletter sign-up process, ensuring compliance with GDPR regulations. This simple guide walks you through different methods to collect user consent for newsletter signups.

Safle Wallet

How AI is Impacting Web3 & Blockchain: Benefits for Developers and Safle’s Integration

The convergence of AI and blockchain technologies represents a pivotal moment in the evolution of software development. According to PwC, AI is projected to inject a staggering $15.7 trillion into the global economy by 2030, catalyzing a notable 14% increase in global GDP. This profound impact underscores AI’s role in reshaping the future of technology and driving economic growth. As AI rede

The convergence of AI and blockchain technologies represents a pivotal moment in the evolution of software development. According to PwC, AI is projected to inject a staggering $15.7 trillion into the global economy by 2030, catalyzing a notable 14% increase in global GDP. This profound impact underscores AI’s role in reshaping the future of technology and driving economic growth.

As AI redefines security, automation, and data management, Web3 developers are poised to harness these advancements for groundbreaking innovations. Leading the charge is Safle, a cutting-edge non-custodial, multi-chain wallet provider, seamlessly integrating AI to revolutionize user experience and elevate blockchain technology to new heights.

AI and Blockchain — A Powerful Combination

Blockchain and AI are two transformative technologies, each enhancing the other. Blockchain provides a transparent, immutable ledger for secure data transactions, while AI uses data to mimic human problem-solving and make informed decisions. AI includes machine learning and deep learning, training algorithms for predictions and classifications.

The synergy between AI and blockchain is powerful. Blockchain’s transparency ensures reliable data for AI, and its decentralized nature supports AI operations without central servers. Blockchain’s cryptographic techniques bolster AI privacy, and decentralized computing (DePIN) supports the computational needs of AI.

In return, AI enhances blockchain by creating more secure smart contracts and processing data to improve blockchain networks. This integration boosts trust, scales AI applications, and streamlines business processes. AI-powered smart contracts can resolve disputes and drive sustainable solutions, making systems faster and more efficient.

How Safle integrates AI in its Offerings

Leading the intersection of AI and blockchain is Safle. It features a cross-resolving Web3 naming registry, multichain and crosschain swaps, and integrated on-ramp and off-ramp solutions, deFi Access for users.

Safle’s identity wallets provide easily manageable multichain identities, seedless onboarding with full self-custody, Built-in features include NFT discovery, access to real-world assets, the security of cold wallets with smartphone simplicity, upcoming NFC recovery, an easy-to-use dashboard, and biometric access. Safle is committed to advancing the adoption of decentralized technologies with its comprehensive offerings.

Safle’s AI integration strategy is meticulously structured into three advanced phases:

1. Informational Phase

In this initial phase, Safle introduces the next-gen Portfolio Viewer, a powerful tool enabling users to gain a holistic view of their digital asset portfolio. Portfolio Viewer’s advanced AI algorithms and data analytics, allows users to meticulously track performance metrics, scrutinize DeFi holdings, NFTs, and transaction histories.

2. Informational with DeFi Integration

In this phase, Safle integrates advanced AI-driven DeFi features into its wallet. Users can use AI to scan their portfolios and find yield optimization opportunities. The AI identifies inefficiencies and suggests strategies for better yields, utilizing the ERC-4626 standard for easy integration. It also creates comprehensive transaction bundles, allowing users to execute multi-step DeFi strategies with a single approval, interacting smoothly with various DApps and staking protocols.

3. Executional — Intent-Based Phase

In this phase, Safle transforms crypto wallet interfaces with account abstraction (AA) and AI agents, making Web3 interactions more intuitive and efficient. Users can express their needs using natural language, either by text or voice. The AI agents, using advanced natural language processing (NLP) models, understand these inputs and create optimal solutions. These solutions automatically match the appropriate DApps and blockchain networks, utilizing protocols like EIP-3074 (AUTH and AUTHCALL operations) and EIP-4337 (Account Abstraction via Entry Point Contract Specification) for seamless operation.

Once users approve, their AA-enabled accounts automatically execute the solutions, managing complex DApp interactions and transactions. This process relies on smart contracts, enabling users to verify outcomes on-chain.

In conclusion, Safle’s phased approach to AI integration leverages cutting-edge Web3 technologies and protocols, enhancing user experience and interaction in the decentralized ecosystem.

About Safle

Safle is a non-custodial, multi-chain wallet and blockchain infrastructure provider. It offers cross-resolving Web3 naming, multichain swaps, and on-the-go off-ramp and on-ramp infrastructure. Safle supports seamless identity management, seedless onboarding, and NFT discovery. The wallet combines the security of cold wallets with smartphone simplicity, featuring NFC recovery, an easy-to-use dashboard, and biometric access.

Follow our social media to stay updated: Twitter | DiscordTelegram


PingTalk

Transforming CIAM Strategy into a Profit Center in Financial Services

Transform CIAM into a profit center in financial services to deliver seamless and secure digital experiences

The six blind men each touch a different part of an elephant and arrive at vastly different conclusions about what it is. This serves as a metaphor for how CIAM has been treated in the past and is getting better with time.


How Session Management Works and Why It’s Important

Best practices and protocols for user session management implementation.

Tuesday, 02. July 2024

Anonym

6 Ways RECLAIM App Helps You Recover Your Info from Companies that Sell it

RECLAIM, powered by MySudo, is a new personal data removal service that uses machine learning and artificial intelligence to help you reclaim control of your personal information from the companies that store and might sell it.   RECLAIM scans your email subject lines and senders to identify which companies have your personal details, such as phone, […] The post 6 Ways RECLAIM App Help

RECLAIM, powered by MySudo, is a new personal data removal service that uses machine learning and artificial intelligence to help you reclaim control of your personal information from the companies that store and might sell it.  

RECLAIM scans your email subject lines and senders to identify which companies have your personal details, such as phone, address, and credit card details, and then gives you instructions for either switching out your personal information for Sudo information or asking the company to delete your data altogether. Sudos are secure digital profiles with phone, email, and payment cards to use instead of your own. You create your Sudos in the MySudo all-in-one privacy app, part of the same app family as RECLAIM. 

Just released in beta, RECLAIM is a great place to start reducing the online exposure of your personally identifiable information and digital footprint, and boosting your data privacy.  

RECLAIM has 6 handy tools built in  Dashboard – Start here. See your email scan and a first look at who has your data.  Footprint – Get the complete list of companies that have your data based on your email, and where your information might have been exposed in a data breach.  Key insights – Dive into the nitty gritty of what information each company has on you and what they might be doing with it based on their privacy policy and terms of service, which RECLAIM extracts and summarizes using AI.   Protection Manager – Take the wheel by opting to either protect your data with MySudo or delete it altogether.  MySudo Protection – If you decide to protect your information with MySudo, follow the steps to switch out your personal details for your Sudo secure digital profile details.  Deletion – If you opt to ask the company to delete your information, you’d do that here.  

When you start with RECLAIM, head to your dashboard to see your footprint scan and then start reclaiming control. It’s quick and easy, and definitely worth the effort. 

Sign in with Google to reclaim your Gmail account 

The beta version of RECLAIM only scans Gmail accounts. If you don’t have Gmail, there’s an option to share with us which email service you’d like our next release to support.  

Once you log in with Google, RECLAIM will read the metadata of your Gmail messages. Email metadata includes date and time, sender and recipient email addresses, and email subject. RECLAIM does not have access to the attachments or body of your Gmail inbox messages 

RECLAIM works best with MySudo 

Once you’ve regained control of your personal information with RECLAIM, keep up the good work by using Sudos to proactively protect your online presence. Sudos are secure digital profiles with phone, email, and payment cards to use instead of your own. You can sign up for new accounts and services with your Sudo information, not your personal information. You’d simply give companies your Sudo nickname, phone number, email address and payment card details instead of your own. Then, if that Sudo information is hit in a data breach or the company sells it to data brokers for profit, your own information is safe because it’s masked. You might choose to delete the affected Sudo and create a new one based on your personal needs (you have up to nine Sudos in the app). Read: 5 Ways to Make Best Use of your 9 Sudo Profiles

So, are you ready to reclaim your information from companies that store and sell it? Try RECLAIM today. 

The MySudo family of apps features: 

MySudo all-in-one privacy app  MySudo VPN  MySudo Browser Extension  RECLAIM. 

You might also like: 

New to MySudo? Start Here

How Sudos Can Give You a Second Chance at Digital Privacy 

From Yelp to Lyft: 6 Ways to “Do Life” Without Using Your Personal Details 

4 Steps to Setting Up MySudo to Meet Your Real Life Privacy Needs 

The post 6 Ways RECLAIM App Helps You Recover Your Info from Companies that Sell it appeared first on Anonyome Labs.


Microsoft Entra (Azure AD) Blog

Evolve your CIAM strategy with External ID

Last month we announced the general availability of our next generation customer identity and access management solution, Microsoft Entra External ID. External ID makes Customer Identity & Access Management (CIAM) secure and simple by enabling you to:     Secure all external identities: Managing several disparate solutions can overcomplicate your security strategy. By adopting

Last month we announced the general availability of our next generation customer identity and access management solution, Microsoft Entra External ID. External ID makes Customer Identity & Access Management (CIAM) secure and simple by enabling you to:  

 

Secure all external identities: Managing several disparate solutions can overcomplicate your security strategy. By adopting External ID as your CIAM solution, you can secure all identity types within your Microsoft Entra admin center, safeguarding all external identities with industry-leading security, including our own conditional access engine, verifiable credentials, and built-in identity governance.   Create frictionless user experiences: The rise of fraud, GenAI, and identity attacks has increased end-user fear when it comes to security risks online. With External ID, you can build frictionless, branded, user centric interfaces into your web and mobile applications to increase brand awareness, build user trust and drive user engagement. Check out an example in the WoodGrove Groceries demo!  Streamline secure collaboration: Collaborating with external users and ensuring they have the right access at the right time is complex. Simplify collaboration by inviting business guests with External ID and defining what internal resources they can access across SharePoint, Teams, and OneDrive.   Accelerate the development of secure applications: Integrating robust and extensive user flows into apps can take developers months. Shorten development time to minutes by leveraging External ID’s rich set of APIs, SDKs, and integrations with developer tools, such as Visual Studio Code, to build secure and branded identity experiences into external-facing web and mobile apps.  Best in class value at scale: Managing several security stacks can be costly. External ID brings innovative CIAM features at a cost-effective value for any growing customer without compromising on scalable, end-to-end security. For example, this approach helps us bring best-in-class identity verification like Face Check with Verified ID to reduce help desk costs for combatting fraud. Learn more about External ID pricing here. 

 

Our goal is to provide best in class protection from bot attacks, sign in and signup fraud and ability to audit every step of external user’s journeys

 

Ask Me Anything (AMA) on July 16 for a deep dive into External ID!  

 

Since our GA announcement, we’ve received lots of interest from customers who want to get started with External ID. Don't miss our live Ask Me Anything webinar on July 16, 2024, at 9am PST! Register online to join our product experts as they showcase live demos to show how External ID shortens the implementation of secure end-to-end identity experiences into external-facing apps from months to minutes.   

 

In our AMA event, we’ll also reserve time to address any FAQs you may have about External ID, Azure AD B2C, Azure AD B2B, and more. You can find most of these questions in public documentation and in your tenant administration portal. We also collected some here for convenience: 

 

I am currently using Azure AD B2C, how can I take advantage of the innovation in Microsoft Entra External ID?  

By building new applications with Microsoft Entra External ID, admins and developers can lean on familiar Microsoft Entra ID experiences while avoiding the overhead of building specific skills in Azure AD B2C technology. Powered by open standards, External ID is built to be interoperable with any Identity solution to provide enterprise-grade security without sacrificing end user experiences. Learn more. 

 

While Azure AD B2C is powerful in the flexibility of experiences it enables, External ID is designed for ease of adoption and speed of innovation as it’s converged into the Entra ID technical stack and organically benefits from all Entra ID innovation, extending Microsoft Entra industry-leading security and governance to external users.   

 

Will there be any changes in Azure AD B2C support and how can I migrate my existing Azure AD B2C applications to Microsoft Entra External ID? 

Current Azure AD B2C customers can continue using the Azure AD B2C with no service disruptions, including creating new tenants. You can continue to operate your existing B2C applications with confidence and we'll continue supporting you until at least May 2030. 

 

We’re currently developing a seamless migration journey so you can move your existing Azure AD B2C applications to External ID without disrupting your end users and will share more information when ready. If you’d like to participate in early previews, your account team can help enroll you. You may choose to migrate your existing applications when the next-generation platform meets your feature requirements, and migration is right for your business. Learn more in our FAQ. 

 

I am currently using Azure AD B2B collaboration and B2B direct connect, have these experiences changed? 

Azure AD B2B collaboration and B2B direct connect are now part of Microsoft Entra External ID as External ID B2B collaboration and B2B direct connect. There are no changes to your product experience, B2B collaboration features remain in the same location in the Microsoft Entra admin center within the workforce tenant, allowing you to secure all business guests, streamline collaboration, and limit access risks extending ID Governance to external users.   

 

Get started with External ID! 

 

We’re excited to share the new External ID platform with you and help you deliver seamless and secure experiences to your end-users. If you are interested in learning more about External ID and how it can help secure your applications, visit aka.ms/External_ID to get started. You can try External ID for free and only pay for what you use, learn more about pricing here.  

 

 

 

  

 

Ankur Patel runs Growth for Identity @ Microsoft. In recent times, he drove the effort for connecting LinkedIn, the world’s leading professional graph and Office 365, the world’s leading productivity graph. Currently, Ankur leads Microsoft’s efforts for Entra Verified ID & External ID to improve security and compliance without compromising on privacy.  

 

 

Read more on this topic 

Microsoft Entra External ID  Microsoft Entra External ID frequently asked questions

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog   ⁠⁠Microsoft Entra blog | Tech Community   Microsoft Entra documentation | Microsoft Learn  Microsoft Entra discussions | Microsoft Community  

KuppingerCole

NRI Secure Uni-ID Libra

by Alejandro Leal Japan's strong push towards digital transformation in various industries, including finance, government, media, telecom, healthcare, and transportation, has elevated the importance of Consumer Identity and Access Management (CIAM) solutions. The growth of eCommerce, especially post-COVID-19, has further emphasized the need for businesses to manage customer identities securely and

by Alejandro Leal

Japan's strong push towards digital transformation in various industries, including finance, government, media, telecom, healthcare, and transportation, has elevated the importance of Consumer Identity and Access Management (CIAM) solutions. The growth of eCommerce, especially post-COVID-19, has further emphasized the need for businesses to manage customer identities securely and efficiently while improving the customer experience. Focusing on the Japanese market, NRI Secure’s Uni-ID Libra stands out as an example of how local CIAM solutions are adapting and innovating to meet these needs.

Safle Wallet

Testing Multi-chain Integration ⛓️

Weekly Safle Update! 🚀 Greetings Saflenauts, We’ve emerged from our dark side exploration, and our systems are humming once again. It’s time to gather the crew and push forward into the cosmos. Check out the latest milestones and adventures from the Safle spaceship. UI/UX Perspective 🎨 Wallet Connect: Simplifying tech requirements for a seamless user experience. 🛠️🔗 Portfolio Viewer: Al
Weekly Safle Update! 🚀 Greetings Saflenauts,

We’ve emerged from our dark side exploration, and our systems are humming once again. It’s time to gather the crew and push forward into the cosmos. Check out the latest milestones and adventures from the Safle spaceship.

UI/UX Perspective 🎨 Wallet Connect: Simplifying tech requirements for a seamless user experience. 🛠️🔗 Portfolio Viewer: Almost synced to perfection for a smoother viewing pleasure. 👀✨ Engineering ⛓️

Chain Integrations in test:

🛠️ Polygon zkEVM 🛠️ Base Chain 🛠️ Avalanche Chain

To get an early preview of this, click below to sign up for beta release and testing.

🔗https://hgbopbs0vss.typeform.com/to/u2I3IQyG

Smart Contract and Token Management 🎬

Checkout the details of last audit by quillaudits here

🔗https://www.quillaudits.com/leaderboard/safle

Download the Safle App Now!

Experience the power of Safle at your fingertips 🚀

🔗https://app.getsafle.com/signup

Keep an 👀 on aur socials

🔗https://linktr.ee/safle


Farmer Connect

4 Steps to Take in Preparation of the Upcoming EUDR

The European Union's upcoming Deforestation Regulation (EUDR) represents a significant shift towards sustainable and transparent supply chains. It aims to combat deforestation and ensure that products imported into the EU are sourced responsibly. As businesses gear up for compliance, integrating with advanced traceability solutions can streamline operations and enhance transparency thro

The European Union's upcoming Deforestation Regulation (EUDR) represents a significant shift towards sustainable and transparent supply chains. It aims to combat deforestation and ensure that products imported into the EU are sourced responsibly. As businesses gear up for compliance, integrating with advanced traceability solutions can streamline operations and enhance transparency throughout the supply chain, whilst also ensuring compliancy ✅


PingTalk

PingGateway 2024.6 is Here!

ForgeRock Identity Gateway is now PingGateway. The 2024.6 release offers powerful enhancements to boost your security and operational efficiency.

We are excited to announce that ForgeRock Identity Gateway has been rebranded as PingGateway. This change is part of our strategic move to unify all ForgeRock products under the Ping brand, ensuring a cohesive and integrated experience for our customers. This rebranding reflects our commitment to all our customers that no matter where you began your journey, with Ping or ForgeRock, the investments you have made are safe. We are building a roadmap to the future where we are leveraging the strengths of both companies to deliver more value to our customers and we are very excited about that future.

Monday, 01. July 2024

KuppingerCole

Distributed Identity and the Impact of AI - The Future of Digital Identity with Jacoba Sieders

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. This interview covers the development of reusable and trusted identities, trust frameworks, and the critical levels of assurance needed for digital identities. Discover the economic value of digital identity in cross-border transactions and the creation of independent wallets that empower users

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. This interview covers the development of reusable and trusted identities, trust frameworks, and the critical levels of assurance needed for digital identities. Discover the economic value of digital identity in cross-border transactions and the creation of independent wallets that empower users with choice and control over their credentials.




Microsoft Entra (Azure AD) Blog

What’s new in Microsoft Entra – June 2024

Have you explored the What's New in Microsoft Entra hub in the Microsoft Entra admin center? It's a centralized view of our roadmap and change announcements across the Microsoft Entra identity and network access portfolio so you can stay informed with the latest updates and actionable insights to strengthen your security posture.   Here in the Microsoft Entra blog, we share feature releas

Have you explored the What's New in Microsoft Entra hub in the Microsoft Entra admin center? It's a centralized view of our roadmap and change announcements across the Microsoft Entra identity and network access portfolio so you can stay informed with the latest updates and actionable insights to strengthen your security posture.

 

Here in the Microsoft Entra blog, we share feature release information and change announcements every quarter. Today’s post covers April – June 2024. It’s organized by Microsoft Entra products, so you can quickly scan what’s relevant for your deployment. 

 

Microsoft Entra ID  Microsoft Entra ID Governance  Microsoft Entra External ID  Microsoft Entra Permissions Management  Microsoft Entra Verified ID 

 

New releases

 

Microsoft Entra ID Protection: remediate risks and unblock users On-premises password reset remediates user risk Multiple Passwordless Phone Sign-in for Android Devices Windows Account extension is now Microsoft Single Sign On Custom Claims Providers enable token claim augmentation from external data sources Granular Certificate-Based Authentication Configuration in Conditional Access New role: Organizational Branding Administrator Microsoft Graph activity logs

$select in signIn AP

Last successful sign-in date for users

Self-service password reset Admin policy expansion to include additional roles

Dynamic Groups quota increased to 15,000

Streamline your ADAL migration with updated sign-ins workbook

 

Change announcements

 

Security update to Entra ID affecting clients which are running old, unpatched builds of Windows

[Action may be required]

 

We're making a security update to Entra ID such that use of older unpatched version of Windows which still use the less secure Key Derivation Function v1 (KDFv1) will no longer be supported.  Once the update is rolled out, unsupported and unpatched Windows 10 and 11 clients will no longer be able to sign in to Entra ID. Globally, more than 99% of Windows clients signing in to Entra ID have the required security patches.

 

Action required:

If your Windows devices have Security Patches after July 2021 no action is required.

 

If your Windows devices do not have security updates after July 2021, update Windows to the latest build of your currently supported Windows version to maintain access to Entra ID. 

 

All currently supported versions of Windows have the required patch. 

 

We recommend you keep Windows up to date with Security Updates.

 

Background: 

A Security Update to Windows CVE-2021-33781 was issued in July 2021 to address a vulnerability where Primary Refresh Tokens were not stored sufficiently securely in the client.  Once patched, Windows clients used the stronger KDFv2 algorithm.  All versions of Windows released since that time have the update and handle the token securely.

 

A small percentage of Windows devices have not yet been updated and are still using the older v1 key derivation function. To improve security of the system, unpatched devices using the KDFv1 algorithm will no longer be able to sign in to Entra ID using Primary Refresh Tokens.

 

What is the user experience on unsupported Windows devices when this change is rolled out?  

Users of Windows devices which haven’t been updated with patches since July 2021 may experience sign in failures with their Entra ID user accounts on joined or hybrid joined Windows device.

 

How do I diagnose this situation?

The error code, which will show in sign in logs, is 'AADSTS5000611: Symmetric Key Derivation Function version 'KDFV1' is invalid. Update the device for the latest updates.'

 

Enhancing the security of Apple devices in the enterprise with hardware bound device identity – 2-year notice

[Action may be required]

 

Device identity is one of the fundamental Entra ID concepts that enables multiple Entra ID and MDM/MAM security features like device compliance policiesapp protection policies, or PRT-based SSO.  To enhance security, Entra ID has now done work to support the binding of device identity keys to Apple’s Secure Enclave hardware, which will replace previous Keychain-based mechanism.

 

Starting in June 2026, all new Entra ID registrations will be bound to the Secure Enclave. As a result, all customers will need to adopt the Microsoft Enterprise SSO plug-in and some of the apps may need to make code changes to adopt the new Secure Enclave based device identity.

 

Opt-in, provide feedback

Before Entra enables Secure Enclave by default for all new registrations, we encourage tenants to perform early testing using the documentation provided on learn.microsoft.com. This will help to identify any compatibility issues, where you may need to request code changes from app or MDM vendors. 

 

To report issues, raise questions, or voice concerns please open a support ticket or reach out to your Microsoft account team. 

 

Upgrade to the latest version of Microsoft Entra Connect by September 23, 2024 

[Action may be required]

 

Since September 2023, we have been auto-upgrading Microsoft Entra Connect Sync and Microsoft Entra Connect Health to an updated build as part of a precautionary security-related service change. For customers who have previously opted out of auto-upgrade or for whom auto-upgrade failed, we strongly recommend that you upgrade to the latest versions by September 23, 2024.

 

When you upgrade to the latest versions by that date, you ensure that when the service changes take effect, you avoid disruption for the following capabilities:

 

Service

Recommended Version

Features Impacted by Service Change

Microsoft Entra Connect Sync

2.3.2.0 or higher

Auto-upgrade will stop working. Synchronization isn’t impacted

Microsoft Entra Connect Health agent for Sync

4.5.2487.0 or higher

A subset of alerts will be impacted:

·        Connection to Microsoft Entra ID failed due to authentication failure

·        High CPU usage detected

·        High Memory Consumption Detected

·        Password Hash Synchronization has stopped working

·        Export to Microsoft Entra ID was Stopped. Accidental delete threshold was reached

·        Password Hash Synchronization heartbeat was skipped in the last 120 minutes

·        Microsoft Entra Sync service cannot start due to invalid encryption keys

·        Microsoft Entra Sync service not running: Windows Service account Creds Expired

Microsoft Entra Connect Health agent for ADDS

4.5.2487.0 or higher

All alerts will be impacted

Microsoft Entra Connect Health agent for ADFS

4.5.2487.0 or higher

All alerts will be impacted

 

Note: If you cannot upgrade by September 23, 2024, you can still regain full functionality for the above features after that date. You would do so by manually upgrading to the recommended builds at your earliest convenience.

 

For upgrade-related guidance, please refer to our docs.

 

Important Update: Azure AD Graph Retirement

[Action may be required]

 

As of June 2023, the Azure AD Graph API service is in a retirement cycle and will be retired (shut down) in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs (https://graph.windows.net). We are revising the da20te for this first stage from June 30 to August 31, so only applications created after August 31, 2024, will be impacted. The second stage of the Azure AD Graph service retirement cycle will begin after January 31, 2025. At this point, all applications that are using Azure AD Graph APIs will receive an error when making requests to the AAD Graph service. Azure AD Graph will be completely retired (and stop working) after June 30, 2025.

 

We understand that some apps may not have fully completed migration to Microsoft Graph. We are providing an optional configuration (through the authenticationBehaviors setting) that will allow an application to continue use of Azure AD Graph APIs through March 30, 2025.  If you develop or distribute software that still uses Azure AD Graph APIs, you must act now to avoid interruption. You will either need to migrate your applications to Microsoft Graph (highly recommended) or configure the application for an extension, and ensure that your customers are prepared for the change. 

 

To identify applications that are using Azure AD Graph APIs, we have provided two Entra recommendations with information about applications and service principals that are actively using Azure AD Graph APIs in your tenant.  

 

For more information, see the following references:  

 

June 2024 update on Azure AD Graph API retirement Migrate from Azure Active Directory (Azure AD) Graph to Microsoft Graph   Azure AD Graph app migration planning checklist   Azure AD Graph to Microsoft Graph migration FAQ  

 

Important Update: AzureAD and MSOnline PowerShell retirement 

[Action may be required]

 

As of March 30, 2024, the legacy Azure AD PowerShell, Azure AD PowerShell Preview, and MS Online modules are deprecated. These modules will continue to function through March 30, 2025, when they are retired and stop functioning. Microsoft Graph PowerShell SDK is the replacement for these modules and you should migrate your scripts to Microsoft Graph PowerShell SDK as soon as possible.  

 

Note: as indicated in our April update, MS Online with “Legacy Auth” will stop functioning in the weeks after June 30, 2024. Legacy Auth is typically associated with versions before 1.1.166.0, and involves use of MS Online PowerShell with the Microsoft Online Sign-In Assistant package installed. If you are using MS Online versions before 1.1.166.0 or MS Online with Legacy Auth, you should immediately migrate to Microsoft Graph PowerShell SDK or update the MS Online version to the latest version (1.1.183.81).  

 

To help you identify usage of Azure AD PowerShell in your tenant, you can use the Entra Recommendation titled Migrate Service Principals from the retiring Azure AD Graph APIs to Microsoft Graph. This recommendation will show vendor applications that are using Azure AD Graph APIs in your tenant, including AzureAD PowerShell.  

 

We are making substantial new and future investments in the PowerShell experience for managing Entra, with the recent Public Preview launch of the Microsoft Entra PowerShell module. This new module builds upon and is part of the Microsoft Graph PowerShell SDK. It’s fully interoperable with all cmdlets in the Microsoft Graph PowerShell SDK, enabling you to perform complex operations with simple, well documented commands. The module also offers a backward compatibility option to simplify migraiton from the deprecated AzureAD Module. Additionally, we are aware that some of our customers were unable to fully migrate to scripts that managed Per-user MFA from MSOnline to Microsoft Graph PowerShell. Microsoft Graph APIs were recently made available to read and configure Per-user MFA settings for users, and availability in Microsoft Graph PowerShell SDK cmdlets is soon to follow.

 

Private Preview – QR code sign-in, a new authentication method for Frontline Workers

[Action may be required]

 

We are introducing a new simple way for Frontline Workers to authenticate in Microsoft Entra ID with a QR code and PIN, eliminating the need to enter long UPNs and alphanumeric passwords multiple times during their shift.

 

With the private preview release of this feature in August 2024, all users in your tenant will see a new link ‘Sign in with QR code’ on navigating to https://login.microsoftonline.com > ‘Sign-in options’ > ‘Sign in to an organization’ page. This new link, ‘Sign in with QR code’, will be visible only on mobile devices (Android/iOS/iPadOS). If you are not participating in the private preview, users from your tenant will not be able to sign-in through this method while we are still in private preview. They will receive an error message if they try to sign-in.

 

The feature will have a ‘preview’ tag until it is generally available. Your organization needs to be enabled to test this feature. Broad testing will be available in public preview, which we will announce later.  

 

While the feature is in private preview, no technical support will be provided. Please learn more about support during previews here Microsoft Entra ID preview program information - Microsoft Entra | Microsoft Learn

 

Changes to phone call settings: custom greetings and caller ID

[Action may be required]

 

Starting September 2024, phone call settings (custom greetings and caller ID) under Entra's multifactor authentication blade will be moved under the voice authentication method in the authentication method policy. Instead of accessing these settings through the Entra ID or Azure portal, they will be accessible through MS Graph API. If your organization is using custom greetings and/or caller ID, please make sure to check the public documentation once we release the new experience to learn how to manage these settings through MS Graph.

 

MS Graph API support for per-user MFA

[Action may be required]

 

Starting June 2024, we are releasing the capability to manage user status (Enforced, Enabled, Disabled) for per-user MFA through MS Graph API. This will replace the legacy MS Online PowerShell module that is being retired. Please be aware that the recommended approach to protect users with Microsoft Entra MFA is Conditional Access (for licensed organizations) and security defaults (for unlicensed organizations). The public documentation will be updated once we release the new experience.

 

Azure Multi-Factor Authentication Server - 3-month notice                         

[Action may be required]


Beginning September 30, 2024, Azure Multi-Factor Authentication Server deployments will no longer service MFA requests, which could cause authentications to fail for your organization. MFA Server will have limited SLA and MFA Activity Report in the Azure Portal will no longer be available. To ensure uninterrupted authentication services and to remain in a supported state, organizations should migrate their users’ authentication data to the cloud-based Azure MFA service using the latest Migration Utility included in the most recent Azure MFA Server update. Learn more at Azure MFA Server Migration.

 

Decommissioning of Group Writeback V2 (Public Preview) in Entra Connect Sync - Reminder

[Action may be required]

 

The public preview of Group Writeback V2 (GWB) in Entra Connect Sync is no longer available and Connect Sync will no longer support provisioning cloud security groups to Active Directory.

 

Another similar functionality is offered in Entra Cloud Sync, called “Group Provision to AD”, that maybe used instead of GWB V2 for provisioning cloud security groups to AD. Enhanced functionality in Cloud Sync, along with other new features, are being developed.

 

Customers who use this preview feature in Connect Sync should switch their configuration from Connect Sync to Cloud Sync. Customers can choose to move all their hybrid sync to Cloud Sync (if it supports their needs) or Cloud Sync can be run side-by-side and move only cloud security group provisioning to AD onto Cloud Sync. Customers who provision Microsoft 365 groups to AD can continue using GWB V1 for this capability.

 

Visual enhancements to the per-user MFA admin configuration experience

[No action is required]

 

As part of ongoing service improvements, we are making updates to the per-user MFA admin configuration experience to align with the look and feel of Entra ID. This change does not include any changes to the core functionality and will only include visual improvements. Starting in August 2024, you will be redirected to the new experience both from the Entra admin center and Azure portal. There will be a banner presented for the first 30 days to switch back to the old experience, after which you can only use the new experience. The public documentation will be updated once we release the new experience.

 

Updates to “Target resources” in Microsoft Entra Conditional Access

[No action is required]

 

Starting in September 2024, the Microsoft Entra Conditional Access 'Target resources' assignment will consolidate the "Cloud apps" and "Global Secure Access" options under a new name "Resources".  

 

Customers will be able to target "All internet resources with Global Secure Access", "All resources (formerly 'all cloud apps') or select specific resources (formerly "select apps"). Some of the Global Secure Access attributes in the Conditional Access API will be deprecated. 

 

This change will start in September 2024 and will occur automatically, admins won’t need to take any action. There are no changes in the behavior of existing Conditional Access policies. To learn more, click here

 

Upcoming Improvements to Entra ID device code flow

[No action is required]

 

As part of our ongoing commitment to security, we are announcing upcoming enhancements to the Entra ID device code flow. These improvements aim to provide a more secure and efficient authentication experience.

 

We've refined the messaging and included app details within the device code flow to ensure a more secure and precise user experience. Specifically, we've adjusted headers and calls to action to help your users recognize and respond to security threats more effectively. These changes are designed to help your users make more informed decisions and prevent phishing attacks.

 

These changes will be gradually introduced starting in July 2024 and are expected to be fully implemented by August 30, 2024. No action required from you.

 

Microsoft Entra ID Governance

New releases

Microsoft Entra ID multi-tenant organization Security group provisioning to Active Directory using cloud sync Support for PIM approvals and activations on the Azure mobile app (iOS and Android) Lifecycle Workflows: Export workflow history data to CSV files B2B Sponsors as an Attribute and Approvers in Entitlement Management Maximum workflows limit in Lifecycle workflows is now 100 New provisioning connectors in the Microsoft Entra Application Gallery

 

Microsoft Entra External ID

New releases

Microsoft Entra External ID Configure redemption order for B2B collaboration

 

Microsoft Entra Permissions Management

New releases

Support for PIM enabled Groups in Microsoft Entra Permissions Management

 

Microsoft Entra Verified ID

New releases

Quick Microsoft Entra Verified ID setup

 

 

Add to Favorites: What’s New in Microsoft Entra

Stay informed about Entra product updates and actionable insights with What’s New in Microsoft Entra.  This new hub in the Microsoft Entra admin center offers you a centralized view of our roadmap and change announcements across the Microsoft Entra identity and network access portfolio.

 

Learn more about Microsoft Entra

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.

Microsoft Entra News and Insights | Microsoft Security Blog⁠Microsoft Entra blog | Tech CommunityMicrosoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community 

 

 

 

 

 

 


June 2024 update on Azure AD Graph API retirement

One year ago, we shared an update on the completion of a three-year notice period for the deprecation of the Azure AD Graph API service. This service is now in the retirement cycle and retirement (shut down) will occur in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs. We’re revisi

One year ago, we shared an update on the completion of a three-year notice period for the deprecation of the Azure AD Graph API service. This service is now in the retirement cycle and retirement (shut down) will occur in incremental stages. In the first stage of this retirement cycle, newly created applications will receive an error (HTTP 403) for any requests to Azure AD Graph APIs. We’re revising the date for this first stage from June 30 to August 31, and only applications created after August 31, 2024 will be impacted. After January 31, 2025, all applications – both new and existing – will receive an error when making requests to Azure AD Graph APIs, unless they’re configured to allow extended Azure AD Graph access.  

 

We understand that some apps may not have fully completed migration to Microsoft Graph. We’re providing an optional configuration through the authenticationBehaviors property, which will allow an application to use Azure AD Graph APIs through June 30, 2025. Azure AD Graph will be fully retired after June 30, 2025, and no API requests will function at this point, regardless of the application’s configuration. 

 

If you develop or distribute software that still uses Azure AD Graph APIs, you must act now to avoid interruption. You’ll either need to migrate your applications to Microsoft Graph (highly recommended) or configure the application for an extension, as described below, and ensure that your customers are prepared for the change. If you’re using applications supplied by a vendor that use Azure AD Graph APIs, work with the software vendor to update to a version that has migrated to Microsoft Graph APIs.  

 

How do I find Applications in my tenant using Azure AD Graph APIs? 

 

The Microsoft Entra recommendations feature provides recommendations to ensure your tenant is in a secure and healthy state, while also helping you maximize the value of the features available in Entra ID.  

 

We’ve provided two Entra recommendations that show information about applications and service principals that are actively using Azure AD Graph APIs in your tenant. These new recommendations can support your efforts to identify and migrate the impacted applications and service principals to Microsoft Graph.

 

Figure 1: Microsoft Entra Recommendations for Azure AD Graph migration

 

For more information, reference Recommendation to migrate to Microsoft Graph API

 

Configuring an application for an extension of Azure AD Graph access

 

To allow an application created to have an extension for access to Azure AD Graph APIs through June 30, 2025, you must make a configuration change on the application after it’s created. This configuration change is done through the AuthenticationBehaviors interface. By setting the blockAzureADGraphAccess flag to false, the newly created application will be able to continue to use Azure AD Graph APIs until further in the retirement cycle.

 

Note: In this first stage, only Applications created after August 31, 2024 will be impacted. Existing applications will be able to continue to use Azure AD Graph APIs even if the authenticationBehaviors property is not configured. Once this change is rolled out, you may also choose to set blockAzureADGraphAccess to true for testing or to prevent an existing application from using Azure AD Graph APIs. 

 

Microsoft Graph REST API examples

 

Read the authenticationBehaviors property for a single application:

GET https://graph.microsoft.com/beta/applications/afe88638-df6f-4d2a-905e-40f2a2d451bf/authenticationBehaviors 

 

Set the authenticationBehaviors property to allow extended Azure AD Graph access for a new Application:

PATCH https://graph.microsoft.com/beta/applications/afe88638-df6f-4d2a-905e-40f2a2d451bf/authenticationBehaviors 

Content-Type: application/json

{

    "blockAzureADGraphAccess": false

}

 

Microsoft Graph PowerShell examples

 

Read the authenticationBehaviors property for a single application:

Import-Module Microsoft.Graph.Beta.Applications

Connect-MgGraph -Scopes "Application.Read.All"

 

Get-MgBetaApplication -ApplicationId afe88638-df6f-4d2a-905e-40f2a2d451bf -Property "id,displayName,appId,authenticationBehaviors"

 

Set the authenticationBehaviors property to allow extended Azure AD Graph access for a new Application:

Import-Module Microsoft.Graph.Beta.Applications 
Connect-MgGraph -Scopes "Application.ReadWrite.All" 

$params = @{ 

authenticationBehaviors = @{ 

blockAzureADGraphAccess = $false 

Update-MgBetaApplication -ApplicationId $applicationId -BodyParameter $params 

 

What happens to applications using Azure AD Graph after August 31, 2024? 

 

Any existing applications that use Azure AD Graph APIs and were created before this date will not be impacted at this stage of the retirement cycle. Any applications created after August 31, 2024 will encounter errors when making requests to Azure AD Graph APIs, unless the blockAzureADGraphAccess attribute has been set to false in the authenticationBehaviors configuration for the application. 

 

What happens to applications using Azure AD Graph after January 31, 2025? 

 

After January 31, 2025, all applications – new and existing - will encounter errors when making requests to Azure AD Graph APIs, unless the blockAzureADGraphAccess attribute has been set to false in the authenticationBehaviors property for the application.

 

What happens to applications using Azure AD Graph after June 30, 2025? 

 

Azure AD Graph APIs will no longer be available to any applications after this point, and any requests to Azure AD Graph APIs will receive an error, regardless of the authenticationBehaviors configuration for the application. 

 

Current support for Azure AD Graph

 

Azure AD Graph APIs are in the retirement cycle and have no SLA or maintenance commitment beyond security-related fixes.

 

About Microsoft Graph

 

Microsoft Graph represents our best-in-breed API surface. It offers a single unified endpoint to access Entra and Microsoft 365 services such as Microsoft Teams and Microsoft Intune. All new functionalities will only be available through Microsoft Graph. Microsoft Graph is also more secure and resilient than Azure AD Graph.

 

Microsoft Graph has all the capabilities that have been available in Azure AD Graph and new APIs like identity protection and authentication methods. Its client libraries offer built-in support for features like retry handling, secure redirects, transparent authentication, and payload compression.

 

What about Azure AD and Microsoft Online PowerShell modules?

 

As of March 30, 2024, AzureAD, AzureAD-Preview, and Microsoft Online (MSOL) PowerShell modules are deprecated and will only be supported for security fixes. These modules will be retired and stop working after March 30, 2025. You should migrate these to Microsoft Graph PowerShell. Please reference this update for more information. 

 

Available tools

 

Migrate from Azure Active Directory (Azure AD) Graph to Microsoft Graph  Azure AD Graph app migration planning checklist  Azure AD Graph to Microsoft Graph migration FAQ

 

Kristopher Bash 

Product Manager, Microsoft Graph 

LinkedIn 

 

 

Learn more about Microsoft Entra 

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.

⁠Microsoft Entra News and Insights | Microsoft Security Blog  ⁠⁠Microsoft Entra blog | Tech Community  ⁠Microsoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community

Ocean Protocol

Artificial Superintelligence Alliance Unveil Token with Migration dApp Tools Now Live

Zug and Singapore — July 1, 2024 –The Artificial Superintelligence Alliance, comprising SingularityNET, Fetch.ai, and Ocean Protocol, has today announced the launch of the multi-coin merger, forming the unified token. This merger of AGIX, OCEAN, and FET tokens marks a significant step towards creating a fully decentralized AI ecosystem. By aligning incentives, the Artificial Superintelligence Alli

Zug and Singapore — July 1, 2024 –The Artificial Superintelligence Alliance, comprising SingularityNET, Fetch.ai, and Ocean Protocol, has today announced the launch of the multi-coin merger, forming the unified token. This merger of AGIX, OCEAN, and FET tokens marks a significant step towards creating a fully decentralized AI ecosystem. By aligning incentives, the Artificial Superintelligence Alliance drives the vision of a decentralized, efficient, and transparent AI landscape, polling resources and expertise from its members to accelerate advancements in the ethical and transparent development of AI solutions.

Today, SingularityNET’s $AGIX and Ocean Protocol’s $OCEAN tokens merge into $FET. FET trading remains uninterrupted as the project rebrands to Artificial Superintelligence Alliance across platforms including CoinMarketCap and CoinGecko. The migration platform is now open on the SingularityDAO dapp, and $AGIX and $OCEAN are starting to delist from exchanges. This phase aims to onboard exchanges and data aggregators for a smooth transition.

Phase II will focus on community onboarding and $ASI deployment, prioritizing self-custody holders and deploying ASI tokens across multiple chains. The transition includes new migration contracts for $AGIX and $OCEAN tokens not yet converted to $FET and for EVM-based FET. All FET Mainnet Tokens will automatically convert to ASI during Fetch.ai’s mainnet upgrade. The migration contracts will be open for years, allowing ample time for all conversions. Detailed guides and tutorials are being provided to support a smooth transition. The timing of this phase is still underway, ensuring all necessary preparations are completed.

“Today’s token merger underscores our commitment to advancing safe artificial intelligence,” said Humayun Sheikh, chairman of the Artificial Superintelligence Alliance and CEO of Fetch.ai. “By merging our tokens, we aim to enhance operational efficiency and seamlessly integrate decentralized AI systems, ensuring broad access to cutting-edge AI technologies. This merger marks the beginning of an ambitious journey to foster unparalleled collaboration and openness, setting new standards for the industry.”

“We’re excited to have reached this milestone along the path to realizing our vision of an Artificial Superintelligence Alliance capable of winning the AGI and ASI race for the decentralized ecosystem,” stated Ben Goertzel, CEO of the Artificial Superintelligence Alliance and SingularityNet. “This tokenomic merger paves the way for a series of R&D and product collaborations combining Alliance technologies toward beneficial superintelligence.”

“We’re grateful to the community, exchanges and other partners for accommodating this token merge. We’re really looking forward to focusing on our users and products that increase adoption.” said Bruce Pon, Council Board Director of Artificial Superintelligence Alliance and founder of Ocean Protocol.

This milestone reflects the alliance’s commitment to accelerating AI research and development by combining Fetch.ai’s autonomous agent technology, Ocean Protocol’s data exchange frameworks, and SingularityNET’s decentralized AI services. The three technology partners will develop products leveraging each project’s diverse solutions to ensure ethical AI development. This open collaboration invites others to join the alliance, fostering an inclusive environment that promotes cutting-edge advancements while emphasizing responsible and transparent AI practices.

LINK TO ASI MIGRATION GUIDELINES

About The Artificial Superintelligence Alliance
The Artificial Super Intelligence (ASI) Alliance is a collective formed by Fetch.ai, SingularityNET (SNET), and Ocean Protocol. As the largest open-sourced, independent entity in AI research and development, this alliance aims to accelerate the advancement of decentralized Artificial General Intelligence (AGI) and, ultimately, Artificial Superintelligence (ASI). For additional information on ASI, visit: superintelligence.io

About SingularityNET
SingularityNET was founded by Dr. Ben Goertzel with the mission of creating a decentralized, democratic, inclusive and beneficial Artificial General Intelligence (AGI). According to Dr. Goertzel, AGI should be independent of any central entity, open to anyone and not restricted to the narrow goals of a single corporation or a single country. The SNET team includes seasoned engineers, scientists, researchers, entrepreneurs, and marketers. The core platform and the SNET AI teams are complemented by specialized teams devoted to various application areas such as robotics, biomedical AI, finance, media, arts and entertainment. For additional information visit: singularitynet.io

About Fetch.ai
Fetch.ai, a Cambridge-based AI company, is redefining the possibilities of an intelligent and connected world through its AI agent-based technology. Fetch.ai’s infrastructure technology enables developers and businesses to build, deploy & monetize through an agent-based modular platform for the new generation of AI applications. The company’s core product, DeltaV, fuses Language Models (LLMs) and AI Agents to create an open and dynamic marketplace that connects users to services and reimagines the current search experience. For additional information visit: fetch.ai

About Ocean Protocol
Ocean was founded to level the playing field for AI and data. Ocean tools enable businesses and individuals to trade tokenized data assets seamlessly to manage data all along the AI model life cycle. Ocean-powered apps include enterprise-grade data exchanges, data science competitions and data DAOs. The Ocean Predictoor product has over $800M in monthly volume six months after launch with a roadmap to scale foundation models globally. For additional information visit: oceanprotocol.com

Artificial Superintelligence Alliance Unveil Token with Migration dApp Tools Now Live was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Aug 14, 2024: The State of the CIAM Market

The CIAM market continues to grow and change. There have been major acquisitions in this space, and new vendors are launching products and services. Security is always a driver, but deploying organizations want useful data to improve marketing effectiveness and increase revenues. New privacy regulations put more requirements for information collection and handling on customer organizations. CIAM sy
The CIAM market continues to grow and change. There have been major acquisitions in this space, and new vendors are launching products and services. Security is always a driver, but deploying organizations want useful data to improve marketing effectiveness and increase revenues. New privacy regulations put more requirements for information collection and handling on customer organizations. CIAM systems must also be able to integrate with other IT, security, and enterprise IAM solutions. To capture market share, CIAM vendors have to be innovative. Fraud prevention and integrations with marketing tools are differentiators that many companies are looking for in CIAM.

Safle Wallet

Safle Update - Audited

House of Safle, Update (June 2024, Week 3) Dear Sentinels, After a brief communication blackout while exploring the moon’s dark side, the Safle spaceship is back online and ready for action. We call all Saflenauts to assemble and continue our stellar journey. Here’s our latest weekly update as we chart our course to the stars. UI/UX Perspective 🎨 Safle’s flagship Portfolio viewer de
House of Safle, Update (June 2024, Week 3) Dear Sentinels,

After a brief communication blackout while exploring the moon’s dark side, the Safle spaceship is back online and ready for action. We call all Saflenauts to assemble and continue our stellar journey. Here’s our latest weekly update as we chart our course to the stars.

UI/UX Perspective 🎨 Safle’s flagship Portfolio viewer designs are almost done, here’s a sneakpeak into it. Public Outlook 🗣️ Safle is trending on DEXtools since a week!! 🎉 Safle token has surged approximately 400% over the past week 🚀 Technical Enhancements 💻 Conducted impact analysis of the current app upgrade in the latest React Native version. File storage systems migrated to GCP for better availability. New and improved testing blueprint for streamlined QA processes. New Chains ⛓️ Avalanche, Polygon zkEVM and Base on the way.

Click here to sign up for beta release and testing.

🔗https://hgbopbs0vss.typeform.com/to/u2I3IQyG

DevOps and Configuration 🌐 Improved ties with MongoDB, deployed new clusters for improved performance. Product releases 📱 Published a new build (1.4.9) on Firebase and TestFlight for thorough testing. - expect a release in the coming week 💪 Smart Contract and Token Management 🎬 Treasury and team re-vesting contracts audited by Quillaudits, all checks passed. 0 vulnerabilities found.

Click here to get the audit report.

🔗https://www.quillaudits.com/leaderboard/safle

Experience Safle now! 🫂

Download the App :
🔗 app.getsafle.com/signup

Keep an 👀 on our socials

🔗 https://linktr.ee/safle


Ontology

Ontology Monthly Report — June

Ontology Monthly Report — June June has been a transformative month at Ontology, marked by significant community growth, strategic partnerships, and development milestones that continue to drive our commitment to enhancing the blockchain landscape. Here’s a recap of our key activities and achievements over the past month: Community and Web3 Influence 🌐🤝 Invite Campaign: We launched an inv
Ontology Monthly Report — June

June has been a transformative month at Ontology, marked by significant community growth, strategic partnerships, and development milestones that continue to drive our commitment to enhancing the blockchain landscape. Here’s a recap of our key activities and achievements over the past month:

Community and Web3 Influence 🌐🤝 Invite Campaign: We launched an invite campaign to expand our vibrant community, welcoming new members to join and contribute. Web3 Happenings: Our discussion on “On-Chain Summer” was highly engaging, providing valuable insights into seasonal trends in blockchain. Australia Node Activation: The Ontology Australia node X account is now operational, enhancing our network’s reach and staking capabilities. Insightful Articles: Don’t miss our latest publications discussing critical topics like blockchain and security, as well as an updated article explaining ONT ID. Ontology DID Quest on Zealy: We’ve launched the Ontology DID quest on Zealy to engage our community in practical applications of decentralized identity technologies. This initiative aims to enhance understanding and utilization of DIDs within our ecosystem. Development/Corporate Updates 🔧 Development Milestones 🎯 Ontology EVM Trace Trading Function: We’ve achieved 96% completion, significantly enhancing our capabilities within the EVM space. ONT to ONTD Conversion Contract: Progress has reached 66%, streamlining the conversion process for our users. ONT Leverage Staking Design: Now at 46%, this development is poised to offer innovative staking options to our community. Events and Partnerships 🤝 Community Calls and AMAs: This month included a community call with UQUID, an AMA discussing TradFi interoperability with KIMA, and a French-language AMA with Bitget. New Partnerships: We’ve established strategic partnerships with DxSale and XWorldGames, and launched a new campaign with Torus chain. ONTO Wallet Developments 🌐🛍️ New Listings and Partnerships: ONTO listed Pambii and announced a fresh partnership, expanding our ecosystem. Exciting Giveaways: We hosted NFT giveaways with The Ninth, NFTFeed and XSTAR, engaging the community with unique digital collectibles. On-Chain Metrics 📊 dApp Growth: The total number of dApps on our MainNet remains strong at 177, indicating a dynamic and thriving ecosystem. Transaction Growth: This month saw an increase of 1,169 dApp-related transactions and 10,519 MainNet transactions, showcasing active network utilization. Community Engagement 💬 Vibrant Discussions: Our social platforms have been buzzing with discussions and insights, fueled by the enthusiasm and engagement of our community members. Recognition through NFTs: We’ve recognized the contributions of active community members with the issuance of NFTs, celebrating their involvement and support. Follow us on social media 📱

Keep up with Ontology by following us on our social media channels. Your continued support and engagement are vital to our shared success in the evolving world of blockchain and decentralized technologies.

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Thank you for your unwavering support and participation this month. As we move forward, we are excited about the opportunities that lie ahead and are committed to delivering groundbreaking solutions and fostering a more inclusive and secure digital future.

Española 한국어 Türk Slovenčina русский Tagalog Français हिंदी 日本 Deutsch සිංහල tiếng Việt

Ontology Monthly Report — June was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Safle Wallet

Bitcoin is New Ethereum

House of Safle, Update (June 2024, Week 2) Dear Sentinels, After a brief communication blackout while exploring the moon’s dark side, the Safle spaceship is back online and ready for action. We call all Saflenauts to assemble and continue our stellar journey. Here’s our latest weekly update as we chart our course to the stars. UI/UX Perspective 🎨 Product Brainstorming: - Engaged in
House of Safle, Update (June 2024, Week 2) Dear Sentinels,

After a brief communication blackout while exploring the moon’s dark side, the Safle spaceship is back online and ready for action. We call all Saflenauts to assemble and continue our stellar journey. Here’s our latest weekly update as we chart our course to the stars.

UI/UX Perspective 🎨 Product Brainstorming:
- Engaged in a creative brainstorming session to refine and enhance our project offerings. Social Media Strategy:
- Published posts, Safle is now trending on Quickswap.
- Planned UX improvements for our portfolio viewer within our blockchain projects, ensuring a seamless user experience.
This includes enhancing the interface for easier navigation, better visualization of blockchain assets, and more intuitive interactions for users to manage and view their blockchain portfolios. Design Initiatives:
- Updated website team profiles, ensuring a polished and professional online presence. Technical Enhancements 💻 Deployment and Integration:
- Separated the code for deployment on the Playstore.
- Worked on stack integration and experimented with various libraries.
- Conducted impact analysis of the current app upgrade in the latest React Native version.
- Implemented and tested Stacks Integration using the secp256k1 algorithm. Backend Development:
- Code changes for the file service, migrating from IBM to GCP.
- Conducted developer testing for the new service.
- Validated file service code and conducted dev QA.
- Created a testing blueprint for streamlined QA processes. DevOps and Configuration:
- Updated network configuration details in the keyless-js and multichain data repositories.
- Migrated dev and test databases to a new cluster for improved performance. Project Management:
- Consolidated the backlog, including mobile code review, API data source for the data pipeline, vault discussions, and finalizing DB migration.
- Resolved relayer issues and published a new build (1.4.9) on Firebase and TestFlight after thorough testing.
- Debugged and fixed the relayer-v2 service and user login. Mobile Application Enhancements:
- Conducted thorough testing of the mobile application to verify all functionalities.
- Reviewed mobile app code to ensure quality and efficiency. Security Enhancements 🔒 Data Security and Privacy:
- Investigated data security issues on the Play Console.
- Debugged errors occurring in keyless after network config updates.
- Increased scope of code changes and testing to address network config changes in multiple areas. Wallet Integration:
- Researched xverse wallet implementation using the Stacks wallet SDK and RN solution. Smart Contract and Token Management:
- Completed the initial hardhat setup and compilation of the vesting contract. Documentation and Compliance:
- Reviewed Wallet Connect Docs to ensure compliance and security.
- Implemented robust testing and validation processes for all updates and new features. Community 🌎

Join the conversation and stay updated with our latest news and announcements!

🔗 https://linktr.ee/safle


Ocean Protocol

Crypto Factor Modeling Data Challenge

Overview The Crypto Factor Modeling Data Challenge, a collaboration between Ocean Protocol and Numerai, tests your data science skills in the evolving cryptocurrency market. Participants must create custom datasets and develop multi-factor risk models to explain cryptocurrency price variance, utilizing data from sources like Tardis, Kaiko, CCXT, and Uniswap. Ocean Protocol is a decentralize
Overview

The Crypto Factor Modeling Data Challenge, a collaboration between Ocean Protocol and Numerai, tests your data science skills in the evolving cryptocurrency market. Participants must create custom datasets and develop multi-factor risk models to explain cryptocurrency price variance, utilizing data from sources like Tardis, Kaiko, CCXT, and Uniswap.

Ocean Protocol is a decentralized data exchange protocol that unlocks data for AI consumption, ensuring data privacy and security while enabling the sharing and monetization of data. Numerai, on the other hand, is a hedge fund that crowdsources machine learning models to manage its portfolio, leveraging the collective intelligence of data scientists worldwide.

This challenge offers an opportunity to apply your data science skills to a real-world scenario and contribute to a broader understanding of the crypto market. You can also leverage Ocean Protocol’s secure data sharing capabilities and Numerai’s collaborative approach to machine learning.

Objectives

The primary objective is to develop multi-factor risk models that explain cryptocurrency price changes. Participants aim to enhance their understanding of the factors influencing cryptocurrency markets. The challenge promotes innovation in data collection, analysis, and modeling techniques.

Participants will identify and justify relevant factors, from simple indicators like momentum to complex frameworks like the Fama-French model. The goal is to build robust models to explain and predict price movements. This involves analyzing historical price data, trading volumes, macroeconomic indicators, and blockchain-specific metrics.

The challenge encourages a collaborative environment where participants share insights and methodologies. This collaboration improves individual models and contributes to the collective understanding of the cryptocurrency market. The objective is to enhance data analysis skills and gain valuable experience handling complex datasets.

Data

Participants gather data from sources such as Tardis, Kaiko, CCXT, and Uniswap to create comprehensive datasets. The data includes historical price data, trading volumes, macroeconomic indicators, and blockchain-specific metrics. This rich dataset forms the foundation for analysis and model development.

The data collection process emphasizes using diverse sources to capture the multifaceted nature of the cryptocurrency market. Thorough cleaning and preprocessing are crucial to ensuring data quality. This step guarantees that the dataset is reliable and ready for analysis.

Integrating data from multiple sources into a coherent format presents a significant challenge. Participants must document their data collection and cleaning methods to ensure the dataset’s integrity. This documentation is essential for building models that provide meaningful insights into cryptocurrency price movements.

Mission

The mission is to identify relevant factors influencing cryptocurrency prices. Participants use statistical techniques and machine learning methods to build and validate their models. The aim is to develop models that explain past price movements and predict future trends.

Participants document their process in a comprehensive report. This report includes an abstract, detailed factor models, data collection and cleaning methods, and conclusions with key findings and suggestions for further research.

This mission involves exploring the data to uncover patterns and relationships affecting cryptocurrency prices. The goal is to create a valuable resource for further research.

Rewards

The $20,000 prize pool will be distributed among the top 10 performers:

Prize pool distribution and points earned

Participants will also earn points contributing to the 2024 championship. Accumulating points correlates with increased rewards, as seen in the 2023 Championship, where top performers received an additional $10 for each point earned throughout the year.

Opportunities

This challenge provides several opportunities to enhance your data analysis and modeling skills. Participants contribute to understanding the cryptocurrency market and apply their knowledge in a real-world, evolving market. The challenge offers exposure to advanced data science techniques and the chance to develop innovative solutions.

How to Participate

Are you ready to join us on this quest? Whether you’re a seasoned data pro or just starting, there’s a place for you in our community of data scientists. Let’s explore and discover together on Desights, our dedicated data challenge platform. The challenge runs from June 27 until August 27, 2024, at 13:00 UTC. Click here to access the challenge.

Community and Support

To engage in discussions, ask questions, or join the community conversation, connect with us on Ocean’s Discord channel #data-science-hub, the Desights support channel #data-challenge-support, and the Numerai #crypto support channel.

You can also reach us via email at dcsupport@oceanprotocol.com | support@numer.ai

About Ocean Protocol

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data.

Follow Ocean on Twitter or Telegram to keep up to date. Chat directly with the Ocean community on Discord — or track Ocean’s progress on GitHub.

Crypto Factor Modeling Data Challenge was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Jul 30, 2024: Asking Good Questions About AI Integration in Your Organization – Part II

The integration of AI poses both unprecedented opportunities and challenges for organizations. Our webinar, "Asking the Right Questions: Navigating AI Integration in Enterprise Security," addresses the pressing need for CISOs, CIOs, and other information risk management professionals to navigate the complexities of AI adoption effectively. Led by John Tolbert, Cybersecurity Director at KuppingerCol
The integration of AI poses both unprecedented opportunities and challenges for organizations. Our webinar, "Asking the Right Questions: Navigating AI Integration in Enterprise Security," addresses the pressing need for CISOs, CIOs, and other information risk management professionals to navigate the complexities of AI adoption effectively. Led by John Tolbert, Cybersecurity Director at KuppingerCole Analysts, Dr. Scott David, LL.M., Executive Director - Information Risk and Synthetic Intelligence Research Initiative at the University of Washington – APL, and Matthias Reinwarth, Director Practice IAM, Senior Analyst at KuppingerCole Analysts, this session offers a deep dive into the pivotal role of asking good questions in guiding organizations through the maze of emerging AI risks.

PingTalk

What Is YubiKey and How Do You Set it Up?

Yubikeys - what they are, how to set up, and use them to enhance your online security.

TBD

Hundreds of developers across Africa learn to speak tbDEX

A recap of the tbDEX Africa Roadshow, where developers in Ghana and Kenya learned how to build payment applications with tbDEX

Following the successful launch of tbDEX 1.0 in Rio de Janeiro, TBD embarked on a roadshow across Africa, visiting Accra, Nairobi, and Cape Town. Our mission was to engage with local developers and share tbDEX, an open source protocol for global payments.

Recognizing that many African countries face significant barriers to participating in the global economy, we aimed to support the vibrant and growing developer communities in these regions. These communities are already tackling these challenges head-on, and we wanted to contribute by providing open source tools to build robust payment applications.

The roadshow featured full-day, hands-on workshops in Accra and Nairobi, where hundreds of developers gathered to learn how to utilize tbDEX. The excitement was palpable as participants built wallet applications using the tbDEX SDK. The enthusiasm and curiosity among the developers were evident, and the success of these workshops was widely shared across LinkedIn and Twitter.

Angie Jones, Head of Developer Relations, and Adewale Abati, Staff Developer Advocate, led the workshops, demonstrating the capabilities of developers eager to integrate advanced payment solutions. Co-sponsored by Yellow Card and Circle, these sessions saw developers integrating wallet applications with tbDEX in just one day.

In addition to the workshops, TBD actively participated in notable conferences across Africa. At the 3iAfrica conference in Accra, Mike Brock, CEO of TBD, delivered an insightful session titled "Trust Reimagined in the Digital Assets World," discussing the economic principles behind digital assets and their impact on trust and value exchange in business. Angie Jones hosted a roundtable discussion on tbDEX as an open protocol for global money movement."

The team also made a stop at Joy 99.7 FM radio station in Ghana, where Mike and Angie discussed how digital identity can enhance payment systems.

The journey continued to the ID4Africa conference in Cape Town, where TBD engaged with local stakeholders to gain deeper insights into the challenges surrounding digital identity.

Overall, the roadshow was a significant step in connecting with developer communities across Africa. We were inspired by the talent and determination we encountered and look forward to seeing the innovative solutions that will emerge from these collaborations.

Get Started with tbDEX

With tbDEX, anyone can dive into building innovative payment solutions without needing permission. This open source tool empowers developers to start building immediately. We welcome your feedback and contributions to make tbDEX even more robust and versatile. Join us in breaking down barriers and building the future of global payments today.

Sunday, 30. June 2024

KuppingerCole

Making it Simple and Secure - The Future of Digital Identity with Andrzej Kawalec

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Andrzej Kawalec, Head of Cybersecurity at Vodafone Business, explores the importance of digital identity in the digital world and the need for user-centric solutions. The concept of decentralized identity and self-sovereign identities are discussed as a shift towards user control and privacy. Th

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Andrzej Kawalec, Head of Cybersecurity at Vodafone Business, explores the importance of digital identity in the digital world and the need for user-centric solutions. The concept of decentralized identity and self-sovereign identities are discussed as a shift towards user control and privacy. The challenges of adoption and standardization are highlighted, particularly for small businesses. The focus is on making identity management simpler and more secure for businesses and individuals.




Ontology

Ontology Mainnet 6th Anniversary Campaign

Ontology MainNet 6th Anniversary Campaign 🎉 As we celebrate the 6th anniversary of the Ontology MainNet, we reflect on an incredible journey that began in June 2018. Over the years, Ontology has transformed from a blockchain platform focused on digital identity and data solutions into a comprehensive Web3 infrastructure. With each milestone, from the initial launch to the integration of adva
Ontology MainNet 6th Anniversary Campaign 🎉

As we celebrate the 6th anniversary of the Ontology MainNet, we reflect on an incredible journey that began in June 2018. Over the years, Ontology has transformed from a blockchain platform focused on digital identity and data solutions into a comprehensive Web3 infrastructure. With each milestone, from the initial launch to the integration of advanced technologies and strategic partnerships, we’ve continuously enhanced our network’s capabilities and expanded our ecosystem.

To commemorate this special occasion and show our appreciation to our amazing community, we are thrilled to announce a series of exciting campaigns and giveaways. These initiatives are designed to reward our dedicated supporters, foster further growth within our network, and encourage new participants to join the Ontology family. Read on to learn more about these fantastic opportunities and how you can get involved.

🎉 Stake2Earn — Giveaway to Ontology Stakers 🚀

In celebration of our 6th anniversary, we’re excited to launch the Stake2Earn campaign to reward our loyal stakers and encourage more users to participate in staking. Here’s how you can join the fun and win some ONG!

📅 Timeline

June 30th — July 14th (after round 236 ends)

💰 Rewards Total Reward Pool: 3000 ONG Lucky Draw Winners: Up to 30 lucky stakers 📝 Eligibility Criteria Social Engagement: Follow Ontology on Twitter 🐦 Join our Discord community 🎮 Sign up for our Telegram group 📲 Humanity Score: Maintain a humanity score above 0 ✔️ Staking Requirements: Stake a minimum of 100 ONT 💼 🤝 Steps to Participate Join and Follow: Complete the social tasks by following us on Twitter, and joining our Discord and Telegram groups. Check Your Score: Ensure your humanity score is above 0. You can view your score in your profile. Stake ONT: Stake at least 100 ONT through our official staking page. Follow the guide provided to ensure your staking is successful. Verification: After staking, verify your participation to enter the lucky draw. Win Rewards: Winners will be randomly selected from eligible participants and announced at the end of the campaign. Rewards will be distributed shortly after the winners are declared. 🎁 Rewards Distribution

Winners will be announced and contacted through our official channels. The ONG rewards will be distributed directly to the winners’ accounts.

🆘 Need Help?

For any questions, please reach out to our team or community Harbingers across all Ontology social channels.

Join the celebration and stake to earn your share of the rewards! 🌟

Join the Ontology Network

🚀 Win a Stake — Grow Your Node with Ontology 🚀

We’re excited to launch the Win a Stake campaign as part of our 6th-anniversary celebration. This campaign is designed to incentivize node operators to promote their nodes and increase total staking. Here’s how you can participate and potentially win a significant boost for your node!

📅 Timeline

Start Date: June 30th (Round 236)
End Date: July 15th (After Round 237 starts)

💸 Rewards Maximum: 2,000,000 ONT 🤝 How to Participate Submit Your Node Information: Node Name Operation Wallet Staking Wallet Public Key Contact Details (Email, Twitter, Telegram, Discord, etc.) Prove Ownership:

Send 0.005 ONT to the address: AKBvFr2AavHKYxUWGqFBdsw1WZoSnMbk9P using your operation/stake wallet to confirm ownership.

Provide the transaction hash in the form. Fill Out the Google Form: Complete and submit your information using this form: Ontology Node Form 📸 Snapshot Dates First Snapshot: June 30th — Total stake of all nodes Second Snapshot: On or around July 15th (After round 237 starts) 🏆 Winning Criteria Increase Rate Comparison: We’ll compare the increase rate of each node’s total stake amount between the two snapshots. Top 3 Nodes: The nodes with the highest increase rates will win. 🥇 Prizes Win a Stake of Ontology: The top three nodes will each receive a stake amounting to 10% of their initial stake. 🔍 Important Notes Eligibility: Only registered nodes will be eligible for the competition. Ensure all submitted information is accurate and authentic. Registration: Make sure to submit the form with all required details to participate.

🌟 Join the competition now and grow your node with Ontology! Boost your staking and potentially win a significant reward! 🌟

For any questions or further assistance, feel free to reach out to our team or community Harbingers across all Ontology social channels.

Join the Ontology Network

Ontology Mainnet 6th Anniversary Campaign 🎉 was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


YeshID

Manage application access without ever leaving Slack

I live in Slack. The rest of the team at YeshID does, too. So do a lot of our customers. That’s why we created the Slack integration we wrote about... The post Manage application access without ever leaving Slack appeared first on YeshID.

I live in Slack. The rest of the team at YeshID does, too. So do a lot of our customers. That’s why we created the Slack integration we wrote about in this other post.

Here, I’m going to tell you how we’ve taken that integration to the next level.

The Problem

Imagine you’re in Slack, discussing some problem. Steve’s the guy who seems to have a handle on what to do. The best way for Steve to take the next step is with an app he doesn’t use all the time. He knows how to use it but doesn’t have access right now. So, let’s just get him access. Easy, right? But according to the customers we’ve been talking to, it’s not.

They’ve told us what they have had to do:

Get out of Slack and send an email (or write a JIRA ticket) to the admin for the app to request access. The admin may not be the approver, so the admin has to email the approver. When the approver approves, they reply with an email to the admin, who grants access. The admin then has to notify the requester. However, the requester might not be the person who will use the app, so they have to notify that person.

It’s a mess. Or as it’s usually described, it’s a cluster- umm cluster-something.

Oh, and compliance? You need an audit trail for compliance. A bunch of emails is not an audit trail. You’ve got to record the decision in an organized way–like in a spreadsheet. Ugh spreadsheets.

Of course, some customers have a Slack channel for requests instead of emails. As the organization grows with multiple IT members, they often have a generic channel for access requests (or perhaps you’re just using the general IT team chat). But now you have to hope your request isn’t missed or forgotten in the buffer.

And you still need an audit trail for compliance. And a way to make sure things get done

The Solution: YeshID’s Access Workflow

We listened to them, and YeshID solved that problem a while ago. Anyone can go to their YeshID console and request an app, and YeshID will manage everything downstream. Anyone who’s clicked on the “Get Started Now” button on the YeshID website, that is.

Hint: click here or on the button to Get Started Now,

If you’ve signed up for YeshID, here’s how it works:

Go to https://app.yeshid.com Click on “My Apps” Click on “Request an application.” Pick the application or request a different one if the one you want isn’t on the list

And you’re done.

Here’s how it looks:

 

YeshID generates a task and gets it to the right person–approver or admin–depending on how you’ve set things up. YeshIID tracks actions and notifies the requester when the request has been completed.

Having YeshID manage the process and provide the artifacts needed for compliance is better than an unmanaged collection (mess) of emails or Slack messages that you have to round up and put into a spreadsheet.

Even Better: Request Directly from Slack

But why even switch focus from Slack to YeshID?

With YeshID’s new Slack integration, you make the request right in Slack—no tool switching. And even better, you can request an application for someone else. Here’s how it works.

Just type /request YeshID will pop up a request form in Slack

Here’s what you’ll see if you were part of T’s Tangerine Organization.

And the image below  how it would look if you wanted to make a request for Figma on behalf of a guy named Steve Jobs. (Not that Steve Jobs. That one has ceased to be, expired and gone to meet his maker, is bereft of life, rests in peace… see Monty Python Dead Parrot Sketch. This is another Steve Jobs.)

The request goes straight to YeshID and gets forwarded to the right approver or administrator. YeshID monitors the entire process and constructs an audit trail. When approval has been granted, YeshID notifies both the person who has been given access and the person who requested it for them. No loose ends. All loops closed.

Ready to Revolutionize Your Access Management?

Don’t let outdated processes hold you back. Embrace YeshID and experience:

Increased productivity Enhanced security Simplified compliance Happier employees and IT staff

Start your journey to streamlined access management today with YeshID!

The post Manage application access without ever leaving Slack appeared first on YeshID.

Friday, 28. June 2024

Microsoft Entra (Azure AD) Blog

Introducing the Microsoft Entra PowerShell module

We’re thrilled to announce the public preview of the Microsoft Entra PowerShell module, a new high-quality and scenario-focused PowerShell module designed to streamline management and automation for the Microsoft Entra product family. In 2021, we announced that all our future PowerShell investments would be in the Microsoft Graph PowerShell SDK. Today, we’re launching the next major step on this j

We’re thrilled to announce the public preview of the Microsoft Entra PowerShell module, a new high-quality and scenario-focused PowerShell module designed to streamline management and automation for the Microsoft Entra product family. In 2021, we announced that all our future PowerShell investments would be in the Microsoft Graph PowerShell SDK. Today, we’re launching the next major step on this journey. The Microsoft Entra PowerShell module (Microsoft.Graph.Entra) is a part of our ongoing commitment and increased investment in Microsoft Graph PowerShell SDK to improve your experience and empower automation with Microsoft Entra.

 

We’re grateful for the substantial feedback we’ve heard from Microsoft Entra customers about our PowerShell experiences, and we’re excited to hear your thoughts after evaluating this preview module. We plan to build on our investment in the Microsoft Entra PowerShell module going forward and expand its coverage of resources and scenarios. 

 

What is Microsoft Entra PowerShell?

 

The Microsoft Entra PowerShell module is a command-line tool that allows administrators to manage and automate Microsoft Entra resources programmatically. This includes efficiently managing users, groups, applications, service principals, policies, and more. The module builds upon and is part of the Microsoft Graph PowerShell SDK. It’s fully interoperable with all cmdlets in the Microsoft Graph PowerShell SDK, enabling you to perform complex operations with simple, well-documented commands. The module also offers a backward compatibility option with the deprecated AzureAD module to accelerate migration. Microsoft Entra PowerShell supports PowerShell version 5.1 and version 7+. We recommend using PowerShell version 7 or higher with the Microsoft Entra PowerShell module on all platforms, including Windows, Linux, and macOS.

 

Benefits of Microsoft Entra PowerShell

 

Focus on usability and quality: Microsoft Entra PowerShell offers human-readable parameters, deliberate parameter set specification, inline documentation, and core PowerShell fundamentals like pipelining. Backward compatibility with AzureAD module: Microsoft Entra PowerShell accelerates migration from the recently announced AzureAD module deprecation. Flexible and granular authorization: Consistent with Microsoft Graph PowerShell SDK, Microsoft Entra PowerShell enables administrative consent for the permissions you want to grant to the application and supports specifying your own application identity for maximum granularity in app permission assignment. You can also use certificate, Service Principal, or Managed Identity authentication patterns. Open source: The Microsoft Entra PowerShell module is open source, allowing contributions from the community to create great PowerShell experiences and share them with everyone. Open source promotes collaboration and facilitates the development of innovative business solutions. You can view Microsoft's customizations and adapt them to meet your needs.

 

Next steps

 

Installation: Install Microsoft Entra PowerShell, which uses the “/v1.0” API version to manage Microsoft Graph resources, from the PowerShell Gallery by running this command:

 

Install-Module Microsoft.Graph.Entra -AllowPrerelease -Repository PSGallery -Force

 

Or install the Beta module, which manages Microsoft Graph resources using the "/beta" API version, by running this command:

 

Install-Module Microsoft.Graph.Entra.Beta -AllowPrerelease -Repository PSGallery -Force

 

Authentication: Use the Connect-Entra command to sign in to Microsoft Entra ID with delegated access (interactive) or application-only access (noninteractive).

 

Connect-Entra -TenantId 'your-tenant-id' -Scopes 'User.Read.All'

 

To see more examples for using your own registered application, Service Principal, Managed Identity, and other authentication methods, see the Connect-Entra command documentation.

 

Find all available commands: You can list all available commands in the Microsoft Entra PowerShell module by using the command:

 

Get-Command -Module Microsoft.Graph.Entra

 

Get Help: The Get-Help command shows detailed information about specific commands, such as syntax, parameters, cmdlet description, and usage examples. For example, to learn more about the Get-EntraUser command, run:

 

Get-Help Get-EntraUser -Full

 

Migrating from AzureAD PowerShell module: You can run your existing AzureAD PowerShell scripts with minimal modifications using Microsoft Entra PowerShell by using the Enable-EntraAzureADAlias command. For example:

 

Import-Module -Name Microsoft.Graph.Entra

Connect-Entra #Replaces Connect-AzureAD for auth

Enable-EntraAzureADAlias #enable aliasing 

Get-AzureADUser -Top 1

 

Frequently Asked Questions (FAQs)

 

What is the difference between the Microsoft Graph PowerShell SDK and Microsoft Entra PowerShell modules?

 

Microsoft Entra PowerShell is a part of our increased investment in Microsoft Graph PowerShell SDK. It brings high-quality and scenario-optimized Entra resource management to the Microsoft Graph PowerShell SDK. Still, it keeps all the benefits of Microsoft Graph PowerShell SDK for authorization, connection management, error handling, and (low-level) API coverage. As Microsoft Entra PowerShell builds on the Microsoft Graph PowerShell SDK, it is completely interoperable.

 

Is the Microsoft Entra PowerShell module compatible with Microsoft Graph PowerShell?

 

Yes. You don't need to switch if you’ve already used the Microsoft Graph PowerShell module. Both modules work well together, and whether you use Entra module cmdlets or Microsoft Graph PowerShell SDK cmdlets for Entra resources is a matter of preference.

 

I need to migrate from the deprecated AzureAD or MSOnline modules. Should I wait for Microsoft Entra PowerShell?

 

No. One of our goals with Microsoft Entra PowerShell is to help you migrate from Azure AD PowerShell more quickly by setting Enable-EntraAzureADAlias. Microsoft Entra PowerShell supports simplified migration for scripts that were using AzureAD PowerShell, with over 98% compatibility. However, the legacy AzureAD and MSOnline PowerShell modules are deprecated and will be retired (stop working) after March 30, 2025. We recommend that you act now to begin migrating your MSOnline and AzureAD PowerShell scripts. 

 

Both modules use the latest Microsoft Graph APIs. For test environments and non-production systems, you can migrate to Microsoft Entra PowerShell. We recommend migrating to this module for production systems only after it reaches general availability. If you migrate scripts to Microsoft Graph PowerShell SDK now, there is no need to update them again with Microsoft Entra PowerShell, as it enhances and will not replace Microsoft Graph PowerShell SDK.

 

Should I update Microsoft Graph PowerShell scripts to Microsoft Entra PowerShell?

 

This is not necessary but a matter of preference. Microsoft Entra PowerShell is part of the Microsoft Graph PowerShell solution, and the two modules are interoperable. You can install both modules side-by-side.

 

Will Microsoft Entra PowerShell add support for more resources in the future?

 

Yes, it is a long-term investment. We will continue to expand support for more resources and scenarios over time. Expect new cmdlets for Privileged Identity Management (PIM), Entitlement Management, Tenant Configuration settings, Per-User multifactor authentication (MFA), and more. We'll also enhance existing cmdlets with additional parameters, detailed help, and intuitive names. Check out GitHub repo for ongoing updates.

 

Will Microsoft Entra PowerShell use a pre-consented app like AzureAD or MSOnline modules?

 

No. Microsoft Entra PowerShell permissions aren't preauthorized, and users must request the specific app permissions needed. This granularity ensures that the application has only the necessary permissions, providing granular control over resource management. For maximum flexibility and granularity in application permissions, we recommend using your own application identity with Entra PowerShell. By creating different applications for different uses of PowerShell in your tenant, you can have exacting control over application permissions granted for specific scenarios. To use your own application identity with Microsoft Entra PowerShell, you can use the Connect-Entra cmdlet:

 

Connect-Entra -ClientId 'YOUR_APP_ID' -TenantId 'YOUR_TENANT_ID' 

 

I am new to Microsoft Entra PowerShell; where do I start?

 

Explore our public documentation to learn how to install the Microsoft Entra PowerShell module, authenticate, discover which cmdlet to use for a particular scenario, read how-to guides, and more. Our best practice guide will help you start on a secure foundation.

 

How can I provide feedback?

 

You can provide feedback by visiting our GitHub repository issues section. Create a new issue with your feedback, suggestions, or any problems you've encountered. Our team actively monitors and responds to feedback to improve the module. 

 

How can I contribute?

 

We welcome contributions from the community, whether it's through submitting bug reports, suggesting new features, or contributing scenario and example improvements. To get started, visit the GitHub repository, check out our contribution guidelines, and create a pull request with your changes.

 

Learn more about Microsoft Entra PowerShell module

 

Explore our public documentation, to learn how to install the Microsoft Entra PowerShell module, the authentication methods available, which cmdlet to use for a particular scenario, how-to guides, and more.

 

Try It Today

 

Try out the new version and let us know what you think on GitHub! Your insights are invaluable as we continue to improve and enhance the module to better meet your needs.

 

Thank you!

 

We want to thank all the community members who helped us improve this release by reporting issues on GitHub during the private preview! Please keep them coming!

 

Steve Mutungi

Product Manager, Microsoft Entra PowerShell

 

 

Read more on this topic

Microsoft Entra PowerShell GitHub repository | GitHub Microsoft Entra PowerShell documentation | Microsoft Learn

 

Learn more about Microsoft Entra 

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds.

Microsoft Entra News and Insights | Microsoft Security Blog  ⁠⁠Microsoft Entra blog | Tech Community  ⁠Microsoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community

1Kosmos BlockID

What are Passkeys and How Do They Work?

Are you still relying on traditional passwords to authenticate users, even while cyber threats grow more sophisticated today? This article will examine passkey authentication work, a modern digital identity verification and security solution. You’ll learn everything about passkey authentication work, from its basic principles to its practical applications, along with the advantages and challenges

Are you still relying on traditional passwords to authenticate users, even while cyber threats grow more sophisticated today? This article will examine passkey authentication work, a modern digital identity verification and security solution. You’ll learn everything about passkey authentication work, from its basic principles to its practical applications, along with the advantages and challenges it presents. By the end, you’ll gain valuable insights to determine if passkey authentication is suitable for enhancing your organization’s digital user verification and security.

Traditional Methods vs. Passkey Authentication

While familiar and straightforward, the traditional password system presents numerous security challenges. These challenges include susceptibility to brute-force attacks, where an attacker tries numerous combinations to guess the password, and phishing attacks, where a user is tricked into revealing their password.
Passkeys mitigate these risks effectively. They provide an additional or an alternative to passwords as a layer of security by issuing a new and unique code for every authentication attempt.
Unlike static passwords, which are vulnerable to reuse in multiple services, passkeys are service-specific and time-bound. This dynamic nature makes it significantly more challenging for unauthorized users to access an account.
If an attacker were to gain your static password, they could use it repeatedly until you notice and take action. With passkeys, however, each code is invalidated after use or after the expiration of a predetermined time, thus nullifying this type of attack vector.

Layers of Security: Multi-Factor Authentication

Multi-factor authentication (MFA) significantly enhances account security by requiring two or more independent credentials: what the user knows (password), what the user has (security token or phone), and what the user is (biometric verification).
The first layer is typically the password, which the user sets and knows. The second layer, a passkey, is often generated and stored on a users device or within a browser.
Implementing MFA means that even if one layer is compromised, the likelihood of an unauthorized individual gaining access is significantly reduced.
For instance, if your password were to be compromised, an attacker would still need to pass the subsequent layers of authentication, such as the passkey or biometric scan, to access your account.
This multi-layered approach is increasingly considered best practice in cybersecurity to protect against a broad range of attacks and vulnerabilities.

The Authentication Sequence

When you try to log into a secure site, a unique passkey is made just for that attempt. It’s like when a game show host says, “Here’s the public key and the private key for your special question.Once the passkey is created and registered, it is stored on your device and can be used to access the associated website or app. To log in, simply select the saved passkey and authenticate using your device’s screen lock, such as a fingerprint sensor or facial recognition.
The passkey is a permanent digital credential that replaces traditional usernames and passwords, providing a seamless and secure login experience without the need to enter anything manually each time.

Types of Passkeys

Not all keys look the same, and neither do all passkeys. Some passkeys stored are a mix of letters and numbers. Others are just numbers but have a time limit, like a game you must beat before time runs out. Some people even use their fingerprints as a passkey.
The type of passkey you use depends on what you’re trying to keep safe. If it’s just an email account, maybe a simple passkey sent through text is enough. But if it’s something like your bank account, you might want more layers, like a fingerprint plus a time-sensitive code.

Encryption and Secure Transmission

Imagine you’re passing a secret note in class. You wouldn’t want anyone else to read it. That’s why you might write it in code or a language only you and your friend understand. This is similar to what encryption does for passkey, unlike passwords and cryptographic keys.
When a passkey is sent to you, it’s scrambled into a code. This scrambled version is the only thing that gets sent over the internet. Even if someone catches it, like catching your secret note, they can’t read it because it’s still in code. Only your device knows how to unscramble it back into the same passkey you can use.

Advantages of Passkey Authentication Security Benefits

One significant advantage of using passkeys is that they add an extra layer of security key safety to your online activities. Passkeys are often temporary, unlike traditional passwords that remain the same until you decide to change them.
They expire after a short time or after you’ve used them, making them useless to anyone who tries to misuse them later. Another upside of traditional passwords is that passkeys are generated on the spot and sent to you directly; they are much harder for hackers to predict or intercept.
This makes them a sturdy barrier against unauthorized access to Google accounts and user devices, effectively acting like a changing lock that keeps potential intruders guessing.

User Convenience

While high security is crucial, it shouldn’t come at the cost of making life difficult for the user. Passkeys are often simpler to use than they sound. Typically, they are sent directly to a device that you already have on you, like your mobile device PIN or phone.
There’s no need to memorize complex combinations of weak passwords or carry additional hardware. Moreover, because a new passkey is generated each time, the user is less pressured to create and remember a strong password. You don’t need to keep track of a complex string of characters; the passkey system handles the complexity.

Scalability and Compatibility

Passkey systems are highly scalable and can easily expand as a business or organization grows. They can be implemented on various platforms and are usually compatible with existing software and hardware. This makes them a flexible option for both small businesses and large enterprises.
Regarding compatibility, many passkey systems can be integrated into existing multi-factor authentication setups. Adding passkey authentication often doesn’t require an overhaul of the user’s current security system, making it a cost-effective upgrade.

Passkey Authentication in Corporate Security

In a business setting, protecting sensitive data is paramount. Passkey authentication is increasingly used to ensure only authorized employees can access crucial company information. Depending on the employee’s role, the passkey support system can be customized to provide different access levels.
Moreover, since passkeys replacing passwords are often combined with other forms of authentication, they help create a robust security environment against remote attacks. This makes it difficult for potential intruders to gain unauthorized access to remote servers, safeguarding the company’s intellectual and financial assets.

Use in Financial Transactions

Regarding banking and financial transactions, the stakes for security are incredibly high. Any vulnerability can lead to financial loss or identity theft. This is why many financial institutions turn to passkey technology for user authentication.
Passkeys’ temporary nature and direct delivery to the user’s device make them a robust security measure for online financial activities. The use of passkeys as authentication factors can also streamline the user experience.
For instance, during a financial transaction, a passkey can be quickly sent to the user’s device, reducing the steps needed to verify the user’s identity and making the process more efficient while maintaining high-security standards.

Public and Government Sector Adoption

The need for secure passkey authentication and public key cryptography isn’t limited to corporations and financial institutions.
Government agencies and public services also handle sensitive information and require high levels of security. Passkey authentication offers a feasible and efficient solution for these sectors.
In many cases, government systems have to be accessible to the public for services like tax filing, voting, or accessing personal records.
Passkey authentication provides a user-friendly yet secure method of using cryptographic private keys to facilitate these interactions, maintaining the integrity of data breaches in the system while ensuring ease of use for the general population.

Security Vulnerabilities

While passkey authentication provides significant advantages, it is not entirely foolproof. For instance, if the device receiving the passkey is compromised, the user’s entire system’s integrity could be at risk. Malware or keyloggers on your smartphone or nearby device could potentially intercept passkeys, rendering the added layer of security ineffective.
At the end of the encrypted call, there’s the possibility of ‘Man-in-the-Middle’ attacks, where an unauthorized entity intercepts the passkey during transmission. While encryption methods are generally strong enough to withstand such attempts, the risk of data breach, although low, still exists.

Usability Concerns

The effectiveness of a security key system also depends on its ease of use. If users find a system cumbersome or confusing, they might avoid using it, compromising security.
For instance, older adults or people not comfortable with technology might find the login process of receiving and inputting a passkey too complex, leading to resistance to adopting this security method.
Also, the dependence on a single device, like a mobile phone, to receive a passkey can be problematic. If the phone is lost, broken, or out of battery, the user’s devices could get locked out of essential services until the issue is resolved.

Cost and Complexity of Implementation

Switching to or adding security keys to a passkey or passwordless authentication-only system is not always straightforward. It requires investment in the technology and could necessitate updates or changes to existing operating systems, too.
This can be a significant challenge for small businesses or organizations with limited resources. A new system will often require staff training who manage it and user education. This entails additional costs and effort, and the transition period could pose temporary but disruptive challenges.

Implementation Best Practices Choosing the Right Passkey Tool

The first step in implementing passkey authentication is choosing the system that best fits the needs of the organization or service. Factors to consider include the level of security required, ease of use for the target audience, and compatibility with existing systems. Additionally, it’s advisable to opt for plans that offer strong encryption and reliable delivery methods for the user’s passkeys.

Technical Requirements

Before deployment, organizations must ensure their infrastructure is compatible with the new passkey authentication system. This may require hardware and software upgrades or the integration of new modules into the current systems. A well-defined technical requirement will facilitate a smoother transition and mitigate potential challenges.

Training and Awareness Programs

Security measures are most effective when users understand how and why they work. Training programs should be initiated to educate users and administrators about the new system.
This can range from simple guides and FAQs to more intensive training sessions for staff responsible for system management. Ongoing awareness programs can help maintain high levels of compliance and effectiveness.

Technological Advancements on the Horizon

Passkey authentication will likely evolve with advancements in biometric authentication and sensor technology. Future systems might incorporate biometrics or behavioral traits as additional forms of biometric authentication of passkeys. There’s also a potential for integrating Artificial Intelligence to detect unauthorized activities more efficiently, making the system even more secure.

Integrating with Other Authentication Methods

As public key cryptography and private key technology evolve, passkey authentication is expected to become more sophisticated and may integrate seamlessly with biometric data and other authentication methods. Combining passkeys with biometrics or hardware tokens can create an even more robust security landscape, offering identity providers high security and user convenience.

Throughout this discussion, we have unpacked the complexities of passkey authentication, shedding light on its benefits, technical implementations, and security best practices. As digital threats evolve, there’s a growing need for sophisticated yet easy-to-use authentication methods like passkey authentication. To take your organization’s digital security to the next level, book a call with our team to discover how the 1Kosmos platform can be tailored to your needs.

The post What are Passkeys and How Do They Work? appeared first on 1Kosmos.


Northern Block

Essential Tips for Designing Decentralized Ecosystems (with Antti Kettunen)

Explore the challenges of designing decentralized ecosystems with expert Antti Kettunen. Learn strategies for balancing incentives and creating sustainable value. The post Essential Tips for Designing Decentralized Ecosystems (with Antti Kettunen) appeared first on Northern Block | Self Sovereign Identity Solution Provider. The post <strong>Essential Tips for Designing Decentralized Ecos

🎥 Watch this Episode on YouTube 🎥
🎧   Listen to this Episode On Spotify   🎧
🎧   Listen to this Episode On Apple Podcasts   🎧

About Podcast Episode

Are you struggling to design and implement decentralized digital trust ecosystems that actually work?

In this episode of The SSI Orbit Podcast, host Mathieu Glaude sits down with Ecosystem Architect Antti Kettunen to unpack the challenges of adoption and value creation in decentralized systems. Antti shares his insights on moving beyond technology-focused conversations and instead building the right incentive models and business value propositions. He explains why simply selling decentralized identity as a technology solution is not enough—ecosystems need to be designed holistically with all stakeholders in mind.

Some of the valuable topics discussed include:

✨ Contrasting value chains in centralized vs decentralized ecosystems
✨ The importance of coordinated governance in decentralized systems
✨ Strategies for Mapping out ecosystem value propositions
✨ Frameworks for analyzing incentives across different participants

Whether you’re in the process of building a decentralized identity solution or embarking on a digital transformation journey for an existing ecosystem, this episode is a goldmine of practical insights. It offers essential guidance on how to create sustainable value for all parties involved. Tune in to gain actionable knowledge that will empower you to design decentralized ecosystems that drive adoption.


Key Insights:

Decentralized ecosystems have disconnected value chains between issuance and presentation of credentials, making it challenging to create value for all parties Ecosystem design needs to consider business value chains and governance, not just technology Payments and transactions can be enablers for ecosystem adoption, but cannot be easily disrupted Public/open data layers may be needed to enable private transactions in an ecosystem Understanding and balancing incentives across all ecosystem participants is crucial for success Strategies: Take a high-level view and start with those who get the most value/have the biggest problems in the ecosystem Map out benefits for different participants and identify win-win opportunities Design governance models and liability frameworks to enable trust between parties Use tools like Ron Adner’s “The Wide Lens” to analyze ecosystem value propositions Think as an “ecosystem architect” to understand business, technology and governance holistically Chapters: 00:00 – Value chains for decentralized ecosystems 08:27 – Are Wallets the new platforms? 14:07 – Are Payments a key enabler for decentralized trust ecosystems 25:25 – Will our identifiers ultimately be controlled by Apple/Google and Government wallets? 31:27 – Frameworks to enable ecosystems for decentralized data exchanges 44:17 – The need for public data layer within ecosystems to build trust and enable trusted interactions 50:57 – Understanding ecosystem member incentives in order to increase success rate Additional resources: Episode Transcript The adoption challenges of wallets & decentralized ecosystems EU Digital Identity Wallet (EUDI) PHEMS.eu Ron Adner, The Wide Lens & Winning the Right Game About Guests

Antti Kettunen is a Lead Consultant and Enterprise Architect at Tietoevry, specializing in digital identity and trust. He designs ecosystem solutions for organizations and governments, actively contributes to industry standards, and shares insights through his blog, Identifinity. As a co-founder of FindyNet cooperative, Finland’s digital identity network, Antti combines technical expertise with strategic vision to drive innovation in decentralized systems.
LinkedIn

  The post Essential Tips for Designing Decentralized Ecosystems (with Antti Kettunen) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

The post <strong>Essential Tips for Designing Decentralized Ecosystems</strong> (with Antti Kettunen) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Holochain

Regenerative Investing

Finances for a Better Future

You have a problem. You see the looming effects of climate change and want to ensure a livable future for your children, but the money you are investing to support your children when they grow up is in mutual funds that likely support fossil fuel companies, arms manufacturers, and others directly responsible for the destruction of our climate. 

The money that you hope will give your children a future is also actively destroying that future.

There is a new push into ethical funds which are meant to avoid these pitfalls, but these funds are limited in their investment opportunities and the tools for you to audit their activities are in their infancy. This article seeks to expand perspectives on better investing.

What if you could invest in the environment, in efforts for climate justice, in carbon markets, and systemic change? What if you could invest in a future that you and your children want to live in? 

What we need are new ways of expressing the very real value of these resources so that we can bring them into the system, investing in their preservation and growth. What we need is for our economies to not be at odds with our ecosystems.

Let me introduce the concept of Regenerative Economics (Regen). The idea is to account for all the costs and values that are missed by traditional economics. That means both social effects, and externalities like carbon emissions.

If a marsh is drained to allow for a new property development, there is a loss of habitat for important species, there is a release of stored carbon and also a loss of future carbon sequestering and oxygen producing capacity by all of the plants and algae, and then there is a loss of the recreation space which is enjoyed people who live in the area. And all of this before we account for the carbon burned by the construction of the shopping centers and roads and homes that replace the original habitat. These are material and social costs that are paid by local residents, future generations, and the ecosystem as a whole. There is also an economic benefit to the developers, investors, and community members who use the services and spaces of the new development. These benefits and costs both matter, but our current economy only has methods to account for the expenses and value of the development, forgetting the value of the marsh and costs of its loss. Sometimes environmental offsets are bought to compensate, but the specificities are largely unaccounted for. Instead, those socio-ecological losses are termed “externalities” and considered outside the system. 

But, you and I both know that these losses are not external to our lives. The loss of a local wetland, or our favorite childhood hike, or the salmon runs that bring life to our local rivers all impact us and our families for generations.

What we need are new ways of expressing the very real value of these resources so that we can bring them into the system, investing in their preservation and growth. What we need is for our economies to not be at odds with our ecosystems.

Case Studies

Two leading projects in the Holochain Ecosystem are attempting to solve pieces of this conundrum. One, Kwaxala, is building financial tools for investing in the environments we want to preserve; and the other, IOEN (the Internet of Energy Network), is connecting global finance to the local renewables that need to be developed. Together, they represent the beginning of a movement for ethical investment and regenerative economics.

Let’s start with IOEN. 

IOEN

Renewable energy is the future. If our civilization has a future, we will have to shift to more sustainable methods for meeting our energy needs. That much is clear. It’s also clear that fossil fuel companies are resistant to this change, and that large solar and wind farms can only be one part of the solution. Small scale local energy production will be a large portion of the global market. Minigrids and Microgrids are emerging as solutions that are more efficient and resilient as they don’t need the massive infrastructure of energy distribution on top of the infrastructure of production. Energy is generated, stored, traded, and used locally.

The pathway to fully clean, tradable energy is not immediate. On the way to full access to clean energy, organizations including large corporations are taking seriously the responsibility to offset their production of carbon emissions. A factory produces a lot of carbon, both from the energy it uses and the industrial processes themselves. They can get their electricity from renewable sources, but depending on where a particular factory is and when they are operating, they might not have access to renewables. Renewable energy generation is subject to location, geography, and access to resources like wind, enough sunshine, and the ability to store electricity.

What if you could invest in the environment, in efforts for climate justice, in carbon markets, and systemic change? What if you could invest in a future that you and your children want to live in?

But, just because they are incapable of directly using renewable energy doesn’t mean that they cannot support the use of renewables generally. Environmental Attribute Credits (EACs), including carbon credits and Renewable Energy Certificates (RECs), help create incentives to invest, support, and use renewable energy.

Currently, large renewable energy farms can sell carbon credits or RECs (depending on the system used), helping corporations offset their carbon emissions. But these are economies of scale. The accounting needed to get government issuance of these carbon credits only works at large scales (a single electrical meter is a lot easier to measure and monitor than many distributed meters). IOEN is providing a way for small scale energy producers who serve their local communities to aggregate their energy production and access global renewable certificate markets.

In order to ensure auditability and to account for distributed energy production by a collection of small holders, data is needed to track energy production. IOEN’s current work with partners in Asia is helping companies and home owners provide two types of data. First, photos of their solar arrays, preferably at the time of installation. Secondly, ongoing readings from their meter, showing the energy that they feed back to their local grid. IOEN’s partners then add satellite data which is used to confirm the installation of solar panels at the sites claimed. All of this data becomes available to regulators who can cross reference each source, ensuring that the meter readings match the expected generation range for the panels. 

Aggregating the data of many individual producers allows IOEN to meet the scale requirements to work with government regulators, procuring and then selling the certificates to corporations. As each individual energy producer’s contributions to this larger pool is tracked, they are able to then be compensated directly in relation to the amount of energy they produced.

Key to IOEN’s value in the market is the ability for clients to store the data (meter data, photographs etc.) in private networks hosted on Holochain which provides auditors with assurance that the certificates minted are validated and true. This creates transparency and integrity in the certificate. IOEN’s blockchain solutions that sit on top of that data then allow secondary markets to emerge, along with fractionalised investment opportunities in these renewable assets, creating liquidity. Because transactions are all tracked on the blockchain, and linked to secure data in Holochain, different organizations involved in the assets all can be paid seamlessly and transparently through fractionalised payments. Already small businesses, communities, and large corporations are seeing the value in this approach for a number of energy-related environmental assets.

Kwaxala

Kwaxala is an indigenous forest regeneration cooperative taking a different approach to nature protection that goes far beyond carbon credits.

Rather than focusing on energy production, they are concerned with the preservation and stewardship of what we already have. Remember that marsh? We agreed that it has a value. And the question is: how can we invest in that value? Kwaxala is making that a possibility. Most land used for logging, oil drilling and fracking, and other resource extraction is government owned and often in contradiction with recognised Indigenous land rights in the area. Companies own or buy the permits and right to extract resources from that land, without buying the land itself. Kwaxala is working to buy those same rights, but rather than using them for extraction, they are securing the right to protect and regenerate the ecosystems threatened by extraction. This legal guarantee of preservation, in an area previously at genuine threat of extraction, is then used to generate carbon offsets each year which can be sold to corporations meeting their net zero commitments.

This rich dataset then provides detailed provenance for both the particular offsets generated by the protected ecosystems and for the long term investments in the ecosystems themselves so that there is detailed visibility into the health and stewardship of the ecosystem.

Kwaxala offsets not only represent sequestered carbon, but also a wider range of monetized ecoservices such as biodiversity protection and ecosystem preservation alongside positive social outcomes such as Indigenous equity and reconciliation. This enables them to command a premium market price per tonne. Additionally, by enabling capital investment directly into the organization holding the right to regenerate, they provide a new way for investment funds to flow into the asset value of the protected natural ecosystem, generating much needed upfront capital to establish these protected areas and counterbalance the extractive economic pressure to destroy the area.

But how do we ensure that the land is well taken care of? Here Kwaxala is using overlapping datasets to build a picture of the ecosystem and its health. Working with Indigenous land stewards, who are also the principal shareholders in Kwaxala itself, they are developing the metrics needed to monitor and support the wellbeing of the particular ecosystems. For some situations, that might be the water and air quality; for others, it might be the presence of indicator species; for many, it will be the conglomeration of multiple data points, collaborating to tell a story. This rich dataset then provides detailed provenance for both the particular offsets generated by the protected ecosystems and for the long term investments in the ecosystems themselves so that there is detailed visibility into the health and stewardship of the ecosystem.

To collect, manage, and maintain this data — and to share it with community members, government regulators, and investors — they are investigating the development of a distributed platform, built on Holochain. Building on Holochain in this case ensures that control of the data isn’t managed solely by the organization claiming to be protecting the ecosystem, but is rather distributed between all stakeholders. With data provenance baked in, it is also easy to track exactly what agent adds any particular piece of data. This combination of verification and distributed storage of overlapping datasets allows for a high degree of trust to be developed, and holds the organization itself accountable.

...we start to see the potential for a new regenerative economics to be developed. Based not on extraction, but rather the regenerative value of all ecosystem activities.
Going Long on the Climate

In his book The Ministry for the Future, Kim Stanley Robinson introduces the idea of going long on the climate — basically, investing in our futures. As popular opinion shifts to focus on the climate crisis, we can expect the economic value of climate mitigation strategies to rise. One of the biggest issues to date in the space is a lack of reliable investment opportunities. With the capacities for data provenance that Holochain provides, investors can have more transparent access to the assets they are investing in.

There isn’t a one size fits all approach here. In comparison to the global replication of industry, climate mitigation is about economies of specificity, because ecosystems are specific to their place and relationships.

Holochain is designed so individual applications can be customized to their particular context, while still being compatible with each other. This gives the flexibility needed for hyper-specific economies to interact with the global market in accountable ways. Add to this innovative accounting methods being developed on the framework and we start to see the potential for a new regenerative economics to be developed. Based not on extraction, but rather the regenerative value of all ecosystem activities. 

Early Entry into a New Economy

The projects outlined above are only first steps towards solving your original problem. You want a way to invest in a future for yourself and your children. A future not only rich in a financial sense, but also an experiential one. Quality of life hinges on the environment and the wellbeing of our society. 

There won’t be any single bulletproof investing strategy, but it’s going to take a multiplicity of tactics to address climate change. Looking at these pilots, we feel that the best way to diversify and to be an early investor in the regenerative economy is to go out and build it. We are looking for more projects innovating in this space. It’s going to take new business structures, financial tools, and community initiatives to shift the landscape and to build a thriving economy. But let’s thrive together, building on real tangible things, building on relationships with the nature we cherish.


Ontology

Ontology MainNet

6th Anniversary Celebrations Ontology Network, launched in June 2018, has evolved from a blockchain platform focused on digital identity and data solutions to a comprehensive Web3 infrastructure. Over the past six years, Ontology has achieved significant milestones, including the launch of its MainNet, integration of layer-2 technology, and the introduction of an Ethereum Virtual Machine, while e
6th Anniversary Celebrations

Ontology Network, launched in June 2018, has evolved from a blockchain platform focused on digital identity and data solutions to a comprehensive Web3 infrastructure. Over the past six years, Ontology has achieved significant milestones, including the launch of its MainNet, integration of layer-2 technology, and the introduction of an Ethereum Virtual Machine, while expanding its ecosystem through strategic partnerships and community growth. Let’s take a journey through some of the key highlights of Ontology’s evolution.

MainNet and TestNet Launches

The Ontology MainNet has undergone several significant changes and upgrades since its inception:

· 2018: Ontology MainNet version 1.0 officially launched, marking the beginning of Ontology’s independent blockchain. This launch included the implementation of a unique dual-token system consisting of ONT and ONG, as well as tools to support developers in building dApps and smart contracts.

· 2019: Ontology introduced sharding technology and launched its cross-chain TestNet, enhancing the network’s scalability and interoperability.

· 2020: Ontology MainNet 2.0 was released, integrating layer-2 technology to improve transaction speed and efficiency.

· 2021: The Ontology Ethereum Virtual Machine (EVM) test network was launched. This significant update allowed Ontology to support four types of smart contracts — Native, NeoVM, WasmVM, and EVM — making it one of the most versatile public-chain platforms in terms of development languages and virtual machines supported.

· 2022: The project’s commitment to expansion is further evidenced by its $10 million EVM fund, aimed at supporting global developers and fostering growth within the Ontology ecosystem.

· 2023: Ontology introduced liquid staking, represented by the stONT token. This significant update allows ONT token holders to maintain liquidity while participating in network consensus. The stONT token accumulates rewards by growing in value relative to ONG, providing users with more flexibility in managing their staked assets.

The consistent updates demonstrate Ontology’s commitment to regular and substantial improvements to its network infrastructure. These launches have progressively enhanced Ontology’s capabilities, from basic blockchain functionality to advanced features like sharding, cross-chain compatibility, and EVM integration.

In terms of staking, Ontology implements a proof-of-stake consensus mechanism. The network has gained popularity among node operators of various sizes and maintains an active staking community. Stakers can earn ONG as rewards for participating in network consensus and governance.

Slash Mechanism for Consensus Nodes

To further enhance the security and stability of the blockchain, Ontology is introducing a Slash mechanism for Consensus Nodes. This new feature ensures that the network operates in a safer and more stable manner by penalizing nodes that act maliciously or fail to perform their duties properly. This addition strengthens Ontology’s commitment to maintaining a robust and reliable blockchain infrastructure.

TestNet Faucet: Empowering Developers

As part of our continuous effort to support developers, Bwarelabs now provides faucets for ONT and ONG test tokens on Ontology and Ontology EVM. This initiative aims to lower the barriers for developers, making it easier to test and deploy their applications on the Ontology MainNet. By offering easy access to test tokens, we hope to foster innovation and growth within our developer community.

Developer Relations: Breaking Barriers

In collaboration with StackUp, Ontology is launched a campaign focused on removing barriers for the developer community. This initiative aims to provide comprehensive support and resources, ensuring that developers have everything they need to succeed in building on the Ontology MainNet. Together, we are committed to creating an inclusive and supportive environment that empowers developers to innovate and thrive.

Sneak Preview of Our Giveaway Campaigns

To celebrate our 6th anniversary, we’re excited to announce some amazing giveaway campaigns!

Stake2Earn — Giveaway to Ontology Stakers: Win a share of 3000 ONG! Simply stake your ONT to participate and stand a chance to earn ONG rewards.

Win a Stake — Grow Your Node with Ontology: Get a boost for your node with 260,000 ONT to be staked. This is a fantastic opportunity for node operators to increase their stakes and contribute to the network’s security.

Anniversary Celebration: Win a share of 200 ONG in our special anniversary giveaway. Join the celebration and earn ONG rewards as we mark this milestone together.

Celebrating Our Community

As we celebrate our 6th anniversary, we want to extend our heartfelt gratitude to our incredible community. Your support and engagement have been instrumental in our journey. Here’s to many more years of innovation, growth, and success together!

Stay tuned for more exciting updates and participate in our giveaway campaigns to celebrate this special occasion with us!

Ontology MainNet was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules

The post Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules appeared first on Tokeny.
June 2024 Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules

As the EU’s Markets in Crypto Assets (MiCA) regulation is set to partially become applicable on 30th June 2024, it is crucial to understand its implications for various types of crypto-assets. While tokenized securities remain largely unaffected by MiCA, utility tokens and stablecoins will face significantly stricter regulatory requirements.

MiCA introduces comprehensive measures to regulate the crypto-asset market, providing much-needed clarity. However, it’s important to note that MiCA does not apply to crypto-assets qualifying as financial instruments, such as transferable securities. These assets continue to fall under the existing regulatory framework of MiFID II (Markets in Financial Instruments Directive II). This provides clear guidance to securities’ issuers.

In contrast, MiCA imposes significant new requirements on utility tokens and stablecoins, fundamentally changing how these assets can be issued and managed. Most cash equivalent tokens currently use the ERC-20 token standard, which lacks owner identification capabilities. Under MiCA, issuers might need to consider a transition to the ERC-3643 token standard to issue permissioned tokens, enabling the necessary compliance features such as owner identification and additional controls.

In addition, providers of services linked to crypto-assets may have to obtain a Crypto-Asset Service Provider (CASP) license. Obtaining such a license would involve considerable financial and time investments. To some extent, the complexity and costly legal processes could make issuing security tokens easier than issuing utility tokens and stablecoins under MiCA.

Despite these challenges, the stricter but clear regulations present opportunities for large institutions such as banks to tokenize cash, providing institutionally acceptable money onchain, which is missing today. They are likely to issue permissioned cash coins to enforce compliance, as we discussed in the previous newsletter “Institutional RWA Tokenization Needs Permissioned Cash Coins”.

At Tokeny, we are well-positioned to leverage these opportunities. Our technology enables both payment coins and securities issuers to identify token holders, track ownership, and enforce automated compliance onchain, which is increasingly crucial as regulations tighten.

This is a historic moment for Europe to position itself as the hub of institutional tokenization, we are excited for what’s coming!

Upcoming Webinar

Discover the benefits, challenges, implementation, and future prospects of tokenized bonds.

Register Now Tokeny Spotlight

INTERVIEW

CCO, Daniel Coheur, did an interview with GDF during Digital Assets Week.

Watch Here

EU COMMISSION

COO, Mathieu Cottin, was invited to join tokenization workshop.

Read More

NEW TALENT

We welcomed Satjapong to our team as DevSecOps Engineer.

Read More

MENTION

Deloitte mentioned Tokeny & ERC3643 in their publication on DLT & Capital Markets.

Read More

EVENT

Our Head of Marketing, Shurong Li, attended Proof of Talk in Paris.

Read More

TALENT INTERVIEW

Our Senior System Test & Certification Engineer, Cristian, shares his views.

Read More Tokeny Events

ETHCC
July  9th, 2024 | 🇧🇪 Belgium

Register Now

Token2049 
September  18th – 19th, 2024 | 🇸🇬 Singapore

Register Now

Digital Assets Week Singapore
November  11th – 12th, 2024 | 🇸🇬 Singapore

Register Now

Web3 Leaders Forum 
July  10th, 2024 | 🇧🇪 Belgium

Register Now

Apex Invest
October  9th – 10th, 2024 | 🇺🇸 USA

Register Now

Apex Invest
November  18th – 19th, 2024 | 🇦🇪 UAE

Register Now ERC3643 Association Recap

Member Perspectives 

Leo Chen, from member company Tokeny, takes us through the transformative 6-year development of  the ERC-3643.

Learn more here

Feature

The World Federation of Exchanges mentions the ERC-3643 as a common standard for asset tokenization.

Read more

Meetup at Consensus 

Twelve of our members attended the Consensus conference, where they had the opportunity to meet up, share insights, and strengthen their connections.

Learn more here

Subscribe Newsletter

A monthly newsletter designed to give you an overview of the key developments across the asset tokenization industry.

Previous Newsletter  Jun28 Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules June 2024 Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules As the EU’s Markets in Crypto Assets (MiCA) regulation is set… May22 Institutional RWA Tokenization Needs Permissioned Cash Coins May 2024 Institutional RWA Tokenization Needs Permissioned Cash Coins Stablecoins are the killer use cases for the crypto space, with a market cap exceeding $160… Apr25 BlackRock’s Influence and the Future of MMFs April 2024 BlackRock’s Influence and the Future of MMFs In the world of finance, innovation acceleration often requires the endorsement of industry giants. BlackRock’s embrace… Mar25 🇭🇰 Hong Kong’s Competitive Leap: Fueling Tokenization Growth Across Asia March 2024 Hong Kong’s Competitive Leap: Fueling Tokenization Growth Across Asia This month, we attended the Digital Assets Week Hong Kong conference and were struck…

The post Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules first appeared on Tokeny.

The post Tokenized Securities Unaffected by MiCA, Utility Tokens and Stablecoins Face Stricter Rules appeared first on Tokeny.


Ockto

De adoptie van de digital identity wallet in Nederland

Digital identity wallets lijken een revolutie teweeg te gaan brengen in hoe we ons online en offline identificeren en gegevens delen. Tegelijkertijd roept het een tal van vragen op.

Digital identity wallets lijken een revolutie teweeg te gaan brengen in hoe we ons online en offline identificeren en gegevens delen. Tegelijkertijd roept het een tal van vragen op.


PingTalk

Be the IAM Hero: How to Optimize Your Investment with Ping

Read this blog to learn how to make the most of your current Ping IAM investments.

Hey there, IAM champion! We know your IAM ecosystem is vast and varied, with many players involved. From IBM and Microsoft to HR systems and CRMs, your Ping IAM stack is integrated with countless solutions from different vendors. Managing all these connections can be overwhelming, and the IAM landscape often feels fragmented.

Thursday, 27. June 2024

KuppingerCole

The Impact of AI and LLMs - The Future of Digital Identity with Patrick Parker

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Patrick Parker focuses on the impact of AI and LLMs on identity governance and administration. He and Warwick discuss the shift from static user interfaces to conversational user interfaces and the role of autonomous agents in performing specific tasks. The implications for business operations a

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Patrick Parker focuses on the impact of AI and LLMs on identity governance and administration. He and Warwick discuss the shift from static user interfaces to conversational user interfaces and the role of autonomous agents in performing specific tasks. The implications for business operations and security are also examined, highlighting the potential for small companies to compete with larger corporations and the need for fine-grained external dynamic authorization.




Asking Good Questions About AI Integration in Your Organization

The integration of AI poses both unprecedented opportunities and challenges for organizations. Our webinar, "Asking the Right Questions: Navigating AI Integration in Enterprise Security," addresses the pressing need for CISOs, CIOs, and other information risk management professionals to navigate the complexities of AI adoption effectively. Led by John Tolbert, Cybersecurity Director at Kuppinger

The integration of AI poses both unprecedented opportunities and challenges for organizations. Our webinar, "Asking the Right Questions: Navigating AI Integration in Enterprise Security," addresses the pressing need for CISOs, CIOs, and other information risk management professionals to navigate the complexities of AI adoption effectively.

Led by John Tolbert, Cybersecurity Director at KuppingerCole Analysts, Dr. Scott David, LL.M., Executive Director - Information Risk and Synthetic Intelligence Research Initiative at the University of Washington – APL, and Matthias Reinwarth, Director Practice IAM, Senior Analyst at KuppingerCole Analysts, this session offers a deep dive into the pivotal role of asking good questions in guiding organizations through the maze of emerging AI risks. 

This trio’s expertise sheds light on the growing responsibility of CISOs in coordinating information risk mitigation, particularly in the realm of AI-amplified risks. Through a questions-based approach, attendees will gain insights into strategies for fostering cross-departmental coordination and shaping discussions across the enterprise. The webinar series aims to equip participants with practical tools to address AI-related concerns across various organizational functions, fostering interoperability and revealing best practices amidst the current lack of established standards.  

Understand the pivotal role of CISOs in navigating AI-driven information risks.  Learn strategies for coordinating resources and attention across organizational divisions.  Gain insights into asking good questions to guide discussions on AI integration with legal requirements.  Explore the impact of AI on data protection, privacy, liability, and regulatory compliance.  Discover practical approaches for mitigating AI-related risks and fostering cost-efficient adoption. 


Shyft Network

FATF Travel Rule Compliance Guide for Gibraltar

The FATF Travel Rule in Gibraltar applies to all cryptocurrency transactions above EUR 1,000. VASPs must register with the GFSC, verify identities, and perform enhanced due diligence for high-risk transactions. VASPs must collect additional information for transactions involving unhosted wallets to ensure thorough verification and monitoring. In Gibraltar, the FATF Travel Rule, also info
The FATF Travel Rule in Gibraltar applies to all cryptocurrency transactions above EUR 1,000. VASPs must register with the GFSC, verify identities, and perform enhanced due diligence for high-risk transactions. VASPs must collect additional information for transactions involving unhosted wallets to ensure thorough verification and monitoring.

In Gibraltar, the FATF Travel Rule, also informally called the Crypto Travel Rule, came into effect with the implementation of the Proceeds of Crime Act 2015 (Transfer of Virtual Assets) Regulations 2021 (POCA) on March 22, 2021. However, virtual asset service providers (VASPs) were given an 18-month grace period, making full compliance mandatory only by September 22, 2022.

Key Features of the FATF Travel Rule in Gibraltar

The FATF Travel Rule in Gibraltar requires VASPs to share detailed transaction-related personal information for all virtual asset transfers above EUR 1,000. This information includes the full names, wallet addresses, and other identifying information of both the originator and the beneficiary.

Compliance Requirements

VASPs in Gibraltar must comply with several key requirements.

Registration and Licensing: VASPs must obtain a DLT Provider License from the Gibraltar Financial Services Commission (GFSC). The GFSC provides a structured process for applying for authorization, which includes pre-application engagement and an initial application assessment. Verification of Identities: VASPs must verify the identities of their counterparties, especially when dealing with high-risk jurisdictions. The aim behind this measure is to ensure compliance with anti-money laundering (AML) and counter-terrorism financing (CTF) regulations. Enhanced Due Diligence: For transactions involving unhosted wallets, VASPs are required to collect additional information such as the originator’s name, physical address, national identity number, customer identification number, or date and place of birth. Through enhanced due diligence, GFSC seeks to mitigate the risks associated with unhosted wallets and ensure thorough verification and monitoring.

For Originators, it includes:

Full name Wallet address One of the following: physical address, national identity number, customer identification number, or date and place of birth

For Beneficiaries:

Name Wallet address

If the required Travel Rule data is missing, incomplete, or delayed, the beneficiary VASP must implement risk-based procedures to address these gaps, potentially suspending or rejecting the transfer until compliance is achieved.

Impact on Cryptocurrency Exchanges and Wallets

VASPs in Gibraltar, including cryptocurrency exchanges and wallet providers, must adhere to the FATF Travel Rule under the supervision of the GFSC. This regulation is crucial for preventing money laundering and terrorism financing.

VASPs must monitor crypto transfers. All transactions, especially those involving unhosted wallets, require monitoring and additional information gathering during high-risk transactions. They must also develop risk-based policies for managing transfers involving unhosted wallets and ensure thorough verification and monitoring.

Failure to adhere to the FATF Travel Rule requirements in Gibraltar can lead to significant penalties. For a summary conviction, the penalties may include imprisonment for up to one year, a fine up to level 5 on the standard scale, or both. In the case of conviction on indictment, the penalties can be more severe, with imprisonment for up to two years, a fine, or both.

Global Context and Comparisons

Gibraltar enforces a EUR 1,000 threshold for the Travel Rule, aligning with FATF recommendations. Comparatively, Switzerland’s Crypto Travel Rule threshold is 1,000 CHF, whereas in the United Kingdom, it is 1,000 GBP, and in Japan, the threshold is $3,000.

Concluding Thoughts

Since implementing the FATF Travel Rule in March 2021, Gibraltar requires detailed verification for cryptocurrency transactions over EUR 1,000. This regulation aims to safeguard financial transactions and prevent illegal activities such as money laundering and terrorism financing.

FAQs 1. What is the minimum threshold for the Travel Rule in Gibraltar?

The Travel Rule applies to all cryptocurrency transactions above EUR 1,000.

2. What information must VASPs collect and verify under the Travel Rule in Gibraltar?

VASPs must collect and verify full names, wallet addresses, and additional identifying information such as a physical address, national identity number, customer identification number, or date and place of birth for both originators and beneficiaries in Gibraltar.

‍About Veriscope

‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process.

For more information, visit our website and contact our team for a discussion. To keep up-to-date on all things crypto regulations, sign up for our newsletter and follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium.

FATF Travel Rule Compliance Guide for Gibraltar was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

Age verification in transport: Moving towards a better service?

Urban mobility has undergone considerable expansion thanks to the digitalization of its services, made more accessible through a simple application. A societal topic thanks to its public service quality, urban mobility contributes not only to the inclusiveness of certain populations, but also to the reduction of carbon-based transport. Urban mobility plays a key role in […]

Urban mobility has undergone considerable expansion thanks to the digitalization of its services, made more accessible through a simple application. A societal topic thanks to its public service quality, urban mobility contributes not only to the inclusiveness of certain populations, but also to the reduction of carbon-based transport. Urban mobility plays a key role in our strategy to combat global warming, by encouraging the use of public transport through the connection of different transport offers, including micromobility and car-sharing.

To encourage the use of public transport, various pricing initiatives have been put in place. For example, reduced-price season tickets have been introduced for certain categories of the population, such as students. These age-based fares are offered by many transport networks in Germany as well as in France, such as Île de France Mobilités or SNCF, with offers reserved for those under 26 or senior citizens.

While such measures encourage the adoption of these services, they also open the door to fraudulent practices, with harmful consequences. It therefore seems necessary to provide solutions in terms of age verification of users, to guarantee a fair and efficient service and avoid attempts at fraud and identity theft.

The importance of discounted fares in urban mobility.

Discounted tickets exist on many types of platforms and services. From telecom operators to streaming or cultural services, there is a plethora of places offering this type of service. In the field of transport, it is a social policy tool, where the existence of specific offers according to age and income categories promotes access to transport for different categories of the population.

The introduction of reduced fares for young people is justified by a number of social and equity considerations: young people are often dependent on public transport (lack of driver’s license and/or vehicle).

Île de France Mobilités

These special fares are also used by transport companies as a means of building customer loyalty. However, in a tense economic climate, it is sometimes tempting for users who are not entitled to special fares to abuse the loopholes in a subscription system in order to benefit from them. As a result, mobility providers are increasingly detecting fraud in this area. This may involve the use of false identity documents to take advantage of discounts, or the theft of another person’s identity.

The impact of such fraud is twofold: not only does it adversely affect the revenues of transport operators, it also limits the scope of these offers and the fairness between users. However, it is possible to curb this damaging occurence with online identity verification solutions, which are transparent and easy for the user.

Combating fraud through identity verification.

Traditionally, age verification on public transport has been carried out by means of identity documents presented when purchasing season tickets. The possibility of subscribing to these services online has made it easier to commit fraud and obtain discounted fares. It is now easy for anyone to forge an identity document, thanks to editing software that has become readily available.

At the same time, anti-fraud and identity verification technologies have also become more sophisticated, providing even more secure and appropriate responses.

For a transport operator, using these solutions means effectively guarding against identity fraud during the subscription process. They also ensure that users benefiting from an offer are indeed who they say they are.

IDnow’s automated identity verification is a solution that has been used for many years in banking and financial services, but is now proving its worth in the mobility and transport sector. By verifying a user’s identity document and photo, IDnow ensures that users are thoroughly screened and their age confirmed in just a few steps:

The user submits an identity document when subscribing to the offer; A biometric capture (facial recognition or live detection) is then performed to obtain a correlation with the user making the subscription; The document is then analyzed by our APIs to verify its authenticity, and cross-referenced with the biometric capture for comparison. This ensures that it is the same person; The account is validated in a matter of seconds, based on the results.

Transparent to the user, the identity verification solution is easily integrated into a mobile or web application via our SDK.

Using this solution limits the risk of identity theft, as the user cannot use someone else’s document to subscribe to an offer. Any alteration is identified and forces the user to terminate the subscription process.

Safer access to offers for both operators and users.

The constantly evolving mobility sector has undergone digital transformations that have revolutionized our habits. At the same time, fraudsters have adapted to these new operating modes, impacting the economic efficiency of various players. Age and identity verification provides transport operators with the guarantee that their users have access to the offers they are entitled to.

Identity verification solutions can be seamlessly integrated into the user journey, simplifying the user experience. In addition to contributing to the fight against fraud, identity verification solutions help improve operators’ economic efficiency. As a result, they are able to enhance their transport offering, as well as the quality of their services. In the end, all users benefit from these measures.

Want to know more about the future of mobility? Discover the major trends in the mobility industry, the innovative models and solutions available to you to design a seamless user experience. Get your free copy

By

Mallaury Marie
Content Manager at IDnow
Connect with Mallaury on LinkedIn


Caribou Digital

Shifting through the noise in the age of information overload — the power of evidence synthesis

Sifting through the noise in the age of information overload — the power of evidence synthesis It is hard for organizations that address complex social, economic, and environmental challenges to determine what strategies may have the greatest impact. This should be easier with more organizations publishing what they learn and AI tools supporting efficient curation of insights. In reality, the vas
Sifting through the noise in the age of information overload — the power of evidence synthesis

It is hard for organizations that address complex social, economic, and environmental challenges to determine what strategies may have the greatest impact. This should be easier with more organizations publishing what they learn and AI tools supporting efficient curation of insights. In reality, the vast amount of data generated is met with overwhelm, and impact-focused organizations struggle to determine what data can be meaningfully used to support their strategic ambitions and what is just noise.

Over the past ten years of working with foundations, government agencies, and private sector organizations, Caribou Digital has found that impact evidence synthesis is one of the more powerful tools to support strategic decisions. Evidence synthesis is the process of compiling and analyzing multiple studies to provide comprehensive impact insights on a set of interventions or products.

In this article, we share our experience of the value that evidence synthesis has catalyzed, the need to improve prevailing impact evidence synthesis norms, and how we have refined our evidence synthesis processes and products to drive impactful decisions.

The power of leveraging collective insights

Organizations that address complex social, economic, and environmental challenges are not doing it alone. Scattered across the internet and within these organizations’ own knowledge management systems, there is an abundance of insights that could be harnessed to explore the pertinent questions of what works, under what conditions, and for whom. Impact evidence synthesis brings these insights together.

Individual research efforts provide puzzle pieces that evidence synthesis assembles to create the impact picture. It adds immense value by:

Generating a holistic view: Combining varied perspectives provides a comprehensive view of an issue, allowing for inclusive assessments of what works and what is untested. Optimizing resources: Identifying promising interventions and avoiding duplicating those with limited impact are valuable tactics in light of finite resources. Enabling adaptive strategies: Creating an information infrastructure supports agile strategic responses to the changing social and economic landscape.

By basing actions and investment decisions on a curated body of evidence, organizations can establish a clear and justified basis while also building an evidence ecosystem and infrastructure that encourages a collective approach to engaging in shared issues.

Challenges within the prevailing forms of impact evidence synthesis

There are two phases to unlocking the value of evidence synthesis. The first is intentionality on what is included in a given synthesis, and the second is ensuring the outputs are delivered in ways that support access and — fundamentally — the use of evidence. While there are diverse views on these phases, there are dominant practices. These dominant practices serve some sectors well; in others, they present challenges that can limit their value. Two of the main challenges are:

A bias toward experimental methods: In some sectors, an exclusive focus on one methodology can be challenging for numerous reasons, including issues of power, diversity, and representation, and discounting early insights in emerging and rapidly changing sectors — like digital — where experimental evidence may not be available when investment decisions are being made. Static representation: Evidence synthesis products often exist as static narratives that summarize the state of evidence. The last decade has seen an increase in evidence maps: graphic representations of the landscape of impact evidence plotted against a set of interventions and outcomes. Maps vary but are usually one-off and focus on visualizing the number of studies rather than the insights gained from them. Unless they are accompanied by narrative summaries, evidence maps leave impact insights on the table. Like all knowledge products, content and medium directly affect engagement and use by different audiences.

At Caribou Digital, we have struggled with these prevailing norms and forms, which sparked our work to innovate and improve the practice of evidence synthesis.

How we’ve refined our approach to connect insights to decisions

Our approach seeks to provide timely insights with rigor, context, and diversity of perspectives. Five elements distinguish our evidence synthesis approach:

Iterative Theory of Change: Designed with thematic experts, the Theory of Change outlines impact pathways for specific issues as the basis of the synthesis. Diverse and credible impact methodologies: Experimental and theoretical approaches each offer important views on the evidence. We need both and include both. Impact results: We include a view of the volume of data points that initiated positive, negative, or no impact, not just the number of studies on a given topic. Design and delivery mechanisms of interventions: Interventions or products (e.g., upskilling, e-commerce platforms) are not homogeneous. They have unique design and delivery features that can and do affect their impact. We include these features in our analysis to really drill down into what works. Contextual insights: Who is being studied and where influences impact. We extract contextual data, leveraging localized evidence to support targeted interventions that are more likely to succeed.

Including and extracting this level of detail from individual studies supports a rapid and deeper interrogation of the impact data — many of these elements are visualized in our existing evidence maps on digital financial services and digital and data-first approaches to driving small business growth.

Transforming the way users interact with evidence synthesis

At Caribou Digital, we are proponents of interactive and ongoing evidence synthesis. Our experience delivering interactive evidence synthesis has revealed a strong demand for customized and detailed insights on demand. This entails extracting and synthesizing impact evidence from the database on specific questions, such as “What is the impact of market-place platforms on small business growth?” This recurring need has led us to further innovate with evidence outputs to support greater access to insights using generative AI. Below are three ways we share our evidence synthesis outputs:

An interactive impact database holds the raw coded data. Guided by a code book, we can filter, search, and access extracted summary insights and data points from the impact database. This has been a successful format for (manually) mining the database on-demand to craft impact narratives on interventions/products in specific countries on specific segments. All our evidence syntheses have an underlying impact database, resulting in a significant body of evidence on digital financial services, digital approaches to driving small business growth, digital tools to support women’s economic empowerment, and geospatial data for humanitarian action (forthcoming). Interactive evidence maps provide a visual overview of the entire impact evidence landscape, plotting interventions to outcomes and the direction of impact. The maps are integral for communicating a snapshot of the impact landscape and quickly discerning where evidence is concentrated. Filter buttons enable users to get more specific on other variables of interest (i.e., methodology, design and delivery mechanisms, country, segment, etc.) Alongside these interactive syntheses, Caribou Digital has crafted dozens of targeted insights for client strategy and the public (e.g., DFS and small business).

3. A chatbot-based conversational interface queries the impact evidence database using natural language queries. This utilizes language-learning models to interpret queries and return synthesized results from the evidence base, with cited references, allowing users to, for example, type a query “Outline e-commerce products that have positively impacted women’s income in India.” This interface opens up new possibilities to engage with impact evidence, particularly when convenience and speed are prioritized and when supporting greater access to insights for those traditionally marginalized from more complex knowledge products.

What’s next

This is an exciting time to be working in evidence synthesis. With innovative new tools and processes to work with, there are more possibilities than ever to advance the efficiency, accuracy, diversity, and inclusivity of evidence synthesis. In the coming months, we will share more about our iterations with Gen AI and our approaches to deepening inclusivity and equity within the evidence synthesis process and products. We are keen to connect with users and practitioners alike to refine and perfect these approaches.

Contact niamh@cariboudigital.net with any feedback or questions about our evidence synthesis approach.

Shifting through the noise in the age of information overload — the power of evidence synthesis was originally published in Caribou Digital on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

From VideoIdent provider to leading identity platform: IDnow celebrates its tenth anniversary

Munich-based identity pioneer looks back on ten years of revolutionizing the KYC space and doubles down on reusable identities and trust services Munich, June 27, 2024 – IDnow, a leading platform provider for identity verification in Europe, is looking back on ten successful years of revolutionizing the German and European Know-Your-Customer (KYC) market and is […]
Munich-based identity pioneer looks back on ten years of revolutionizing the KYC space and doubles down on reusable identities and trust services

Munich, June 27, 2024 – IDnow, a leading platform provider for identity verification in Europe, is looking back on ten successful years of revolutionizing the German and European Know-Your-Customer (KYC) market and is now doubling down on reusable identities and trust services. Founded in 2014 by Armin Bauer, Sebastian Bärhold, Felix Haas and Dennis von Ferenczy, the Munich-based company’s flagship product at the time was the revolutionary VideoIdent, which had just been allowed by the German Federal Financial Supervisory Authority (BaFin).

A platform with automated and expert-led solutions that serves all needs

Since then, a variety of solutions have been added to the portfolio, following market needs and technological trends: the AI-supported, fully automated AutoIdent solution was introduced in 2018, followed by the electronic function of the German ID card (eID) in 2019. Just last year, IDnow introduced ShopIdent, the onsite solution for identification in German petrol stations and stores. The transaction figures prove the success of these solutions: Over 13.2 million transactions have been processed via IDnow AutoIdent, while the scale-up looks back on another 13 million successfully identified users with VideoIdent to date.

In 2019, Andreas Bodczek was appointed as CEO at IDnow: “We have made it our mission over the past five years to actively shape and lead our industry and show our customers why digital identities are essential to their business. In the last decade, we have grown into a platform provider that serves all market needs with the help of the latest technology. We have earned the trust of our customers because we focus on our core business every day without losing sight of the secular trends that are shaping the market.”

Munich-based scale-up has long been one of the largest European players

In 2021, increased expansion opportunities led IDnow to acquire the French market leader for identity verification technology, ARIADNEXT, and the German provider identity Trust Management AG, establishing IDnow as one of the largest European players in the identity industry.

In addition to its traditional financial and telecommunications applications, new growth areas for IDnow have opened up in the mobility and travel industry, online dating, gaming, human resources and social media. Although user needs and regulatory requirements differ greatly from traditional industries, the desire for increased trust in digital transactions has grown across territories, regardless of the use case.

Reusable identities and trust services determine the future

The digital identity industry is facing a paradigm shift due to regulations, such as eIDAS 2.0 at the European level or announced changes to the Videoident regulation in Germany. With its more than 900 trusted customers, the company aims to leverage these relationships and take them to the next level by building up its own trust service and a digital identity pool of several million users in the next few years. In doing so, the scale-up intends to lead the way in the transformation of the digital identity market heralded by eIDAS 2.0: European trust services, reusable identities and a fully integrated, broad product portfolio form the basis of the company’s long-term product strategy.

“Today, we are facing a fundamental transformation of our industry – away from the one-time verification process and towards reusable identities – and we will once again lead this revolution,” Bodczek says. “Since its founding, IDnow, with its broad and continually evolving portfolio, has shown a great deal of foresight, resiliency, and adaptability to market developments. With a team of over 450 highly skilled colleagues across ten offices in Europe, we are already one of the largest players in the identity industry. The experience, expertise, and innovation drive that we have assembled in-house make us extremely optimistic about the next ten years of our company and the industry at large,” he concludes.


Ocean Protocol

DF95 Completes and DF96 Launches

Predictoor DF95 rewards available. DF96 runs Jun 27 — Jul 4, 2024 1. Overview Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor. Data Farming Round 95 (DF95) has completed. DF96 is live today, June 27. It concludes on July 4. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.
Predictoor DF95 rewards available. DF96 runs Jun 27 — Jul 4, 2024 1. Overview

Data Farming (DF) is Ocean’s incentives program. In DF, you can earn OCEAN rewards by making predictions via Ocean Predictoor.

Data Farming Round 95 (DF95) has completed.

DF96 is live today, June 27. It concludes on July 4. For this DF round, Predictoor DF has 37,500 OCEAN rewards and 20,000 ROSE rewards.

2. DF structure

The reward structure for DF96 is comprised solely of Predictoor DF rewards.

Predictoor DF: Actively predict crypto prices by submitting a price prediction and staking OCEAN to slash competitors and earn.

3. How to Earn Rewards, and Claim Them

Predictoor DF: To earn: submit accurate predictions via Predictoor Bots and stake OCEAN to slash incorrect Predictoors. To claim OCEAN rewards: run the Predictoor $OCEAN payout script, linked from Predictoor DF user guide in Ocean docs. To claim ROSE rewards: see instructions in Predictoor DF user guide in Ocean docs.

4. Specific Parameters for DF96

Budget. Predictoor DF: 37.5K OCEAN + 20K ROSE

Networks. Predictoor DF applies to activity on Oasis Sapphire. Here is more information about Ocean deployments to networks.

Predictoor DF rewards are calculated as follows:

First, DF Buyer agent purchases Predictoor feeds using OCEAN throughout the week to evenly distribute these rewards. Then, ROSE is distributed at the end of the week to active Predictoors that have been claiming their rewards.

Expect further evolution in DF: adding new streams and budget adjustments among streams.

Updates are always announced at the beginning of a round, if not sooner.

About Ocean, DF and Predictoor

Ocean was founded to level the playing field for AI and data. Ocean tools enable people to privately & securely publish, exchange, and consume data. Follow Ocean on Twitter or TG, and chat in Discord. Ocean is part of the Artificial Superintelligence Alliance.

In Predictoor, people run AI-powered prediction bots or trading bots on crypto price feeds to earn $. Follow Predictoor on Twitter.

DF95 Completes and DF96 Launches was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 26. June 2024

KuppingerCole

AI and Deepfakes in Fraud - The Future of Digital Identity with John Erik Setsaas

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. The conversation between John Tolbert and John Erik Setsaas focuses on the topic of fraud prevention in the context of advancements in technology. They discuss the challenges posed by fraudsters using AI, deepfakes, and phishing techniques. They also explore the importance of monitoring and anal

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. The conversation between John Tolbert and John Erik Setsaas focuses on the topic of fraud prevention in the context of advancements in technology. They discuss the challenges posed by fraudsters using AI, deepfakes, and phishing techniques. They also explore the importance of monitoring and analyzing behavioral patterns to detect fraudulent activities.




Customer Centricity - The Future of Digital Identity with Katryna Dow

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. The focus in the interview with Katryna Dow is on the shift towards a customer-centric approach and the emergence of self-sovereign identity and reusable identity. The discussion also touches on the importance of interoperability and the need for a rich ecosystem of trusted issuers and verifiers

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. The focus in the interview with Katryna Dow is on the shift towards a customer-centric approach and the emergence of self-sovereign identity and reusable identity. The discussion also touches on the importance of interoperability and the need for a rich ecosystem of trusted issuers and verifiers.




cyberevolution 2024: Shaping the Future of Cybersecurity in an AI-Powered Digital World

Join Martin as he converses with Christopher Schütze and Berthold Kerl about the upcoming cyberevolution in Frankfurt. They discuss the key trends and topics in cybersecurity that will be covered at the event, including the latest innovations from vendors, the SAFIRE scenario for the next 5-10 years, and the importance of exchanging best practices. Highlights include discussions on the governan

Join Martin as he converses with Christopher Schütze and Berthold Kerl about the upcoming cyberevolution in Frankfurt. They discuss the key trends and topics in cybersecurity that will be covered at the event, including the latest innovations from vendors, the SAFIRE scenario for the next 5-10 years, and the importance of exchanging best practices.

Highlights include discussions on the governance of AI, the threat of deepfakes, and the impact of regulations like the AI Act, NIS2, and DORA. Additionally, they emphasize the significance of translating cybersecurity concepts for business leaders and the need for a community of experts. The conversation also touches on mental health in cybersecurity and the unique Capture the Flag competition for young talents. The first 200 tickets for end users are complimentary, promoting greater participation and knowledge sharing.




Microsoft Entra (Azure AD) Blog

Move to cloud authentication with the AD FS migration tool!

We’re excited to announce that the migration tool for Active Directory Federation Service (AD FS) customers to move their apps to Microsoft Entra ID is now generally available! Customers can begin updating their identity management with more extensive monitoring and security infrastructure by quickly identifying which applications are capable of being migrated and assessing all their AD FS applica

We’re excited to announce that the migration tool for Active Directory Federation Service (AD FS) customers to move their apps to Microsoft Entra ID is now generally available! Customers can begin updating their identity management with more extensive monitoring and security infrastructure by quickly identifying which applications are capable of being migrated and assessing all their AD FS applications for compatibility. 

 

In November we announced AD FS Application Migration would be moving to public preview, and the response from our partners and customers has been overwhelmingly positive. For some, transitioning to cloud-based security is a daunting task, but the tool has proven to dramatically streamline the process of moving to Microsoft Entra ID. 

 

A simplified workflow, reduced need for manual intervention, and minimized downtime (for applications and end users) have reduced stress for hassle-free migrations. The tool not only checks the compatibility of your applications with Entra ID, but it can also suggest how to resolve any issues. It then monitors the migration progress and reflects the latest changes in your applications. Watch the demo to see the tool in action.

Moving from AD FS to a more agile and responsive, cloud-native solution helps overcome some of the inherent limitations of the old way of managing identities.

 

In addition to more robust security, organizations count greater visibility and control with a centralized, intuitive admin center and reduced server costs as transformative benefits of moving to a modern identity management. Moreover, Entra ID features can help organizations achieve better security and compliance with multifactor authentication (MFA) and conditional access policies—both of which provide a critical foundation for Zero Trust strategy.  

 

More Entra ID features include:

Passwordless and MFA for better user experience. A rich set of apps, APIs, SDKs, and connectors for customization and extensibility. Granular adaptive access controls to define and monitor conditional access. Self-service portals that allow employees to securely manage their own identity.

 

Want to learn more about Microsoft Entra? Get the datasheet and take a tour here. Ready to get started? Visit Microsoft Learn and explore our detailed AD FS Application Migration guide. 

 

Have any questions or feedback? Let us know here.  

 

Melanie Maynes

Director of Product Marketing

 

 

For a comprehensive overview of the migration tool and its capabilities, check out these other resources:

Overview of AD FS application migration - Microsoft Entra ID | Microsoft Learn Use the AD FS application migration to move AD FS apps to Microsoft Entra ID - Microsoft Entra ID | Microsoft Learn Demo: Effortless Application Migration Using Microsoft Entra ID | OD03 (youtube.com)  Best practices to migrate applications and authentication to Microsoft Entra ID - Microsoft Entra | Microsoft Learn Customer Case Study: Microsoft Customer Story-Universidad de Las Palmas de Gran Canaria boosts accessibility with Microsoft Entra ID  

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog⁠Microsoft Entra blog | Tech CommunityMicrosoft Entra documentation | Microsoft Learn Microsoft Entra discussions | Microsoft Community  

Ontology

Building Trust and Reputation in Web3 with Proof of Humanity

In the dynamic landscape of Web3, ensuring trust and authenticity is paramount. Proof of Humanity (PoH) emerges as a pivotal solution, combining social verification, video submission, and decentralized dispute resolution to create a Sybil-proof registry of real humans on the blockchain. This revolutionary system addresses key challenges by verifying that each registered entity is a unique individu

In the dynamic landscape of Web3, ensuring trust and authenticity is paramount. Proof of Humanity (PoH) emerges as a pivotal solution, combining social verification, video submission, and decentralized dispute resolution to create a Sybil-proof registry of real humans on the blockchain. This revolutionary system addresses key challenges by verifying that each registered entity is a unique individual, thus fostering a more secure and trustworthy digital environment.

Key Features of Proof of Humanity Sybil-Proof Registry: Guarantees each registered user is a real, unique human, preventing fraudulent multiple accounts. Social Vouching: Existing verified humans vouch for new registrants, adding a layer of community-based verification. Challenge Mechanisms: Users can challenge suspicious submissions, which are resolved through a decentralized process. Reverse Turing Tests: Distinguishes humans from bots using advanced behavioral analysis. Decentralized Dispute Resolution: Utilizes systems like Kleros for unbiased resolution of disputes. Importance in Web3

With the rise of AI and bots, distinguishing between human and machine interactions in Web3 is increasingly difficult. PoH mitigates these challenges by preventing Sybil attacks, ensuring fair resource distribution, and enhancing transparency in digital interactions. This system empowers users to interact only with verified accounts, akin to content moderation on social media, fostering authentic relationships in the digital realm.

Challenges and Considerations

Despite its advantages, PoH faces several challenges:

Privacy: Balancing verification with user privacy is crucial, as PoH systems may require sensitive personal data. Scalability: Ensuring PoH can handle large-scale adoption efficiently. Inclusivity: Overcoming barriers such as access to technology and digital literacy to ensure widespread participation. Evolving AI: Continual adaptation to differentiate increasingly sophisticated AI from humans. Applications and Use Cases

PoH’s applications span various sectors. In decentralized finance (DeFi), it enhances trust in transactions and lending protocols. For governance, it ensures fair voting in decentralized autonomous organizations (DAOs) by preventing Sybil attacks. In social media, PoH verifies user identities, enabling portable reputations across platforms. Additionally, it supports fair distribution of resources in schemes like Universal Basic Income (UBI), ensuring each verified individual receives their due share.

Proof of Humanity stands at the forefront of human-centric blockchain innovations, safeguarding trust and authenticity in the evolving Web3 ecosystem.

Building Trust and Reputation in Web3 with Proof of Humanity was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Elliptic

What you can expect from Elliptic’s Typologies Report 2024

In the year since we published our 2023 Typologies Report, important developments have impacted at the nexus of cryptoassets and financial crime, putting pressure on the compliance teams to remain alert and continually adapt. For law enforcement agencies and regulators, it has become vital to be able to consistently identify new criminal behavior in order to respond effectively. 

In the year since we published our 2023 Typologies Report, important developments have impacted at the nexus of cryptoassets and financial crime, putting pressure on the compliance teams to remain alert and continually adapt. For law enforcement agencies and regulators, it has become vital to be able to consistently identify new criminal behavior in order to respond effectively. 


Ockto

Podcast Digital Identity Wallets; met Vincent Janssen INNOPAY

In een nieuwe aflevering van de Data Sharing Podcast duikt host Hidde Koning samen met Vincent Jansen, Vice President bij INNOPAY, in de wereld van digital identity wallets. Digital identity wallets lijken een revolutie teweeg te brengen in hoe we onze identiteit online beheren en delen. Tegelijkertijd roept het een tal van vragen op.

In een nieuwe aflevering van de Data Sharing Podcast duikt host Hidde Koning samen met Vincent Jansen, Vice President bij INNOPAY, in de wereld van digital identity wallets. Digital identity wallets lijken een revolutie teweeg te brengen in hoe we onze identiteit online beheren en delen. Tegelijkertijd roept het een tal van vragen op.


Verida

Verida Network Community Update — 26 June 2024

Verida Network Community Update — 26 June 2024 Following a $5M raise and recent announcement to support personalized AI with confidential compute on the Verida Network, here’s what it means for the future of Verida. Verida Network Community Update: 26 June 2024 Hello, gm everyone! Chris Were here, the CEO and Co-founder at Verida 👋 With our $5M raise and recent announcement
Verida Network Community Update — 26 June 2024 Following a $5M raise and recent announcement to support personalized AI with confidential compute on the Verida Network, here’s what it means for the future of Verida. Verida Network Community Update: 26 June 2024 Hello, gm everyone!

Chris Were here, the CEO and Co-founder at Verida 👋

With our $5M raise and recent announcement to support personalized AI with confidential compute on the Verida Network, I wanted to provide a community update to share with you all what it means for the future of Verida.

Confidential Computation to Fuel Personalized AI

Some of you may have seen our important announcement about Verida and how we’re going to extend the Verida network to support confidential computation. This is a significant step to unlock decentralized infrastructure for personalized AI. I want to explain what this means and how it fits into the broader story of AI and the critical infrastructure and technology we’ve been building.

You can watch my community update on YouTube, where I share my ideas in a live scratchpad, or dive into this article for the details.

Honing in on Personalized AI

Until a few weeks ago, our focus at Verida was on decentralized infrastructure for private databases, particularly addressing data storage for personal data. We ensured that data was securely handled, encrypted by private keys, and met all necessary authentication, security, and regulatory requirements. Recently, we announced that we are extending the Verida network to support personalized AI with confidential compute.

Why Verida? Why are we well-placed to make this move?

Firstly, everyone has seen the rapid growth in AI and its evolving use cases. Verida has always focused on helping people own their data and benefit from it. The idea of having all your data and using it with AI is powerful, and I’ll elaborate on that shortly.

From Verida’s perspective, we’ve always been about protecting and adding value to your data. We’ve built infrastructure around database storage for user data, making it well-suited to extend to confidential computation leveraging that data.

For example, if you’re technical, you might build software to interact with a database. But with an API, you get computation attached to it, adding business logic and making it much more powerful and useful for developers and product builders.

This is what we’re doing — saying, “databases are great, but what if we could add business logic and confidential compute?” This extension will provide APIs for other builders, making it easy to build AI platforms using your personal data.

“Data matters more than the AI or the compute” — Emad Mostaque

To highlight the importance of this, Emad Mostaque from Stability AI, who recently founded Shelling AI, emphasized that “data matters more than the AI or the compute.” At the Super AI conference in Singapore, he talked about the idea of self-sovereign AI.

Verida has discussed self-sovereign data, and now we’re advancing to self-sovereign AI, which needs self-sovereign identity and data as foundational pieces. Verida is well-placed to provide this infrastructure for a future of self-sovereign AI.

AI privacy issues are critical

For instance, Elon Musk highlighted concerns about Apple devices potentially sharing personal information with OpenAI. Imagine if all prompts and information you sent to an AI were exposed — this is a real risk, not a hypothetical one.

Data breaches, like the Ashley Madison hack in 2015, show the potential consequences of sensitive information leaks. Verida’s work on database infrastructure supports end-to-end privacy for AI, and we can extend this to the compute side.

It’s important to note that we are not providing decentralized computation for training AI models. Instead, we focus on personalized AI, using personal data confidentially with AI.

So, what does personalized AI mean?

We have products with AI built into them, like facial recognition and autocomplete in Gmail. Another type of AI involves prompts, like ChatGPT, which provides contextual help by accessing your data.

Examples include answering questions about your purchase history or creating itineraries based on your emails and bookings. More complex prompts might involve estimating taxes by accessing your pay slips and other financial information. AI agents, which act on your behalf, represent another aspect of personalized AI. These agents, supercharged by direct access to your personal data, can perform tasks like searching for shoes or helping advance your career by suggesting networking opportunities, online courses, and relevant connections.

For personalized AI to work, it needs access to personal data and guaranteed end-to-end privacy.

This is where Verida excels. We’ve been building infrastructure for storage and a personal data bridge, allowing users to connect to web2 platforms and pull their data into Verida’s decentralized storage network.

Introducing the Verida Vault

The Verida Vault is the user interface that enables this, allowing you to connect, browse, and authorize access to your data. Here’s a preview of the Verida Vault, where you can connect to the personal data bridge and pull in data from various platforms. Once your data is in the Verida Vault, you can authorize access to different AI products and services that enhance their features with your data.

Sneak preview of the Verida Vault The journey from web2 to web3 data unlocks personal AI

Verida’s infrastructure includes the personal data bridge and Verida Vault, ensuring secure and private access to your data. By adding confidential compute, we enable AI products and services to interact with your data securely, enhancing its usefulness.

Enabling Personalized AI with Verida Vault and Personal Data Bridge Your data, your control, our commitment.

Verida is solving the problem of bringing your data from web2 to web3, enabling personalized AI use cases with guaranteed privacy. This infrastructure will play a critical role in the future of self-sovereign AI, leveraging confidential compute to provide secure, personalized AI experiences.

Follow us for more updates and join the conversation with me on X/Twitter and LinkedIn about the future of personalized AI.

Yours truly,

Chris Were — CEO & Co-founder, Verida

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for various industries. Verida’s ecosystem of KYC partners and technologies are ideally suited to help Kima expand into new markets, streamlining processes and efficiency for compliant transactions. For more information, visit Verida.

Verida Missions | X/Twitter | Discord | Telegram | LinkedInLinkTree

Verida Network Community Update — 26 June 2024 was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.


Tokeny Solutions

The Complete Tokenization Process

The post The Complete Tokenization Process appeared first on Tokeny.
Report download The Complete Tokenization Process Discover lifecycle management of tokenized securities, from onboarding, issuance to servicing and secondary trading.

Qualification and subscription features to easily onboard investors

How to deploy token smart contracts on the blockchain

Corporate actions that go beyond simple reporting functionalities

Built-in secondary market solutions and connectivity with any distributor

The post The Complete Tokenization Process first appeared on Tokeny.

The post The Complete Tokenization Process appeared first on Tokeny.


KuppingerCole

Jul 25, 2024: Passwordless 360°: A Game-Changing Approach to Authentication within Your Business

In today's digital landscape, traditional password-based authentication is increasingly challenged by the need for more secure, efficient, and user-friendly methods. The shift towards passwordless authentication is reshaping how users interact with digital services, promising enhanced security and improved user experiences across various sectors.
In today's digital landscape, traditional password-based authentication is increasingly challenged by the need for more secure, efficient, and user-friendly methods. The shift towards passwordless authentication is reshaping how users interact with digital services, promising enhanced security and improved user experiences across various sectors.

MyDEX

Connecting data ‘about me’ to the world around me

This blog is third in a series explaining how Mydex’s personal data infrastructure works. It explains how our platforms help deliver our mission of empowering individuals with their own data, enabling them to use this data to manage their lives better and assert their human rights in a practical way on a daily basis. So far in this series we’ve talked about some absolute basics: how personal

This blog is third in a series explaining how Mydex’s personal data infrastructure works. It explains how our platforms help deliver our mission of empowering individuals with their own data, enabling them to use this data to manage their lives better and assert their human rights in a practical way on a daily basis.

So far in this series we’ve talked about some absolute basics: how personal data stores work and how they enable a new approach to data sharing. But very often, to get something done, you don’t just need personal data. You also need data about the world out there — data about those aspects of the world which relate directly to your task at hand.

Ideally, what you want to be able to do is to connect your personal data to this relevant info ‘about the world’ so that the two types of data get integrated together, providing everything you need either to make a better decision or to act well on this decision, efficiently and effectively.

Today — friction, effort, risk and cost all round

Today however, this connection and integration happens only rarely. Usually, the two types of data are kept completely apart, which involves you in all sorts of to-ing and fro-ing, looking for which bits of data are relevant and working out how to bring them together.

Here’s an everyday example. You need to send something in the post but you don’t know the postcode. So you have to stop what you are doing, search for the right website, enter it, and then once there, search again for the information you need. Then you need to copy the information, leave the web site, and go back to what you were doing. A time consuming, irritating interruption — a classic example of what we call FERC — which stands for Friction, Effort, Risk and Cost.

This is just a trivial example. It gets more complicated — and the FERC just grows and grows — with more detailed, sensitive information such as the medicines that a patient should be taking, where there are more things to think about such as when you should take what dose, what other medicines you can’t take with it, contra-indications and so on.

This isn’t personal information as such. It’s information about the drug — information that is very important to the person at this time in this context, because it relates and connects directly to what they need and are trying to do.

Most services are made up of these two different types of information: Information about the person themselves and information about some aspect of the world that relates to what the person needs and is trying to achieve. Both types of information are needed if a person is to receive a comprehensive, quality service.

For example, health and care service providers need to integrate detailed data about the particular individual being cared for (personal data) with data about what services they need, based on their specific health conditions and medications they are to take (data about ‘the world’). This extends to referrals to specific services the individual may want to use as part of getting the support they need.

Currently, such data is kept separate, making it difficult and time consuming for people (both people needing support and service providers providing the support) to bring it together in a way that speaks directly to that particular person’s situation at that particular time.

The UK’s NHS has developed painstakingly curated lists of the medications it enables doctors to use, and of health conditions they are likely to come across. These lists are freely available and accessible to everyone … if, that is, you know they exist (in separate websites) and have the time, will and energy to jump through all the hoops you need to access and use them.

To do this, you first have to find the URL for the website in question (a different search process just to get started); navigate your way through mountains of information that’s not relevant for you; then once you have found, it capture the information in some way (such as copy and paste); go back to what you were originally doing; and find some way to integrate it into what you were trying to do.

Very often, well-meaning service providers develop general information packs for people dealing with a particular health condition. Very often these information packs are truly informative and helpful … but to find what’s truly informative and helpful in relation to this particular individual with these particular circumstances the person then has to wade through them, looking for the few bits of information that are particularly relevant to them right now.

Many people feel overwhelmed and exhausted by such processes. Often they feel overloaded with information, get confused. Sometimes frightened.

A breakthrough in service provision

We have developed our Master Reference Data service (MRD) to get rid of all this hassle so that people can access exactly the right information that they need, at the right time, from within what they are doing, seamlessly. This means they don’t have to stop what they were trying to do to go searching for it.

Instead, the individual needing the information (the person themselves or a front line team member in a service provider) can instantly look up exactly the right relevant information, sourced from a reliable, curated source as an integrated part of what they are doing — from within whatever application they are using. Individuals can link it into their own records stored in their Personal Data Store and service providers can use it to build personalised information packs and resources for the people they serve.

Take the example of a service provider arranging an appointment with a person who needs support regarding a particular condition or situation. The appointment is being arranged seamlessly online (no need for letters or emails). It mentions some medications which need to be reviewed and plus some other services they might want to use. As soon as the medication is mentioned, a link to exactly the right place in the curated list is created, so that both sides can instantly see exactly the right information. Same for services and specifics relating to the health condition involved.

Or, looking at it the other way round, an individual can develop an ‘About Me’ profile that summarises all the key personal information about themselves and their condition, medications, support services etc. They can integrate this information seamlessly into their own records, with all the links stored in their personal data store.

They can then share this full profile with a new service provider as and when needed, so that the new service provider gets presented with all the key information they need straight away, without delay. This also means that the individual doesn’t keep on having to tell the same story to different people over and over again.

Result: a much better quality service and process with much lower FERC (Friction, Effort, Risk and Cost) for both the individual and the service provider.

Getting the details right

Of course, explaining it like this makes it sound simple (as it should be). But behind this simple experience lies a lot of technicalities. So, for example, a key part of the Master Reference Data Service is the detailed support services for developers covering such issues as authentication to access the service, how cross-service search works, and help support services.

At a higher level, Figure One below sums up how we provide these services.

Figure One: Overview of how our Master Reference Data Services work

It starts with the curated data sources themselves (the coloured circles at the bottom of the diagram). We access the data from these curated data sources and crunch them so that a) people using different software languages or formats can easily access the data (interoperability) and b) so that the data they contain can be sliced and diced so that only those bits of data that are particularly relevant to a person at a particular time can be easily presented (configuration).

This means that if you are looking for information about a particular thing that may be referred to in a range of different curated databases, you can search for this information from within our service across all these different databases, all at the same time.

For example, a patient with a particular condition could access information about the condition itself, information about medicines related to treating that condition, and about local services or community groups supporting people with that condition.

The top of the diagram then shows subscribers subscribing to any and all curated data sources that are relevant to them.

Currently these data sources include ALISS (A Local Information System for Scotland) and NHS curated content covering a wide range of areas.

ALISS is a service designed to help citizens find help and support close to them when they need it most. ALISS is maintained and published by the Health and Social Care Alliance Scotland (the ALLIANCE) and is funded by the Scottish Government.

The NHS databases cover:

Health Conditions information for the top 600 health conditions experienced by People. Symptoms, assessments, testing, treatment and implications. Live Well, which covers advice about healthy living, including eating a balanced diet, healthy weight, exercise, quitting smoking and drinking less alcohol, sexual health and addiction support. Mental Health support. This covers feelings, symptoms and behaviours, mental health conditions, advice for life situations and events, self-help, mental health services, mental health for children, teenagers and young adults, talking therapies, medicine and psychiatry, social care, mental health and your rights. Medicines This covers content about common medicines. Includes how and when to take the medicine, possible side effects and answers to common questions. Pregnancy , including modular advice and guidance about trying for a baby, pregnancy, labour and birth.

Many of these curated lists of information also link with each other. MRD makes it easy to link and pull them together into a seamless payload that can be integrated into any application. This reduces the amount of duplication involved across organisations and their applications, improves the data quality and reduces the complexity of delivering interoperability and integration.

Subscribers accessing this curated content via our MRD API can be confident that the information they are accessing is accurate and up-to-date. This is because each one of them is refreshed once a week to include any changes that may have been made in the last seven days.

The list of data services covered by our MRD service is continually growing. For example, we are in the process of adding the World Health Organisation’s International Statistical Classification of Diseases and Related Health Problems (ICD-10), and the Systematized Nomenclature of Medicine — Clinical Terms (SNOMED CT) which is another standardised, multilingual vocabulary of clinical terminology that is used by physicians and other healthcare providers for the electronic exchange of health information.

We are seeking to enable standardised lists of the sorts of adaptations people may need in their homes if they are disabled, frail or at risk of falling, postcode look-ups, and (further into the future) comprehensive lists of operational GPs and dentists.

We also include (or will soon include) data services designed to help service providers improve the quality of the personal data they hold and share. These include personal data quality relating to:

Race and Ethnicity, and Age Range classifications. Activity and Measurement reference tables covering the full spectrum of datasets supporting wearables (for example relating to activity types, temperature, sleep and exercise) Postcode and geolocation cross references

The potential list is endless. In this blog we have only talked about health and care, but the concept applies to every aspect of life. The underlying goals and core processes remain the same: it’s all about bringing exactly the right information to people at the right time, as and when they need it.

Conclusion

Personal data lies at the heart of every service that deals with an identified individual: we need to be able to access and use it to manage our lives better. But very often, for the service to really fulfil its function it needs to relate information about some important things of ‘the world out there’ to the specific needs and context of the individual, as identified by their personal data.

Very often, the key to service quality and efficiency is the ability to access and integrate both types of data so that all the right data is made available at the right time, and so that everybody involved can be confident that the information they are using is reliable (i.e. comes from a professionally curated source).

We have spent years developing our Master Reference Data Services so that both individuals and service providers can use data in this way. Its potential is simply enormous.

Connecting data ‘about me’ to the world around me was originally published in Mydex on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Safeguard Against Healthcare Cyberattacks With Defense In Depth

Secure your healthcare organization with a layered security strategy. Learn how identity orchestration and defense-in-depth protects against cyber threats.

In 2024, the healthcare sector experienced significant cyberattacks, notably the Change Healthcare and Ascension cyber incidents. These attacks underscored the critical need for robust and layered security measures in healthcare and other industries. The recent breaches highlight how multifaceted and coordinated security strategies, combined with advanced identity orchestration, can protect organizations from complex cyber threats such as ransomware, fraud, and insider attacks.

 


What is Embedded Finance? And How Identity Powers It

Discover how IAM enables embedded finance to function across financial services, ensuring seamless and secure transactions.

BlueSky

Introducing Bluesky Starter Packs

Create a starter pack today — personalized invites that bring friends directly into your slice of Bluesky.

Today, we’re releasing starter packs — personalized invites that allow you to bring friends directly into your slice of Bluesky!

An example starter pack.

Recommend custom feeds and users to help your community find each other. Get started in the Starter Packs tab on your Bluesky profile.

What’s in a starter pack? Custom feeds. On Bluesky, you can set any algorithm or topic as your home timeline. Examples include Quiet Posters (posts from your quieter mutuals), Science (posts from the science community), and Catch Up (most popular posts from the last 24 hours). Recommended follows. Add your favorite accounts and encourage new users to follow them. How do I create a starter pack? Click the Starter Packs tab. On your profile, next to your media and likes tabs, you’ll see a new tab. Create a starter pack from your profile. Create a starter pack. Use our auto-generation tool to create a starter pack or make your own from scratch! You can create more than one starter pack. Click “Make one for me” to get a pre-populated starter pack of suggested users and custom feeds. You can add or remove items from this list. Or, click “Create” to add users and feeds to your starter pack yourself. Set your starter pack name, description, and recommended users and feeds. Share your starter pack! Every starter pack comes with a link and QR code you can share. Text your starter pack to a friend, share it with your professional network, and post it to other social apps! Share your starter pack! Say hi! You’ll get notified for users who join Bluesky via your starter pack. Who can use starter packs?

Anyone with a Bluesky account can create starter packs.

If you don’t have a Bluesky account yet, you can join via a friend’s starter pack and get started with their recommended customizations. Once you’re in Bluesky, you can add/remove these recommendations and further customize your experience.

If you’re already on Bluesky but want to onboard to another community or get your friend’s recommendations, you can also use their starter pack to add to your experience!

Starter Packs FAQ

How many people and feeds can I add to my starter pack?

You can recommend up to 50 people and up to 3 custom feeds. New users will automatically have Following and Discover pinned.

How can I share my starter pack with more people?

Text a link to your friends, post about it on other social networks, share it with your professional network! Every starter pack comes with an auto-generated preview image that shows the name of your starter pack and some suggested users for handy sharing.

How do I find more starter packs on Bluesky?

You can share starter packs directly on Bluesky, and you’ll see an embed preview for these links. Currently, starter packs do not show up in search, so to find a starter pack, a friend will have to send you the link or you can see the embed preview within the Bluesky app.

I was added as a recommended user in someone’s starter pack. Can I remove myself?

No. If you’d like to not be suggested as a user to follow in someone’s starter pack, you can ask them to remove you via Bluesky DMs. You can also report a starter pack to Bluesky’s moderation team (see below).

Can I report a starter pack to Bluesky’s moderation team?

Yes. You can report a starter pack by clicking the three-dot menu at the top of the starter pack. Bluesky’s moderation team will review all reports and evaluate them according to our Community Guidelines.

Can I include a labeling service in my starter pack?

We currently don’t include labeling services in starter packs — we’re working on improving in-app discovery of these services and service reliability first.

Tuesday, 25. June 2024

Anonym

5 Ways Anonyome Has Improved its Market-Leading DI Mobile Wallet SDK 

Anonyome Labs has the most advanced decentralized identity (DI) mobile wallet SDK on the market—and the newly released version 2.0 has even more features for customers to apply to their own products.   As a market leader in innovative decentralized identity (DI) technology offerings, Anonyome Labs is proud to bring version 2.0 of the Sudo Platform […] The post 5 Ways Anonyome Has Impro

Anonyome Labs has the most advanced decentralized identity (DI) mobile wallet SDK on the market—and the newly released version 2.0 has even more features for customers to apply to their own products.  

As a market leader in innovative decentralized identity (DI) technology offerings, Anonyome Labs is proud to bring version 2.0 of the Sudo Platform DI Mobile Edge Agent SDK to market. Like its predecessor, this SDK unites industry-standard DI protocols with Sudo Platform’s modularity, reliability and security, and then adds new features that give customers more options for applying the technology to their product suite.  

An edge agent is an application that sits on an ‘edge device’, such as a user’s mobile phone. Edge agents can provide an array of DI functionality such as wallet and peer-to-peer services. Learn more

In this article we run through the technical specs of Sudo Platform Edge Agent SDK 2.0, but here are the 5 ways the version 2.0 mobile wallet SDK is more advanced than its predecessor: 
 

Sudo Platform Edge Agent SDK 2.0 adopts the latest v2.0 suite of Hyperledger Aries RFCs for receiving and securely storing credentials and verifiably presenting credentials. Learn about how the banking industry is using verifiable or reusable credentials, for example.  Sudo Platform Edge Agent SDK 2.0 still supports Hyperledger Anoncreds, and now also supports more credential and presentation formats, particularly W3C Verifiable Credentials (VCs) and Verifiable Presentations (VPs).  Sudo Platform Edge Agent SDK 2.0 adds support for Aries-based chat functionality, so users can exchange end-to-end encrypted chat messages with their DIDComm connections.  The Sudo Platform Edge Agent SDK continues to exist in the ecosystem of Anonyome’s DI offerings, such as in the Sudo Platform Cloud Agent Service and the Sudo Platform Decentralized Identity Relay.  In future we will add into the SDK support for OpenID4VC and more of the latest Hyperledger Aries RFCs. This will be part of our new roadmap as we move on from the initial roadmap we outlined here
  

Let’s now go into more detail about these 5 features: 

Sudo Platform Edge Agent SDK 2.0 adopts the latest Aries credential and presentation exchange RFCs 
 

The Sudo Platform Edge Agent SDK continues to expand its support for the Hyperledger Aries RFCs. Having already supported most of the Aries Interop Protocol (AIP) 1.0 suite of protocols, we have progressed to the latest AIP 2.0 suite. To do this, we’ve implemented these new Aries RFCs into our edge agent: 

Issue Credential Protocol 2.0 (RFC 0453) for negotiating and receiving verifiable credentials from Aries issuers  Present Proof Protocol 2.0 (RFC 0454) for presenting verifiable credentials to Aries verifiers. 

This means version 2.0 is significantly better than version 1.0 because it can receive and present credentials in a range of different formats, such as W3C VCs, in addition to Anoncreds. We discuss these new supported formats below. 

As was the case with version 1 of the Sudo Platform Edge Agent SDK, we implement the core of these Aries RFCs by leveraging and contributing to the open-source Hyperledger project that Anonyome co-maintains: Aries-VCX

Version 2.0 adds support for W3C credentials and presentations 
 

Using the latest Aries protocols for credential and presentation exchange, we built support for new credential and presentation formats, such as W3C VCs and VPs. This means our customers can pick and choose from a wider range of ‘DI flavours’ that best suit their business use cases. 

Anonyome has previously analysed Anoncreds and W3C VCs, both of which play a crucial purpose in the DI ecosystem. Some benefits the Edge Agent SDK’s support for W3C VCs offers include: 

Higher interoperability from standardization and widespread adoption as a result of being formalized by W3C  Highly flexible data structure in comparison to Anoncreds, including support for more complex credential structures such as nested attributes and lists, suitable for a broad range of applications  More maturity and backing by a global standard, with many implementations and growing adoption. 

Where Anoncreds may beat W3C VCs is in its advanced privacy-preserving features. Anoncreds thrives in this area, with support for zero knowledge proofs (ZKPs) of credential attributes, and partial disclosure of credential attributes (only reveal the attributes that need to be revealed).  

To retain some of these privacy-preserving use cases in W3C VCs credentials, the Sudo Platform Edge Agent SDK adds support for presenting selectively disclosed credentials using BBS+ cryptography (as defined in Aries RFC 0646). This means edge agent users can present a proof with a subset of their BBS credential’s attributes without revealing all the other attributes that may contain sensitive data. This is not otherwise possible with other W3C VC types, such as Ed25519-based VCs. 

Version 2.0 adds support for end-to-end encrypted chatting 
 

All Aries protocol messages transmitted between the Sudo Platform Edge Agent SDK and its peers (issuers, verifiers, etc.) are already end-to-end encrypted (E2EE), but with our added support for Aries “Basic Messages” (RFC 0095), our users can now send and receive text chat messages with their DIDComm connections. 

Anonyome has built a comprehensive API around this feature, with support for message storage and real-time subscriptions. This new suite of APIs allows our customers to build E2EE chat use cases into their applications, whether that is E2EE chat with an issuer or verifier (e.g. a bank or DMV), or with a fellow edge agent (mobile to mobile chat). The Edge Agent SDK supports these use cases. 

The SDK exists in Anonyome’s suite of DI offerings 
 

Anonyome continues to advance its collection of DI offerings, supporting a wide range of use cases. The Sudo Platform Edge Agent SDK continues to seamlessly integrate with our highly scalable DI message relay solution, the Sudo DI Relay SDK, allowing the edge agent to reliably receive messages from the DIDComm peers it connects with. 

What’s more, we continue to thoroughly test our edge agent against our Cloud Agent Service offering (in addition to a suite of open-source Aries agents) to make sure all our features remain interoperable in the DI ecosystem. 

We have a rich roadmap ahead for the SDK 
 

Maturing into its second stable release, the Sudo Platform Edge Agent SDK has achieved the roadmap tasks that we laid out in this article. But given the evolving nature of the DI industry, Anonyome plans to continue adding new features to every new release of the SDK so our customers’ applications can stay up to date with the latest and greatest in the DI ecosystem.  

In the future for the SDK, we plan to: 

Implement V2 Aries protocols for connection establishment, particularly DIDExchange (RFC 0023) and Out-of-Band protocol (RFC 0434)  Implement parts of the OpenID4VC DI stack, allowing the Edge Agent SDK to receive and present credentials with a wider audience of standards, particularly the EUDI’s Architecture and Reference Framework  Expand the set of DID methods we support, such as did:cheqd, did:indy, and did:peer, again increasing the level of interoperability with more DI agents. 

Sudo Platform lets you put privacy in your customers’ hands 

Anonyome Labs’ Sudo Platform is the mobile and cloud platform for decentralized identity. Use our APIs and SDKs to quickly build and deploy next-generation privacy, cybersafety, and decentralized identity apps so your customers can communicate privately, navigate online safely, and transact securely in an increasingly connected world. 

Sudo Platform combines a scalable identity foundation and menu of enterprise-ready APIs and SDKs, built for developers by developers. Quickly integrate our technology into your new or existing products. Create a custom solution or choose a pre-configured option. 

Talk to us today 

You might also like: 
 

Verifiable Credentials – the Killer Feature of Decentralized Identity  The Surprising Outcome of Our Comparison of Two Leading DI Governance Models  Why More Companies are Turning to SaaS for Decentralized Identity Solutions  Want to Monetize Privacy? Here’s How to Do it, Fast  17 Industries with Viable Use Cases for Decentralized Identity  5 Aha! Moments About Decentralized Identity from the Privacy Files Podcast  What Our Chief Architect Said About Decentralized Identity to Delay Happy Hour 

And check out our podcast, Privacy Files, to hear what your peers and experts are saying about the state of member and consumer privacy in real time. 

The post 5 Ways Anonyome Has Improved its Market-Leading DI Mobile Wallet SDK  appeared first on Anonyome Labs.


KuppingerCole

Foundational Security – the Critical Cyber Security Infrastructure

In the landscape of cybersecurity, the foundation remains unshakable, and these timeless principles continue to shape our digital defenses. Despite the rapid pace of technological advancement, there are certain aspects that demonstrate that threats persist over time. Using modern technologies to solve these perennial problems requires a sophisticated understanding of both historical challenges a

In the landscape of cybersecurity, the foundation remains unshakable, and these timeless principles continue to shape our digital defenses. Despite the rapid pace of technological advancement, there are certain aspects that demonstrate that threats persist over time.

Using modern technologies to solve these perennial problems requires a sophisticated understanding of both historical challenges and emerging threats. By using advanced tools and methods, organizations can strengthen their defenses while adapting to evolving cyber landscapes. Join us to explore how innovative solutions work with traditional principles to create a resilient cybersecurity infrastructure.

Paul Fisher, Lead Analyst at KuppingerCole, will shed light on why IT Security sometimes appears overly complex and how industry standards like NIST can streamline cybersecurity efforts. He'll also explore the future role of LLMs in achieving security basics.

Brian Chappell, VP of Product Management at One Identity, will share insights into establishing robust cybersecurity foundations and the effectiveness of layered defense strategies. Additionally, he'll discuss strategies for mitigating vulnerabilities and combating social engineering attacks.

Join this webinar to:

Explore enduring principles that form the backbone of resilient digital defenses, emphasizing the importance of maintaining a robust foundation amidst evolving threats. Understand the effectiveness of a stratified approach to cybersecurity, incorporating modern tools and methodologies to fortify organizational defenses against diverse threats. Learn strategies to address longstanding vulnerabilities that have persisted for decades, including practical recommendations for identifying, prioritizing, and mitigating these vulnerabilities within organizational systems. Gain insights into the tactics employed by cyber adversaries to exploit human vulnerabilities, and receive actionable recommendations for fostering a culture of cybersecurity awareness and vigilance among employees. Discover how innovative solutions intersect with traditional principles to adapt and respond to emerging cyber threats, ensuring organizations remain resilient in the face of evolving cyber landscapes.


Microsoft Entra (Azure AD) Blog

User insights: Analyze customer identity data

Today, we're excited to announce the general availability of user insights in Microsoft Entra External ID.   User insights, which was launched in public preview in October 2023, is a powerful tool that enables admins and developers to gain deeper insights into their customers’ behavior, preferences, and challenges. It provides key metrics such as monthly active users (MAU), daily active u

Today, we're excited to announce the general availability of user insights in Microsoft Entra External ID.

 

User insights, which was launched in public preview in October 2023, is a powerful tool that enables admins and developers to gain deeper insights into their customers’ behavior, preferences, and challenges. It provides key metrics such as monthly active users (MAU), daily active users (DAU), new users added, requests over time, authentications over time, multi-factor authentication (MFA) usage by type, and MFA success versus failure rates. You can also filter and segment the data by time range, operating system, country, and application id. With user insights, you can:

 

Analyze the trends and patterns of your customers’ application login and registration activity and discover new opportunities for growth and improvement. Optimize user experience and identity management strategies for your customers and make data-driven decisions that align with your business goals and user needs. Build customized dashboards in tools like Power BI using user insights from Microsoft Graph APIs, allowing more flexibility and control of your customer identity data. 

 

To access user insights, you need to have a Microsoft Entra External ID external tenant. Once you have your tenant ready, you can access the dashboards on the Microsoft admin center or access raw data via Microsoft Graph APIs. The features we’re announcing today provide significant value and are based on direct feedback from our preview customers. Sign up for your free trial here.

 

Export data to Excel for offline analysis

 

The ‘export to Microsoft Excel’ feature is a convenient way to access raw data from the dashboards to suit different user preferences and use cases. You will now be able to export data in comma-separated values (CSV) format to facilitate the seamless importation and manipulation of data with Excel, or any other preferred CSV editor. This allows customers to use data offline for their own customized analysis and manipulation.

 

Figure 1: Export authentications data to Microsoft Excel

 

Tailor and optimize your identity management solution for different user segments 

 

You can filter data by language and identity provider to get more insights into the preferences and behavior of your users. For example, you can see which languages are most popular among your users and how they vary across applications and regions. You can also see which identity providers are used the most for authentication and how they may affect the user experience and retention. These filters help you tailor and optimize your identity management solution for different user segments.

 

Figure 2: Analyze authentications data by identity provider or language customization

 

Improving user experience and security with MFA failure insights

 

The MFA failure chart shows the number of sign-in attempts that failed due to MFA issues and why they failed. You can see the breakdown by three categories:  

 

Bad request: the sign-in request was malformed or invalid. MFA denied: the user entered the wrong verification code or declined the MFA request. MFA incomplete: the user did not complete the MFA request within the time limit.     

 

This breakdown can help you understand the common causes of MFA failures so you can target your resources and improve the user experience and security of your applications. 

 

Figure 3: Monthly MFA failures insights

 

Identify potential issues with user retention, engagement, and satisfaction

 

Inactive users are users who have not signed in over a certain period. You can see the number of daily and monthly inactive users in your applications, as well as the trend over time. This metric can help you monitor user engagement and identify areas where you can improve your user experience or offer incentives to re-engage your users. 

 

Figure 4: Active, inactive, and new user trends over time

 

Get started today

 

To access and view data from user insights, you must have a Microsoft Entra External ID external tenant with registered applications that have customer sign-in or sign-up data. Use our quickstart guide to create a trial tenant and access user insights on the Microsoft admin center or access the raw data via Microsoft Graph APIs. Visit our docs to learn more about how to access this new feature and how to view, query, and analyze user activity.

 

To learn more or test out other features in the Microsoft Entra portfolio, visit our developer center. Sign up for email updates on the Identity blog for more insights and to keep up with the latest on all things Identity, and follow us on YouTube for video overviews, tutorials, and deep dives.

 

Learn more about Microsoft Entra  

Prevent identity attacks, ensure least privilege access, unify access controls, and improve the experience for users with comprehensive identity and network access solutions across on-premises and clouds. 

Microsoft Entra News and Insights | Microsoft Security Blog  ⁠⁠Microsoft Entra blog | Tech Community   ⁠Microsoft Entra documentation | Microsoft Learn  Microsoft Entra discussions | Microsoft Community  

Finicity

How data-enabled income and employment verifications deliver smarter, seamless financial experiences 

To be competitive in an era of accelerating consumer expectations, deploying a seamless application experience for personal lending, mortgage, tenant screening and other use cases is a must. Slower, resource-laden… The post How data-enabled income and employment verifications deliver smarter, seamless financial experiences  appeared first on Finicity.

To be competitive in an era of accelerating consumer expectations, deploying a seamless application experience for personal lending, mortgage, tenant screening and other use cases is a must. Slower, resource-laden manual processes invite higher consumer drop-off rates, introduce inefficiencies and can be vulnerable to fraudsters.  

Mastercard Open Banking, provided by Finicity, a Mastercard company, is equipping its customers with the capabilities to deliver smarter and more efficient digital lending experiences.  With the expansion of its Verification of Income and Employment (VOIE) solution to include credentialed payroll, Mastercard Open Banking enables consumers to permission access to their payroll account data, mitigating the need to collect income documentation.  

A more efficient digital experience within the loan application process is precisely what open banking customers need – and credentialed payroll delivers a seamless, secure income and employment verification process, and reliable data at speed. The latest VOIE innovation populates consumer-permissioned data directly into the correct fields, creating a fast, secure digital experience that can boost profitability while making life easier for your applicants and employees.  

Strong partnerships that drive industry-leading innovations 

Mastercard Open Banking teamed up with industry-leading payroll data aggregator Argyle to power this expanded VOIE solution. With 90%+ coverage of the U.S. workforce, financial institutions can now digitally verify an applicant’s income and employment in just moments.  

“We’re excited to partner with Mastercard Open Banking to make digital verification of income and employment widely available through our trusted network of consumer-permissioned connections,” said Brian Geary, COO at Argyle. “Together, we are creating smarter lending experiences, free from time-consuming manual touchpoints while achieving faster and more accurate verification​s​,” he added. The combined capabilities of Mastercard Open Banking and Argyle are driving VOIE to be an efficient, flexible solution that comes in at a much more affordable price point than other income and employment verification options, poised to benefit financial institutions and consumers alike. 

“Mastercard Open Banking’s new credentialed payroll enhancement was one of the simplest product launches we have done,” said Josh Cilman, EVP at Intercoastal Mortgage. “The new experience integrated smoothly into our existing processes with no impact to loan processing workflows and an efficient borrower experience. We are excited to digitize our loan application experience with a much more cost-effective way to verify income and employment.” 

Multiple verification solutions to fit your business 

Credentialed payroll is just the latest addition to Mastercard Open Banking’s growing list of digital income and employment verification solutions for smarter and faster decisioning. 

Mastercard Open Banking’s Deposit Income verification solution analyzes direct deposit streams from the consumer’s permissioned bank account data to identify an applicant’s income.  

Freddie Mac launched Loan Product Advisor® (LPASM) asset and income modeler (AIM) for income using direct deposits in March 2022 with Mastercard Open Banking as an initial service provider. In March 2024, Fannie Mae announced general availability of Deposit Income as part of their Desktop Underwriter (DU®) validation service. These solutions help streamline the mortgage origination process for lenders and homebuyers while expanding opportunities for home ownership. 

Mastercard Open Banking solutions are used by financial institutions and decision makers in varied use cases, including mortgage lenders, tenant screeners, personal lenders, credit card issuers and more.  Credentialed payroll and Deposit Income solutions provide financial institutions with the high conversion, affordable income, and employment solutions they seek.  

Bank account and payroll account data can be used independently or in combination to enhance the customer experience depending on the institution’s needs and workflows. 

Partner with Mastercard to create better consumer experiences 

The Mastercard Open Banking platform delivers next-generation financial experiences that delight consumers. The platform serves as a one-stop shop for digital verifications, with the capabilities to verify assets, balances, income, employment, cash flow and much more, seamlessly and at scale. Credentialed payroll and Deposit Income solutions are available today through Mastercard Open Banking’s lending solutions

The post How data-enabled income and employment verifications deliver smarter, seamless financial experiences  appeared first on Finicity.


Elliptic

Crypto regulatory affairs: UAE Central Bank approves plan for stablecoin registration framework

The Central Bank of the United Arab Emirates has paved the way for the country to have a regulatory framework for stablecoin issuance. 

The Central Bank of the United Arab Emirates has paved the way for the country to have a regulatory framework for stablecoin issuance. 


Verida

Verida Raises $5M in Seed and Strategic Funding to Solve AI data privacy problem

Verida Raises $5M in Seed and Strategic Funding to Solve AI data privacy problem Verida, a pioneer in DePIN networks, announced the launch of its personal confidential compute platform to address the growing concerns over user data privacy in the age of Big Tech AI. The company’s latest funding round brings the total value raised to over $5 million, reaching a valuation of $50 million. F
Verida Raises $5M in Seed and Strategic Funding to Solve AI data privacy problem

Verida, a pioneer in DePIN networks, announced the launch of its personal confidential compute platform to address the growing concerns over user data privacy in the age of Big Tech AI. The company’s latest funding round brings the total value raised to over $5 million, reaching a valuation of $50 million.

Fueled by the immense value of data in the AI era, the global AI market is expected to reach $10 trillion by 2025 according to Deloitte. However, this growth raises concerns about user privacy, as the platforms hosting these AI models have access to user prompts. Additionally, current AI models rely heavily on massive datasets collected by companies like Google and Apple. This raises serious privacy issues, leaving users with limited control over how their data is used or shared with third-party AI applications — an issue highlighted this week by Elon Musk concerned over Apple’s ChatGPT integration.

Verida is on a mission to ensure everyone is in control of their own identity and data, without the need to rely on third parties. The company has already proven its impact on digital security across critical industries, helping keep patient healthcare data, helping an entire country adopt Web3 technology, and disrupting tech monopolies.

Now, Verida’s innovative platform will leverage their decentralized storage infrastructure to keep user data private and in control of its creators to create hyper-personal AI experiences while retaining user privacy. Their decentralized storage infrastructure, leveraging Zero-Knowledge credentials and blockchain technology, to encrypt private information, will enable users to build custom AI models trained on their own data, ensuring a more tailored and relevant experience, as well as granting permission for specific AI applications to access and utilize their data, promoting transparency and user control.

The funding round included participation from O-DE Capital Partners, ChaiTech Ventures, Simurg Labs and others who join existing investors Gate Labs, HASH CIB, Bison Capital, Amesten Capital, Evan Cheng (Mysten Labs), as well as community pre-sales of the Verida Storage Credit token.

To date, Verida has secured over 20 ecosystem partners including Polygon ID, NEAR, Partisia, Redbelly, zkPass, Kima, Nillion, cheqd and continues to grow its network within the industry.

With their recent injection of funding, Verida is poised to disrupt the AI landscape by putting users back in control of their data, unlocking more secure, privacy-preserving personalized AI for everyone.

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for a wide range of industries. With a thriving community and a commitment to transparency and security, Verida is leading the charge towards a more decentralized and user-centric digital future. For more information, visit www.verida.network

Verida Raises $5M in Seed and Strategic Funding to Solve AI data privacy problem was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

Improving Healthcare Identity Verification Experiences

Streamline healthcare identity verification processes to significantly enhance the overall consumer experience. Reduce costs and improve satisfaction.

As the healthcare industry continues its digitization journey, connecting a consumer's physical identity to their digital identity, while maintaining confidence that they are who they say they are, is increasingly vital.

 

Registering an account is the first step to enter a healthcare organization’s digital front door. For consumer safety, regulatory compliance, and operational efficiency, it is crucial to verify that only authorized individuals receive care and have access to health information, while upholding robust security protocols. 

Monday, 24. June 2024

Ontology

Revolutionizing Identity

Decentralized Approaches Across Industries The urgency to innovate identity management systems has never been greater. Recent statistics paint a sobering picture of identity theft and fraud across various sectors: approximately one in three Americans have fallen victim to identity theft. Moreover, the global cost of cybercrime, including identity fraud, is projected to soar to an astonishing $9.5
Decentralized Approaches Across Industries

The urgency to innovate identity management systems has never been greater. Recent statistics paint a sobering picture of identity theft and fraud across various sectors: approximately one in three Americans have fallen victim to identity theft. Moreover, the global cost of cybercrime, including identity fraud, is projected to soar to an astonishing $9.5 trillion annually by 2024. The prevalence and sophistication of these threats are accelerating, exemplified by a tenfold increase in detected deepfakes worldwide from 2022 to 2023, posing significant security challenges across all industries.

These figures not only reflect the scale but also the evolving complexity of cyber threats. The average cost of a data breach now stands at a record $4.45 million globally, marking a 15% increase over three years. Simultaneously, the number of data breaches and their severity continue to escalate, with a 15% rise in incidents and an 11% surge in breach severity noted just within the United States from 2022 to 2023.

Such statistics underscore the critical vulnerabilities within traditional, centralized identity management systems and highlight the growing imperative for more secure, resilient solutions. Decentralized identity (DID) technology, leveraging blockchain’s inherent security and transparency, offers a promising alternative. By empowering individuals with control over their personal data through self-sovereign identity principles, DID not only enhances security but also significantly mitigates the risks of data breaches and identity theft. This technological shift is crucial for protecting personal and organizational assets against the increasingly sophisticated landscape of cyber threats and for paving the way toward a more secure digital future.

To address these pervasive challenges, this article delves into several compelling use cases of decentralized identity (DID) across diverse sectors such as healthcare, finance, education, and more. Each example illustrates how DID not only counters the threats outlined but also offers transformative benefits — enhancing privacy, streamlining operations, and fostering innovation. As we explore these use cases, it becomes evident how DID stands as a cornerstone for building a resilient, efficient, and more equitable digital landscape, empowering individuals and institutions alike to navigate the complexities of the modern cyber environment safely and effectively. This exploration aims not just to inform but to inspire action towards the widespread adoption of decentralized identity solutions that are vital for securing our digital futures.

Healthcare

The healthcare sector handles highly sensitive personal data, including medical records, insurance details, and patient identities. Remarkably, one out of every three Americans has faced identity theft in healthcare, highlighting an urgent need for robust, secure solutions. Decentralized identity (DID) systems could be the key to revolutionizing how we manage healthcare information, making processes more secure, efficient, transparent, and patient-centered.

In the realm of healthcare, DIDs are particularly pivotal. The landscape of healthcare data encompasses a vast network of providers, emergency services, and facilities. Through DIDs, patients gain complete ownership and control over their data. This empowers them to access, review, amend, and manage consent for data sharing, request deletion of their data, or even choose to be forgotten.

A prime application of DIDs in this sector is in verifying the credentials of healthcare staff and vendors. The healthcare ecosystem is a complex web of interactions requiring stringent authentication. Decentralized identity management enables individuals to create and control their digital identities securely via cryptographic, verifiable credentials stored in digital wallets. This method ensures that individuals can prove their identity and qualifications without unnecessary exposure of their information or dependence on a central authority.

Moreover, DIDs can significantly cut administrative costs. Individuals can use their verifiable credentials to explore medical benefits, compare healthcare plans, and enroll in services directly through a digital portal, eliminating the need for repetitive paperwork. These credentials can also store insurance details and provide necessary proof of attestation when required.

DIDs enhance secure communications between patients and healthcare providers as well. For example, patients can verify their eligibility for services through a mobile application before even visiting a clinic. At the healthcare facility, this same application can seamlessly integrate necessary details like names and insurance information for efficient check-in processes.

Additionally, blockchain technology can serve as a decentralized record-keeping service, storing metadata about a patient’s records primarily in the form of public keys and non-personally identifiable information (PII).

However, the integration of decentralized identity management in healthcare is not without its challenges. Issues such as interoperability, regulatory compliance, widespread adoption, and the inherent technical complexities need to be addressed. The road ahead involves extensive research, development, and rigorous testing to ensure these systems are both practical and beneficial.

Despite these hurdles, the potential for improved data security, privacy, immutability, interoperability, and enhanced patient autonomy positions decentralized identity as a transformative force in healthcare. This promising technology could very well redefine the landscape of healthcare decision-making, ushering in a new era of digital health empowerment.

Finance

In the world of finance, handling monetary transactions, investments, and safeguarding customer financial data are paramount. With the average cost of a data breach soaring to $4.45 million globally in 2023, the role of decentralized identity (DID) in bolstering security and minimizing fraud has never been more crucial.

DID is set to revolutionize the financial services industry with its secure, efficient, and user-centric approach to identity management. Leveraging cryptography, DID ensures the protection of identity data while empowering individuals to manage their personal information access and sharing. This enhanced control fosters greater ownership over one’s digital and financial identities.

A significant advantage of DID is its ability to streamline the customer onboarding process. By eliminating repetitive identity verification, DID not only enhances the user experience but also simplifies access to financial services. Additionally, the shift from traditional, insecure password systems to decentralized authentication boosts security significantly.

DID also aids financial institutions in adhering to stringent regulatory requirements, such as Know Your Customer (KYC) and Anti-Money Laundering (AML). The blockchain’s immutable ledger ensures that transactions cannot be altered, which helps prevent certificate fraud and other forms of tampering. Over time, this can lead to reduced costs associated with identity verification and management for financial organizations.

Another critical application of DID is promoting financial inclusion. Approximately one billion people globally lack official identification, which blocks their access to essential financial services. DID offers a solution where only an internet connection and a smart device are needed to establish a self-sovereign identity on the blockchain. This capability is invaluable, particularly in regions affected by political instability where paper records may be compromised. By enabling the unbanked to establish an economic identity, DID opens doors to participation in the global financial ecosystem.

Despite its potential, the adoption of DID within financial services encounters several hurdles. The absence of universally accepted standards poses challenges in system interoperability. Moreover, many institutions remain cautious, sticking with traditional identification methods due to security, compliance fears, or simple unfamiliarity with new technology. The deployment of DID systems also demands specialized technical knowledge and resources.

To navigate these challenges, the industry must collaborate on setting standards, educating stakeholders, and initiating pilot projects to showcase the practical advantages of DID. As investment in digital identity verification technologies grows, particularly within banking and fintech sectors, DID is increasingly recognized as a forward-thinking solution poised to reshape the future of financial services.

E-Government

E-government harnesses digital technologies to provide public services and foster citizen engagement. In the U.S., approximately 29 million adults lack a valid driver’s license, and another 7 million are without any form of government-issued photo ID, underscoring the potential for decentralized identity (DID) to broaden equitable access to government services.

DID technology is poised to transform e-government by offering a secure, efficient, and user-centric approach to identity management. It enables governments to enhance service delivery and mitigate the risks linked to centralized data repositories.

A notable advantage of DID within e-government is its ability to expedite information validation. Credentials in a DID system are verified once upon issuance and remain trusted, removing the need for continuous paperwork or manual verification. This efficiency allows government entities to quickly and accurately engage with citizens, accessing verified information almost instantaneously.

Moreover, DID significantly bolsters security against fraud. The use of a distributed ledger for verification means that identity theft becomes nearly impossible, thus elevating the protection of sensitive personal data. Under DID, individuals own and control their data — information that is not stored by government agencies or DID providers, thereby enhancing privacy.

Cost reduction is another critical benefit. Governments can lower expenses linked to identity verification, data security enhancements, and the management of personally identifiable information (PII). By eliminating the need for extensive data storage, governments reduce the attractiveness of their systems to cyber attackers, mitigating risks such as costly ransomware attacks.

DID also unlocks new possibilities for e-government services. It could, for instance, enable secure online voting, allowing citizens to authenticate their identities confidently and cast their votes electronically. Additionally, DID can facilitate the creation of secure online communities for more interactive government engagement and collaboration.

However, the implementation of DID in e-government is not without challenges. Issues such as regulatory compliance, system complexity, and interoperability must be addressed. The integration of DID with existing legal and industry standards remains unclear, and DID systems may present a steeper learning curve compared to traditional centralized systems. Achieving interoperability among various DID solutions is essential yet challenging.

Despite these hurdles, governments globally are increasingly experimenting with and adopting DID to enhance public service delivery. For instance, the Canadian province of British Columbia has initiated a DID pilot project to establish a reliable digital identity for its residents, facilitating online access to government services. As the technology matures, DID holds the potential to make e-government more secure, efficient, and user-friendly, setting the stage for a revolution in how public services are accessed and delivered.

Education

The education sector, characterized by its vast array of learning institutions, academic records, and student identities, faces significant challenges from the estimated $7 billion market for fraudulent degrees and transcripts. Instances like thousands of nurses caught practicing with fake credentials underscore the crucial role decentralized identity (DID) could play in combating credential fraud.

DID technology promises a transformation in education through a more secure, efficient, and student-centric approach to managing data and credentials. It enables institutions to digitize and issue verifiable credentials like transcripts, grades, and diplomas as digital badges or certificates. Students can then store these in their digital wallets, sharing them as needed with colleges or prospective employers, streamlining the authentication of academic achievements.

A primary advantage of DID is the facilitation of a streamlined verification process for student records. Organizations equipped with the appropriate verification tools can instantly confirm the authenticity no of credentials without having to contact the issuing institution. This is particularly advantageous for environments like college campuses, where a robust identity and access management system is essential for accessing dorms, dining halls, and other facilities.

Moreover, the immutability of blockchain-stored records via DID makes credential falsification nearly impossible. This fortifies the integrity of academic credentials and directly addresses the rampant issue of fraudulent educational qualifications.

DID further empowers students by giving them control over their academic data. They can manage and share their credentials independently, without reliance on the administrative systems of educational institutions. This autonomy is especially beneficial when the issuing institution ceases to exist, ensuring that academic achievements are preserved indefinitely, supported by standards like Open Badges 3.0.

Additionally, DID can streamline administrative processes, such as college applications and the hiring process. Students can share verified credentials directly with admissions offices or potential employers, simplifying the submission of required documents and speeding up the verification process.

However, the adoption of DID in education is not without challenges. Issues such as interoperability between different DID systems, user adoption, and compliance with privacy regulations like FERPA in the US are critical hurdles to overcome. Ensuring seamless data exchange and educating stakeholders about the benefits and operation of DID are essential for its widespread integration.

Despite these obstacles, DID’s potential to enhance security, efficiency, and student empowerment presents a promising outlook for education’s future. Initiatives like the Digital Credentials Consortium are actively working to establish standards and best practices for implementing DID in educational settings. As the technology matures, DID is poised to revolutionize the management and sharing of educational records, heralding a new era of transparency and trust in academic credentials.

Marketplaces

Marketplaces are central hubs where goods and services are exchanged between buyers and sellers. Decentralized identity (DID) is revolutionizing these platforms by enabling peer-to-peer interactions and bridging the trust gap that often exists between transacting parties.

Traditionally, marketplaces relied on centralized intermediaries to facilitate transactions and establish trust among parties who typically have no personal contact. DID, however, utilizes blockchain technology and the principles of self-sovereign identity, allowing individuals to control their own digital identities. This shift enables users to engage in direct, trusted interactions without the need for centralized authorities.

A primary advantage of DID in marketplaces is the enhanced privacy and control over personal data it offers users. With DID, individuals create and manage their identities using decentralized identifiers and verifiable credentials, which can be stored in digital wallets. These credentials can then be selectively shared to prove identity, qualifications, or other attributes as needed, empowering users with full sovereignty over their personal information and reducing the risk of unauthorized access or misuse.

DID also simplifies the identity verification process within marketplaces. Instead of submitting sensitive information for each transaction, users can present their verified credentials directly to service providers or transaction partners, akin to showing a passport or driver’s license. This streamlined approach facilitates more efficient and secure onboarding, reducing friction and enhancing trust between parties.

Furthermore, DID plays a critical role in preventing fraud and reinforcing marketplace integrity. The immutability and transparency of blockchain records make it challenging to falsify or tamper with identity data. By requiring users to verify their unique humanness, DID helps mitigate Sybil attacks — where one individual creates multiple fake identities — thus ensuring that marketplace participants are genuine and trustworthy.

DID also opens the door to innovative decentralized marketplaces and business models. For example, it can enable peer-to-peer marketplaces for data, allowing individuals to monetize their personal information while maintaining control over its use. Additionally, DID supports the development of decentralized sharing economy platforms, where users can directly rent out assets or services without intermediaries.

However, the implementation of DID in marketplaces is not without challenges. Issues such as interoperability, user experience, and regulatory compliance must be addressed. Ensuring seamless data exchange and verification across different DID systems is essential for broad adoption. Moreover, creating user-friendly interfaces and educating users about the benefits and functionalities of DID are crucial for encouraging uptake. Compliance with regulations like GDPR and PSD2 is also necessary to ensure legal conformity.

Despite these hurdles, the benefits of DID — including user empowerment, increased efficiency, and enhanced trust — position it as a transformative technology for the future of marketplaces. As standards and infrastructure around DID continue to evolve, we can anticipate an increase in marketplaces leveraging this technology to facilitate secure, peer-to-peer transactions and create new opportunities for value exchange.

Real Estate

The real estate industry, which encompasses property ownership, sales, and management, stands to gain significantly from the integration of decentralized identity (DID) technology. By employing blockchain and self-sovereign identity principles, DID offers a more secure, efficient, and transparent framework for managing property data and transactions, potentially revolutionizing the sector.

A primary advantage of DID in real estate is the secure and efficient management of property data. Homeowners can store essential documents like titles, deeds, and inspection reports in a secure digital wallet, allowing for easy access and verification by authorized parties such as lenders, insurers, or prospective buyers. This capability not only preserves the privacy and control of the homeowner but also reduces the need for repetitive paperwork and manual verification processes, thereby saving time and minimizing errors in transactions.

DID also plays a critical role in preventing fraud and building trust within the real estate market. The immutable nature of blockchain records makes altering or falsifying property ownership data nearly impossible. Sellers can use DID to authenticate their identity and ownership, enhancing trust with buyers and mitigating risks associated with title fraud or counterfeit listings. Conversely, buyers gain increased assurance in the legitimacy of their property purchases.

Another innovative application of DID is in the tokenization of real estate assets. By converting property ownership into digital tokens on a blockchain, DID facilitates fractional ownership, simplifying the trading of real estate investments on decentralized exchanges. This approach lowers the barriers to real estate investment, enhances market liquidity, and provides more nuanced control over property rights, such as the division of ownership among several parties.

DID can further streamline the financing and mortgage processes within real estate. Lenders can efficiently verify borrowers’ identities, credit histories, and property details using DID, expediting the loan underwriting process. Additionally, smart contracts can automate key aspects of the mortgage workflow, such as the disbursement of funds upon fulfillment of specific conditions, thereby accelerating the financing process for buyers.

However, the implementation of DID in real estate is not without its challenges. The industry is governed by complex legal and financial regulations that vary by jurisdiction, all of which must be navigated carefully by DID solutions. Ensuring interoperability among different DID systems is essential for their widespread adoption in the industry. Moreover, educating property owners, investors, and other stakeholders about the benefits and functionalities of DID is crucial to encourage its use.

Despite these hurdles, the potential advantages of DID — such as enhanced efficiency, security, and accessibility — render it a promising technology for the future of real estate. As DID standards and infrastructure continue to evolve, it is expected that an increasing number of real estate entities will adopt this technology to streamline transactions, reduce costs, and unlock new investment and ownership opportunities.

Travel

The travel industry, encompassing transportation, accommodation, and tourism services, faces significant security challenges, with over 50% of consumers across 18 countries reporting fraud attempts in 2023. Decentralized identity (DID) technology offers a transformative solution by enhancing security and convenience for travelers through a more secure, efficient, and user-centric approach to managing traveler data and credentials.

The Decentralized Identity Foundation (DIF) Hospitality and Travel Special Interest Group, with experts from over 35 global companies, is dedicated to developing technical specifications and fostering DID adoption within the sector. This collaborative effort highlights the industry’s commitment to leveraging DID for improving travel experiences.

One of the primary benefits of DID in travel is the streamlined guest experience. Travelers can store verified credentials, such as passports, visas, health records, and payment information, in a digital wallet on their smartphone. This integration allows seamless data sharing with airlines, hotels, and other service providers, eliminating the need for repetitive paperwork or physical document handling. For instance, hotels can issue reservation credentials directly to guests’ wallets, enabling contactless check-ins and room access without manual reception verification.

DID further enhances traveler privacy and control over personal data. Through mechanisms like zero-knowledge proofs, travelers can selectively disclose only the necessary data for each interaction, significantly reducing the risk of unauthorized data access or misuse. This approach ensures that travelers retain full sovereignty over their data, with the ability to grant or revoke access permissions at any time.

Moreover, DID enables efficient and secure data sharing among partners within the travel ecosystem. Utilizing open standards like W3C Verifiable Credentials allows different service providers to instantly verify traveler information without needing direct integration or creating data silos. This capability improves interoperability and reduces friction throughout the travel process, enabling scenarios such as airlines recognizing hotel loyalty status for upgrade eligibility or destinations offering personalized experiences based on verified traveler preferences.

In the post-pandemic landscape, DID is particularly valuable for navigating health and safety requirements. Travelers can securely store and present health credentials, such as vaccination records or COVID-19 test results, using their digital wallets. This system supports efficient and privacy-preserving health checks at borders or entry points, with initiatives like the IATA Travel Pass and CommonPass utilizing DID principles to facilitate safe international travel resumption.

However, implementing DID in the travel and hospitality industry presents challenges, including fragmented infrastructure, varying regulations, and the need for user education. Achieving interoperability and consistent standards across different DID systems is essential for ensuring seamless traveler experiences. Compliance with data protection laws like GDPR is also critical. Furthermore, educating travelers about the benefits and functionality of DID is necessary to encourage widespread adoption.

Despite these obstacles, the potential of DID to significantly enhance traveler experiences, protect privacy, and enable personalized services positions it as a promising technology for the future of travel and hospitality. As DID standards and solutions continue to evolve, more industry stakeholders are expected to adopt this technology, creating a more secure, efficient, and traveler-centric ecosystem.

Social Media

Social media platforms facilitate online communication and content sharing, but often at the cost of user privacy and data control. Decentralized identity (DID) is set to transform this dynamic by empowering users with greater control over their personal data and online interactions, utilizing blockchain technology and self-sovereign identity principles to allow users to manage their digital identities independently from centralized platforms.

A major advantage of DID in social media is significantly enhanced privacy and security. With DID, users can securely store their personal information and credentials in digital wallets, sharing only what is necessary with each platform or community. This minimizes the risk of unauthorized access, data breaches, and misuse by centralized entities. Users maintain full ownership and control over their data, with the flexibility to grant or revoke access permissions at any time.

DID also promotes censorship resistance and free speech within online communities. Decentralized social media platforms like Mastodon and Minds operate on distributed networks and consensus mechanisms that prevent any single entity from controlling the narrative or silencing users. This encourages a more open and inclusive environment where individuals can express themselves freely without fear of arbitrary censorship.

Another key benefit of DID is the facilitation of user-controlled networks and community governance. Decentralized social media platforms typically employ community-driven decision-making processes, allowing users to collectively set rules and policies. This participatory governance model ensures that the platform evolves in a way that aligns with the interests and values of its community, fostering diversity and free expression.

Additionally, DID can innovate social interaction and content monetization. For instance, decentralized social media platforms can tokenize user interactions, enabling creators to earn rewards for their content and engagement. This creates a more equitable ecosystem where community contribution and value are directly incentivized. DID also supports decentralized content sharing, utilizing distributed storage solutions like IPFS to enhance availability and resist censorship.

However, integrating DID into social media and online communities presents challenges such as user adoption, technical complexity, and content moderation. Educating users about the benefits and functionalities of DID is crucial for encouraging widespread adoption. Decentralized platforms often present a steeper learning curve than their centralized counterparts. Moreover, balancing free speech with the need to protect users from harmful content continues to be a complex issue for decentralized communities.

Despite these challenges, the potential of DID to empower users, protect privacy, and cultivate dynamic online communities positions it as a highly promising technology for the future of social media. As DID standards and infrastructure continue to develop, we can anticipate an increase in platforms and communities adopting this technology to create a more user-centric and equitable social media landscape.

Telecommunications

The telecommunications sector, which facilitates the transmission of voice, video, and data communications, faced a staggering $29 billion loss due to fraud in 2018 alone. This alarming statistic underscores the vital role decentralized identity (DID) technology could play in enhancing security and reducing fraud within this industry.

DID is set to revolutionize telecommunications by providing a more secure, efficient, and user-centric approach to managing customer data and access to services. As telecom operators are frequent targets for fraudulent activities, DID introduces a robust solution by employing blockchain and self-sovereign identity principles. This approach enables users to have greater control over their personal information, simultaneously streamlining critical processes like customer onboarding, authentication, and service provisioning.

A key advantage of DID in telecommunications is the significant enhancement of security and reduction in fraud risk. Storing user identities on a decentralized blockchain, as opposed to centralized databases, makes them much harder to compromise. The decentralized nature of DID systems inherently disincentivizes malicious attacks, and by eliminating traditional login IDs and passwords — which are common exploitation points — DID offers a more secure and privacy-preserving authentication method through cryptographic verifiable credentials.

DID also optimizes the customer onboarding and identity verification process for telecom providers. Rather than collecting and verifying customer information repeatedly, telecom operators can utilize DID to allow customers to verify their identity once with a trusted issuer and reuse those credentials to access various services. This not only saves time and resources but also enhances the customer experience. For instance, a customer could quickly sign up for a mobile plan by presenting a verified government-issued ID from their digital wallet, bypassing the need for filling out extensive forms or undergoing manual checks.

Additionally, DID facilitates seamless and secure access to digital services. Users can store various credentials — such as identity documents, payment methods, and service entitlements — in a digital wallet on their smartphone, simplifying authentication and access to services like mobile banking or e-commerce. In this model, telecom operators can serve as trusted identity providers, issuing digital credentials to customers and enabling access to a broader range of online services.

Moreover, DID can assist telecom providers in complying with stringent regulatory requirements concerning customer data privacy and security. By allowing users to control their personal data and reducing the amount of information stored by operators, DID supports compliance with regulations like GDPR. Features such as zero-knowledge proofs and selective disclosure enable users to share only the essential information for each transaction, further preserving privacy.

However, the adoption of DID in telecommunications is not without challenges. Issues such as interoperability, user adoption, and integration with existing legacy systems must be addressed. Ensuring that different DID systems can effectively exchange and verify credentials across operators and service providers is crucial for broad adoption. Moreover, educating users on the benefits and functionality of DID is essential to foster uptake. Telecom providers also face the technical complexities of integrating DID technology into their existing infrastructure and processes.

Despite these obstacles, the potential of DID to enhance security, streamline operations, and unlock new business opportunities positions it as a promising technology for the future of telecommunications. Initiatives like the Decentralized Identity Foundation (DIF) Telco Working Group are instrumental in bringing together industry stakeholders to develop standards and promote the adoption of DID in the sector. As DID continues to evolve, we can anticipate more telecom operators leveraging this innovative technology to transform identity management and service delivery, ultimately benefiting both customers and their bottom line.

Web3 Gaming

Web3 gaming, which involves decentralized video games built on blockchain technology, is a burgeoning sector that can greatly benefit from the integration of decentralized identity (DID). By incorporating blockchain technology and the principles of self-sovereign identity, DID equips players with enhanced control over their digital identities and in-game assets, while boosting security, privacy, and interoperability across various gaming platforms.

A primary advantage of DID in Web3 gaming is empowering players with genuine ownership of their in-game items and progress. Through DID, players can securely store their gaming credentials, achievements, and digital assets in personal wallets, independent of any specific gaming platform. This autonomy allows players to transfer their gaming history and possessions seamlessly across different games and metaverses, enabling them not to be confined to a single ecosystem. For instance, a player could use their DID to effortlessly move their avatar, skins, and rankings from one game to another, preserving their progress and investments.

Additionally, DID enhances security and helps prevent fraud within Web3 gaming. The decentralized nature of DID systems makes it extremely difficult for hackers to compromise player accounts or steal in-game assets. By utilizing cryptographic verifiable credentials, DID facilitates secure player authentication without the vulnerabilities associated with traditional passwords or centralized servers prone to attacks. This robust security framework protects players from prevalent threats such as account takeovers, item duplication, or unauthorized trading.

DID also introduces new models of gaming governance and community participation. Many decentralized gaming platforms employ player-driven decision-making processes, allowing the community to collectively set rules, features, and the developmental trajectory of the game. DID enables players to securely engage in these governance activities, such as voting on proposals or earning rewards for contributions, based on their verified identities and involvement within the game. This participatory approach fosters a more democratic and inclusive gaming environment, where players have a direct impact on shaping their experiences.

Furthermore, DID can drive innovative monetization and reward systems in Web3 gaming. Decentralized platforms can utilize DID to facilitate player-to-player trading of in-game items or currencies without intermediaries. Players can also accumulate tokens or reputation points based on their achievements and contributions, which are recognized and rewarded across various games via DID. This approach opens up new avenues for players to monetize their skills and engagement, encouraging a more equitable and motivating gaming economy.

However, the implementation of DID in Web3 gaming is not without its challenges. Issues such as scalability, user experience, and regulatory uncertainty need to be addressed. As the number of players and transactions increases, maintaining the performance and cost-effectiveness of DID systems becomes crucial. Moreover, creating intuitive user interfaces and straightforward onboarding processes is essential to encourage widespread adoption among mainstream gamers. Additionally, the evolving legal and regulatory frameworks around digital ownership, cryptocurrencies, and decentralized gaming present complex compliance challenges that need careful navigation.

Despite these obstacles, the potential of DID to transform player identity, ownership, and governance in Web3 gaming marks it as a significant technology to watch. As the standards and infrastructure around DID continue to develop, we can anticipate an increase in gaming platforms and developers leveraging this technology to create immersive, player-centric experiences that surpass traditional gaming models.

Decentralized Identity: The Future

Decentralized identity (DID) technology is poised to revolutionize various industries by providing a more secure, efficient, and user-centric approach to managing digital identities and personal data. As we have seen, DID has the potential to transform sectors such as healthcare, finance, e-government, education, marketplaces, real estate, travel, social media, telecommunications, and Web3 gaming.

By leveraging blockchain technology and self-sovereign identity principles, DID enables individuals to have greater control over their personal information while enhancing privacy, security, and interoperability across different platforms and services. DID allows users to store their verified credentials in secure digital wallets, selectively sharing only the necessary data for each interaction using zero-knowledge proofs and cryptographic techniques.

The benefits of DID are numerous and far-reaching. In healthcare, DID can streamline patient data management, improve interoperability between providers, and enhance patient privacy. In finance, DID can facilitate faster and more secure customer onboarding, reduce fraud, and enable financial inclusion for the unbanked. In e-government, DID can simplify service delivery, prevent identity theft, and reduce administrative costs.

DID also can transform education by enabling secure and verifiable digital credentials, preventing diploma fraud, and empowering students with control over their academic records. In marketplaces and real estate, DID can enable peer-to-peer transactions, enhance trust between parties, and unlock new opportunities for fractional ownership and investment.

In the travel industry, DID can streamline the guest experience, protect traveler privacy, and facilitate seamless data sharing among ecosystem partners. In social media and online communities, DID can empower users with greater control over their data, enable censorship resistance, and foster user-driven governance models.

In telecommunications, DID can enhance security, reduce fraud, streamline customer onboarding, and enable new business opportunities for operators. And in the rapidly evolving world of Web3 gaming, DID is set to play a crucial role in empowering players with true ownership of their digital assets, enabling secure and interoperable gaming experiences, and facilitating new models of community participation and governance.

However, the adoption of DID also faces several challenges across industries. These include technical complexities, interoperability issues, regulatory uncertainties, and the need for user education and adoption. Overcoming these challenges will require collaboration among stakeholders, the development of common standards, and the creation of user-friendly solutions that demonstrate the tangible benefits of DID.

Despite these challenges, the transformative potential of DID cannot be ignored. As the digital landscape continues to evolve, the need for secure, privacy-preserving, and user-centric identity solutions will only grow. By embracing DID, industries can unlock new opportunities for innovation, efficiency, and user empowerment while building a more trustworthy and inclusive digital future.

Revolutionizing Identity was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

Guess Who? Unmasking the face behind the device with Duke Golden.

We sit down with our internal subject matter expert in the telecommunications industry to learn what trends this sector is seeing, predictions on where it is headed as well as advice for the future. Telecommunications is becoming an ever-growing market, not only because we live in a connected world but also because the number of […]
We sit down with our internal subject matter expert in the telecommunications industry to learn what trends this sector is seeing, predictions on where it is headed as well as advice for the future.

Telecommunications is becoming an ever-growing market, not only because we live in a connected world but also because the number of devices and network connections just keeps increasing. One recent study predicts that 5G IoT (Internet of Things) connections will reach 116 million globally by 2026. This industry is therefore confronted with many challenges as the number of devices and connections continues to rise.

According to an identity verification survey conducted by Regula in 2023, 9 out of 10 organizations in the telecommunications industry consider identity verification (IDV) a crucial tool in combating the increasing threat of identity fraud. Furthermore, in the coming years, CIOs will increase investments by 83% in Integration Technologies/APIs/API Architecture according to Gartner. But fraud isn’t the only challenge this industry faces. Telecommunications needs to stay up-to-date and meet the demands that modern technology brings to the table, or shall I say device, if it is to have any future. Since smart devices are as common and habitual as brushing one’s teeth, the direction in which this industry is headed is one in which needs to stay ahead of the curve or get left behind in the pile of outdated technology.

We spoke with Duke Golden, Director Sales at IDnow, with his almost 30 years within the industry and past experiences at 1NCE, Huawei and Vodafone to name just a few, to learn what topics in the telecommunications sector are trending, his predictions on where this industry is headed and his advice for telco operators.

As an industry that is perhaps overlooked due to the fact that it is very much part of our everyday lives and common for the generations that grew up within this technology boom, what major trends would you say the telecommunications industry is therefore seeing?

Telecommunications in our daily lives is almost invisible these days – but it is truly the foundation for our digital lives. It’s just something that you think should be there. I’ve been in the telecommunications industry for about 25 years now. But you’re right, that comment is very much what you get from a generation that grew up with the Internet. But there is a lot that goes into this. Everything we enjoy today from dating apps to ordering groceries, telecommunications is the driving force behind all of these innovations. Nothing happens without a network connection.

To describe it in a different way, network connectivity has very much become a commodity, especially over the last 10-15 years. Connectivity is like bread, milk, eggs, oil, gas—it’s just something we need for our everyday life. However, the struggle right now within the telecommunications industry, especially with regards to Tier-1 operators, is that they need to find a way to make that connectivity into something like a value add for their customers. It can’t just be about the connectivity. The question now is what do telecommunications operators do with that connectivity? They have a great opportunity to do something really special, being the owner of the network connectivity.

So right now we’re at a point where telecommunications providers really need to provide new solutions, incorporating their connectivity to bring additional value add to customers.

Duke Golden, Director Sales

A big part of that value add for customers is the IoT—API driven economies. What they’re doing is basically providing the connectivity and then a platform for API integration for a million different types of services that are intrinsically connected to their connectivity.

But how do you monetize connectivity? You have to create an ecosystem where you can generate revenue based on new ideas using that connectivity. Telecommunications providers still want connectivity to be at the center of their proposition but the really interesting and attractive part for customers is the ability to monetize this in a variety of forms. And this is where the IoT starts to come in.

Yes, IoT (Internet of Things) seems to be one of those trends on the radar within telecom, why is this the case?

IoT is evolving to become one of the most disruptive and enabling technologies for end users in recent memory. It’s revolutionizing the way we approach network connectivity and digitalization in our daily lives. In the beginning it was just machine to machine which was basically the first iteration of the IoT.

IoT is basically a blank canvas for an artist to use connectivity to paint a picture which didn’t exist before, and that’s what’s happening with the IoT, especially with telco-driven IoT initiatives. Because it’s not just the telcos who are doing IoT. It is a hugely competitive and largely proprietary space where the only limitation to business cases is the creator’s imagination. This is where only the strongest telcos will prosper. Deutsche Telekom, for instance, has a very strong expert approach to IoT which can accommodate entire ecosystems to create business value for their customers.

Now, the pressure on the telecom world is really to bring the knowledge that they have, not just about networks but about industry verticals themselves, into play to develop something meaningful for individual customers.

What tools and resources will help these initiatives in the telco sector?

What we do at IDnow is possibly one of the most important enabling solutions within the IoT potentially. One of my goals is to connect IDnow into this API economy so customers of a telco service have the option to conduct an ID verification process. To date, IoT has been more concerned with locating individual devices. A lot of people think IoT is a device-driven type of world where you don’t need to know the person behind it. But that’s changing, especially with AI and deepfakes. You can really manipulate a traditional device-based IoT network in a big way unless you verify who the person is working on the IoT platform. You really need to know who has access and who is accessing the platform. Maybe it doesn’t have to be a deep dive, but the key with IoT is trust.

Trust is the number one factor that will either enable a customer to deploy a telco-based IoT solution or will stop it in its tracks.

Duke Golden, Director Sales

If there are doubts about the authenticity of the platform, or the ability for that platform to control unauthorized usage, then a lot of times IoT initiatives never make it off the ground, especially in Germany. It’s all about 1% risk means 100% opposition in many cases. Germany has kind of a very strict sense when it comes to these things like 0 risk is the only acceptable risk.

Since trust is the number one factor within telco, the term Zero Trust with regards to network connections and security has been used a lot recently. Is this something widespread in the telecommunications industry?

Zero Trust is widely used. That is a term that you will hear everywhere from network engineers to security operations technicians. Zero trust—trust nothing. The reason that it has become popular is because security has become completely endangered, especially over the last five to 10 years. In the network engineering world, there are different points of ingress into the network. In the past before IoT, before cloud, before all these other innovative ways to connect to the Internet, the Internet protocol basically protected these environments. They were standardized environments where you could only enter them with a specific network protocol and it was much easier to protect them. Zero Trust wasn’t really needed at that point because you knew that if someone broke into the network, it was an inside job.

Now, anyone has the opportunity to interface with any layer of the network from a completely unverified position. So Zero Trust was born, and it became probably the strongest terminology used especially in the telco world. You need to verify every single user to a different degree. For example, if you’re talking about an online dating platform, you don’t need to verify everything about this person’s character. You just need to verify they are over 18. Zero Trust doesn’t mean you have to conduct a complete background check on every single individual person, but you need to know who you’re dealing with.

With the need to verify a person and bind them to the device they are using, will there be hesitation from users with concerns to data privacy?

Enterprises want to know more about their customers because the more they know the better they can serve their customers. Nothing is for free. A well-known CEO of a cybersecurity company once told me, if you are getting something for free, then you are the product. Thus, we’re not dealing in a cash-based economy anymore. True value lies in information these days.

And with regards to the telecommunications industry, there is a huge push right now to monetize in different ways the connectivity that they’re already providing to people. With the price of connectivity sinking daily, the push now for telcos is to recover that depreciation in the cost of actual connectivity in extra value-added services like IoT, like cloud services, 5G campus Networks and other digitalization initiatives. And the key to all these other value-added services is to verify who those services are going to, not only for the safety and security aspect, but having the knowledge to serve their customers better. That knowledge for telecommunications providers is essential to designing or altering services to fit specific types of the widest possible scope of business cases.

As with all new initiatives, especially from my experience in Europe, there will be hesitation. But it depends on the country and on the culture. Every society globally is very different when it comes to their aversion to risk. At the end of the day, it’s all about the economy and people’s ability to pay for things, either with money or information.

While there might be kickback in conservative countries like Germany, the vast majority of the world will have no problem with signing up for services and giving them a certain amount of information. But it needs to be tied to identity verification solutions because the only way to really increase security in a very complicated online services world is to start with a single point of truth. And the only way to truly verify who a person is, is by verifying their identity. But they can’t only be verified one time. It needs to be an ongoing activity verifying and reverifying, especially with regards to critical infrastructure. And this is where the telecommunications industry and IDV is headed.

How regulated would you say is the IoT industry? For example, in Germany, there is the Telecommunications Act (TKG) but are there any other regulations?

With regards to the IoT specifically, IoT is still very much wild, wild West. There is very little regulation.

I think there is a serious risk that the IoT can become too dangerous in some ways if we don’t regulate ourselves using best practice instead of waiting for regulatory agencies to act. The difference between traditional telecommunications networks and IoT networks is that telecom networks are completely standardized internally and with IoT all of these solutions are proprietary. There is no standardization within the IoT industry. So that means our ability to secure the IoT is much less than with standardized industry standards like on the traditional network side which have well thought out, highly developed security protocols built into them.

What better place to secure the IoT than at the point of ingress for all users, both internal and external? That means performing a comprehensive ID verification process for admins and users, depending on what type of IoT platform it is. We just need to change our view on what is an acceptable level of risk.

With the number of smart devices continuing to increase and experts predicting that by 2030 the world population will be around 8.3 billion and the number of devices will have reached an estimated 43.2 billion, what are the main challenges this industry faces when considering these numbers?

Traditionally, in the IDV sector it’s been about verifying users before they buy their prepaid mobile or before they activate it and then they’re good to go.

Now, verification needs to become an integral part of the entire value chain for technology, not just at the point of purchase, but at the point of usage.

Duke Golden, Director Sales

The ID verification process is something that needs to be always going on in the background. And this is where the eIDAS 2.0 regulation would come into play with a digital wallet. But we’re just not there yet. Regulations will always be behind technology.

At IDnow and as an ID verification solutions provider, it’s our responsibility to fill the gap between regulation and everyday business risks that aren’t being addressed by the regulatory environment. Building this bridge of trust for our customers is our core mission which we take very seriously.

If you’re interested in more insights from industry insiders and thought leaders, check out one of our other interviews from our Spotlight Interview series below:

The importance of data-first fraud investigations, with Peter Taylor Sébastien Marcel on generative AI and deepfakes Jinisha Bhatt, financial crime investigator Paul Stratton, ex-police officer and financial crime trainer

By

Kristen Walter
Jr. Content Marketing Manager
Connect with Kristen on LinkedIn


Shyft Network

A Guide to FATF Travel Rule Compliance in Liechtenstein

The FATF Travel Rule applies to all crypto transactions above 1 CHF in Liechtenstein. To operate legally in Liechtenstein, VASPs must register with the FMA and verify transaction parties’ identities. VASPs must use blockchain analytics to verify and monitor transactions, especially with unhosted wallets in Liechtenstein. Since January 1, 2020, Liechtenstein has enforced the Blockchain Act (TV
The FATF Travel Rule applies to all crypto transactions above 1 CHF in Liechtenstein. To operate legally in Liechtenstein, VASPs must register with the FMA and verify transaction parties’ identities. VASPs must use blockchain analytics to verify and monitor transactions, especially with unhosted wallets in Liechtenstein.

Since January 1, 2020, Liechtenstein has enforced the Blockchain Act (TVTG), creating a comprehensive legal structure for cryptocurrencies and related services.

Subsequent amendments to the Due Diligence Act (SPG) and the Due Diligence Ordinance (SPV) have integrated the Financial Action Task Force (FATF)’s crypto Travel Rule.

Key Features of the Travel Rule

In Liechtenstein, the Travel Rule obliges VASPs to exchange detailed transaction-related personal information for all virtual asset transfers above the minimum threshold of 1 Swiss Franc (CHF).

This includes the full names, wallet addresses, and other identifying information of the transaction’s originator and beneficiary.

Compliance Requirements

VASPs in Liechtenstein must register as outlined in the TVTG, with the Financial Market Authority (FMA) having three months to respond to registration applications.

All VASPs must also verify the identities of counterpart VASPs, particularly when dealing with high-risk jurisdictions, and implement enhanced due diligence measures. These requirements apply to both domestic and cross-border transfers.

For Originators:

Full name Wallet address One of the following: physical address, ID document number, customer identification number, or date and place of birth

For Beneficiaries:

Name Wallet address

If the required Travel Rule data is missing, incomplete, or delayed, the beneficiary VASP must implement risk-based procedures to identify and rectify gaps, potentially suspending or rejecting the transfer until compliance is achieved.

Impact on Cryptocurrency Exchanges and Wallets

In Liechtenstein, VASPs like cryptocurrency exchanges and wallet services must adhere to the Travel Rule, supervised by the Financial Market Authority (FMA). This regulation is crucial for preventing money laundering and terrorism financing. Here’s what they need to do:

Registration and Licensing

VASPs must register and get licenses as per the Trustworthy Technologies Act (TVTG), with the FMA reviewing and approving these applications. This ensures that all VASPs meet high regulatory standards and contribute to a secure financial system.

Monitoring Transfers

Exchanges must monitor all transactions, especially those involving unhosted wallets that let owners directly control their private keys. This includes collecting extra information on these wallets during high-risk transactions to prevent illegal activities.

For this, VASPs must conduct laser-focused diligence and use blockchain analytics tools to verify the ownership of the wallets involved in transactions. Additionally, transactions involving these wallets must be scrutinized using blockchain analysis tools to ensure they do not facilitate money laundering or terrorist financing.

Risk-Based Policies

Both sending and receiving VASPs need to develop policies tailored for managing transfers involving unhosted wallets. These policies must focus on thorough verification and monitoring to align with local and international regulations.

Global Context

Liechtenstein enforces a low threshold for the Travel Rule, requiring compliance for transactions as small as 1 CHF. This is in contrast to places like the United States, where the threshold is $3,000, and Singapore, where it’s S$1,500. The European Union, on the other hand, is exploring the idea of applying the FATF Travel Rule to all crypto transactions, regardless of amount.

Concluding Thoughts

Since implementing the FATF Travel Rule in 2020, Liechtenstein requires detailed verification for cryptocurrency transactions over 1 CHF. This is in sharp contrast to the higher thresholds in the United States and Singapore and the potential for no minimum in the European Union. Liechtenstein’s strict approach aims to safeguard financial transactions and prevent illegal activities like money laundering and terrorism financing within its borders.

FAQs 1. What is the minimum threshold for the Travel Rule in Liechtenstein?

In Liechtenstein, the FATF Travel Rule applies to all cryptocurrency transactions above 1 Swiss Franc (CHF).

2. What information must VASPs collect and verify under the Travel Rule in Liechtenstein?

VASPs must collect and verify full names, wallet addresses, and additional identifying information such as a physical address, ID document number, customer identification number, or date and place of birth for both originators and beneficiaries in Liechtenstein.

3. How must VASPs handle transactions involving unhosted wallets?

VASPs must use blockchain analytics tools to verify the ownership of unhosted wallets and conduct enhanced due diligence to prevent illegal activities in Liechtenstein.

‍About Veriscope

‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process.

For more information, visit our website and contact our team for a discussion. To keep up-to-date on all things crypto regulations, sign up for our newsletter and follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium.

A Guide to FATF Travel Rule Compliance in Liechtenstein was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


PingTalk

DigiKey Saves $500K, Better Serves & Protects Users

DigiKey secures its workforce and customers while savings hundreds of thousands with Ping Identity.

Saturday, 22. June 2024

Finema

4 Criteria for Great vLEI Use Cases

Authors: Yanisa Sunanchaiyakarn & Nuttawut Kongsuwan, Finema Co. Ltd. As we enter the era of generative AI, the risk of identity theft for both individuals and organizations is escalating. A recent case in Hong Kong highlights this threat, where an office was deceived into transferring $25 million after a video call with a deepfake CFO and colleagues. As advances in AI erode trust, it be

Authors: Yanisa Sunanchaiyakarn & Nuttawut Kongsuwan, Finema Co. Ltd.

As we enter the era of generative AI, the risk of identity theft for both individuals and organizations is escalating. A recent case in Hong Kong highlights this threat, where an office was deceived into transferring $25 million after a video call with a deepfake CFO and colleagues. As advances in AI erode trust, it becomes apparent that a robust digital identity solution for organizations is urgently needed.

In the past few years, the verifiable Legal Entity Identifier (vLEI) framework has emerged as one of the most promising solutions to this global crisis. The vLEI framework offers one of the most secure and trustworthy digital organization identity management to date and potentially revolutionizes digital business transactions worldwide.

Since vLEI is also still in its infancy, there is a significant challenge for the community to overcome its adoption hurdle. In this article, we propose 4 pragmatic criteria for identifying fair, good, and great use cases for vLEI from the adoptability perspective. These criteria are by no means rigid rules set in stone but serve as a useful mental model for pioneers and early adopters exploring vLEI use cases.

The 4 Criteria

We have identified 4 criteria for good and great vLEI use cases:

Use cases that involve organization-to-organization transactions Use cases that involve cross-border transactions Use cases that are highly regulated Use cases that have open ecosystems 1. Org-to-Org

The first criterion for viable vLEI use cases is that they involve organization-to-organization (Org-to-Org) transactions. This includes business-to-business (B2B), business-to-government (B2G), and government-to-government (G2G) use cases. This is because org-to-org transactions often involve significant monetary value, which far outweighs the initial friction associated with adopting vLEI.

2. Cross-Border

Cross-border transactions often face challenges in identifying and verifying clients, suppliers, or partners across different countries, leading to significant perceived risks. The vLEI framework is ideal for addressing these challenges as it is specifically designed for international use. It can potentially streamline the cross-border identification and verification processes, mitigating risks and improving efficiency.

3. Highly Regulated

Highly regulated use cases are subject to strict requirements and standards, where non-compliance can result in severe penalties or legal consequences. As a result, these use cases often require extensive due diligence of business partners, including Know Your Customer (KYC), Anti-Money Laundering (AML), and Countering the Financing of Terrorism (CFT) checks.

The process of obtaining vLEI for organizations and their representatives involves robust identity verification that is more stringent than those often conducted in the financial sector. As a result, vLEI, which is built on the global LEI system, can help streamline the due diligence process, significantly reducing compliance costs.

4. Open Ecosystems

The final criterion for viable vLEI use cases is that they operate within open ecosystems, allowing the addition of organizations whose identities are not known in advance. Onboarding a new organization to the ecosystem often incurs significant cost, time, and effort. vLEI allows these onboarding costs to be offloaded to a third party, specifically a qualified vLEI issuer (QVI), which ensures that the organization and its representatives have undergone strict identity verification.

Fair, Good, and Great Use Cases Great Use Cases

Great use cases are those that satisfy all 4 criteria, making them ideal candidates that could benefit from integrating vLEI into their workflows.

A prime example of a great use case is trade finance. It typically involves organizations (criterion 1) which are often located in different countries (criterion 2). Trade finance is also highly regulated, with financiers subject to AML and CFT requirements (criterion 3). The trade finance ecosystem is open, allowing any company worldwide to initiate trades and apply for, e.g., a letter of credit (criterion 4).

Good Use Cases

Good use cases are those that satisfy 3 out of the 4 criteria. We consider these use cases worth pursuing.

An example of a good use case is the financial reporting of banking institutions in Europe to the European Banking Authority (EBA). This exemplifies a B2G scenario (criterion 1), encompassing banks across multiple European countries (criterion 2) that operate within a highly regulated environment (criterion 3). What makes this use case simply good rather than great in our criteria is that European banks are already known entities within the closed ecosystem supervised by the EBA.

Fair Use Cases

Fair use cases are those that satisfy two or fewer criteria. While they may be a feasible use case for vLEI, we consider them a low-potential candidate. We argue that fair use cases will become viable once vLEI has achieved widespread adoption. For instance, organizations that already possess vLEI for stronger use cases might contemplate applying it to these fair use cases.

An example of a use case we consider “fair” for vLEI is an HR platform. Such a platform is used internally between employers and employees (not org-to-org) and is also not highly regulated. Typically, employees and employers have alternative means to verify each other, diminishing the necessity for vLEI in this context.

Conclusion

While vLEI holds significant potential to transform global business operations, it currently faces a challenge known as the “cold start problem,” where there are not enough holders and use cases to foster exponential growth in the ecosystem. Pioneers and early adopters are encouraged to prioritize exploring high-potential use cases.

Do you agree with our criteria? Are there additional criteria we should consider adding to the list? We welcome your feedback and invite you to contact us at contact@enauthn.id.

4 Criteria for Great vLEI Use Cases was originally published in Finema on Medium, where people are continuing the conversation by highlighting and responding to this story.

Friday, 21. June 2024

Transmute TechTalk

Yet another AI

Beyond code, Copilot understands shipping I know, we all drowned in “OMG” posts last year as people shared their AI experiences. We’re beyond that now; we’re looking for use cases, value adding application and bundling. Nonetheless, I can’t hide from the fact that I had one of those OMG experiences yesterday. I was designing a JSON Schema for a Mill Test Report (MTR)— a document used in the Stee
Beyond code, Copilot understands shipping

I know, we all drowned in “OMG” posts last year as people shared their AI experiences. We’re beyond that now; we’re looking for use cases, value adding application and bundling.

Nonetheless, I can’t hide from the fact that I had one of those OMG experiences yesterday. I was designing a JSON Schema for a Mill Test Report (MTR)— a document used in the Steel industry to describe the type and quality of steel products. MTRs can contain a large number of tests, and I had modeled a few of them already.

As I got to Surface Condition tests, I typed “surface” — and look what Github Copilot suggested…

Stupid amount of code generated.

As my colleague stated, it generated “stupid amount of code” for me. Not rocket science, mainly repeating a pattern of the previous tests — but still a lot I didn’t have to type.

But then I moved to generating an example JSON instance. Now look…

Copiloting on both horizontal and vertical domains.

As the test standard for Microstructure Analysis, it suggests “ASTM E112–13”. That is indeed a standard for microstructure analysis! Copilot pulls industry vertical knowledge into my Visual Studio Code environment. The fact that it “copilots” on both horizontal (the technical coding domain) and vertical (industry domain) is just… Cool!

In all fairness, Copilot subsequently suggested “ASTM E112–13” again for Hydrostatic Tests, which (to my knowledge) does not define such test standards. I guess it got carried away a bit there. Human vigilance is still required. But that hardly takes away from the coolness of this AI experience.

You can find the resulting Mill Test Report schema and many more common, standardized shipping documents on https://platform.transmute.industries/marketplace/templates.

Transmute is committed to digitizing supply global chains, applying modern cryptography standards for more efficient, automated, and safer cross-border trade.
Sign up for free now on https://platform.transmute.industries.

Nis Jespersen
Solution Architect
Transmute
https://platform.transmute.industries

Yet another AI 🤯 was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


Dock

Credential Management: 8 Best Practices for Improving Your Process

Credential management is becoming increasingly critical for product professionals. Full article: https://www.dock.io/post/credential-management

Credential management is becoming increasingly critical for product professionals.

Full article: https://www.dock.io/post/credential-management


IDnow

Connected vehicles and driver identification: Fewer keys for greater safety?

In the age of the Internet of Things, autonomous and connected cars are revolutionizing the way we drive. As these vehicles become increasingly sophisticated, verifying user identity is becoming a crucial issue in ensuring safety and preventing fraud. Let’s explore the challenges and solutions associated with this mobility revolution. By 2025, it is estimated that […]
In the age of the Internet of Things, autonomous and connected cars are revolutionizing the way we drive. As these vehicles become increasingly sophisticated, verifying user identity is becoming a crucial issue in ensuring safety and preventing fraud. Let’s explore the challenges and solutions associated with this mobility revolution.

By 2025, it is estimated that 50 billion objects will be connected worldwide, covering everything from household appliances to urban infrastructure. Among these objects, cars occupy a prominent place, with functionalities multiplying at an impressive rate. According to Geotab, the smart device is what has changed the game the most in recent years. As technology advances, connected vehicles become more sophisticated, offering advanced functionality that goes far beyond simple driving. These advances are not just limited to improving mechanical and safety aspects, but also extend to features that enhance the overall experience for drivers and passengers.

As the Belgian Automobile Federation (FEBIAC) explains: “There are two types of connected cars. The first category has a 4G/5G card that vehicles use to be constantly connected to the Internet, and more specifically to the manufacturer’s online services. The second category connects via your smartphone, whose screen will be replicated on that of the vehicle.”

Connected cars are increasingly integrating smartphones into their ecosystems, thanks in particular to the democratization of Apple CarPlay and Android Auto. As Apple points out, it’s now possible to integrate your iPhone with your car’s digital keys, to open and start your vehicle. Users are now able to link their phone to their vehicle, enabling them to benefit from functions such as remote opening and starting, or purchasing options directly from their phone.

However, such functionalities require a high level of user verification. With the risk of hacking on the rise, it is essential to protect against fraud and identity theft by adopting appropriate solutions.

Connectivity: the new spearhead of the automotive sector.

Connected cars have opened up a whole new world of opportunities for automakers and end-users alike. Connectivity has made a major contribution to enhancing the driving experience, through the deployment of new functionalities. These include:

Predictive maintenance, thanks to remote diagnostic modules, or maintenance notifications based on real-time analysis of vehicle data; Personalized services, such as notification of available parking spaces nearby, or suggestions for restaurants and services based on location; Entertainment and connectivity, including smartphones accessing streaming services, or compatible applications from stores; Remote control and command, enabling the vehicle to be locked, unlocked or started remotely, or the vehicle temperature to be preset.

Some of these advantages require a connection with a smartphone to operate. More and more cars no longer need a key to start, for example. Cards are gradually replacing ignition keys. They can also be dematerialized on a smartphone, for even greater convenience. This is one of the reasons behind the success of car-sharing services such as Getaround or Zity, which rely on remote unlocking of vehicles, thanks to strong user identification.

As Thales points out, another crucial point in terms of security for connected vehicles lies in web applications. These enable “the exchange of information between the vehicle and/or driver and the manufacturer’s IT system. If the authentication process is not secure enough, an attacker can enter a user’s account and take control of the vehicle.”

For all these reasons, strong identification measures are essential to anticipate these new uses.

Securing new uses through identification.

While car-sharing is one of the benchmark use cases, there are other situations in which it may be useful to identify the driver of a vehicle. Lending a vehicle to an employee, rental between private individuals or even self-service cars are all examples of uses that justify better identification of the driver.

These new uses require strong authentication of users using their smartphone, as well as identification of the smartphone by the vehicle. This begins as soon as the vehicle is delivered, where an automated identity verification check and biometric verification can be carried out, in order to associate a user with his or her account.

When a key is exchanged digitally with another user, it is necessary for the user to prove his or her identity. By going through the same identification and authentication process as the owner, both parties can optimize the security of vehicle access.

It is also possible to request an additional biometric element to reinforce authentication. For example, the use of biometric features such as facial recognition can add an extra layer of security, allowing the user to access their account in complete safety. Many modern smartphones are equipped with biometric sensors that can be used to authenticate the user before allowing them to unlock the vehicle. Even if an ill-intentioned person has the smartphone of the vehicle’s rightful owner, it will not be possible for him or her to gain access, or even worse, to start it.

At IDnow, we’re helping to enhance the security of connected vehicles by offering ID document verification via our APIs, capable of handling over 3,000 ID documents from 195 countries. By leveraging the latest technologies in facial recognition and life detection, we are able to confirm the physical presence of the person during the verification process. By integrating these technologies and strategies, manufacturers and users of connected vehicles can ensure secure identification of users via their smartphones, minimizing the risk of fraud.

Anticipating future uses.

As vehicles become increasingly autonomous and integrated into our daily lives, reliable and secure identification systems are essential to protect drivers and their vehicles. Faced with these new uses, we need to think about security and identification, both on the user side and on the component side. Proactive measures will have a positive impact on users’ peace of mind and their confidence in connected cars.

Want to know more about the future of mobility? Discover the major trends in the mobility industry, the innovative models and solutions available to you to design a seamless user experience. Get your free copy

By

Mallaury Marie
Content Manager at IDnow
Connect with Mallaury on LinkedIn


KuppingerCole

Customer Identity and Access Management (CIAM)

by John Tolbert CIAM systems allow users to register for an account, associate their device and other digital identities, authenticate, authorize, collect, and store information about consumers from across many domains. This Buyer's Compass provides insight into how these systems work, what companies should consider when selecting a solution, and the use cases for them.

by John Tolbert

CIAM systems allow users to register for an account, associate their device and other digital identities, authenticate, authorize, collect, and store information about consumers from across many domains. This Buyer's Compass provides insight into how these systems work, what companies should consider when selecting a solution, and the use cases for them.

Inverid ReadID

by Anne Bailey This KuppingerCole Executive View report looks at ReadID, an NFC-based solution from Inverid to identity document verification for use cases in the financial sector, eGovernment, border control, trust service providers, travel, gaming, HR, and as a Personal Identity Data issuer for digital wallets.

by Anne Bailey

This KuppingerCole Executive View report looks at ReadID, an NFC-based solution from Inverid to identity document verification for use cases in the financial sector, eGovernment, border control, trust service providers, travel, gaming, HR, and as a Personal Identity Data issuer for digital wallets.

uquodo

Mobile SDK 3.1.2 Updates

The post Mobile SDK 3.1.2 Updates appeared first on uqudo.

The post Mobile SDK 3.1.2 Updates appeared first on uqudo.

Thursday, 20. June 2024

KuppingerCole

Mastering Identity Cybersecurity: The Power Trio of Zero Trust, Identity-First Security, and ITDR

Dive into the intricate world of identity cybersecurity, where the convergence of Zero Trust, Identity-First Security, and Identity Threat Detection and Response (ITDR) presents both opportunities and challenges. With escalating cyber threats targeting identity assets, organizations face the daunting task of safeguarding sensitive data and systems while ensuring seamless operations. The applicat

Dive into the intricate world of identity cybersecurity, where the convergence of Zero Trust, Identity-First Security, and Identity Threat Detection and Response (ITDR) presents both opportunities and challenges. With escalating cyber threats targeting identity assets, organizations face the daunting task of safeguarding sensitive data and systems while ensuring seamless operations.

The application of modern technology offers a beacon of hope in the face of these challenges. Discover how organizations can leverage advanced solutions like AI-driven threat detection, behavioral analytics, and adaptive access controls to strengthen their cybersecurity posture. By embracing innovative approaches, organizations can effectively mitigate risks and enhance resilience against evolving cyber threats, all while enabling secure and frictionless user experiences. 

Paul Fisher, Lead Analyst at KuppingerCole, will dissect the intricacies of Zero Trust Maturity, shedding light on its assessment and the imperative role of Identity-First Security and ITDR. He'll explore strategies for managing exceptions and tackling sophisticated social engineering attacks. 

Andre Priebe, Chief Technology Officer at iC Consult, will lead an in-depth discussion on the necessity and benefits of ITDR in modern cybersecurity landscapes. He'll provide actionable insights into mitigating risks, navigating data privacy concerns, and tackling challenges posed by Shadow SaaS. 

Join this webinar to: 

Gain insights into gaps within Zero Trust deployments.  Understand synergies between ITDR and Zero Trust, fortifying security frameworks.  Learn critical considerations for ITDR implementation, including risk mitigation strategies.  Manage exceptions and sophisticated social engineering attacks.  Gain insights into navigating data privacy concerns and Shadow SaaS challenges. 


Authorization with AuthZEN - The Future of Digital Identity with David Brossard and Allan Foster

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. David Brossard and Allan Foster discuss the future of authorization and its impact on digital identity and business. The main focus is on AuthZen, a new approach to authorization aimed at addressing the challenges of managing authorization in complex environments.

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. David Brossard and Allan Foster discuss the future of authorization and its impact on digital identity and business. The main focus is on AuthZen, a new approach to authorization aimed at addressing the challenges of managing authorization in complex environments.




Proximax

Sirius Chain Mainnet 1.7.3 Upgrade

Dear Sirius Chain Node Owners, We are pleased to inform you that the crucial software upgrade for our Sirius Chain network was successfully completed on June 20th, 2024, at approximately 14:35 UTC, at upgrade height 9485216. This upgrade has enhanced the performance, security, and overall efficiency of the network. Key Features of the Upgrade: Consensus Protocol Improvements: • D

Dear Sirius Chain Node Owners,

We are pleased to inform you that the crucial software upgrade for our Sirius Chain network was successfully completed on June 20th, 2024, at approximately 14:35 UTC, at upgrade height 9485216. This upgrade has enhanced the performance, security, and overall efficiency of the network.

Key Features of the Upgrade:

Consensus Protocol Improvements:
• Disabled committees
• Implemented block confirmation using the DBRB protocol
• Implemented DBRB sharding

Storage Improvements:
• Added receipts for the Storage plugin to validate storage tokenomics
• Improved replicator onboarding procedure

Bug Fixes:
• Fixed transaction cache deadlock
• Fixed node hanging issue caused by transaction spamming
• Fixed Mac build issues
• Fixed automatic replicator start

Action Required:
To ensure your nodes are compatible with the upgraded software version, we kindly request that network participants and node owners follow the instructions outlined in the upgrade guide (https://github.com/proximax-storage/xpx-mainnet-chain-onboarding/blob/master/upgrade/README.md).

Thank you for your ongoing support and cooperation.

Best regards,
The Sirius Chain Team


Verida

Extending the Verida Network for Personalized AI with Confidential Compute

Extending the Verida Network for Personalized AI with Confidential Compute The AI revolution is undeniable, with a market poised to hit $10 trillion by 2025. But there’s a catch: the data fueling this growth is often centralized, leaving it vulnerable and raising serious privacy concerns. However, alongside its huge impact lies a looming concern: data privacy. Traditional AI models typically rely
Extending the Verida Network for Personalized AI with Confidential Compute

The AI revolution is undeniable, with a market poised to hit $10 trillion by 2025. But there’s a catch: the data fueling this growth is often centralized, leaving it vulnerable and raising serious privacy concerns.

However, alongside its huge impact lies a looming concern: data privacy. Traditional AI models typically rely on centralized data storage and centralized computation, raising concerns about ownership, control, and potential misuse. Apple’s recent data-sharing practices and HuggingFace’s security breach underscore these risks.

Verida’s Vision: Your Data, Your AI

Verida’s mission has always been clear: empower individuals to own and control their data. Now, we’re taking it further.

The Verida Network has commenced extending its platform to support not just decentralized, privacy preserving databases, but also decentralized, privacy preserving compute suitable for handling personal data.

We’re integrating personal data with leading AI models, ensuring end-to-end privacy. This safeguards your data and unlocks a new era of hyper-personalized AI experiences.

AI is only as good as the data it has access to.

Learn more about the data privacy problems in AI in the previous series we published.

Part 1: Top Three Data Privacy Issues Facing AI Today

Part 2: How web3 and DePIN solves AI’s data privacy problems.

Part 3: Verida is enabling the privacy preserving AI tech stack

Confidential Compute Unlocks New Capabilities

Adding confidential compute to the Verida Network opens up new capabilities to developers and enables powerful new use cases for personal data.

It enables “Personalized API’s” that provide advanced capabilities beyond data storage, such as enabling existing AI prompts and third party applications easy access to personal data via traditional web2 integration methods.

Adding “Personal Data Search” enables hyper-personalization to existing applications, while also enabling existing AI models to leverage private user data — without leaking this to third parties.

For builders on the Verida Network, this new pathway extends capabilities allowing developers to build and offer custom “user agent” code that securely operates on behalf of a user. For example, automating sending messages (email, telegram etc.) or a background agent that runs on behalf of a user, pulling their latest data from centralized API’s every hour, scheduling calendar invites and managing personal tasks and requests. All of this operating in a highly secure, decentralized confidential compute environment, protected by a user’s private keys.

Join Verida in building a user-owed AI future

The Verida network is designed for storing private, personal data. It is a highly performant, low cost, regulatory compliant solution for storing structured database data for any type of application. Data stored on the network is protected with a user’s private key, ensuring they are the only account that can request access to, and encrypt their data.

Verida has always been about allowing individuals to own and be in control of their personal data, enabling their data to benefit them. When Verida was first established, AI was barely a blip, but has now become a huge force and demonstrated great utility across almost every industry sector.

Privacy preserving AI will need access not only to private data, but also preserving computation to train AI models and respond to user prompts.

Our core values and objectives remain; to provide the underlying infrastructure for personal data ownership, and now extend that to support the development of user-owned, privacy preserving AI products.

Through the extension of the Verida Network to support personal confidential compute, there is a very clear pathway to enable personalized, privacy preserving AI leveraging a 100% DePIN technology stack.

This is incredibly exciting, as it will provide a more secure, privacy preserving solution as an alternative to giving all our data to large centralized technology companies.

If you’re a developer looking to future proof your application with private self-sovereign data storage and compute capabilities, get in touch with our experts in our Discord server or register your project for the Verida Ecosystem here.

About Verida

Verida is a pioneering decentralized data network and self-custody wallet that empowers users with control over their digital identity and data. With cutting-edge technology such as zero-knowledge proofs and verifiable credentials, Verida offers secure, self-sovereign storage solutions and innovative applications for a wide range of industries. With a thriving community and a commitment to transparency and security, Verida is leading the charge towards a more decentralized and user-centric digital future. For more information, visit www.verida.network

Extending the Verida Network for Personalized AI with Confidential Compute was originally published in Verida on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 19. June 2024

KuppingerCole

Biometrics is Key - The Future of Digital Identity with Annet Steenbergen

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Annet Steenbergen sets focus on the future of digital identity in the travel industry, especially digital travel credentials and their security implications. The conversation highlights the potential for biometrics to enhance the passenger experience, innovations in travel identity solutions, an

In this interview series, our Analysts talked to Identity Experts about the Future of Digtial Identity. Annet Steenbergen sets focus on the future of digital identity in the travel industry, especially digital travel credentials and their security implications. The conversation highlights the potential for biometrics to enhance the passenger experience, innovations in travel identity solutions, and the importance of data privacy and selective data sharing in the travel context.




UbiSecure

IDS 2024.1, patches for SSO 9.4.1 and CustomerID 6.4.1

In the first release of 2024 we have updated SSO Swedish BankID adaptor to enable the core Authentication Flow, utalising animated QR... The post IDS 2024.1, patches for SSO 9.4.1 and CustomerID 6.4.1 appeared first on Ubisecure Digital Identity Management.

In the first release of 2024 we have updated SSO Swedish BankID adaptor to enable the core Authentication Flow, utalising animated QR codes for all two device authentications. This modern specification helps to limit the potential phishing attacks through the use of continuously changing QR codes.

We have altered our CustomerID application to help reduce the potential of misconfigurations which could result in user invitations being sent repetitively if your PostgreSQL service were misconfigured or became unavailable.

We have also updated Tomcat, Redis, OpenLDAP and PostgreSQL to the latest possible versions for use with Identity Platform. Additionally, as RedHat7/CentOS 7 becomes end of life during 2024, we have updated our support to RedHat9/Rocky9.

For full details over the IDS 2024.1 release, please review the Release Notes and System Recommendations pages found on our Developer Portal.

Patch update:

SSO 9.4.1 – within this patch release we have updated the Mobile PKI (MPKI) service to fulfil current Traficom requirements. Additionally we have updated Tomcat to 9.0.87

CustomerID 6.4.1 – within this patch release we correct a coding error present when using SSO’s TOTP along with CustomerID’s ability to move users between Organisations. Prior to this patch, moving a user would result in errors and impact both LDAP and PostgreSQL databases. If you utalise TOTP and have CustomerID in active use, we encourage you to update to this patch version of CustomerID.

As always, we encourage all our customers to update to the latest available versions of software. For full details over our latest releases, please review the Release Notes and System Recommendations pages found on our Developer Portal.

The post IDS 2024.1, patches for SSO 9.4.1 and CustomerID 6.4.1 appeared first on Ubisecure Digital Identity Management.


Ontology

Ontology Weekly Report (June 11th — June 17th, 2024)

Ontology Weekly Report (June 11th — June 17th, 2024) This week at Ontology has been bustling with exciting developments, partnerships, and community engagements. Here’s a detailed update on our progress and activities: Latest Developments Ontology Australia Node: We’re excited to announce that Ontology Australia node now has its own channel! Follow for updates and learn more about node st
Ontology Weekly Report (June 11th — June 17th, 2024)

This week at Ontology has been bustling with exciting developments, partnerships, and community engagements. Here’s a detailed update on our progress and activities:

Latest Developments Ontology Australia Node: We’re excited to announce that Ontology Australia node now has its own channel! Follow for updates and learn more about node staking on Ontology. AMA with Bitget — Ontology French: The French community enjoyed an engaging AMA session with Bitget, discussing future collaborations and community questions. Web3 Happenings: Did you join our Web3 happenings this week? Stay tuned for more insightful sessions! New Article on Blockchain and Digital Identity: A fresh article discussing blockchain’s role in digital identity has been published. Check it out to understand how blockchain technology is shaping secure digital ecosystems. Community Updates Coming Up: Tune in to our upcoming community update for details on exciting events on the horizon. Partnership with DxSale: We have partnered with DeSale to revolutionize the Web3 landscape, aiming to enhance platform integrations and user experiences. Future DID Quests: Expect more DID-related quests as we continue to expand our focus on decentralized identity solutions. Development Progress Ontology EVM Trace Trading Function: Now at 93%, we’re nearing completion, enhancing our trading capabilities within the EVM space. ONT to ONTD Conversion Contract: Progress has reached 60%, making the conversion process more streamlined and user-friendly. ONT Leverage Staking Design: Currently at 43%, this development promises innovative staking options for the community. Product Development ONTO x NFTFeed Giveaway: ONTO is having a giveaway with NFTFeed! Don’t miss your chance to win exclusive NFTs. ONTO x XSTAR Giveaway Winners: Congratulations to the winners of the ONTO x XSTAR giveaway! Rewards have been distributed. On-Chain Activity DApp Ecosystem Stability: The total number of dApps on our MainNet holds steady at 177. Transaction Growth: This week saw an increase of 570 dApp-related transactions, totaling 7,769,159, and a total transaction increase on MainNet by 2,111, reaching 19,462,273. Community Growth Vibrant Community Discussions: Our platforms on Twitter and Telegram are buzzing with the latest developments and interactions. Join us to stay connected and involved. Telegram Discussion on Interoperable DID Solutions: This week, led by Ontology Loyal Members, we explored “Interoperable DID Solutions: Web2, Web3, and Beyond,” discussing the implications for KYC, login systems, and peer-to-peer interactions. Stay Connected 📱

Engage with us across our social media platforms to keep up with the latest from Ontology. Your involvement and feedback drive our progress and innovation.

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Thank you for your ongoing support and participation. Let’s continue to push the boundaries of blockchain technology and decentralized identity together!

Ontology Weekly Report (June 11th — June 17th, 2024) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


This week in identity

E55 - Identiverse, Identity Week Europe and Gartner SRM

Summary In this episode, Simon and David discuss the recent identity conferences they attended, including Identiverse and Identity Week. They highlight the growing interest in identity across various industries and the need for resilience and security in identity management. They also delve into the topics of decentralized identity and generative AI, emphasizing the importance of tying security

Summary

In this episode, Simon and David discuss the recent identity conferences they attended, including Identiverse and Identity Week. They highlight the growing interest in identity across various industries and the need for resilience and security in identity management. They also delve into the topics of decentralized identity and generative AI, emphasizing the importance of tying security investment to business outcomes and altering the way we think about data and technology. They conclude by mentioning future episodes dedicated to decentralized identity and generative AI.


Keywords

identity conferences, Identiverse, Identity Week, resilience, security, decentralized identity, generative AI, security investment, business outcomes

Takeaways

Identity conferences have seen a surge in interest from various industries, indicating the growing importance of identity management. Resilience and security are crucial in identity management, especially in the face of evolving threats and attacks. Decentralized identity and generative AI are emerging topics that require careful consideration and alignment with business goals. Security investment should be tied to business outcomes and the specific needs of the organization. The identity and security industry is still relatively young and evolving, requiring a shift in thinking and approach.


Links

Identiverse Identity Week Europe Gartner Security & Risk

Shyft Network

Veriscope Regulatory Recap — 1st to 16th June

Veriscope Regulatory Recap — 1st to 16th June Welcome to the latest issue of the Veriscope Regulatory Recap. In this edition, we analyze the recent regulatory developments in Taiwan and South Korea. Taiwan’s Proactive Stance with the VASP Association Taiwan has seen a unique development with the creation of the Taiwan Virtual Asset Service Provider (VASP) Association. (Image Source) F
Veriscope Regulatory Recap — 1st to 16th June

Welcome to the latest issue of the Veriscope Regulatory Recap. In this edition, we analyze the recent regulatory developments in Taiwan and South Korea.

Taiwan’s Proactive Stance with the VASP Association

Taiwan has seen a unique development with the creation of the Taiwan Virtual Asset Service Provider (VASP) Association.

(Image Source)

Formed by 24 local crypto firms, the association is a response to the Justice Ministry’s proposed amendments to Anti-Money Laundering (AML) rules for crypto operations. These changes could introduce penalties for non-compliance, including fines and prison time.

The association’s goal is to work with the government to ensure fair regulations that balance growth with fraud prevention.

This development has both pros and cons. On the upside, this collaborative effort between the industry and government could foster a more stable and growth-oriented market environment, enhancing Taiwan’s position as a forward-thinking hub for digital assets.

The association also faces challenges, particularly in aligning the diverse interests of its members with the strict requirements proposed by the Justice Ministry. The severity of the penalties for non-compliance, ranging from heavy fines to imprisonment, places significant pressure on the association to not only ensure adherence to these rules but also advocate effectively for regulations that do not stifle innovation.

(Image Source)

So, will other countries follow in Taiwan’s footsteps?

South Korea’s New Guidelines on NFTs

On the other side of the strait, South Korea has taken a different path, focusing on the classification of non-fungible tokens (NFTs).

The Financial Services Commission (FSC) of South Korea recently announced that certain NFTs, which are mass-produced and used similarly to cryptocurrencies, will now be regulated under existing cryptocurrency laws.

(Image Source)

This is a significant move that can impact the global NFT landscape, as South Korea is one of the largest markets globally for NFTs. According to Statista, NFT sales are projected to reach $58M by 2028 from $38M in 2024.

Through this move, South Korean authorities aim to bring clarity and stability to the NFT market by treating NFTs similarly to other digital assets.

This move provides clearer legal boundaries and increased protection for investors, potentially making South Korea a more attractive market for institutional investment in NFTs and enhancing the overall stability of the digital asset sector. On the other hand, these regulations might also bring challenges for NFT creators and platforms, who will now face stricter compliance requirements.

What’s Next?

As both countries refine their regulatory frameworks, the global crypto market watches closely. Taiwan’s and South Korea’s different approaches reflect their unique market dynamics and regulatory philosophies. It’s essential for crypto industry stakeholders to stay informed about these changes as they develop.

Interesting Reads

FATF Crypto Travel Rule Adoption: 6-Month Status Update

A Guide to FATF Travel Rule Compliance in the United States

A Guide to Crypto Travel Rule Compliance in Japan

A Guide to FATF Travel Rule Compliance in the Philippines

A Guide to FATF Travel Rule Compliance in Estonia

‍About Veriscope

‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process.

For more information, visit our website and contact our team for a discussion. To keep up-to-date on all things crypto regulations, sign up for our newsletter and follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium.

Veriscope Regulatory Recap — 1st to 16th June was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Aergo

Aergo Bridge User Guide

* IMPORTANT NOTICE: Bridging Aergo tokens MUST be conducted using private wallets including Aergo Connect and Metamask and NOT through exchanges. The process can take up to 24 hours. Transferring tokens to old Aergo Swap wallet addresses will result in the IRREVERSIBLE loss of all tokens. 1. Choose Bridge Direction (Aergo → Eth) or (Eth → Aergo) 2. Connect to Aergo Con
* IMPORTANT NOTICE: Bridging Aergo tokens MUST be conducted using private wallets including Aergo Connect and Metamask and NOT through exchanges. The process can take up to 24 hours. Transferring tokens to old Aergo Swap wallet addresses will result in the IRREVERSIBLE loss of all tokens. 1. Choose Bridge Direction (Aergo → Eth) or (Eth → Aergo) 2. Connect to Aergo Connect If Aergo Connect (Chrome) is not installed:
1) Download and install Aergo Connect from the Chrome Web Store, then refresh the page to restart.
2) Create or import a wallet to enable connection. If Aergo Connect is already installed:
1) Click the Aergo Connect button.
2) In the Aergo wallet connection popup, set the network to Mainnet (AERGO.IO) and click Confirm.
3) Verify successful connection (check address and balance). 3. Connect to Ethereum Wallets (Metamask recommended) In the Metamask wallet connection popup, switch to the Ethereum network and connect. Check the ERC-20 Aergo token balance and verify the address. 4. Enter the Amount to Exchange 5. Sign and Send Ensure Aergo Connect and Metamask are connected before proceeding. a. Signing with Aergo Connect (Aergo → Ethereum) Complete the signature within 1 minute. If this time is exceeded, a request to re-sign will be necessary. After the signature is verified, the Aergo Bridge address and confirmation of the exchange amount will be provided, accounting for any fees deducted. Transfer Aergo to the provided Bridge address using Aergo Connect. Verify the transaction on Aergo Scan using the provided link upon successful transfer. Confirm the issued Ethereum Aergo Bridge address and the exchange amount. b. Signing with Metamask (Ethereum → Aergo) Complete the signature within 1 minute. If this time is exceeded, a request to re-sign will be necessary. Transfer Ethereum Aergo tokens to the designated Bridge address using Metamask. Verify the transaction hash and access detailed information on Etherscan using the provided link.

Aergo Bridge User Guide was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 18. June 2024

KuppingerCole

Trustworthy and Efficient Payments - The Future of Digital Identity with Marie Austenaa

In this interview series, our Analysts talked to Identity Experts about the Future of Digital Identity. The conversation with Marie Austenaa discusses the role of digital identity in payments and the potential for enhanced security and convenience. The discussion touches on topics such as the evolution of digital identity, the use of digital identity wallets, and the potential for business process

In this interview series, our Analysts talked to Identity Experts about the Future of Digital Identity. The conversation with Marie Austenaa discusses the role of digital identity in payments and the potential for enhanced security and convenience. The discussion touches on topics such as the evolution of digital identity, the use of digital identity wallets, and the potential for business process optimization.




Securing the Digital Frontier: Exploring HP's Business Solutions for Endpoint Security

In the ever-evolving landscape of enterprise and personal computing, maintaining security and integrity across endpoints is paramount. This webinar explores the realm of Business Solutions for Endpoint Security, spotlighting the transformative potential of HP's cybersecurity innovations in safeguarding organizations against modern cyber threats. As technology advances, so do the challenges in en

In the ever-evolving landscape of enterprise and personal computing, maintaining security and integrity across endpoints is paramount. This webinar explores the realm of Business Solutions for Endpoint Security, spotlighting the transformative potential of HP's cybersecurity innovations in safeguarding organizations against modern cyber threats.

As technology advances, so do the challenges in endpoint management. Join experts from KuppingerCole Analysts and HP as they explore how HP's Business Solutions tackle these challenges head-on, from thwarting sophisticated attacks at the BIOS and firmware levels to facilitating remote management of devices even when offline or out of organizational control.

John Tolbert, Director of Cybersecurity Research and Lead Analyst at KuppingerCole Analysts will discuss the importance of endpoint security, look at some common threats, and describe the features of popular endpoint security tools such as Endpoint Protection, Detection and Response (EPDR) solutions. He will also look at specialized Unified Endpoint Management (UEM) tools and how these tools all fit into an overall cybersecurity architecture.

Lars Faustmann, Leading Digital Services for Central Europe at HP and Oliver Pfaff, Business Development Manager at HP will provide an in-depth overview of HP's Business Solutions for Endpoint Security and their constituent components. They will highlight the key features and functionalities of HP's cybersecurity innovations, including Sure Start, Sure Admin, and Tamper Lock. Additionally, they will address the use cases and target market for HP's Business Solutions, emphasizing their suitability for SMBs and MSSPs. They will also discuss HP's contributions to the cybersecurity community and the value they bring to organizations looking to enhance their security posture.

Join this webinar to:

Gain insights into the evolving landscape of endpoint security and management. Discover how HP's Business Solutions can enhance endpoint integrity and mitigate cyber threats. Learn about the comprehensive features and functionalities of cybersecurity innovations. Understand the strengths and challenges of implementing HP's Business Solutions within your organization. Equip yourself with actionable strategies to fortify your organization's endpoint security posture.


1Kosmos BlockID

Identiverse 2024: Key Takeaways from the Conference

A few weeks ago, we had the pleasure of attending Identiverse 2024, the premier conference dedicated to identity and security. Now that we’ve had time to reflect, we’re excited to share our key takeaways from the event. Packed with insightful sessions, innovative workshops, and inspiring keynotes, Identiverse 2024 offered a wealth of knowledge and forward-thinking … Continued The post Identivers

A few weeks ago, we had the pleasure of attending Identiverse 2024, the premier conference dedicated to identity and security. Now that we’ve had time to reflect, we’re excited to share our key takeaways from the event. Packed with insightful sessions, innovative workshops, and inspiring keynotes, Identiverse 2024 offered a wealth of knowledge and forward-thinking ideas that are shaping the future of digital identity.

Fundamentals of Passkeys

One of the standout sessions was on passkeys, which provided a comprehensive overview of FIDO passkeys, their flow, multi-device capabilities, and the evolution of the standard to support new functionalities. The demonstration of transferring 1Password passkeys to Dashlane showcased significant progress in passkey management flexibility. This interoperability is a game-changer, enabling users to move seamlessly between keychains and enhancing their overall experience. Additionally, Air New Zealand’s rapid adoption of passkeys was notable, with 25% of sign-ins using passkeys and a 30% opt-in rate within the first 24 hours. Such success stories underline the potential of phishing-resistant multi-factor authentication (MFA) in real-world applications.

Innovation Workshop: Overcoming Obstacles to Empower Innovators

This workshop addressed key challenges and innovative solutions in advancing identity and security. Inspired by a 2015 data breach affecting 21.5 million federal workers’ fingerprint data, the session emphasized the need for technologies that securely handle biometric data while preserving privacy. The concept of open innovation was also explored, particularly in industries with tight IP restrictions like healthcare and defense. Crowdsourcing platforms like InnoCentive were highlighted as tools to drastically reduce innovation time and effectively solve complex problems.

Opening Keynote: Andre Durand, Ping Identity

Andre Durand’s keynote was both engaging and thought-provoking, setting the tone for the conference. Durand highlighted the emerging crisis of deepfakes and the critical need for reliable identity verification solutions. He also emphasized the issue of repeatedly proving one’s identity across multiple systems and the necessity to address this fragmentation. The merging of verification and authentication processes was discussed, aligning well with identity-based authentication solutions. Durand stressed that protecting consumer reputation should be a top priority for identity professionals.

Keynote: Digital Identity at United Airlines

United Airlines provided insights into their advancements in integrating digital identity into their customer experience. One of the highlights was the Information Security team reporting to the Customer Success team, demonstrating the importance of combining security with a delightful user experience. The transition from security questions to Email & SMS OTP for second-factor authentication was noted as a step forward, though it presents its own set of challenges. United Airlines is leading the way in embedding identity into every customer touchpoint, from personalized in-flight entertainment to recommending nearby flights during delays.

Conclusion

Identiverse 2024 was a resounding success, offering valuable insights and innovative solutions that will undoubtedly shape the future of digital identity. At 1Kosmos, we’re excited to implement these learnings and continue pushing the boundaries of what’s possible in identity and security. For more information about Identiverse 2024 and our key takeaways, check out our IBA Friday episode.

The post Identiverse 2024: Key Takeaways from the Conference appeared first on 1Kosmos.


liminal (was OWI)

Link Index for Customer Authentication

The post Link Index for Customer Authentication appeared first on Liminal.co.

IDnow

Players ready: IDnow sees 8.5 times more verification requests amid busy betting season

London, June 18, 2024 – Ahead of the busy summer betting period, IDnow, a leading identity verification provider in Europe, has seen a significant spike in the number of gambling customers requesting identity verification. Between 14 and 16 June, IDnow saw a 750% increase in the number of players being verified with a European gambling […]

London, June 18, 2024 – Ahead of the busy summer betting period, IDnow, a leading identity verification provider in Europe, has seen a significant spike in the number of gambling customers requesting identity verification.

Between 14 and 16 June, IDnow saw a 750% increase in the number of players being verified with a European gambling operator [1]. The busiest sign-up day was 14 June, the day of the opening match of UEFA EURO 2024, which saw a much higher volume of verification requests than was seen during the 2022 World Cup.

This latest spike coincides with the lead up to the opening game of UEFA EURO 2024 on 14 June. During each country’s first game, IDnow is expecting these figures to stay at a high level. With that, an increased likelihood of bonus abuse, multi-accounting, underage gambling, money laundering and fake documentation.

Effective, compliant verification processes to onboard vast numbers of players

“The challenges operators face during major sporting tournaments are not necessarily any different to their regular daily compliance hurdles, but the sheer volume of new players from all around the world, coupled with increased activity is likely to cause issues for unprepared operators,” said Roger Redfearn-Tyrzyk, Vice President of Global Gaming at IDnow.

“EURO 2024 will be a huge opportunity for gambling platforms, new and old, so it’s essential they capitalize on the event, while safeguarding their players and their reputation. It’s important they have effective, compliant verification processes in place to onboard vast numbers of players as smoothly and safely as possible. Only when operators are confident they can verify accounts thoroughly, and onboard the right players during such busy period, will they be effective in fighting fraud.”

As global sporting events have such broad appeal, they often attract both casual and regular bettors. Combined with the convenience and accessibility that online platforms now offer, betting during these periods is getting increasingly popular. Indeed, betting during the 2022 World Cup increased by 13 percent compared with the 2018 FIFA World Cup.

Worryingly, the surge in gambling activity was seen on both regulated and unregulated sites. For example, in the UK alone, 250,000 people visited unregulated, black-market sites during the last World Cup compared with just 80,0000 during the same time frame of the previous year.

Other key security concerns that are exacerbated by major sporting events, such as EURO 2024:

Increase in site traffic is often used by fraudsters as a smoke screen for nefarious activities. They may conduct identity theft, or attempt account takeovers by launching phishing attacks and cyberattacks on unsuspecting users. Unsecured public Wi-Fi networks can pose a security risk when players log in to place bets. These networks are susceptible to interception by hackers, leading to unauthorized access to user accounts, exposure of sensitive personal information and potential financial losses for both users and operators. Rising rates of gambling fraud, with chargeback fraud a particularly common issue. While identity verification solutions can help verify the identity of users and detect fraudulent activity, preventing chargeback fraud requires additional measures, such as transaction monitoring and collaboration with payment processors to identify and block suspicious transactions in real time.

Although all global gambling platforms are likely to see an uptick in player onboarding, it will likely be European nations that share the same time zone as the tournament that will see the biggest increase in usage.

“Gaming operators in these particular regions must offer secure player onboarding, deposits and withdrawals, and conduct seamless Anti-Money-Laundering and age verification checks to help fight the expected proliferation of fraudulent activity in both the build up to EURO 2024 and throughout the tournament,” concluded Roger Redfearn-Tyrzyk.

[1] Compared to the same time frame last year.


HYPR

Key Takeaways From Horizon3.ai’s Analysis of an Entra ID Compromise

As enterprises shift from on-premises to cloud systems, hybrid cloud solutions have become essential for optimizing performance, scalability, and user ease. However, risks arise when poorly configured environments connect to the cloud. A compromised Microsoft Active Directory can fully compromise a synchronized Microsoft Entra ID tenant, undermining the integrity and trust of connected

As enterprises shift from on-premises to cloud systems, hybrid cloud solutions have become essential for optimizing performance, scalability, and user ease. However, risks arise when poorly configured environments connect to the cloud. A compromised Microsoft Active Directory can fully compromise a synchronized Microsoft Entra ID tenant, undermining the integrity and trust of connected services.

Researchers at Horizon3.ai recently published a fascinating analysis on how on-premise misconfigurations in hybrid Microsoft environments can be exploited by attackers using well-documented techniques. In this case, the attack chain can lead to full compromise of the Entra Tenant. In complex enterprise environments, such misconfigurations are all too common. It’s critical for security teams to understand the tactics attackers use and strategies to close these points of vulnerability. Here we take a closer look at the attack chain and offer some additional mitigation strategies.

The Attack Chain Using the NodeZero™ tool:

 

  MITRE ATT&CK Technique(s) Attack Chain Step 1

T1557.001

NBT-NS traffic from Host 1 is poisoned to relay a netNTLM credential to Host 2 — an SMB server that doesn’t require signing.

2

T1003.002, T1078.003

Host 2 SAM database dump exposes a local administrator credential that is reused on Host 3 and Host 4.

3

T1003.001, T1078.002, T1078.003, T1219

Shared local admin credential is used to run a remote access trojan (RAT) on Host 3 and perform an LSASS dump, discovering a domain administrator credential (HOST3$).

4

T1003.004, T1078.002, T1078.003

Shared local administrator credential is used to remotely dump LSA on Host 4, revealing another domain administrator credential (Admin2).

5

T1087.004, T1003

Admin2’s credentials used to query AD, determining that the domain uses Entra Connect; credential dumping techniques used to harvest the cloud credential for Entra Connect.

6

T1003.003, T1558

HOST3$’s credentials used to perform an NTDS dump on another Domain Controller (DC2), discovering the credential utilized to sign Kerberos tickets for Azure cloud services when Seamless SSO is enabled.

7

T1528

Entra Connect credential used to log into Entra tenant. Refresh token obtained for easier long-term access.

8

T1087

Analysis of AzureHound data reveals on-premise user Global Administrator (EntraAdmin) within the Entra Tenant.

9

T1558.002

Silver ticket attack used to forge Kerberos Service Ticket for Entra Admin.

10

T1098

Access granted to the Microsoft Graph cloud service, without being prompted for MFA, with Global Administrator privileges.

 

Fortunately, this hacking exercise was carried out by white hat pentesters. The researchers at Horizon3.ai noted that, with absolutely no prior knowledge of the company’s environment, it took the NodeZero tool only an hour to compromise the on-premises AD domain, and the associated Entra ID tenant was compromised in less than two hours.

Prevention Strategies

The team at Horizon3.ai included a set of solid initial mitigation recommendations. These include:

Prevent NTLM Relay: Disabling NBT-NS and enforcing SMB Signing would have prevented the initial access technique used, although other vectors can be used for initial domain access.  Use LAPS: Reuse of credentials for Local Administrators enabled key lateral movements that lead to the discovery of Domain Administrator credentials.  Treat Entra Connect as a Tier-0 Resource: Install Entra Connect on a non-DC server (with LAPS enabled) and adequately protected with an EDR solution.   Do not Use On-Premises Accounts for Entra Administrator Roles: Microsoft recommends limiting the number of Entra Administrators and their level of privilege.   Further Critical Recommendations

In addition to those, we recommend specific strategies to close common security gaps in Microsoft Entra ID environments.

Use HYPR as a complement to LAPS to ensure administrators access their devices and systems using a phishing-resistant authentication method. Review and revise your PAM (Privileged Access Management) program. The Static Domain Admin password should be rotated after it is used / checked out. You should also reduce the viability of the Domain Admin Credential in Cache. Use secure passwordless SSO methods such as HYPRspeed that don’t rely on shared secrets and instead leverage public key cryptography. Enforce the use of phishing-resistant passwordless MFA methods, such as HYPR, for privileged Entra users access. Begin migration to Entra ID from legacy on-prem technology. While not a small project, it will reduce the threat model of older protocols that rely on hash/passwords.

Monday, 17. June 2024

KuppingerCole

Data Privacy - The Future of Digital Identity with Max Schrems

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. Max Schrems and Marina Iantorno talk about the future of privacy, particularly in Europe, and the challenges of enforcing privacy laws. They discuss the use of user data by Meta and the need for consent and proper anonymization.

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. Max Schrems and Marina Iantorno talk about the future of privacy, particularly in Europe, and the challenges of enforcing privacy laws. They discuss the use of user data by Meta and the need for consent and proper anonymization.




Spruce Systems

Meet the SpruceID Team: Libby Brown

We're thrilled to have recently welcomed Libby, our new product manager, to the team!
Name: Libby Brown
Team: Product Management
Based in: Seattle area About Libby

I’ve been working in technology and product management for longer than I care to admit, the past 6 years of which were in Identity at Microsoft as product manager of the Microsoft Authenticator app, then all of our passwordless credentials for Microsoft Entra (formerly known as Azure Active Directory.) I really enjoyed working at that intersection of building new technologies in the identity and security ecosystems, and driving the adoption of those technologies by creating user experiences that were compelling both on an individual and organizational level. 

Coming to SpruceID was just a “shift left” in my career, moving upstream in the identity lifecycle, moving to concepts that are earlier in their definition and implementation stages, while at a company still in its launch phase.   

Can you tell us about your role at SpruceID?

As Product Manager, I see my role as setting the longer-term direction of the products we are building while staying grounded in the day-to-day details of what and how we deliver  on customer promises today. I want to define who SpruceID’s customers are, what they need, and how we’ll meet them, then get everyone on board to deliver. 

What do you find most rewarding about your job? 

Startup life is amazing! I love the hustle, the energy, the pressure to deliver because everyone on the team has to play their part to make the whole company successful - it’s invigorating! 

What are some of the most important qualities for someone in your role to have, in your opinion?

Product (and program) managers need to be able to explore and understand a subject's big picture and then dive down to the nitty gritty details to be successful. Problem-solving and collaboration are also key skills—my current mantra is, “Yes, and….” as in, “Let’s say yes to the idea or challenge, and what else do we need to do to make it happen?” It’s in those discussions of how to do something that good ideas rise up, and the consensus is gained; that doesn’t happen if the first response is a No. 

What are you currently learning, or what do you hope to learn?

First, I’m still learning non-Microsoft tools - I doubt I will ever come to love Google Calendar – but after 2 decades of muscle memory, I’m beginning to leave Outlook behind. 

I was reading the AAMVA Implementation Guide yesterday afternoon; digging into the W3C VCDM email digests earlier; and last week was refreshing my knowledge of NIST IAL levels. Learning is just a constant in this space, which is one of the reasons I love working on the bleeding edge of identity and security. If I ever find myself NOT learning, that’s when I know it’d be time to move on.  

What has been the most memorable moment for you at SpruceID so far?

I was in Rio de Janeiro for our all-company meeting, my second week on the job. It was definitely one of my best onboarding experiences ever! It was a great way to meet everyone at once and learn folks’ roles and responsibilities. Also, being a part of big, foundational conversations we were having as a team helped me jump right into product management tasks when I returned home. 

What's the best piece of advice you’ve received since starting your job here?

“Meet Everyone.” Each person at SpruceID has unique skill sets, so it’s been great to learn what those are and jointly figure out how to combine those capabilities to deliver great results. 

How do you define success in your role, and how do you measure it?

I have done my job successfully when: 

customers know what SpruceID can deliver through our digital identity capabilities with the Credible platform;   Customers can realize the business value to their organization from digital credentials and  SpruceID has a long and lustrous roadmap to continue delivering features of value to our customers, existing and new.  What is your favorite part about working at SpruceID?

I love #startuplife! After a long time working in and around Microsoft, the direct control I have for driving product definition is wonderfully both terrifying and invigorating. 

Fun Facts

What do you enjoy doing in your free time? I enjoy spending time with my kids. They’re at a fun age where they still want me to be a part of their activities.

What is your favorite coding language (and why?): HAHAHAHAHA—I am not a programmer, though I’ve spent my life around computers and technology. My first language was Basic, written on a Timex Sinclair computer and saved on an audio cassette. I built my first website via Notepad and FTP. I really didn’t advance in computer science much beyond that until I took an Intro to Programming class taught in Java this spring.

If you could be any tree, what tree would you be and why?: A weeping willow - they love being near water and make perfect playgrounds and climbing gyms for children under their long boughs. 

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions.

Sunday, 16. June 2024

KuppingerCole

Identity Standards - The Future of Digital Identity with Joni Brennan

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. Joni Brennan and Martin Kuppinger focus on the evolution of decentralized identity and the need for interoperability among different standards. The discussion also touches on topics like consent management, trust, and provenance.

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. Joni Brennan and Martin Kuppinger focus on the evolution of decentralized identity and the need for interoperability among different standards. The discussion also touches on topics like consent management, trust, and provenance.



Friday, 14. June 2024

KuppingerCole

Identity Defense in Depth - The Future of Digital Identity with Ian Glazer

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. John Tolbert and Ian Glazer focus on the concerns and changes in digital identity. They discuss the need for new approaches in access management and access governance, as well as the importance of recognizing users rather than just authenticating them. They also explore the concept of id

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. John Tolbert and Ian Glazer focus on the concerns and changes in digital identity. They discuss the need for new approaches in access management and access governance, as well as the importance of recognizing users rather than just authenticating them. They also explore the concept of identity defense in depth and the application of fraud reduction technologies in the enterprise. Overall, the conversation highlights the need for continuous and dynamic decision-making in digital identity.




Transmute TechTalk

Introducing Our New AI-Powered Drag-and-Drop PDF Feature for USMCA Certificates of Origin

Convert an existing USMCA PDF into a Verifiable Credential We are excited to announce a new functionality in our product that leverages the power of AI to streamline the process of digitizing USMCA Certificates of Origin. This latest update allows users to easily drag and drop existing PDF files of their certificates into our web application. Our advanced AI technology reads the PDF and converts it
Convert an existing USMCA PDF into a Verifiable Credential

We are excited to announce a new functionality in our product that leverages the power of AI to streamline the process of digitizing USMCA Certificates of Origin. This latest update allows users to easily drag and drop existing PDF files of their certificates into our web application. Our advanced AI technology reads the PDF and converts it into a digital form that can be seamlessly completed online.

Why This Feature Matters

The United States-Mexico-Canada Agreement (USMCA) requires businesses to certify the origin of their products, which traditionally involves handling paper documents. This process can be time-consuming, error-prone, and cumbersome.

In the world of international trade, a large number of important documents, including USMCA Certificates of Origin, exist as PDFs or scans of paper documents. These files are often manually processed, which is not only inefficient but also prone to errors. Our new AI-powered drag-and-drop feature addresses these challenges by:

Simplifying the Workflow: No more manual entry or scanning. Just drag your existing PDF into our application. Saving Time: The AI quickly reads and converts your PDF into a digital form, ready for completion. Reducing Errors: Automated data extraction minimizes the risk of human error. Enhancing Accessibility: Access your digital documents anytime, anywhere, from any device. How It Works Drag and Drop Existing USMCA files to convert them Drag and Drop: Simply drag your existing USMCA Certificate of Origin PDF into the designated area in our web application. AI Processing: Our advanced AI technology reads the PDF and extracts the necessary information. Digital Conversion: The extracted data is then populated into a digital form. Complete and Save: Review and complete the digital form, making any necessary adjustments, and save it securely within the application. Benefits for Your Business

This new feature is designed to make the digitizing process as straightforward as possible, ensuring you can focus on what matters most — your business. Here are some key benefits:

Efficiency: Drastically reduce the time spent on document processing. Accuracy: Ensure the information in your certificates is accurate and up-to-date. Convenience: Access and complete your forms from anywhere, without the need for physical paperwork. Compliance: Stay compliant with USMCA regulations with ease.

We believe this new functionality will greatly enhance your experience and efficiency in managing USMCA Certificates of Origin. Our team is committed to continuously improving our product to meet your needs and help your business thrive in an increasingly digital world.

Get Started Today

Ready to try out the new AI-powered drag-and-drop feature? Log in to your account and start digitizing your USMCA Certificates of Origin with just a few simple steps. For more information, visit our guide page.

Looking Ahead

We’re thrilled to introduce this feature for USMCA Certificates of Origin, but we’re not stopping there. Stay tuned as we roll out this powerful AI-driven functionality to support more document types, making it easier than ever to manage all your important paperwork digitally.

Thank you for choosing our product. We look forward to helping you streamline your document management processes and enhance your business operations.

Introducing Our New AI-Powered Drag-and-Drop PDF Feature for USMCA Certificates of Origin was originally published in Transmute on Medium, where people are continuing the conversation by highlighting and responding to this story.


Northern Block

Germany’s Wallet Strategy, Interop Profiles, Big Tech (with Kristina Yasuda & Niels Klomp)

Explore Germany’s digital identity strategy, interoperability challenges, and big tech's influence in our latest SSI Orbit Podcast episode with Kristina Yasuda and Niels Klomp. The post Germany’s Wallet Strategy, Interop Profiles, Big Tech (with Kristina Yasuda & Niels Klomp) appeared first on Northern Block | Self Sovereign Identity Solution Provider. The post <strong>Germany’s Wall

🎥 Watch this Episode on YouTube 🎥
🎧   Listen to this Episode On Spotify   🎧
🎧   Listen to this Episode On Apple Podcasts   🎧

About Podcast Episode

Are you curious about how Germany is navigating the complex landscape of digital identity wallets and interoperability?

In this episode of The SSI Orbit Podcast, host Mathieu Glaude sits down with guests Kristina Yasuda and Niels Klomp to explore Germany’s ambitious wallet strategy, the development of interoperability profiles, and the influence of big tech on digital identity.

Kristina and Niels bring a wealth of knowledge and firsthand experience to the discussion, making this episode a must-listen for anyone interested in the future of digital identity. They delve into Germany’s approach to creating a secure and user-friendly wallet infrastructure, the challenges of achieving interoperability across different systems, and how major technology companies are shaping the digital identity space.

Get ready to uncover:

✨ The strategic objectives behind Germany’s wallet initiative
✨ Key considerations for developing and implementing interoperability profiles
✨ The role of big tech in driving innovation and potential risks
✨ Insights into the regulatory landscape and its impact on digital identity projects

Join us for an engaging and informative conversation highlighting the complexities and opportunities within the digital identity ecosystem.

Tune in to The SSI Orbit Podcast to stay ahead of the curve and gain valuable insights into the future of digital identity.

 

Key Insights: Germany’s approach to driving wallet adoption includes an innovation competition (“Funke”) and a consultation process with civil society. Interoperability profiles, such as OpenID4VC and the Dutch Decentralized Identity Interoperability Profile (DDIP), aim to promote adoption and innovation by providing a common subset of technologies and specifications. The role of big tech companies in providing mobile operating systems and browsers is a double-edged sword, enabling better user experiences but also raising concerns about control and data monetization. Governments have a crucial role in setting regulations and fostering an equal playing field for all stakeholders in the digital identity ecosystem. Strategies: Governments can promote innovation and adoption by creating testbeds, innovation competitions, and consultation processes. Interoperability profiles, updated in a cadence, can help organizations and vendors align on a common set of specifications and technologies, enabling interoperability. Separating the roles of big tech companies as infrastructure providers (mobile operating systems, browsers) and product providers (wallets) can help address concerns about control and data monetization. Chapters: 00:00 – What wallet solutions are the German government funding, how it fits into eIDAS 2.0 11:58 – Governments x Wallets x Governance Frameworks 18:02 – Technical interop profiles from an adoption cycle lens, adoption being 27:08 – eIDAS 2.0 Personal ID adoption can happen quicker because of Org ID 30:00 – More on technical interoperability profiles: HAIP, DIIP, and the future of interop profiles 41:05 – Japanese government national ID, implications of governments issuing into Big Tech Wallets 46:00 – The Browser API for credential presentation 50:31 – Is there anything missing on top of eIDAS 2.0? 55:48 – About Presentation Exchange and its current/future roadmap in OpenID Additional resources: Episode Transcript The Federal Agency for Disruptive Innovation SPRIND Funke Decentralized Identity Interoperability Profile My Number Card in Japan Internet Engineering Task Force (IETF) About Guests

Kristina Yasuda is an Identity System Architect at SPRIND, the German Federal Agency of Disruptive Innovation. With a strong background in decentralized identity, she previously worked at Microsoft as an Identity Standards Architect. Kristina is renowned for her contributions to identity standards, serving as an editor for OpenID for Verifiable Credentials, Selective Disclosure for JWTs, and JWT-VC Presentation Profile. She also chaired the Verifiable Credentials Working Group at W3C and contributed to the mobile driving licence standard at ISO/IEC. Kristina’s achievements have been recognized with Forbes Japan 30Under30 and MITTR Japan Innovators Under 35 awards.
LinkedIn: https://www.linkedin.com/in/kristinayasuda/

Niels Klomp is an expert in decentralized identities, architectures, and specifications. As the CTO of Sphereon, he leads the development of a cutting-edge Verifiable Data and Decentralized Identity Platform, available both in the cloud and on-premise. Sphereon offers advanced identity services and products, including SDKs, APIs, and wallets, leveraging microservices, decentralized identity, and blockchain technologies. With over 20 years of experience, Niels has a strong background in cloud and microservices architectures, continuous integration, blockchain development, and enterprise integration.
LinkedIn: https://www.linkedin.com/in/niels-klomp/

  The post Germany’s Wallet Strategy, Interop Profiles, Big Tech (with Kristina Yasuda & Niels Klomp) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

The post <strong>Germany’s Wallet Strategy, Interop Profiles, Big Tech</strong> (with Kristina Yasuda & Niels Klomp) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Metadium

Announcement — New Director Appointed

Announcement — New Director Appointed Announcement — New Director Appointed Dear Community, We are excited to announce the addition of a new member to our board of directors at Metadium. As of June 6 2024, Francisco E. Filho has joined With a distinguished career spanning over 15 years in computer hardware and technology, Francisco is a powerhouse of knowledge and expertise, now joining o
Announcement — New Director Appointed

Announcement — New Director Appointed

Dear Community,

We are excited to announce the addition of a new member to our board of directors at Metadium. As of June 6 2024, Francisco E. Filho has joined

With a distinguished career spanning over 15 years in computer hardware and technology, Francisco is a powerhouse of knowledge and expertise, now joining our team to revolutionize the field of blockchain technology. His extensive background equips him with the unique ability to seamlessly integrate hardware insights with the latest advancements in blockchain, driving unprecedented innovation and development. Francisco’s profound understanding of hardware, combined with his exceptional communication skills and fervor for cutting-edge technology, makes him an invaluable asset to our team. We are thrilled to welcome him on board as we push the boundaries of what’s possible in the tech world. Please join us in welcoming Francisco to the Metadium family.

메타디움 신임 이사 선임 공지

안녕하세요 메타디움 커뮤니티 여러분,

메타디움 이사회에 새로운 구성원이 추가되었음을 알리게 되어 기쁩니다. 2024년 6월 6일자로 Francisco E. Filho가 이사로 합류했습니다.

컴퓨터 하드웨어 및 기술 분야에서 15년 이상의 뛰어난 경력을 보유한 Francisco는 훌륭한 지식과 전문성을 가진 사람입니다. 이제 블록체인 기술 분야에 혁명을 일으키기 위해 우리 팀에 합류했습니다. 그의 광범위한 배경은 하드웨어적 통찰력을 블록체인의 최신 발전과 원활하게 통합하여 전례 없는 혁신과 개발을 주도할 것입니다. 하드웨어에 대한 프란시스코의 깊은 이해와 뛰어난 의사소통 능력, 최첨단 기술에 대한 열정이 결합되어 우리 팀의 귀중한 자산이 되었습니다. 우리는 기술 세계에서 가능한 것의 경계를 넓히고 있는 그를 환영하게 되어 매우 기쁩니다. Francisco가 Metadium 가족이 된 것을 함께 환영해 주세요.

- 메타디움 팀

Website | https://metadium.com

Discord | https://discord.gg/ZnaCfYbXw2

Telegram(EN) | http://t.me/metadiumofficial

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Announcement — New Director Appointed was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

IDnow team explains why invitation to European Blockchain Sandbox will take Web3 identity verification to the next level.

We sit down with our Director of Global Regulatory and Government Affairs, Rayissa Armata and our Senior Architect, Sebastian Elfors to discuss collaboration, consortiums and the honor of being invited to join the European Blockchain Sandbox. Earlier this year, we revealed that we had joined the IOTA Foundation consortium, alongside walt.id, HAVN Network and Bloom Labs with […]
We sit down with our Director of Global Regulatory and Government Affairs, Rayissa Armata and our Senior Architect, Sebastian Elfors to discuss collaboration, consortiums and the honor of being invited to join the European Blockchain Sandbox.

Earlier this year, we revealed that we had joined the IOTA Foundation consortium, alongside walt.id, HAVN Network and Bloom Labs with the aim of creating a Web3-ready KYC and identity verification solution.  

This initiative will enable Crypto Asset Service Providers (CASPs) and self-hosted wallets to become compliant with the Transfer in Funds Regulation (TFR), which is part of the greater EU Anti Money Laundering Package. 

Sounds relatively straight forward, right? Well, yes and no. New regulations mandate all crypto transactions to a) carry identifying data of the sender and the receiver and b) comply with similar AML rules as other financial institutions. The issue lies with CASPs also being required to comply with new rules regarding GDPR compliance, which stipulate that personal identifiable information (PII) should not be stored on blockchains or Distributed Ledger Technologies (DLT). 

Enter the consortium; each partner bringing its own strengths and areas of expertise. 

Here, Rayissa and Sebastian fill us in on how the project has been going since it was announced in February 2024.

As Web3 is still just an idea for a ‘new internet’, why did we decide to join the consortium to try and solve this ‘theoretical problem?’ Will this project be important for the future of payments, in particular in the crypto space?

Rayissa: Great question. Yes, there were several reasons why IDnow joined the consortium. For one, it is important to collaborate and to be able to adapt to new environments be it from a regulatory or technological perspective. Although concepts can be developed quickly, building the frameworks and ultimately achieving a usable product or service does not happen in a vacuum.

As you know, Europe established the crypto regulatory framework and licensing regime, Markets in Crypto Assets Regulation (MiCA) and the companion rule known as TFR, which applies to transactions between cryptocurrency businesses and personal wallets. Due to be applied by the end of the year, TFR applies to transactions between cryptocurrency businesses and personal wallets. For example, with transactions, TFR requires CASPs to store certain PII so it can be cross matched to verify information on the originator of the transferred funds and the beneficiary of the transfer. The consortium took on the challenge of how this information can be transferred on and off chain. These solutions not only address privacy concerns, KYC and interoperability but will enhance trust in a relatively new technology ecosystem.

Sebastian: One key objective with this project is to design and implement a KYC solution that can be used when issuing credentials that are stored as SoulBound Tokens (SBTs) on a blockchain. Technically speaking, the consortium is using Ethereum Virtual Machine-compatible IOTA Smart Contract Chain. Having said that, the SBTs must also be compliant with GDPR regulations, so the SBTs contain anonymous identifiers instead of PII in clear text. If there is a need for an authority to verify the user’s identity, this can be done by requesting the initial KYC user information from walt.id in conjunction with IDnow.

What has the consortium been working on since February? Do you have any updates?

Sebastian: The consortium has spent the past four months testing the solution and refining the implementation with scalability for production purposes in mind. Long term archiving and retrieval of identity documents is one such topic. We have also presented and demonstrated the solution to several stakeholders, with very positive feedback.

Preparing for the known: Operating in a world of crypto regulation. Download to discover how crypto exchanges can prepare for the brave new world of crypto regulation.  Read now Now onto the good news. The consortium’s Web3 Identification Solution (which will enable KYC and identity verification in the Web3 environment), has been selected to take part in the European Blockchain Sandbox. What is it and why is it important?

Rayissa: Let me share a little about the European Blockchain Sandbox initiative since we have already covered a little about the consortium’s aims and activities above. The European Commission provides various opportunities to introduce new solutions that meet European regulatory requirements, such as data sharing in the TFR. This European Blockchain Sandbox initiative is geared so that new use cases that involve DLT can be developed and tested. This Blockchain initiative began in 2023 and will continue until 2026.

Sebastian: We are proud and happy that our consortium has been selected to participate in the Sandbox. This is a way for IDnow to demonstrate blockchain-based solutions to EU legislators, and pinpoint how they can meet the relevant EU regulations for cryptocurrencies. In particular, the IOTA solution is targeted to comply with the upcoming TFR regulation and we are keen on proving how the product can meet those requirements.

Why is cross-collaboration so important for such a project?

Rayissa: The European Commission recognizes the need for collaboration and cooperation between public and private sectors. Part of the regulatory framework is to set requirements that address interoperability, innovation, security and user accessibility. Collaboration is key to enabling this set of requirements and we have many legal experts and technology-savvy developers guiding our project to build something new. IDnow has been at the forefront of good management, regulatory compliance and smart growth since we began and collaboration is a big part of our success.

Sebastian: In addition to the cross-collaboration with the EU Commission, we are also very proud of working with the IOTA-founded consortium. All parties contribute with cutting edge technologies:

IOTA provides the underlying network based on the IOTA Smart Contract Chain
HAVN Network integrates hybrid blockchain technologies
IDnow delivers the ID verification, prior to the tokenization into a SBT
walt.id has developed the trusted witness service for creating and verifying SBTs
Bloom provides the digital wallet for storing and presenting the SBTs How will our existing and future crypto clients, who use our solutions to onboard customers and process transactions, be able to benefit from the research and development that is going into the Web3 Identification Solution?

Rayissa: From a compliance perspective, CASPs will be able to use this solution to comply with national and European KYC and TFR requirements to share personal data between the originator and beneficiary of a specific transaction. This will enable trust in the crypto ecosystem and beyond in Web3 and use cases. There may not be a regulatory framework with set requirements for Web3 or Web4 applications, but trust brings growth and what a great opportunity for these environments to become more mainstream.

Sebastian: The solution developed by the IOTA solution has the potential to go beyond the realm of cryptocurrency transactions, which is the current scope of the project. The SBTs that are issued could also be used for digital identities in the Web3 metaverse, for example for identifying avatars purchasing NFTs such as digital paintings.

Will the Web3 Identification Solution be commercially available in the future? If so, which companies and industries will the product appeal to?

Rayissa: Yes, certainly that is a goal. We are already in discussions with our consortium members and external CASPs and companies specialized in IOT services.

Sebastian: The term Web3 was initially coined in 2014 by the Ethereum co-founder Gavin Wood when he created the vision of a “decentralized online ecosystem based on blockchain.” So yes, the Web3 Identification Solution can be used by all industries that have a need for digital identification. In addition to authentication for CASPs, totally different verticals such as the gaming industry could benefit from Web3 identification. Only the imagination sets the limits.

By

Jody Houton
Senior Content Manager at IDnow
Connect with Jody on LinkedIn

Thursday, 13. June 2024

KuppingerCole

AI-Ready Cloud Infrastructure - The Future of Digital Identity with Jianchao Wang

Marina Iantorno and Jianchao Wang discusses the key considerations when designing cloud infrastructure for AI applications, successful integration of AI into cloud services, the evolving relationship between edge devices and cloud infrastructure, emerging trends in AI and cloud computing, and the future of deploying AI models locally versus using cloud or hybrid methods.

Marina Iantorno and Jianchao Wang discusses the key considerations when designing cloud infrastructure for AI applications, successful integration of AI into cloud services, the evolving relationship between edge devices and cloud infrastructure, emerging trends in AI and cloud computing, and the future of deploying AI models locally versus using cloud or hybrid methods.




Unveiling 2024's Identity Landscape: Real-World Tactics, Top Risks, and Cybersecurity Best Practices Revealed

As cyberattacks increase in volume and sophistication it is not a matter of if, but when, your organization will have an incident. Threat actors target accounts, users, and identities via lateral movement and poor identity security hygiene. Detecting and defending against this MUST be the basis of a modern cybersecurity initiative. Join identity and security experts from KuppingerCole

As cyberattacks increase in volume and sophistication it is not a matter of if, but when, your organization will have an incident. Threat actors target accounts, users, and identities via lateral movement and poor identity security hygiene. Detecting and defending against this MUST be the basis of a modern cybersecurity initiative.

Join identity and security experts from KuppingerCole Analysts and BeyondTrust as they share the risks associated with poor identity security disciplines, techniques that external threat actors and insiders leverage, and operational best practices to protect against identity theft, and to develop an effective identity security strategy.

Paul Fisher, Lead Analyst at KuppingerCole Analysts, and Morey J. Haber, Chief Security Advisor at BeyondTrust will discuss the central role of identity security for zero trust models, and look at why cyber hygiene, specifically with respect to identity, is still not at the required level, like why there is no 100% adoption of MFA in enterprises. They will also discuss how identity-based attacks are used as the starting point of more elevated attacks.

Hear examples of tactics used in real world identity attacks. Discover The Top Five Identity Risks in 2024. Learn about Operational best practices to protect Identities & Access from Cyberthreats.


Spruce Systems

What is Decentralized Identity?

Decentralized identity is gaining traction globally through diverse implementations and aligns with SpruceID's mission to empower users with control over their data.

Decentralized identity is rapidly gaining momentum as an industry concept, with a growing number of live implementations. Its early roots were in pioneering projects like uPort, appearances at IIW, Microsoft’s subsequent bet, and its emergence as a hot topic at industry conferences like Identiverse and EIC. We see it in practice globally in the California DMV Wallet, the European Union Digital Identity initiative, and even digital pioneers like Bhutan

It’s easy to get confused because “decentralization” has become a buzzword for many new technologies. While its source of decentralization doesn’t need to come from blockchains or cryptocurrencies, decentralized identity is ultimately aligned with the objective of returning control to end users.  

At SpruceID, we think about decentralized identity as systems built using the issuer-holder-verifier model, which achieves decentralization when open protocols are used such that any party can assume the role(s) of the issuer, holder, or verifier. These roles come with a clean separation of concerns that mitigate undesirable side effects, such as issuer “phone home” during holder usage. These systems can support either functional (“attributes”) or foundational identity. So, it’s decentralized in that we’re not forcing all digital identity-related interactions through one central control point; instead, peer-to-peer interactions are the default.

Starting with this definition, it’s easier to evaluate technical standards based on their core ability to support these interoperable capabilities, rather than indexing on affiliation with certain brands or use cases. For example, all the digital credential technologies mentioned in the EU ARF, including W3C Verifiable Credentials, SD-JWTs, and ISO/IEC 18013-5 mobile driver’s license (mDL), can fit this definition of supporting decentralized identity depending on the implementation details. These can all also violate the definition depending on implementation details.

It’s worth noting that identity and credentialing systems are already more decentralized than you might realize. Even in the United States, DMVs do not have a monopoly on all forms of identification: other documents, such as passports or permanent residency cards, are widely accepted. Even utility bills and bank statements often play a role in verifying identity. 

If we start with a decentralized landscape, a decentralized approach to the technology seems to make the most sense. Open digital standards for sharing this already broad plethora of credentials largely amount to bringing existing credentials together under one umbrella and opening the playing field for other issuers, whom verifiers can choose to trust or not, on a case-by-case basis.

Already, verifiable digital credentials issued across state and federal bodies fall under different administrations, many of which don’t see eye-to-eye, even within the same agency. This doesn’t even scratch the surface of the complexities within the private sector—and an open system is the best way to accommodate that complexity. 

Here at SpruceID, we believe the use of decentralized identity, under this definition, furthers our mission to let users control their data across the web (and beyond). We will continue to evaluate technologies and technical standards to ensure alignment with these capabilities and our organization’s values. 

To learn more about how we are working to build decentralized identity, visit our knowledge base.

Learn More

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions.


Matterium

Stablecoins — Inherently, Unfixably, Problematic

Stablecoins — Inherently, Unfixably, Problematic …And Why Gold on the Blockchain Makes Them Obsolete Mattereum’s partnership with Sempsa JP, one of the world’s oldest and most respected producers and distributors of precious metals, is enabling us to tokenize gold through our German regulated exchange partner Swarm in quantity, giving investors immutable proof of ownership and provenance so
Stablecoins — Inherently, Unfixably, Problematic …And Why Gold on the Blockchain Makes Them Obsolete

Mattereum’s partnership with Sempsa JP, one of the world’s oldest and most respected producers and distributors of precious metals, is enabling us to tokenize gold through our German regulated exchange partner Swarm in quantity, giving investors immutable proof of ownership and provenance so that every token has a traceable, legally enforceable, connection to a specific gold bar in a named vault and warranties relating to its provenance and other qualities, legally valid in 170 countries worldwide. To make it even easier to put gold, or any other high value asset on chain, the Mattereum group, through its German subsidiary Mattereum GmbH has a discount token, MATR*, on sale through Swarm. This gives anyone onboarding assets a discount of up to 50% on their onboarding fees, which is significant when you are talking about gold bars.

With other forms of tokenized gold, the value of the token is backed by gold that remains in the ownership of the token issuer, and you just have to trust that it’s there. With Sempsa gold as an NFT, backed by a Mattereum Asset Passport (MAP), the token buyer indisputably owns the gold to which the token is bound, and their claim to it is enforceable by international law in over 170 jurisdictions. This combines the solidity and value of owning a physical gold bar with the convenience, speed and flexibility of selling NFTs. No more needing to have bars lugged from the seller’s vault to the purchaser’s vault, no more complex paperwork, no more hassle. This is new, this is revolutionary, and it has the potential to kill stablecoins stone dead.

It has the potential to kill stablecoins stone dead

How so?

Stablecoins exist because they claim to offer the convenience of being blockchain tokens, but with the kind of stability that Bitcoin can’t provide, through being pegged to a currency, usually the US dollar, so that the stablecoin always has a set dollar value that doesn’t fluctuate — pegging (an unfortunate term [nsfw!], I know, but given the nature of stablecoins, maybe not inappropriate). As a result, they are meant to act as stable repositories of value against Bitcoin’s wildly fluctuating price. It is not uncommon for speculators who have reaped vast profit from Bitcoin fluctuations to cash out and put the earnings into stablecoin to hold onto the profits instead of trading it in for fiat and putting it in a bank.

So why is tokenized gold better? Why does it make stablecoins dead men walking? Well, for a start, stablecoins are not stable. We’ve already seen that can happen when stablecoins go wrong and depeg, and it was ugly. When TerraUSD and LUNA did it in 2022, it wiped out billions of dollars in value, and there were more than 600 fiat-backed stablecoin depegs in 2023 alone. Not so stable.

There were more than 600 fiat-backed stablecoin depegs in 2023 alone

Tokenized gold done through Mattereum so that it legally bonds the token to a named gold bar — one bar, one token — can’t depeg, it creates a StableNFT that is massively more solid than any stablecoins. Faced with this innovation, stablecoins are already dead, they just haven’t realized it yet.

Let’s dig into that a bit more. So how do you make a stablecoin? Somebody gives you $1 You put the dollar in the bank. They ask for the dollar back, you take the dollar out. Okay, great. That’s fine. How’d you make a living? Well, option one is the bank pays interest, which means it’s lending that money out. And you hoped the bank is stable and doesn’t lend the money to the wrong people. Maybe that’s fine. Maybe it’s not. Option two is to charge transaction fees like Mattereum does on the gold. You give us the gold. We charge a fee. That means that we’re not doing anything stupid with gold. In the meantime, we just hold on to it, we don’t move it, and that costs us some money and we charge that to you. Option three is you give me that dollar and I go and put it into the lowest risk legitimate investment instrument — one of these index funds type things. Now the thing that I told you is worth $1 is correlated to the stock market and if the stock market crashes we’re both toast, but if the stock market booms you get your dollar back and I get to keep all the upside. Then you imagine option three, but with shitcoins, and that’s a stablecoin, and the very, very large providers of stablecoins have problems, because that model is inherently problematic, inherently unfixably problematic; Tether has horrible, horrible lockbox issues. Nobody really knows what’s going on in there. Then USDC had a bunch of money in Silicon Valley Bank. If Silicon Valley Bank had not been bailed out by the US government, they might have had really serious problems — it turns out that they had about 8% of their reserves in that single financial institution, but no one owning USDC knew that until the bank hit trouble. So stablecoins are highly opaque. They’ve proven that they have interdependence with the global financial system and so are inherently volatile. Plus, there may or may not be issues of mismanagement that can be discovered at any time.

So, OK, you could say, that’s why we have central bank digital currencies (CBDCs). They are stablecoins, but unlike the privately issued stablecoins I’ve been talking about above, they are issued by a central bank so are no more likely to depeg than the money in your pocket (I am assuming some of you still use actual cash there). Well, yes, but these still have problems that make gold bound stable NFTs a better option. Like the dollar or pound, or whatever currency you choose, any CBDC linked to those is going to suffer the same problems with inflation that steadily erodes their value, at 2–3% inflation a year, which is less than recent rates, it doesn’t take long for a CBDC to have a fraction of the buying power it once did. Not so with gold, gold value, as I said, is steadily going up, and a stable NFT for a gold bar is going to do the same, so it’s still better than a CBDC stablecoin.

The core problem with privately issued stablecoins is a lack of transparency because you cannot see exactly what your stablecoin is backed by — that is proprietary information to these guys. But this proprietary information is what you would need to have full insight into a stablecoin and be able to do a full analysis and be sure what you were getting for your dollar. It’s not like it’s a small pool of these things — there’s $140 billion in them. What I’m suggesting is that the first criteria that you want for a stable instrument is staggering, bright side of the moon, black and white photography levels of clarity. You want a zero atmosphere full daylight shot of where the underlying asset is. With an on chain gold StableNFT backed by a MAP, that’s what you get. It’s this chunk of solid gold; it’s got this number; it’s in this building here. You own it.

You cannot see exactly what your stablecoin is backed by — that is proprietary information to these guys

In theory, that could be a dollar in a central reserve bank account instead. You can have an account with the US Federal Reserve Bank — you could just say there’s a million dollars in the Fed and million tokens one to one, here’s your deal. You could have a situation where you have an absolutely transparent portfolio. We can say we own a single thing and it is backed with a BlackRock index fund and for every dollar you give us we buy BlackRock index fund; here’s an audit report that says this is exactly what we own. We own 151,000 shares and our current market value x that is more than your reserve. Have a nice day. You could use gold and then it would be dollar price, gold, you give us $1 — we buy gold, when we can show you that there is enough money in the portfolio. We like the gold option, anything even remotely tied to dollars loses value due to inflation. Gold doesn’t — in 1929, ten gold bars would buy the average house, and today, ten gold bars will still buy you the average house.

Anything even remotely tied to dollars loses value due to inflation. Gold doesn’t

There’s no reason that you couldn’t use any single appreciating asset in a very simplistic way. If your asset value goes up, your investment goes up, and if it goes down, so does your investment. What you wouldn’t get with that is any of the fancy schmancy obfuscated hedge fund stuff, where if that asset went down, you are meant to have something else in the portfolio that would go up. Everything balances and you aren’t supposed to lose money, and, hopefully, you gain some. But as soon as we go down that path, we wind up a highly opaque hedge fund that sells portfolios, which are being used to theoretically secure value in an asset that is not supposed to go up and down. So basically, the suggestion here is that you either pick one thing and your stable coin goes up and down to that one thing, or you pick a basket of things and they’re connected by super complicated maths and you have to pray that that thing is stable, and that’s what the grown ups are supposed to do. However, those grown ups are called hedge fund operators. And hedge funds implode all the time. So if you’re going to pick one thing and anchor to that thing, it should either be something like $1 in a Federal Reserve Bank Account, or it could be an index fund object, or it could be something like US Treasury bonds, or it could be a gold brick, but these are the kinds of assets which have extraordinary ease of audit. And then at that point, you either have a token attached to a very large very non-transparent bucket of things or you have a token which is directly attached to a thing which is extremely audited. These are your two choices.

Mattereum has made it easy to have a token directly attached to an asset which is extremely audited. The MAP enables a firm legal binding between the asset and the token, so that ownership is solid and provable. The asset we believe is the absolute best to bind to is gold, so that is what we have done, and we are looking to grow the gold ecosystem, we’re building to increase opportunities for token buyers to own something that really is stable and acts as a repository of value. Also, Gold is steadily going up in value, dollars are getting eaten by inflation, even without all the underlying horrors lurking in stablecoins. The whole sector is just awaiting a deeply horrible, value-incinerating implosion, it’s only a matter of time before something blows away the whole house of cards and reveals the ‘stable’ in stablecoin to be wishful thinking. With the alternative that Mattereum’s gold-bound StableNFTs offer, why take that risk? Do you want to hang around and get pegged by your stablecoins?

Stablecoins are truly dead in the water for those who have eyes to see. Their days are numbered.

Find out more about Mattereum GmbH’s token sale*.

*The Mattereum Discount Token (MATR) is available for purchase through Mattereum GmbH’s fully regulated German crypto exchange partner, Swarm. Buying MATR is subject to terms and conditions in eligible jurisdictions — in particular, residents of the United Kingdom and the United States of America are excluded from the public sale of MATR.

Stablecoins — Inherently, Unfixably, Problematic was originally published in Mattereum - Humanizing the Singularity on Medium, where people are continuing the conversation by highlighting and responding to this story.


liminal (was OWI)

Behavioral Biometrics: Leveraging Sophisticated Signals for Risk Prevention

Outside In-Report Behavioral biometrics, the study and analysis of an individual’s unique behavioral patterns such as typing rhythm, mouse movement, and touchscreen behavior, has emerged as a cutting-edge method to authenticate identity and detect anomalies. By comparing current and past behavior, behavioral biometrics can effectively identify risks, offering a robust solution against various thre
Outside In-Report Behavioral biometrics, the study and analysis of an individual’s unique behavioral patterns such as typing rhythm, mouse movement, and touchscreen behavior, has emerged as a cutting-edge method to authenticate identity and detect anomalies. By comparing current and past behavior, behavioral biometrics can effectively identify risks, offering a robust solution against various threat vectors. To view our recent Outside-In Report, exclusive content available in Link™, current customers can sign in here. If you are new to Link, explore the platform, request a demo or self-subscribe to gain immediate access to the industry’s only actionable market intelligence and competitive enablement platform.  Market Overview

The behavioral biometrics market is experiencing rapid growth. Estimated at $3.2 billion in 2024, it is projected to reach $7.5 billion by 2030, growing at a 15.1% CAGR. This surge is driven by the increasing demand for low-friction security solutions suitable for multiple steps in the customer journey. Behavioral biometrics is particularly appealing to buyers looking to protect against threats such as phishing, credential stuffing, and remote access attacks. While large financial institutions continue to adopt these sophisticated risk detection capabilities, the market faces challenges in capturing small and medium-sized businesses (SMBs) due to their constrained engineering resources. The next 1-3 years will be pivotal in determining the market’s ability to attract a broader spectrum of customers beyond large institutions.

Challenges and Opportunities

Adoption Challenges for SMBs: Implementing behavioral biometric solutions requires robust engineering resources, making it challenging for SMBs to adopt. Solution providers need to innovate to bridge this gap.

Best Suited for Large Financial Institutions: Large financial institutions benefit the most from behavioral biometrics due to their prolonged customer retention periods, allowing for effective utilization of these solutions.

Threat-Agnostic Power: Behavioral biometrics can defend against a wide range of threats, including those posed by emerging technologies like generative AI, offering comprehensive protection.

Maximizing User Experience (UX): Users demand frictionless experiences across all sectors. Behavioral biometrics ensure robust risk detection without compromising user experience, meeting high standards set by leading UX examples in various industries.

How It Works

Behavioral biometrics differs from behavioral analytics by tracking individual users across multiple sessions to inform a robust risk profile. This creates a unique footprint, offering more advanced threat detection by utilizing longitudinal data. The process involves capturing behavioral signals such as gait analysis, mouse dynamics, keystroke dynamics, touchscreen behavior, and device movement. These signals are then compared to the user’s established pattern to detect anomalies and add extra security measures or terminate sessions if necessary. View our detailed diagram for a visual representation of this process and its effectiveness.

Notable Players

Key players in the behavioral biometrics market include:

BioCatch BehavioSec Bureau NuData Security (Mastercard) SecureAuth Zighra Threat Protection Against

Behavioral biometrics provide protection against:

Credential Stuffing: Attackers use stolen credentials to gain unauthorized access to accounts with reused passwords. Phishing: Deceptive attempts to acquire sensitive information by impersonating legitimate entities. Remote Access Attacks: Unauthorized attempts to access systems or networks from remote locations, exploiting vulnerabilities in remote access protocols. Future Outlook

The future of behavioral biometrics is promising, with generative AI, demand for low friction, and an appetite for advanced solutions acting as significant tailwinds. However, the requirement of engineering resources, potential in-house solutions, and data tightening by Big Tech pose challenges. The market’s success will hinge on overcoming these obstacles and expanding adoption beyond large financial institutions to include SMBs. For a detailed analysis of these dynamics and strategic insights, read the full report.

Behavioral biometrics represents a sophisticated solution for risk prevention, offering significant advantages in user authentication and anomaly detection. As the market continues to grow, understanding the dynamics and challenges will be crucial for stakeholders. To gain in-depth insights and stay ahead in this rapidly evolving field, access our full report behind a paywall.

Subscribe to Link for access to exclusive content that will give you a competitive edge in leveraging behavioral biometrics for risk prevention.

What is Behavioral Biometrics?

Behavioral biometrics is the study and analysis of an individual’s unique behavioral patterns to authenticate their identity and detect anomalies. Unlike traditional biometric methods such as fingerprints or facial recognition, behavioral biometrics examines how a person interacts with devices. This includes their typing rhythm, mouse movements, touchscreen behavior, gait, and other subtle actions. By continuously monitoring and comparing current behavior to historical patterns, behavioral biometrics can identify whether the person interacting with a system is who they claim to be, providing a robust and passive security measure against various cyber threats.

Related Content:

Outside-In Report: Behavioral Biometrics Losing Sight: How Limits on Big Tech Data Impact Bank Account Takeover Defenses Article: Facial Biometrics Trends and Outlooks  Market and Buyer’s Guide for Customer Authentication 

The post Behavioral Biometrics: Leveraging Sophisticated Signals for Risk Prevention appeared first on Liminal.co.


Shyft Network

FATF Crypto Travel Rule Adoption: 6-Month Status Update

It’s been about five years since the Financial Action Task Force (FATF) laid down its requirements on virtual assets. Since then, the implementation of these rules by countries has increased significantly. Today, we’ll look at all that has gone down in the realm of Crypto Travel Rule this year, so far. First, a bit of history. In 2019, the FATF expanded its recommendations or Travel Ru

It’s been about five years since the Financial Action Task Force (FATF) laid down its requirements on virtual assets. Since then, the implementation of these rules by countries has increased significantly. Today, we’ll look at all that has gone down in the realm of Crypto Travel Rule this year, so far.

First, a bit of history.

In 2019, the FATF expanded its recommendations or Travel Rule to cover crypto and Virtual Asset Service Providers (VASPs).

Updated in 2021, the Crypto Travel Rule applies to crypto exchanges, brokers, and custodians that handle crypto transactions. It mandates that these businesses share transaction details, including sender and receiver information, above a certain threshold.

In February, at a FATF plenary, the group agreed to publish a summary of the steps jurisdictions have taken to regulate VASPs.

Then, after a year-long process of collecting and evaluating information, late in March this year, the FATF released a report detailing the implementation of the Crypto Travel Rule across 58 jurisdictions.

These jurisdictions involved those who are official members of FATF, host VASPs with greater than 0.25% of global virtual asset trading volume, and have 1 million or more active users. These jurisdictions combined account for a vast majority (97%) of global crypto activity.

Based on the self-reported survey, the results showed that all 38 assessed members and 20 other jurisdictions have either completed or begun a virtual asset risk assessment.

91% of them have either passed a law or are in the process of doing so to establish licensing or registration requirements and AML/CTF obligations for crypto businesses. 88% of the jurisdictions had already done or started administrative inspections 84% had authorized the Travel Rule 79% had registered VASPs

According to the report, Saudi Arabia, China, and Egypt have explicitly prohibited the use of crypto and VASPs.

Gradual Alignment with International Standards

For several years, the FATF has urged jurisdictions to implement its recommendations. Although not mandatory, failure to adopt them may result in being placed on the organization’s watchlist.

The FATF president stated that the agency does not require jurisdictions to pass laws for successful recommendation implementation; notifications from authorities could suffice.

So, against this backdrop, the South African Financial Intelligence Centre (FIC) issued a draft directive in May. As per the draft, VASPs are required to implement the FATF Travel Tule.

The directive provides obligations of crypto transfer originators and defines the policies and procedures, which also need to be incorporated into exchanges’ risk management and compliance programs, to be followed when dealing with unhosted wallets. Those who fail to comply will be “subject to an administrative sanction.”

The same month, the Turkish ruling party submitted a draft crypto bill to the parliament that focuses on licensing and registration for VASPs and aligns with international standards.

Back in 2021, Turkey was put on the FATF’s “gray list” due to its failure to implement AML measures in the banking industry. As such, it has taken steps to address the organization’s concerns and align with its standards.

The bill aims to comprehensively govern the crypto market. It will update existing laws, with a key focus on platform transparency, compliance with financial regulations, and consumer protection.

Under the proposed legislation, VASPs and other entities will have to obtain licenses from the Capital Markets Board (CMB), foreign crypto brokers will be banned, and enhanced CMB oversight will ensure effective dispute resolution.

Then, last week, the UK crypto industry’s self-regulatory trade association, CryptoUK, released a detailed guide on the Crypto Travel Rule to help market participants gain a deeper understanding and navigate the complications in the rule’s application to FCA-registered companies and conduct unhosted wallet transfers.

In the European Union (EU), meanwhile, the final Travel Rule guidelines are expected soon now that the deadline (Feb. 2024) to submit a response to the European Banking Authority’s (EBA) public consultation on it has passed. Once released, authorities will have two months to comply with the region’s crypto regulator or explain their non-compliance.

The implementation of FATF’s Crypto Travel Rule is overseen by the Transfer of Funds Regulation (TFR) in the EU, which was approved in April last year and will take effect on Dec. 30, 2024.

Concluding Thoughts

As crypto adoption continues to rise, regulators around the world have increased their scrutiny of the sector and are introducing regulatory measures for crypto businesses to follow, including FATF’s Crypto Travel Rule.

With this, the crypto regulatory scene across nations has been aligning with global standards. However, this is far from being achieved, with countries still not having a robust framework to regulate the crypto space and having different crypto transaction reporting thresholds and information reporting requirements.

This highlights the need for superior solutions like Shyft Veriscope to efficiently meet the unique blockchain sector requirements without burdening users.

Interesting Reads

Guide to FATF Travel Rule Compliance in the United States

Guide to Crypto Travel Rule Compliance in Japan

Guide to FATF Travel Rule Compliance in the Philippines

Guide to FATF Travel Rule Compliance in Estonia

Guide to Travel Rule Compliance in Malaysia

Guide to FATF Travel Rule Compliance in South Korea

Guide to FATF Travel Rule Compliance in India

Guide to FATF Travel Rule Compliance in Canada

Guide to FATF Travel Rule Compliance in Indonesia

FATF Travel Rule Compliance Guide for South Africa

Guide to FATF Travel Rule Compliance in Switzerland

FATF Travel Rule Compliance in Germany

Guide to FATF Travel Rule Compliance in Mexico

The Visual Guide on Global Crypto Regulatory Outlook 2024

About Veriscope

‍Veriscope, the compliance infrastructure on Shyft Network, empowers Virtual Asset Service Providers (VASPs) with the only frictionless solution for complying with the FATF Travel Rule. Enhanced by User Signing, it enables VASPs to directly request cryptographic proof from users’ non-custodial wallets, streamlining the compliance process.

For more information, visit our website and contact our team for a discussion. To keep up-to-date on all things crypto regulations, sign up for our newsletter and follow us on X (Formerly Twitter), LinkedIn, Telegram, and Medium.

FATF Crypto Travel Rule Adoption: 6-Month Status Update was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Extrimian

Zero Knowledge Proofs in Digital Identity

A Deep Dive to ZKPs and Decentralized Technology with Extrimian In the rapidly evolving landscape of digital identity management, privacy and security are paramount. Zero Knowledge Proofs (ZKPs) stand out as a revolutionary technology, enabling the verification of claims without revealing underlying data. Extrimian is at the forefront, integrating ZKPs into decentralized digital identity systems [
A Deep Dive to ZKPs and Decentralized Technology with Extrimian

In the rapidly evolving landscape of digital identity management, privacy and security are paramount. Zero Knowledge Proofs (ZKPs) stand out as a revolutionary technology, enabling the verification of claims without revealing underlying data. Extrimian is at the forefront, integrating ZKPs into decentralized digital identity systems to offer unmatched security and privacy.

Understanding Zero Knowledge Proofs

Zero Knowledge Proofs are a form of cryptographic protocol that allows one party (the prover) to prove to another party (the verifier) that a statement is true, without revealing any information beyond the validity of the statement itself. This technology is particularly potent for privacy-preserving digital interactions.

Key Concepts of ZKP: Completeness: If the statement is true, an honest verifier will be convinced by an honest prover. Soundness: If the statement is false, no cheating prover can convince the honest verifier that it is true, except with some small probability. Zero-knowledge: If the statement is true, no verifier learns anything other than the fact that the statement is true. Extrimian’s Application of ZKPs in Digital Identities

Extrimian harnesses the power of ZKPs to enhance the security and privacy of digital identities. By using ZKPs, Extrimian allows users to authenticate and share their credentials without exposing any sensitive personal information.

Example of ZKP in Action: const { zkSnark, generateProof, verifyProof } = require('extrimian-zkp-toolkit'); // Setup: A simple example where a user proves they are over 18 without revealing their age. const { proof, publicSignals } = generateProof({ age: 21, lowerBound: 18 }); // Verification: The verifier checks the proof without ever knowing the user's exact age. const isValid = verifyDetails({ proof, publicSignals }); console.log(isValid); // Output: true

This simple code snippet demonstrates how a user can prove they meet the age requirement without revealing their exact age, using Extrimian’s ZKP toolkit.

Practical Use Cases of ZKP in Decentralized Digital Identity

ZKPs empower several practical applications in decentralized identity systems:

Privacy-Enhanced Authentication: Users can authenticate themselves to services without revealing any unnecessary personal information, thus enhancing privacy. Credential Verification: Verify the authenticity of credentials without exposing the underlying data, useful in scenarios like employment verification or accessing restricted services. Fraud Prevention: Secure systems against fraudulent claims and transactions by enabling entities to prove the legitimacy of their actions without revealing sensitive data. Extrimian’s Integration and Benefits

At Extrimian, we integrate ZKPs within our digital identity solutions to provide:

Enhanced Privacy: Users have complete control over their personal information. Increased Trust: Entities can trust the verification process without accessing private data. Scalability: ZKPs allow for complex operations to be conducted without a significant increase in transaction costs or delays.

Read more about how we use ZKPs on our Zero Knowledge Proofs Wiki and explore various applications on our Use Cases Page.

Zero Knowledge Proofs Circuits and Process Further Reading and Resources

For those interested in diving deeper into the technicalities of ZKPs and their applications in decentralized digital identity, the following resources are invaluable:

zkSync zkEVM | Ethereum Layer 2 Blockchain | Scaling the Ethos and technology of Ethereum: zkSync is the layer 2 protocol that scales Ethereum’s security and values through zero-knowledge cryptography. Zero Knowledge Proofs: Introduction, History, and Applications Read this Blog Post to understand how SSI (Self-Sovereign Identity can enhance the privacy layer with ZKPs): Zero Knowledge Proof and Self Sovereign Identity

Extrimian continues to push the boundaries of what’s possible in digital identity management. By leveraging advanced cryptographic techniques like ZKPs, we are setting new standards for privacy and security in the digital world.

Join us on this journey at Extrimian, where we redefine the future of digital identity. Explore our platform, tools, and the vast possibilities that our technology unlocks in the era of decentralized digital identities.

The post Zero Knowledge Proofs in Digital Identity first appeared on Extrimian.


Tokeny Solutions

Tokeny’s Talent | Cristian

The post Tokeny’s Talent | Cristian appeared first on Tokeny.
Cristian Salinas is Senior System Test & Certification Engineer at Tokeny.  Tell us about yourself!

I am an enthusiast for life, cherishing small details and tangible experiences. Life is a gift, and I try to live it fully, enjoy a good environment, and be grateful for what I have.

In my free time, I enjoy engaging in family activities or playing sports such as soccer, basketball, or volleyball.

What were you doing before Tokeny and what inspired you to join the team?

Before joining Tokeny, I worked at a company called Motive TV, which focused on developing streaming apps. This experience fueled my curiosity about creating final products where customer satisfaction is the ultimate reward. Testing code goes beyond functionalities; it is about refining the quality of the product, proposing solutions, designing test scenarios, and making the product as robust as possible.

Joining a technology company like Tokeny is always inspiring because of its capacity for innovation. The focus on developing cutting-edge solutions and the transparency and security inherent in blockchain technology particularly caught my attention. I believe this is highly relevant in today’s world.

How would you describe working at Tokeny?

It is a unique experience without a doubt. I appreciate the company’s philosophy and the human quality that radiates among co-workers. There’s a strong culture of communication, with everyone willing to teach and learn from one another as we work together towards our common goals.

What are you most passionate about in life?

After my family, my biggest passion is technology—specifically, I am fascinated by advancements in video games where AI and augmented reality can now be used. It’s simply incredible.

What is your ultimate dream?

My long-time dream is to collaborate on or develop a solution that helps children and young people with language problems develop their skills. I envision a device that can help awaken and stimulate active neurons, aiding in skill development.

What advice would you give to future Tokeny employees?

Enjoy working at Tokeny, a diverse company where people from various nationalities come together to achieve a common goal: advancing Tokeny.

What gets you excited about Tokeny’s future?

I like the company’s clear project vision. Tokeny knows where it is heading and how each employee can contribute. Being part of a pioneering and innovative project translates into great growth opportunities.

He prefers:

Coffee

check

Tea

check

Movie

Book

Work from the office

check

Work from home

check

Dogs

Cats

Call

check

Text

check

Burger

Salad

check

Mountains

Ocean

Wine

check

Beer

Countryside

check

City

check

Slack

Emails

check

Casual

Formal

check

Crypto

Fiat

Night

check

Morning

More Stories  Tokeny’s Talent|Sefa’s Story 24 November 2022 Tokeny’s Talent | Denisa 26 October 2023 Tokeny’s Talent|Tony’s Story 18 November 2021 Tokeny’s Talent|Mihalis’s Story 28 January 2022 Tokeny’s Talent|Nida’s Story 15 January 2021 Tokeny’s Talent | Gonzalo 24 November 2023 Tokeny’s Talent|Ben’s Story 25 March 2022 Tokeny’s Talent|Thaddee’s Story 2 June 2022 Tokeny’s Talent | Liam 30 March 2023 Tokeny’s Talent|Héctor’s Story 29 July 2022 Join Tokeny Solutions Family We are looking for talents to join us, you can find the opening positions by clicking the button. Available Positions

The post Tokeny’s Talent | Cristian first appeared on Tokeny.

The post Tokeny’s Talent | Cristian appeared first on Tokeny.


IdRamp

Instant Onboarding with Automated Identity Verification

Manual onboarding processes are time-consuming and expensive, while also being vulnerable to identity theft. Virtual onboarding, while faster, introduces new security challenges like AI and Deep Fake attacks that can bypass traditional verification methods. The post Instant Onboarding with Automated Identity Verification first appeared on Identity Verification Orchestration.

Manual onboarding processes are time-consuming and expensive, while also being vulnerable to identity theft. Virtual onboarding, while faster, introduces new security challenges like AI and Deep Fake attacks that can bypass traditional verification methods.

The post Instant Onboarding with Automated Identity Verification first appeared on Identity Verification Orchestration.

Wednesday, 12. June 2024

KuppingerCole

The Impact of AI - The Future of Digital Identity with Joseph Carson

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. This time, Joseph Carson and Martin focus on the impact of artificial intelligence (AI). They discuss how AI is currently being used in defense and attack strategies, with a focus on the defensive capabilities of AI in improving security. The conversation also touches on the concept of a

In this interview series, some of our Analysts talked to Identity Experts about the Future of Digital Identity. This time, Joseph Carson and Martin focus on the impact of artificial intelligence (AI). They discuss how AI is currently being used in defense and attack strategies, with a focus on the defensive capabilities of AI in improving security. The conversation also touches on the concept of augmented intelligence and the potential for autonomous and interactive AI agents. It highlights the intersection of decentralized identity and the continuous influence of personal preferences on AI models.




Spruce Systems

SpruceID On Stage at the European Identity and Cloud Conference

Last week, SpruceID CEO Wayne Chang spoke on three panels at EIC.

Last week, SpruceID attended the European Identity and Cloud Conference (EIC) in Berlin, an event that brings together industry leaders, innovators, and experts to discuss and explore the latest advancements in digital identity and cloud security.

This year, SpruceID CEO Wayne Chang was among the panelists contributing to three in-depth discussions on the evolving landscape of decentralized identity. These panels provided a comprehensive overview of how decentralized identity can help mitigate modern threats and enhance trust in digital ecosystems in today’s world.

Panel: Building Trust in AI through Decentralized Identity Photo courtesy of the Decentralized Identity Foundation

One of our most anticipated sessions was the panel "Building Trust in AI through Decentralized Identity." This session focused on the pressing issues posed by AI technologies and brought together Wayne Chang, Kim Hamilton Duffy from the Decentralized Identity Foundation (DIF), and Linda Jeng, founder of Digital Self Labs.

In this talk, Wayne highlighted the increasing risks of traditional identification methods in the face of sophisticated AI technologies. He pointed out, “Until now, we’ve been able to get by holding a driver's license up to a webcam, but with new AI tech, you can fool these systems — this is already showing up at the edges. ” He emphasized the rise in phishing attacks utilizing AI-generated voices that can convincingly mimic trusted individuals, making it easier for scammers to deceive victims.

The panelists discussed how decentralized identity could offer solutions to these emerging challenges. Wayne explained, “We need to add authenticity to communication. We don’t want to present a strong ID every time we want to use a chat app, so it makes sense to embed decentralized identity into comms channels to prove I’m real.” Decentralized identity is not only about identity but also about establishing data governance and creating chains of trust to mitigate risks from synthetic data. This approach empowers individuals with greater control over their data.

They touched on the promising developments in automated compliance with data regulations and the potential for expanding trust to content authenticity. They emphasized the need for ongoing education and the role of user demand in driving the adoption of decentralized identity technologies. In Linda Jeng’s words, “We need to spend time educating policymakers and the public, but in the end, it comes down to end-user demand for solutions.”

Read more about the discussions that took place during this panel in the blog post published by the Decentralized Identity Foundation.

Panel: Expert/Digital Wallet & Verifiers Q&A Photo courtesy of the Decentralized Identity Foundation

In the Expert/Digital Wallet & Verifiers Q&A session, Anil John, Technical Director of the DHS S&T Silicon Valley Innovation Program (SVIP) at the U.S. Department of Homeland Security, introduced the newest cohort members of SVIP, including SpruceID. Wayne Chang, SpruceID CEO, engaged in a Q&A with digital wallet solution providers and industry experts from other SVIP cohort members, including Hushmesh, Procivis AG, Ubiqu, Netis, and Credence ID. The discussion focused on the practical aspects of integrating digital identity into digital wallets and verifiable credential systems. It covered a range of topics, including the challenges and opportunities in implementing decentralized identity, the role of verifiers in ensuring trust and security, and the potential for digital wallets to enhance user privacy and data control. 

Panelists emphasized the importance of interoperability and user-centered design in developing digital wallet solutions, which are crucial for securely and conveniently managing digital identities and credentials.

Panel: Decentralized Identity in Production

Our final panel, "Decentralized Identity in Production," explored the practical applications of verifiable credentials and the integration of decentralized identity into production environments. 

Wayne, along with panelists from Condatis, Adobe, Spherity, and Accenture, shared valuable insights and experiences on the business case for decentralized identity, highlighting key success factors and lessons learned from recent projects and discussing how verifiable credentials have been effectively integrated into workforce certifications, supply chain documentation, and educational qualifications, demonstrating enhanced trust achieved through decentralized identity.

The panelists emphasized key success factors in project learnings, product roadmaps, and the future direction of decentralized identity, and they encouraged attendees to explore practical applications for decentralized identity technologies within their organizations.

Continuing the Conversations on Decentralized Identity

We're grateful to have been invited to speak on several panels alongside other experts in the field at this year's event.

SpruceID is excited to be at the forefront of these important discussions and committed to advancing the development and adoption of decentralized identity solutions. We look forward to continuing these discussions and working with partners and stakeholders from EIC!

Check out our website to learn more about how SpruceID uses decentralized identity to build a more secure and trustworthy digital future.

Learn More

About SpruceID: SpruceID is building a future where users control their identity and data across all digital interactions.