Last Update 5:24 PM September 27, 2022 (UTC)

Company Feeds | Identosphere Blogcatcher

Brought to you by Identity Woman and Infominer.
Support this collaboration on Patreon!

Tuesday, 27. September 2022

Infocert (IT)

Forum Banca 2022, incontra InfoCert al principale evento italiano per il mondo Bancario 

Al via la XV edizione dell’evento di riferimento del mondo bancario italiano, dove oltre 600 manager del settore si incontreranno per decidere quale sarà il modello di banca del futuro. L’appuntamento è fissato per il prossimo 29 settembre nella sala congressi dell’NH Milano Congress Centre.  Forum Banca è l’unico hub di incontro fisico italiano sullo sviluppo delle […] The post Forum Banca

Al via la XV edizione dell’evento di riferimento del mondo bancario italiano, dove oltre 600 manager del settore si incontreranno per decidere quale sarà il modello di banca del futuro. L’appuntamento è fissato per il prossimo 29 settembre nella sala congressi dell’NH Milano Congress Centre. 

Forum Banca è l’unico hub di incontro fisico italiano sullo sviluppo delle strategie e delle tecnologie del settore finanziario, focus di questa edizione dell’evento ideato e creato da Ikn Italy sarà lo sviluppo delle strategie e delle tecnologie del settore finanziario: “Sostenibilità, metaverso, criptovalute e blockchain le sfide del settore finanziario”. Sono attesi più di 700 partecipanti, daranno il loro contributo di valore circa 100 speaker appartenenti ai più importanti player del settore e del mondo dell’industria italiana e straniera, e parteciperanno oltre 15 fintech innovative. 

L’evento prenderà il via il 28 settembre con il consueto Executive Summit, un incontro dalla formula Invitation Only destinato ai C-Level, che affronterà due macro temi: “ESG, Ambiente e non solo: il ruolo sociale degli Istituti Finanziari” e “Digital Trend e Regolazione: come andare allo stesso passo” 

Il 29 settembre si svolgerà invece la Main Conference nella quale si confronteranno oltre 100 Speaker, tra i migliori innovatori del settore. La giornata prenderà il via con la Sessione Plenaria dove CEO e DG si confronteranno sulle sfide del 2022 tra nuovi equilibri macroeconomici, ruolo delle cripto, possibilità offerte dal PNRR e change management. A seguire 4 Sessioni Tematiche in parallelo che tratteranno di: Customer, Tech & Innovation; Data, Security & Identity; Operation; Governance & Compliance.  

Anche quest’anno l’evento ospiterà inoltre il Fintech Smart Village, un’area totalmente dedicata alla presentazione, discussione e condivisione delle nuove Fintech che, tra innovazione e tecnologia, stanno sfidando i servizi bancari tradizionali.  

IKN Italy, leader nella creazione e sviluppo di contenuti, eventi e corsi di formazione B2B, è una realtà affermata ormai da 35 anni, anche quest’anno InfoCert è fiera di partecipare come sponsor all’evento che da anni rappresenta uno fra i più importanti momenti di incontro e discussione per I più importanti attori del mondo italiano ed internazionale.  

Iscriviti e partecipa al Forum Banca 2022! 

The post Forum Banca 2022, incontra InfoCert al principale evento italiano per il mondo Bancario  appeared first on InfoCert.


KuppingerCole

Modernizing Legacy IAM Systems

by Osman Celik Legacy IAM systems can no longer meet the requirements of Digital Transformation. They often have a negative impact on business efficiency and customer experience. Such systems are too costly to maintain and are getting closer to reaching end-of-life. Today, organizations can be expected to manage billions of digital identities with their IAM systems. Organizations possessing legacy

by Osman Celik

Legacy IAM systems can no longer meet the requirements of Digital Transformation. They often have a negative impact on business efficiency and customer experience. Such systems are too costly to maintain and are getting closer to reaching end-of-life. Today, organizations can be expected to manage billions of digital identities with their IAM systems. Organizations possessing legacy IAM systems must modernize their IAM systems and upgrade their capabilities. It is a transition that requires a comprehensive approach. Organizations must not only adapt to today's standards, but also become future-proof while complying with regulations. In this paper, we will consider what are the necessities for modernizing legacy IAM systems and how Curity aligns with them.

Dock

Circom Language Integration: Anonymous Credentials Protocol Update

Our anonymous credentials protocol update integrates Circom to enable developers to specify zero-knowledge "predicates" on credential attributes.

TL;DR

Use Circom to specify zero-knowledge "predicates" on credential attributes. Circom is a DSL (Domain Specific Language) used to express computations for which zk-SNARKs need to be created. Predicates are written as programs in the Circom language. The programs are compiled with the Circom compiler and the artifacts are used to create zk-SNARKs. The proof protocol is LegoGroth16, an adaptation of Groth16 (used by ZCash).

We have updated our anonymous credentials protocol that enables developers to express "predicates" (checks that return true or false) on credential attributes using the Circom language and then prove that these predicates are satisfied in zero-knowledge. This means that credential verifiers can check that certain credential attributes satisfy the required conditions without learning those attributes such as convincing a credential verifier that the holder is (or is not) a resident of certain cities without revealing the city or that the blood group is not AB- without revealing the blood group. In both these examples, the city and blood group were attributes of a credential and the holder convinced some facts about those without revealing them. More examples would be a holder proving that his yearly income is less than $25,000 for availing of social care where their yearly income is calculated from the pay slip of each month (adding each month's income and checking that the sum is less than desired). Similarly, the holder could prove that his yearly income is more than a certain amount to get access to a high-risk, high-reward investment fund.

Code and Examples

For folks wishing to jump right into the code, the support for Circom in LegoGroth16 is here. It has been integrated into the composite proof system here. The Typescript support is added here. The Circom circuits (.circom) and artifacts (.r1cs, .wasm) are here and here. The .r1cs and .wasm files are generated for BLS12-381 curve and wasm optimizer (wasm-opt -O3) has been run over the .wasm files. Also, we only support Circom 2.

Here are some examples that demonstrate the use of Circom along with credentials:

1. The yearly income calculated from monthly payslips is less/greater than a certain amount

2. The sum of assets is greater than the sum of liabilities where assets and liabilities are calculated from several credentials

3. The blood group is not AB-

4. The grade is either A+, A, B+, B, or C but nothing else

5. Either vaccinated less than 30 days ago OR last checked negative less than 2 days ago

6. Certain attribute is the preimage of an MiMC hash

Motivation

Previously, the expressivity of our anonymous credential proving system was limited. The holder could prove things like I have a credential(s), or certain attribute(s) across my credentials are the same or I can verifiably encrypt attribute(s) for traceability. But verifiers could not request proofs for more complicated conditions like the sum of certain attributes should be less/more than a certain value or an average of certain values is less/more than a certain value. These simple arithmetic operations might not always be sufficient and the holder might want to prove the preimage of a hash is the credential attribute, say to be used in the Merkle tree. Moreover, as library developers, we cannot imagine all the possible logic an application can require. One solution to this problem is to use a more powerful tool to express such conditions and that tool is a programming language. We wanted a developer-friendly programming language that could also be used with our proving system. We found Circom, developed by the folks at iden3, to be the most popular such language with sufficient documentation and community around it.

Developer Workflow

1. Express the predicates/arbitrary computation as a Circom program.

2. Compile the above program to get the constraints (R1CS file) and witness generator (WASM file, takes input wires and calculates all the intermediate wires).

3. Use the constraints from step 2 to generate a zk-SNARK proving and verification key of LegoGroth16.

4. Use the R1CS and WASM files from step 2 and the proving key from step 3 to create a LegoGroth16 proof.

5. Use the verification key from step 3 to verify the LegoGroth16 proof.

Steps 1-3 are done by the verifier and the result of these steps, i.e. R1CS file, WASM file, proving, and verification key are shared with any potential prover (published or shared P2P). Step 4 is done by the prover and step 5 again by the verifier.

Cryptographic Details

As mentioned in our previous post, we use LegoGroth16 protocol for creating ZK-SNARKs. LegoGroth16 is similar to Groth16 (used by ZCash), but in addition to the zero-knowledge proof, it provides a cryptographic commitment (Pedersen) to the private data (credential attributes in our case). This commitment allows us to prove that the private inputs to the proof protocol are the same as the credentials attributes using the Schnorr proof of knowledge protocol.

At a high level, the approach is to use:

BBS+ signatures to prove possession of a credential (knowledge of signature) and Circom to express arbitrary logic and use LegoGroth16 to get the proof of the correctness of the Circom program and use the Pedersen commitment to the witnesses in another zk-proof where I prove that the specific attributes signed under BBS+ are equal to the witnesses in the LegoGroth16 proof.

The Circom tooling however generates a Groth16 proof whereas we needed a LegoGroth16 proof. Fortunately, the Circom compiler outputs the R1CS (description of the computation) file containing the constraints and the WASM file to generate all the values in the circuit given the inputs. We use the R1CS file to do the SNARK setup, i.e. proving and verification key. This is a trusted setup and is done by the verifier. The prover uses the R1CS and WASM files and the proving key to then generate the zk-SNARK proof using our composite proof system. Here, the prover can specify which of the Circom program inputs correspond to which of his credential attributes.

We acknowledge that the verifier can create fake proofs (proof of untrue statements) as he does the trusted setup but in most cases, the verifier has no incentive to cheat itself. When this is a possibility, like in a large public system deployed by a government, care must be taken to destroy the toxic waste (trapdoor) from the trusted setup.

Finally, we would like developers to try this new feature to build more expressive applications which are private as well. We would love to hear what all use-cases does this tool enable for you. If you need any support writing a Circom program for your use case, please reach out to us through Github.



OWI - State of Identity

The Dry Powder Conundrum

In this month's Investing in Identity series, we discuss the latest movers and shakers in fraud and fintech and take an analytical look at the digital identity trends that are best positioned for deal activity this fall. The agenda includes: Sardine, a leading provider of fraud, compliance, and instant settlement solutions raises a $51.5MM Series B led by Andreessen Horowitz Alloy, an ID ve
In this month's Investing in Identity series, we discuss the latest movers and shakers in fraud and fintech and take an analytical look at the digital identity trends that are best positioned for deal activity this fall. The agenda includes: Sardine, a leading provider of fraud, compliance, and instant settlement solutions raises a $51.5MM Series B led by Andreessen Horowitz Alloy, an ID verification platform for banks and fintech companies, receives $52MM in additional funding to accelerate growth and global expansion We're seeing record levels of accumulated dry powder. Although there's been a recent slowdown in deployment, once the market resets, how will VCs put their money to work?

FindBiometrics

Mastercard to Discuss Biometric Checkout in FindBiometrics Identity Summit Opening Session

Mastercard’s Nili Klenoff will be kicking off Wednesday’s Financial Biometrics Virtual Identity Summit, participating in a fireside chat with FindBiometrics Editor in Chief Peter Counter on the topic of Biometric […] The post Mastercard to Discuss Biometric Checkout in FindBiometrics Identity Summit Opening Session appeared first on FindBiometrics.

Mastercard’s Nili Klenoff will be kicking off Wednesday’s Financial Biometrics Virtual Identity Summit, participating in a fireside chat with FindBiometrics Editor in Chief Peter Counter on the topic of Biometric […]

The post Mastercard to Discuss Biometric Checkout in FindBiometrics Identity Summit Opening Session appeared first on FindBiometrics.


KuppingerCole

Noname API Security Platform

by Alexei Balaganski The Noname API Security Platform is a unified API security solution that combines proactive API discovery and classification, runtime protection, and API security testing to ensure consistent security coverage for all types of APIs across all on-prem or cloud environments. The platform’s deployment flexibility makes it suitable for enterprise customers even from highly regulat

by Alexei Balaganski

The Noname API Security Platform is a unified API security solution that combines proactive API discovery and classification, runtime protection, and API security testing to ensure consistent security coverage for all types of APIs across all on-prem or cloud environments. The platform’s deployment flexibility makes it suitable for enterprise customers even from highly regulated industries.

SelfKey

Incorporations Marketplace

SelfKey wallet users can now start a business overseas with complete transparency on costs and KYC procedures in their preferred jurisdiction. The post Incorporations Marketplace appeared first on SelfKey.

SelfKey wallet users can now start a business overseas with complete transparency on costs and KYC procedures in their preferred jurisdiction.

The post Incorporations Marketplace appeared first on SelfKey.


KuppingerCole

The HeatWave is Spreading

by Alexei Balaganski Just over a month ago, I wrote about the partnership between Oracle Cloud and Microsoft Azure that has finally enabled their customers to create “properly multi-cloud” applications without any hidden costs or limitations of earlier architectures. Well, unfortunately, announcements like that aren’t heard often, simply because the very idea of such partnerships goes against clo

by Alexei Balaganski

Just over a month ago, I wrote about the partnership between Oracle Cloud and Microsoft Azure that has finally enabled their customers to create “properly multi-cloud” applications without any hidden costs or limitations of earlier architectures. Well, unfortunately, announcements like that aren’t heard often, simply because the very idea of such partnerships goes against cloud service providers’ traditional business interests. This obviously has worked remarkably well for Oracle and Microsoft, but at least to a certain extent because there is no rivalry between them in these areas, where they are traditionally strong. Connecting a business application running on Azure to a database backend on OCI is a win-win situation for both companies, and their customers can reap the benefits of that, too.

What if multi-cloud is not an option?

Unfortunately, this approach won’t necessarily be as successful with other potential partners. Take AWS, for example. The company offers a massive range of database services: over 15 purpose-built database engines for different data models and business use cases. Some of them compete directly with Oracle’s core products like the Oracle Database and MySQL. The rivalry between the companies has a long and quite bitter history. I can only imagine that establishing a mutually beneficial partnership here might take a few more years…

However, what are the ordinary customers supposed to do? In my last blog post I casually mentioned that “you can run SAP on AWS or Oracle on Azure,” as if it was somehow an inferior alternative to the real multi-cloud. Yes, it is usually not a fully managed service and is more difficult to set up, maintain, and operate than a native cloud service from the appropriate vendor. However, when no other options are available, this still can be a very sensible solution for many potential use cases. Provided, of course, that the vendor hasn’t cut any corners when migrating their service to the competitor’s cloud and there is some kind of feature parity guarantee between different deployment options.

Some vendors might go even further and design their products in an entirely cloud-agnostic way – those can be deployed to any cloud, and their customers can just choose where their tenant will be spun up. This approach has obvious advantages, but also one big downside: by design, it is created for the lowest common denominator across all supported clouds, not optimized for any particular cloud, and with no support for native services each CSP offers – such as identity, observability, or security, for example. This might work for some scenarios (Kubernetes is an example of a great success) but will fall short in others. I firmly believe that databases belong to the latter – ensuring the highest degree of performance, data protection, and regulatory compliance is not possible without strong optimization and native service integrations for a specific cloud provider.

What is MySQL HeatWave?

MySQL HeatWave is a managed database service created by Oracle that extends the standard MySQL engine with support for in-database high-performance transactional workloads, analytics, machine learning, and automation. Simply put, it can transparently make an existing MySQL database 1000x faster without any effort from the customer and avoid ETLs across databases. We’ve already reviewed the service about a year ago, when it was announced publicly, but the company continued to add new features to provide customers with a scalable, robust offering and make optimizations for enhanced performance ever since.

For example, with support for real-time elasticity and compression, MySQL HeatWave is now suitable for even larger (or smaller) data volumes. And auto-reload and auto-recovery capabilities ensure consistent availability even after upgrades or failures. But most importantly, MySQL HeatWave is now also available on AWS in addition to OCI.

If the mountain will not come to Muhammad…

How exactly does it work? Well, as an old proverb goes, “If the mountain will not come to Muhammad, then Muhammad must go to the mountain”. While many customers of AWS do migrate to OCI to run MySQL HeatWave, for many customers this can be expensive due to high AWS egress fees. To accommodate such customers, the MySQL HeatWave service is offered as a native AWS service. The data plane, control plane, and interactive console are optimized and deployed on AWS infrastructure as a fully managed service supported by the MySQL engineering team. Through the new interactive console, users can issue queries, monitor performance and resource utilization, manage Autopilot, and access their AWS account.

Customers can provision a new MySQL HeatWave instance on AWS with a few clicks with a similar user experience as on OCI before, but at all times, their data and network traffic will be confined to the AWS cloud, with the expected low latency for any application running there as well. Oracle refers to this as a “native experience.” I’m not sure if I can agree with this statement, since HeatWave isn’t (yet?) integrated into the AWS console, and there is no consolidated support and billing like Oracle offers with Azure. However, it works with AWS CloudWatch, S3, PrivateLink and we can expect it to be integrated with a growing number of other native AWS services in the future.

Oracle claims that MySQL HeatWave runs 20x faster than Amazon Redshift (which is AWS’s own managed PostgreSQL service and running on AWS’ own infrastructure). However, as an analyst focused on cybersecurity and data protection, I’m more interested in the fact that HeatWave is a fully self-contained database service – there is no need to move data between different database engines to support analytics or machine-learning workloads. This feature alone has massive security and compliance benefits. In addition, Oracle has brought advanced security features to MySQL HeatWave on AWS such as data masking, de-identification, asymmetric encryption support, and MySQL firewall, making sensitive data even more secure—all within the database.

The availability of MySQL HeatWave on AWS is great news for everyone dreaming of the future where cloud providers eliminate walls between each other and make their services available wherever customers want them. HeatWave offers users a unique combination of the familiar MySQL experience with Oracle’s latest developments in machine learning, automation, and security. Now all this is available for AWS customers not only without the latency and data egress overhead and fees of a multi-cloud architecture, but also with the ability for customers to tightly integrate with existing AWS services. The HeatWave is spreading; it’s available on OCI, AWS, and will next be available on Azure. With so many cloud choices—as well as an on-premises cloud deployment option with Dedicated Region – what’s not to like?

Commissioned by Oracle


SelfKey

Metaproof Platform

Through identity verifications, a decentralized identity platform like the Metaproof Platform, controlled by active user participation and owned by its users, might successfully sort bots from humans. The post Metaproof Platform appeared first on SelfKey.

Through identity verifications, a decentralized identity platform like the Metaproof Platform, controlled by active user participation and owned by its users, might successfully sort bots from humans.

The post Metaproof Platform appeared first on SelfKey.


Metadium

Metadium continues expanding its ecosystem in SE Asia

Dear Metadium community, Our ecosystem continues expanding in SE Asia. This time we are proud to announce our new partnership with a leading mobility software development company: DRIMAES. With this cooperation, Metadium will enter the field of mass produced cars and smart electric vehicles in SE Asia, with special focus on Singapore, Vietnam, and Indonesia. The two companies are planning t

Dear Metadium community,

Our ecosystem continues expanding in SE Asia. This time we are proud to announce our new partnership with a leading mobility software development company: DRIMAES.

With this cooperation, Metadium will enter the field of mass produced cars and smart electric vehicles in SE Asia, with special focus on Singapore, Vietnam, and Indonesia. The two companies are planning to join forces to enter the SE Asian market and improve drivers’ convenience by providing blockchain-based identity authentication and payment services within the web-based In-Vehicle Infotainment (IVI) platform.

The IVI platform enables faster and more efficient O2O services, making it easier to expand the ecosystem, while blockchain-based identity authentication and payment services provide a security layer for users’ ease of mind.

Metadium is moving ahead with a full-scale entry into the ASEAN market and will continue bringing new partners into its ecosystem. Last month we announced that META is now listed on Indodax, Indonesia’s largest digital asset exchange.

Thank you for your continuous support.

— Metatium Team

About DRIMAES

DRIMAES is a Korean developer and producer of hardware (In-Vehicle Infotainment or IVI) and software in the mobility industry. With the evolution from traditional cars into smart cars, IVI has become an essential part of in-vehicle experiences. DRIMAES has developed IVI commercial projects based on the AGL (Automatic Grade Linux) platform and applied its technology to a number of commercial vehicles, bringing its technology to a worldwide recognition.

Website | https://metadium.com

Telegram(EN) | http://t.me/metadiumofficial

Facebook | https://www.facebook.com/Metadium

Twitter | https://twitter.com/MetadiumK

Medium | https://medium.com/metadium

Brunch | https://brunch.co.kr/@metadium

Metadium continues expanding its ecosystem in SE Asia was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Monday, 26. September 2022

Holochain

Capture-Resistance: From the Ground Up

What is capture and how to avoid it.

Anticapture? Capture-Resistance? Uncapturable?

We’ll get to these in a moment, but first: What is Capture?

“In politics, regulatory capture … is a form of corruption of authority that occurs when a political entity, policymaker, or regulator is co-opted to serve the commercial, ideological, or political interests of a minor constituency, such as a particular geographic area, industry, profession, or ideological group.”
Wikipedia, Regulatory capture

More broadly we can think of capture as a type of control or corruption where a small group gains inordinate power over others and their interests. Whether that is capture of a resource, a process, or the flow of communication, a powerful minority takes control of the benefits of something that should be held in common.

Following we are going to look at a few different forms of capture and methods for avoiding it.

Anticapture “The four phases of action in the Anticapture framework. Note how each phase is itself composed of smaller instances of all the phases.” — Spencer Graham, Anticapture

The Anticapture framework proposed by Spencer Graham outlines governance practices for managing shared resources collectively without a fraction of that collective capturing control of the resources. For this he proposes DAOs as an ideally capture-resistant structure, but in doing so he redefines a DAO as: “a network of agents who share common purpose, and are the only ones who hold the power to execute actions that manage a set of shared resources. The core of Graham’s work is focused on the key stages of governance which he outlines as follows.

Graham outlines governance practices for managing shared resources collectively without a fraction of that collective capturing control of the resources.

Anticapture describes four stages of decision making:

Propose: delineate the options on the table Decide: select the preferred option Execute: put the selected option into action Evaluate: analyze the impact of the executed action

Of these, Graham sees the greatest security risk in the Execute stage. If the ability to execute action is captured then the resource is also captured. While he is not completely clear about the mechanics, he presents DAOs as addressing this by decentralizing action and giving the community the ability to act autonomously without outside influence (thus avoiding capture from regulators, for instance).

Assuming that the final execution of any action in a DAO is carried out by smart contracts, then the design of these contracts themselves and the infrastructures that they are built on are risk vectors for capture.

Capture Resistance Across the Web

Anticapture speaks to the level of the organization, but what of the infrastructures the organization is built on?

“For the longest time, word was that the Web would free the world; now, it seems as if most of the world wants to liberate the Web.” - Robin Berjon, Capture Resistance

The internet itself is a commons, a shared resource and infrastructure we all rely on, but due to its design we have seen many forms of capture overtake the internet, from data mining to market manipulation. Robin Berjon’s article Capture Resistance calls for new design standards to ensure that softwares and the underlying infrastructure of our web protocols do not allow for capture.

Berjon outlines four main attack vectors:

Capability to observe the behavior of users on another business’s products:
Data mining, trackers, etc. Defaults that project power laterally between markets:
Default search engines, browsers, and similar. Intermediation which happens when an entity has power over relations between other entities:
Platforms that act as gatekeepers and extract rent from peer transactions. Opaque mechanisms used in place of open markets:
Lack of clear rulesets that are mutually consented to.

Berjon calls into question the institutions and infrastructures of the web, suggesting the need for explicitly capture-resistant designs. These can be brought about through technical shifts like new security standards as well as through regulatory practices like antitrust laws. Above all he suggests that these infrastructures must be “subject to either exit or voice (or both).” That is to say that users need to be co-governors in the systems they interact with and need to be able to leave those systems for better alternatives when they no longer serve them.

The internet is not only digital but also infrastructural and institutional and we need to think about capture across every aspect of the system. Source

We can see from this how even anti-institutional projects, like many within the blockchain movement, are subject to capture. For instance, PoW and PoS systems open up opportunities for capture by a minority of miners or stakers. Because a large amount of capital is needed to have real influence and make participation profitable, the governance of these systems can quickly become exclusionary. Here what is captured might not be what gets written to the ledger, but the ledger itself, and thus all the associated rents such as gas fees which are set by these parties and which control who has access to the infrastructure are also captured.

...users need to be co-governors in the systems they interact with and need to be able to leave those systems for better alternatives when they no longer serve them.
Uncapturable (or Unenclosable)

Anticapture is an organizational response to capture, and Capture Resistance aims to open conversation between regulators and software architects, adding to this we now introduce an architectural response to capture at a foundational layer of communication.

After all, capture can also happen through the manipulation of the medium of communication itself. This overlaps heavily with Berjon’s concerns as we’ll see, but is also a vulnerability for the Anticapture framework. If you can’t trust the system you are communicating over, then you can’t trust the decisions you make as an organization.

In order to make an uncapturable form of writing, we developed Holochain as an open source, data integrity engine for building peer-to-peer apps.

We term any communications medium that avoids capture by its very nature an uncapturable carrier. For instance, the air through which sound travels is uncapturable (so long as no one invents a cone of silence). Conversely, messages sent by intermediary or through writing can be contorted, lost, or otherwise captured in transit, opening up vulnerabilities in the system. Scaling communication through the air is vastly limited; you can only speak so loudly. So creating a form of scalable communication that is uncapturable is key to addressing the forms of capture introduced by large scale systems and societies. In order to make an uncapturable form of writing, we developed Holochain as an open source, data integrity engine for building peer-to-peer apps. The technical details for how Holochain secures the communication architecture will be described for a general audience in forthcoming materials, but in the meantime we refer you to an alpha draft of our whitepaper. It is also worth noting that our implementation doesn’t require any centralized servers or a shared global ledger and thus avoids the forms of capture that those systems open up, either in governance and security or in regulation.

Capture here is about more than just the security of communications. Holochain also ensures that users always possess their own data and can take it into new spaces they create or join. This combats the capture of privatization.

The anti-enclosure movement has a lot to teach modern activists fighting against the privatization of public goods like the internet. The image is a depiction of commoners storming London’s Richmond Park.

We previously termed this architectural intervention an “unenclosable carrier” in reference to the enclosure of the commons, a term originally referring to the privatization of public land which the commoners of England relied on. However, it can also be used to refer to any privatization of public resources, from literal enclosure with fences to the paywalling of publicly funded research or the patenting of plant genomes. More generally enclosure is a form of capture and so we have shifted our terminology to speak about Holochain as an uncapturable carrier in order to recognize how it serves as a base layer architecture that can support a broader capture-resistant framework.

Holochain and Capture Resistance

The introduction of an uncapturable carrier provides several tools that can help address the four security threats that Berjon describes. Along with several other intrinsic privacy benefits, Holochain’s peer-to-peer data structures break cloud services’ monopoly on data and privacy. This brings its own set of challenges, for instance Holochain’s accountability mechanisms force certain data into the public sphere; but unlike blockchains with their global visibility, the required data transparency only extends to the group level. Similarly, defaults and intermediation are at least in part solved by easy forking of both platform and data in what we call walkaway (a new article on this coming soon). And opaque mechanism design is limited by shared, openly accessible, consented-to rulesets that can be audited by anyone who can read source code. Without offering one totalizing solution, these tools work together to create capture resistant ecosystems from the ground up. No organization building on Holochain can capture the market or underlying infrastructure because we make exit easy and thus provide users the opportunity to build-in voice.

No organization building on Holochain can capture the market or underlying infrastructure because we make exit easy and thus provide users the opportunity to build-in voice.
Holochain and Anticapture

This brings us back to governance as users need to be able to govern their shared platforms, communities, and resources. Holochain and uncapturable carriers can help support the Anticapture framework in a number of ways. For one, because anyone can host the software and act directly peer-to-peer without a central server or ledger, the community gains a high level of autonomy. Beyond that, having clear, mutually shared rulesets helps to maintain security and ensure clarity through the varied governance stages. Holochain also follows almost the same process as laid out by Anticapture for changing the internal ruleset of a community, with the execute stage being managed by each agent for themselves autonomously from the rest. Finally, because every action within a Holochain app is signed, it is easy to trace any form of capture that does happen, no matter the stage of governance. This means communities can address any exploits that arise and continually become more capture-resistant. We explored several of these concepts further in a recent article if you are interested in learning more.

Conclusion

While there are several definitions and arenas of capture presented above, they all share a concern for the ways small, powerful groups can take control of a resource that should be held in common by the larger group, community, or society.

There is a tendency to avoid capture in one area by shifting power to someone or something that is tasked with protecting against that capture. However, that opens up a new opportunity for capture by centralizing power in a new place. Thus we need capture-resistant design across all layers, starting with a strong foundation in the medium of communication.

There is a lot of work still to be done in capture-resistance. We hope this broad overview of capture and capture-resistance frameworks helps you to think through these issues from multiple angles, holding them all in mind to see where they differ, conflict, and benefit each other.

Subscribe to be the first to read our upcoming blog on the power of Walkaway.


Indicio

All Aboard a New Era of Travel with the Digital Travel Credential

The post All Aboard a New Era of Travel with the Digital Travel Credential appeared first on Indicio.
Decentralized identity will improve the passenger experience at every step of the travel ribbon, creating new opportunities for airlines, airports, and the hospitality and tourism sectors

By Heather Dahl

For those of us who remember the pre-smartphone era of travel, with long lines and paper tickets, being able to get tickets and boarding passes and negotiate check in from a smartphone has been a transformational way to save time and alleviate stress.

But that was only a first step toward a seamless, digitized travel experience. A second, much bigger leap is now possible, thanks to a new method of using Digital Travel Credentials that was successfully showcased at the 2022 ICAO Trip Symposium in September.

Using technology developed with Indicio, SITA, the world’s leading IT provider for the air transport industry, demonstrated how passenger identity information can be ingested into privacy-preserving, tamper proof verifiable credentials based on the Digital Travel Credentials (DTC) standards set out by the International Civil Aviation Organization (ICAO) for creating a full digital equivalent to a paper passport.

A DTC verifiable credential means that passenger data can be verified as authentic instantly and repeatedly without complex integrations into databases, using third parties to manage verification, or relying on manual checking. With SITA and Indicio’s technology, any government entity can have the power to verify DTC verifiable credentials in a way that enables passenger data to become immediately actionable while preserving passenger privacy.

The verifiable credential solution created by SITA and Indicio will transform check in, boarding, border control—and all aspects of travel that require assured identity verification.

DTCs move digital identity beyond government use

The DTC verifiable credential from SITA and Indicio advances the adoption of automated, biometric, and self-service solutions. DTC verifiable credentials allow travelers to control their personal information in a way that can be digitally verified by both commercial and government entities.

Intended to be as widely accepted as existing passports and usable for purposes beyond the ticket counter, the DTC VC reduces friction at lines for the gate, border crossings, public transportation facilities like airports and cruise terminals, and hospitality sites such as amusement parks and hotels.

The ICAO standard for the DTC is structured as a phased approach, with three types of DTC. These types include the simplest and quickest way to create a digital version of an existing document (Type 1, holder derived via smartphone), a more enhanced and verified digital equivalent of the physical passport (Type 2, state derived via appointment), and a full virtual replacement for the passport (Type 3, state issued).

SITA demonstrated the first use of a Type 1 DTC in a verifiable credential.

DTCs lay a path to business growth

For those already established in the aviation and travel industry, the DTC provides a way to transform passenger experience by making travel seamless. This simplifies travel infrastructure while providing a next generation capacity to combat fraud and enhance data privacy and security.

The interoperable nature of verifiable credentials based on DTCs opens up new opportunities to develop experiences for “trusted” travelers, and to integrate with governments and other industries across different dimensions of travel, tourism, and transport without having to create complex direct integrations.

With immediately actionable data, companies, governments, and organizations of all kinds can reduce friction associated with manual verification of paper documents, improve customer experiences, and offer new services.

DTC verifiable credentials are a simple way to mitigate fraud, comply with data privacy regulations, and integrate with government, airline, and partner travel applications without direct database integrations.

The Digital Travel Credential’s time is now

Digital Travel Credential verifiable credentials can be integrated into existing systems in weeks rather than months. They provide a simplicity of mechanism that existing and alternative solutions lack. Built using open source and on open standards, and supported on both iOS and Android, the DTC verifiable credentials are a sustainable technology supported by a diverse and experienced developer community and a growing list of adoptions and implementations across multiple sectors.

Together with SITA, we see DTC verifiable credentials as providing the simplest, most effective, and robust way to advance the digital transformation of travel and significantly enhance the passenger experience. Our goal with SITA is to make it as easy as possible to enact DTC standards through a full, open source technical solution.

Learn more about the underlying technology of a Digital Travel Credential here.

The post All Aboard a New Era of Travel with the Digital Travel Credential appeared first on Indicio.


FindBiometrics

TikTok Faces Big ICO Fine: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: TikTok The UK’s Information Commissioner’s Office has […] The post TikTok Faces Big ICO Fine: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: TikTok The UK’s Information Commissioner’s Office has […]

The post TikTok Faces Big ICO Fine: Identity News Digest appeared first on FindBiometrics.


SC Media - Identity and Access

Modern multi-factor authentication: A primer

Organizations need to input extra steps of authentication to ensure users are who they say they are. Here’s what that looks like.

Organizations need to input extra steps of authentication to ensure users are who they say they are. Here’s what that looks like.


auth0

Micro Frontends for Java Microservices

This tutorial explains what micro frontends are and how you can use them with Java microservices.
This tutorial explains what micro frontends are and how you can use them with Java microservices.

Shyft Network

Veriscope Weekly Regulatory Recap:

September 19 to September 26 Welcome to another edition of Veriscope Regulatory Recap! A lot to cover this week, from Indonesia’s plan to turn on the heat against crypto exchanges to the US Treasury’s open call for public comments on crypto regulations. So, without further ado, let’s dive straight into it. Indonesia Introduces New Stricter Rules for Crypto Exchanges Indonesia is preparing

September 19 to September 26

Welcome to another edition of Veriscope Regulatory Recap! A lot to cover this week, from Indonesia’s plan to turn on the heat against crypto exchanges to the US Treasury’s open call for public comments on crypto regulations. So, without further ado, let’s dive straight into it.

Indonesia Introduces New Stricter Rules for Crypto Exchanges

Indonesia is preparing to tighten the regulation of digital asset exchanges, officials from the country’s Minister of Trade and the Commodity Futures Trading Regulatory Agency (Bappebti) said at a parliamentary hearing in Jakarta.

Under the revised regulatory framework, two-thirds of directors on digital trading platforms must be Indonesian citizens residing in the country.

“That way, at least we can prevent the top management from fleeing the country if any problem arises,” Didid Noordiatmoko, acting head of Bappebti, told parliament.

Crypto exchanges will also be required to use a third party to store client funds and prohibited from re-investing stored crypto assets.

Deputy Minister of Trade Jerry Sambuaga said Bappebti would be issuing new rules soon but didn’t give a specific timeframe.

He also confirmed that the government is still planning to launch a crypto bourse, which has been delayed several times this year.

EU Finalizes Legal text for Crypto Regulations, Covering Stablecoins & NFT

The European Union has reportedly finalized the full text of its Markets in Crypto Assets (MiCA) legislation, although it is still open to comments.

Once passed into law, MiCA will require crypto issuers to publish white papers, including technical roadmaps, stablecoin issuers to hold capital and be prudently managed, and platforms to register with the authorities.

The latest draft of the bill also covers algorithmic stablecoins, which were previously excluded from MiCA’s scope when it was first introduced in 2020.

The reason behind it is pretty apparent. Algorithmic stablecoins have gained notoriety since TerraUSD (UST) lost its peg to the US dollar, bringing Terra’s entire ecosystem to its knees, with Luna’s value dropping by almost 90%.

This is why the draft highlights that algorithmic stablecoins should fall within the scope of regulation “irrespective of how the issuer intends to design the crypto asset, including the mechanism to maintain a stable value.”

When it comes to NFTs, the draft says, the issuance of “non-fungible tokens in a large series or collection should be considered as an indicator of their fungibility.”

The UK Introduces New Law to Seize, Freeze, and Recover Crypto Assets

The UK has introduced the Economic Crime and Corporate Transparency Bill to make it easier and quicker for law enforcement agencies to seize, freeze, and recover crypto assets when used for criminal activities.

The 250-page bill, first promised in May, builds on an earlier Economic Crime Act that helped regulators place sanctions on Russia and covers more than just crypto.

“Domestic and international criminals have for years laundered the proceeds of their crime and corruption by abusing U.K. company structures and are increasingly using cryptocurrencies,” said director general of the National Crime Agency, Graeme Biggar, in a statement.

“These reforms — long-awaited and much welcomed — will help us crack down on both.”

US Treasury Seeks Public Comment on Crypto Regulation

The US Treasury Department has invited the public to comment on how digital assets are used in illegal activities and how the government should respond to them.

It listed over 20 questions covering CBDC, the risks posed by NFTs, and what the government can do to prevent crimes related to DeFi.

“The growing use of digital assets in financial activity heightens risks of crimes such as money laundering, terrorist and proliferation financing, fraud and theft schemes, and corruption,” states the document.

“These illicit activities highlight the need for ongoing scrutiny of the use of digital assets, the extent to which technological innovation may impact such activities, and exploration of opportunities to mitigate these risks through regulation, supervision, public-private engagement, oversight, and law enforcement.”

Important Announcement: 10,000 SHFT on Offer!

After Shyft DAO approved the Veriscope VASP grant proposal, an aggregate of 10,000 SHFT has been granted for Virtual Asset Service Providers (VASPs) that integrate into the Veriscope mainnet by September 30, 2022.

The fund will enable VASPs to pay the Shyft Network gas fees while using Veriscope. This offer will remain valid until December 31, 2022, or till the VASP exhausts its SHFT grant.

Read more here: https://medium.com/shyft-network/shyft-dao-approves-veriscope-vasp-grant-proposal-to-enable-free-fatf-travel-rule-transactions-77e51d735cd6

Meet us at Token2049!

Shyft Network’s Head of Strategy, Malcolm Wright, will be attending Token2049 on September 29 at 9:15 AM. Come talk to us on all things crypto regulations, particularly the Travel Rule. Drop us a line at https://www.veriscope.network/contact to set up a time.

Interesting Reads:

#1. The White House Launches a Comprehensive Framework for Crypto Assets

#2. Crypto Bill Introduced in Uruguay’s Parliament: What You Should Know?

#3. Australia’s Financial Watchdog Expands Team to Better Regulate Crypto

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope! Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

Veriscope Weekly Regulatory Recap: was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Auth0 Code Repository Archives From 2020 and Earlier

Details around a recent security event
Details around a recent security event

Tokeny Solutions

Tokeny and Ownera Partner to Unlock Global Liquidity for Tokenized Assets

The post Tokeny and Ownera Partner to Unlock Global Liquidity for Tokenized Assets appeared first on Tokeny.

26th September 2022, Luxembourg – Tokeny, the Luxembourg-based tokenization platform, today announced a partnership with Ownera, provider of a global inter-trading network based on the open-source FinP2P protocol, to distribute tokenized assets over the Ownera FinP2P network.

The digital securities industry is growing fast but has lacked a common global distribution network for connecting issuers, investors, exchanges, and other market participants. As a result, the rate of institutional adoption and investor access to high-quality digital assets has been limited up until now.

Ownera brought the industry together to develop the FinP2P open-source routing protocol to solve this problem by orchestrating the instant exchange of digital assets held on any blockchain platform, for digital cash held on any ledger. It supports primary issuance, secondary trading and DeFi-style instant borrowing against assets pledged as collateral.  This routing network has the power to open up digital distribution for the private markets and unlock global liquidity in a way that no single institution or exchange can achieve on its own.

As the market leader in tokenization solutions, Tokeny’s technology has enabled direct transfers and settlements of tokenized assets in a compliant manner, by default. Its infrastructure uses the market standard ERC-3643 to ensure each transfer can only be triggered between eligible investors through ONCHAINID, a digital identity system, enabling the transferability of tokenized assets. The integration of Ownera FinP2P distribution network enables its clients to further improve the liquidity of their tokenized assets by reaching a broader investor base across the globe.

Tokenization is the first step to unlock the full potential of assets using blockchain technology. When assets are brought onchain, they can be transferred and managed by their owners, at a lower cost, and in real-time. As the compliance is built in at the token level, visibility and interoperability is the key to maximize asset liquidity. By integrating Ownera’s FinP2P, we allow our clients and their investors to connect to additional distribution networks, and use tokenized assets for collateralization. Daniel CoheurChief Strategy Officer of Tokeny Institutions have been faced with the challenge of deploying multiple blockchain technologies if they want to access different isolated pools of assets and investors. The adoption of FinP2P will result in higher liquidity and better access to capital and assets by providing regulated firms with one secure point of connection to multiple digital asset networks across the globe. Any institution deploying a Tokeny solution now has the potential to access the added distribution capabilities of the Ownera FinP2P network. Anthony WoolleyHead of Business Development and Marketing at Ownera About Tokeny 

Tokeny provides an enterprise-grade infrastructure to allow companies and financial actors to compliantly issue, transfer, and manage digital assets on blockchains, enabling them to apply control and compliance on the decentralized infrastructure without technical hurdles. Tokeny is recognized by CB Insights and KPMG as a Top 50 Blockchain and Top 100 Global Fintech company. The company is backed by Euronext Group, Inveniam, Apex Group, and K20 Funds.

About Ownera

Ownera is a digital assets software company building the institutional rails for a new multi-trillion-dollar digital securities market.  The company led the creation of the open-source specifications of the FinP2P protocol and delivers FinP2P based network nodes and digital assets solutions to the financial industry, thus enabling global distribution and liquidity for digital securities.

The post Tokeny and Ownera Partner to Unlock Global Liquidity for Tokenized Assets appeared first on Tokeny.


KuppingerCole

Cyber Hygiene: Common Problems & Best Practices

by Osman Celik To maintain their health and well-being, people are practicing personal hygiene routines on a regular basis. These routines are continuous and never completed. By taking proactive measures, people aim to protect their health against potential diseases and disorders. Analogically, organizations must also define a routine of proactive cybersecurity practices to identify and elimina

by Osman Celik

To maintain their health and well-being, people are practicing personal hygiene routines on a regular basis. These routines are continuous and never completed. By taking proactive measures, people aim to protect their health against potential diseases and disorders.

Analogically, organizations must also define a routine of proactive cybersecurity practices to identify and eliminate critical vulnerabilities and protect sensitive data. Cyber hygiene is a set of regular practices that intend to keep systems, networks, sensitive data, and users secure against cyberattacks, data breaches, and data loss.

Good cyber hygiene practices help organizations locate unmanaged assets, gain visibility into the software installed, safeguard against ransomware and malware, avoid phishing attempts, audit administrative privileges, protect customer data, and achieve regulatory compliance. As a concept, cyber hygiene has increased in significance since the mass adoption of remote and home office practices or “WfA (Work from Anywhere”) after Covid-19 pandemic.

Embracing a cyber hygiene strategy resembles a person building new habits to work safer. Much like personal habits, organizations have cultures. Cyber hygiene routines must be a shared responsibility that all departments and users take part in. Thus, good cyber hygiene practice requires every stakeholder’s participation. This can be done by incorporating cyber hygiene strategies into organizations’ cybersecurity culture. Such strategies can include practices like using the right cybersecurity tools, keeping applications and software up to date, using MFA (Multi-Factor Authentication), implementing ZTNA (Zero-Trust Network Access), creating organization-wide password policies, and developing a backup strategy.

Common Problems

Despite the increase in cybersecurity spendings, organizations continue suffering from cyberattacks. With varying attack vectors targeting different components of organizations’ IT (Information Technology) environment, such as hardware, software, and applications, lack of cyber hygiene can result in various problems. Some of the common cyber hygiene problems include:

Security Breaches: Data is one of the most valuable assets of modern organizations. Failing to protect sensitive data, organizations often end up with data theft and expensive ransomware payouts. Poor vulnerability management and weak security policies can expose organizations to security threats like phishing, malware, and viruses.

Data Loss: When local and online storage are not regularly backed up and maintained, important data can be lost through hardware failure, data corruption, improper configuration, and theft.

Software Vulnerabilities: Software and applications should be updated regularly, ensuring that the latest security patches and the latest versions are in use across the organization for all kinds of applications and software. Otherwise, out-of-date programs may have vulnerabilities that attackers can exploit. Poor patch management and old or out-of-date software are a common cause of data breaches at organizations of all sizes.

 

Best Practices

Following cyber hygiene best practices, organizations can leverage their cybersecurity culture. Providing guidelines to security teams, these practices must be implemented across all the users. After identifying the cybersecurity gaps in an organization, a security awareness program can be implemented to support all stakeholders with their security skills.

Utilizing the right Cybersecurity Tools:

Finding the right tool is an essential part of cyber hygiene to ensure network and data security. Some of the cybersecurity tools include:

Endpoint Detection, Protection, and Response (EPDR): Organizations must protect not only their internal networks, but also all users at all connected endpoints. Endpoint security has evolved from traditional antivirus solutions to delivering comprehensive detection and prevention of different forms of malware and zero-day threats. Apart from malware detection and protection, EPDR solutions often ship with capabilities, such as endpoint firewall, URL filtering, allowlisting/blocklisting as well as alerting and reporting mechanisms. Secure Remote Access: Secure Remote Access solutions are designed to prevent unauthorized access to resources and data loss. A variety of solutions like VPNs (Virtual Private Networks), CASBs (Cloud Access Security Brokers), and ZTNA can help organizations facilitate secure network connection for users regardless of their physical location. Encryption: Using cryptographic methods can provide sensitive data protection while data is in transit and at rest.

Stepping up to Secure Authentication:

Requiring MFA for all logins can reduce the risks arising from compromised credentials. MFA offers an extra protection layer to organizations’ cybersecurity. Biometric authentication methods like facial recognition and fingerprint scanners can also provide secure and robust authentication. As password-based threats continue to rise, organizations should go beyond username/password and consider strong passwordless multi-factor authentication solutions.

Performing Regular Backups:

Backing up the data to a secondary location regularly can protect organizations against accidental mistakes, natural disasters, and cyber incidents, such as malware as well as physical and logical damage to the storage devices. To avoid such events, organizations must develop a data backup strategy and ensure that the data backups are protected against unauthorized access through air gaps, immutable storage, and encryption. To explore more, read our Market Compass on Cloud Backup and Disaster Recovery by KuppingerCole analyst Mike Small.

 

KuppingerCole – Cybersecurity Leadership Summit 2022

If you are looking for more specific and comprehensive guidance in cyber hygiene and other sessions around cybersecurity or want to meet the real experts in the fields of identity and cybersecurity, you should definitely not miss the Cybersecurity Leadership Summit that will take place in Berlin between 8 -10 November 2022.

Join the panel “Cyber Hygiene Best Practices: Why does it Matter?” session by Warwick Ashford (KuppingerCole Analysts), Boris Beuster (E.ON), Noam Green (Cyolo), Fabian Libeau (Axonius), and Manual Garat Loureiro (Booking.com).

Understand why “Cyber Hygiene is the backbone of an IAM strategy” in Manual Garat Loureiro’s (Booking.com) cyber hygiene and awareness session.

Drs. Jacoba C. Sieders (Independent Expert) will deep dive into “Exploring the Impact of Cybersecurity Regulations in the Digital World” in a cyber resilience session.

CSLS 2022 brings together cybersecurity executives, analysts and top CISOs to help delegates drive decision-making within their organization and to better understand fundamental issues such as buzzword-hunting, process complexity and cyber-threat mitigation.

Please see our webpage to see our agenda, and register here.


Affinidi

How to Use Affinidi Console — A Sample User Journey

How to Use Affinidi Console — A Sample User Journey Affinidi Console is a one-stop shop that provides a suite of tools that make it easy for builders to create personalized and privacy-preserving applications. It provides data control and ownership to end-users, and empowers you to leverage the advantages of a decentralized data ecosystem. With Affinidi Console, you can access many services
How to Use Affinidi Console — A Sample User Journey

Affinidi Console is a one-stop shop that provides a suite of tools that make it easy for builders to create personalized and privacy-preserving applications. It provides data control and ownership to end-users, and empowers you to leverage the advantages of a decentralized data ecosystem.

With Affinidi Console, you can access many services that work on top of Affinidi’s APIs. Also, you can unlock and verify fully portable Verifiable Credentials (VCs) — tamper-proof and W3C-standard digital credentials that can be verified cryptographically.

Here’s an example of how you can use the Affinidi Console to build privacy-preserving apps.

Meet Kate — a rockstar software engineer who wants to build an app called MedScore that rewards people for healthy behavior. She envisages users to input their medical records into the app and qualify for special services and discounts based on certain conditions.

As her app deals with sensitive patient data, she wants to protect their privacy and comply with standards like GDPR. Based on these business goals, she lists her requirements.

Requirements

MedScore’s requirements include,

Signup — At the time of registration, she wants to gather customers’ informed consent. Verification of patient records — Once the patient uploads their health records and documents, she wants her app to verify their authenticity. Qualification for rewards — Next, her app must compute a health score based on the entered records and decide if a patient qualifies for rewards or discounts. Storage — Kate wants to give users the option to store the digitized version of their patient records securely. Sharing — She wants users to be able to safely and securely share their sensitive data with healthcare providers.

As you can see, data management and privacy are some of the core tenets of MedScore, so she wants to understand how Affinidi Console can help.

Using Affinidi Console

Affinidi Console is the perfect no-dev tooling platform for Kate’s requirements, and let’s see how.

Signup — Affinidi’s Consent Manager displays your terms and conditions and gathers user consent at signup. All she has to do is create a privacy policy and enter it on the Consent Manager’s admin widget. Verification — Digital Check and Paper Check can verify the data on both digital and paper records, respectively. Assessing Qualification — Kate can leverage the superpowers of Rules Builder to set the criteria for qualification. Storage — Edge and cloud wallets securely store patients’ data and put users in control of their information. Sharing — Bulk Issuance and Schema Manager enable her customers to share their data in a standardized format with healthcare providers.

By leveraging Affinidi Console, Kate saves hours of development, and at the same time, she builds her dream project in a few hours.

Building privacy-preserving apps doesn’t get easier than this.

Sign up for early access to Affinidi Console today. For further questions, reach out to our Dev Evangelist, Marco Podien.

Join our community today

Get conversations going with #teamaffinidi and the community on Discord Follow us on LinkedIn, Twitter, and Facebook Get the latest updates on what’s new at Affinidi; join our mailing list Interested in joining our team? Start building with the Affinidi tech stack now For media inquiries, please contact Affinidi’s PR team via pr@affinidi.com

How to Use Affinidi Console — A Sample User Journey was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 25. September 2022

KuppingerCole

Analyst Chat #142: Cyber Resilience: What It Is, How to Get There and Where to Start - CSLS Special

A key issue for many companies beyond technical cybersecurity is cyber resilience. This refers to the ability to protect data and systems in organizations from cyber attacks and to quickly resume business operations in the event of a successful attack. Martin Kuppinger, Mike Small, and John Tolbert will explore this important topic at the Cybersecurity Leadership Summit in Berlin. For this speci

A key issue for many companies beyond technical cybersecurity is cyber resilience. This refers to the ability to protect data and systems in organizations from cyber attacks and to quickly resume business operations in the event of a successful attack. Martin Kuppinger, Mike Small, and John Tolbert will explore this important topic at the Cybersecurity Leadership Summit in Berlin.

For this special episode of Analyst Chat, they join Matthias for a virtual panel discussion to identify key actions on the path to a cyber resilient enterprise.




Defining Modern Cybersecurity Leadership

by Matthias Reinwarth Today's Leader Spotlight is one of our recent Analyst Chats in which host Matthias Reinwarth and KuppingerCole CEO Berthold Kerl talk about what makes a good cybersecurity leader.

by Matthias Reinwarth

Today's Leader Spotlight is one of our recent Analyst Chats in which host Matthias Reinwarth and KuppingerCole CEO Berthold Kerl talk about what makes a good cybersecurity leader.

Saturday, 24. September 2022

SelfKey

SelfKey offshore bank accounts Marketplace

You can instantly open a foreign bank account from crypto using the SelfKey Bank Accounts Marketplace. The post SelfKey offshore bank accounts Marketplace appeared first on SelfKey.

You can instantly open a foreign bank account from crypto using the SelfKey Bank Accounts Marketplace.

The post SelfKey offshore bank accounts Marketplace appeared first on SelfKey.

Friday, 23. September 2022

Holochain

Gossip Performance Improvements

Dev Pulse 125

In the wake of DWeb Camp, some internal dogfooding and Holo testing, and other chances to test Holochain in the real world, the Holochain core dev team has been troubleshooting and fixing performance issues in Holochain’s network protocol.

Holochain release notes Holochain 0.0.159: Gossip rate limiting bump

Release date: 6 September 2022
HDI compatibility: 0.0.20, 0.0.21
HDK compatibility: 0.0.148, 0.0.149

This release optimises a hard-coded rate limit on gossip, bringing it from 0.5mbps for recent data and 0.1mbps for historic data, up to 10mbps for all data, with a burst limit of 1gb in a 10-second window. The burst limit is also now tunable from the conductor config. (#1543)

You may ask, why was the limit so tiny? Gossip mainly supports the resilience of data, a sort of price you pay for the privilege of participating in a network. We didn’t want it to be too painful, especially for people on slow connections. However, 0.1mbps was a bit too gentle and was preventing large entries from ever being gossiped. We recognise that most people have more spare bandwidth than that, and can even accept occasional spikes in traffic. This change has already improved performance of data-heavy hApps like DevHub.

There are some other small changes:

Bugfix: Too-large gossip payloads could put the rate limiter into an infinite loop, preventing historical data from being gossiped even with higher bandwidth limits. The conductor now tries to reconnect to the Lair keystore process, with backoff. (#1529) The Lair keystore gives out separate TLS certificates to different conductors running on the same machine. (#1519) All of the HDI and HDK types related to validation have had their members made public. This doesn’t break compatibility with previous HDIs and HDKs; it simply improves autogenerated documentation. (#1450, #1452, #1457, #1458)

Read the full changelog.

Holochain 0.0.160: HDI 0.1.0

Release date: 7 September 2022
HDI compatibility: 0.0.20 to 0.1.0
HDK compatibility: 0.0.148 to 0.0.150

Nothing has changed in this release except a version bump: the hdi, or ‘Holochain Deterministic Integrity’ crate, has gone from version 0.0.x to 0.1.0. This signals that we think it’s made “significant progress towards stabilizing the deterministic integrity layer’s API”, in the words of Stefan, Holochain’s release manager. (#1550)

This brings us one step closer to putting the 0.1 stamp on the HDK itself (which includes but is broader than the HDI) and calling it stable for coding and validation. Expect a little bit of API churn yet before we hit that milestone :)

Holochain 0.0.161: Gossip bug fix

Release date: 8 September 2022
HDI compatibility: 0.0.20 to 0.1.1
HDK compatibility: 0.0.148 to 0.0.151

Bugfix: This release continues gossip reliability, fixing a bug in which a node can prematurely end a gossip round even if there’s more to be gossiped. This bug led to persistent timeouts between nodes. (#1553)

Holochain 0.0.162: Cell cloning part 1

Release date: 13 September 2022
HDI compatibility: 0.1.2
HDK compatibility: 0.0.152

This release brings a long-awaited feature: the ability to ‘clone’ a cell. In Holochain, a DNA can be cloned, which results in an agent running multiple cells within the same hApp, each spawned from the DNA. Each of these cells has its own DHT, thanks to a different network seed which changes the DNA hash. This lets you create many separate public or private spaces that share the same functionality. (#1547)

The new conductor API call that supports this is called create_clone_cell, found in the app API, and it takes the installed app ID, the role of the cell to be cloned, a new network seed, properties, and origin time, and optionally a membrane proof and cell name. (Note: the network seed, properties, and origin time are all optional and are contained in a struct called DnaPhenotypeOpt. However, these properties become part of the parent object in their serialised form, making DnaPhenotypeOpt disappear. We realise this is surprising and undocumented, and we’re working on a clearer approach. So be aware that there will probably be breaking changes soon.)

Here are the rest of the changes:

Breaking (host API): You can now order results in a chain query descending (newest to oldest); the default is still ascending (oldest to newest). ChainQueryFilter has .ascending() and .descending() convenience functions in its builder API. (#1539) Breaking (conductor admin API): The add_records API endpoint has been changed to graft_records, with modified functionality. You probably won’t have used this endpoint unless you’re working on a source chain backup or replication service, but just in case! (#1538) New (app API): The app_info endpoint now includes cloned cells. This is a non-breaking change; they just show up in the existing cell_data field of the response. (#1547) Removed (conductor config): Legacy Lair keystore config options have been removed. These options have been non-functional for a while. (#1571)

Read the full changelog.

Holochain 0.0.163: Maintenance

Release date: 21 September 2022
HDI compatibility: 0.1.2
HDK compatibility: 0.0.152, 0.0.153

This is a small release:

Bugfix: A rare ‘Arc is not quantizable’ panic is now a warning. (#1577) Breaking (Sweettest): If you use Sweettest, the Rust-based testing framework, some of the functionality that deals with inline zome has been changed. (#1575)

Read the full changelog.

Known issues There are still some reliability and performance issues with gossiping, especially for large entries. Very rarely, the WebSocket for the conductor admin API will close. The JavaScript and Rust clients haven’t been updated for the changes in the conductor APIs. Future breaking change warning

Holochain’s capability-based security model means that all zome calls need to be authenticated by a valid capability. So far, the conductor has been giving a free pass to zome calls made by clients via the WebSocket app API, treating them as equivalent to the ‘author’ capability. But in the future, these zome calls will all need to supply their own capability.

For the most part, this won’t be a hassle. If you mean for your hApp to be hosted in the Launcher or the Holo hosting network, it’ll all be taken care of behind the scenes. The only remaining scenario is UIs that access Holochain over WebSocket ports, which will probably use some sort of UI password to authenticate. We’ll give details about refactoring your hApp when the time comes.

Holochain Launcher release notes Launcher 0.6.0: Large hApps in DevHub

Release date: 14 September 2022

This release adds support for Holochain 0.0.162, which includes the cumulative bugfixes for gossip of large entries. This means that hApps in the DevHub are more likely to be accessible when you try to download and install them.

Here are the rest of the changes:

Support for everything before Holochain 0.0.162 / HDI 0.1.1 / HDK 0.0.152 has been dropped. The database is now encrypted at rest. This means that databases created with older versions of the Launcher are incompatible with the new one; you’ll need to create a new key and reinstall all your hApps. Compatibility with Ubuntu Linux 22.04 has been improved. There have been various bugfixes and UX improvements.

Read the full changelog.

Launcher 0.6.1: Debugging improvements

Release date: 21 September 2022

This release adds niceties for people who are debugging Holochain or hApps; custom conductors are now distinguished from built-in ones in the log, and other error messages and pieces of info are clearer.

Read the full changelog and download the latest installer!

hREA hits major milestone release

hREA, the project that implements the ValueFlows economic vocabulary as a set of Holochain libraries, has hit its Sapling release, version 0.1.0-beta. Connor, Pegah, and Wes at Sprillow, along with longtime lead developer Pospi, have completed the work that they believe was necessary to help hREA get to a state that developers can start hacking hApps with, submitting feedback on, and building a community around. Check out the code, read the excellent documentation, and join the Discord server to get developing. We’ll be sharing more details about this project and the ValueFlows economic vocabulary in a few weeks.

[Ed: I previously failed to credit Pospi for their massive contribution to hREA as its sole developer for many years. Sorry Pospi!!!]

Holochain In Action: IOEN and energy gamification

In this week’s Holochain in Action video, Philip Beadle of IOEN demos how he’s using the Unity 3d game engine to run IOEN-powered community microgrid simulations. His tool lets you see how energy-independent your community may or may not be, given a certain amount of solar and power storage. You can also tweak settings and see how your community fares better, get a breakdown of individual houses’ revenue and costs, and even deploy an actual Holochain-powered IOEN network for all of the houses.

Cover photo by Nguyễn Phúc on Unsplash


Radiant Logic

It’s in the Playbook: Delivering the Master User Record for FICAM

Creating rich representation for each and every user is much easier said than done. The post It’s in the Playbook: Delivering the Master User Record for FICAM appeared first on Radiant Logic.

Shyft Network

The White House Launches a Comprehensive Framework for Crypto Assets

With its comprehensive framework for crypto assets, the White House called for a “more aggressive push by regulators to take on fraud in the sector.” The White House also urged authorities to double down on their efforts to develop the “Digital Dollar,” as it will significantly benefit the country’s financial sector. The Secretary of the US Treasury, Janet Yellen, that if all the risks that
With its comprehensive framework for crypto assets, the White House called for a “more aggressive push by regulators to take on fraud in the sector.”
The White House also urged authorities to double down on their efforts to develop the “Digital Dollar,” as it will significantly benefit the country’s financial sector.
The Secretary of the US Treasury, Janet Yellen, that if all the risks that digital assets pose are dealt with, the industry will offer significant opportunities.

The White House has released a first-of-its-kind framework in the US for the “responsible development” of digital assets as 16% of adult Americans own crypto, and its global market capitalization reached $3 trillion at its peak.

Although the user numbers may have come down from the peak, it still holds strong despite the crypto winter, ready to surge again with growing adoption of Web3, DeFi, and NFTs.

The framework calls for a “more aggressive push by regulators to take on fraud in the sector,” based on its nine reports that make recommendations on protecting consumers, investors, financial stability of businesses, the environment, and national security.
The White House’s Concerns

Amid Russia Ukraine war, the White House was particularly concerned about sanctioned individuals and entities turning to digital assets to circumvent the sanctions. On top of that, an increasing number of crypto hacks and rug pulls and a considerable drop in the global digital market cap added fire to the fury.

Speaking of hacks, the most notable one that irked the US regulators was a $625 million Ronin Bridge attack, reportedly carried out by a North Korean group known as Lazarus.

A Series of Actions

What followed next were a series of strict measures by the US government and authorities, the most notable one being the US Department of Treasury’s Office of Foreign Assets Control (OFAC)’s sanctions against Tornado Cash, popular crypto mixing service. Recently, the OFAC also released FAQs in relation to its sanctions against Tornado Cash.

Despite these tough measures, the White House has indicated that although they are concerned about the misuse of cryptocurrencies by malicious elements, they want to “foster responsible digital asset innovation” with the first-of-its-kind digital assets framework. This is also significant as it comes after President Biden issued an executive order in March this year on the oversight of crypto assets.
Covering NFTs Too

The report urges the Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC) to “aggressively pursue” investigations as well as enforcement actions against unlawful practices in the crypto space. In addition, it has asked the Federal Trade Commission (FTC) and Consumer Financial Protection Bureau (CFPB) to double their efforts to monitor consumer complaints and enforce against unfair practices.

Moreover, there are plans to call upon Congress to amend the Bank Secrecy Act to explicitly apply to digital asset service providers, including exchanges and NFT platforms.

The US Treasury is also set to complete an illicit finance risk assessment on DeFi by February 2023 and an evaluation on NFTs by July 2023.

Providing Guidance

The Office of Science and Technology Policy and NSF will be tasked with developing a Digital Assets Research and Development Agenda.

The framework meanwhile encourages the Department of Treasury (DOT) and regulators to provide entities in the crypto space with guidance, best-practice information, and technical assistance.

Janet Yellen, the US Secretary of the Treasury, said that the reports clearly identify the risks and challenges associated with the usage of digital assets for financial services.

So, “if these risks are mitigated, digital assets and other emerging technologies could offer significant opportunities,” she added.

According to her, all the reports on digital assets and their recommendations provide a “strong foundation for policymakers” to realize their potential benefits while minimizing the risks.

Advocating for a Digital Dollar

As for a digital dollar, the Biden administration said a central bank digital currency (CBDC) could “offer significant benefits” to the US financial system. Besides being environmentally sustainable, facilitating faster cross-border transactions, promoting financial inclusion, and fostering economic growth, the White House said, a US CBDC could also help bolster its status as a global leader and support the effectiveness of sanctions.

The assessment was part of a wide-ranging report on digital assets following the executive order issued earlier this year.

A Treasury Department report published last week on the future of money and payments also advocated for a CBDC, noting a “natural use case” for them.
Industry concerns

Although many from the industry welcomed the White House’s comprehensive framework on digital assets, some criticized it, noting its overt focus on the risks of digital assets, pretty much ignoring all the benefits it offers.

For instance, Binance’s CZ appreciated the framework, tweeting, “It’s great to see the US moving towards a proposed crypto framework. Getting it right will help protect consumers, markets and spark responsible innovation.”

(Image Source)

He believes federal regulation will benefit the industry compared to the “current patchwork of state laws and regulations that govern this space.”

Cameron Winklevoss, on the other hand, believes that the framework fails to capture the potential and opportunities that the digital assets landscape presents.

(Image Source)

He believes that it focuses too much on the risks that cryptocurrencies pose. Instead, he believes, the focus should have been on providing clear “Rules of the Road” over the current “Regulation by Enforcement” approach.

(Image Source)

Some experts also called out the framework over its anti-Proof-of-Work stance and naming all things crypto as a scam or a threat.

So, what do you think about this comprehensive framework for digital assets? Let us know on our social media channels.

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope! Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

The White House Launches a Comprehensive Framework for Crypto Assets was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


FindBiometrics

Refugee Biometrics, Safe City Surveillance, and Massive Cyberattack: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Guidelines and Standards The United Kingdom’s National […] The post Refugee Biometrics, Safe City Surveillance, and Massive Cyberattack: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Guidelines and Standards The United Kingdom’s National […]

The post Refugee Biometrics, Safe City Surveillance, and Massive Cyberattack: Identity News Digest appeared first on FindBiometrics.


This week in identity

E10 - Uber MFA Breach Discussion / Authentication / Why Are We Not Using Passwordless?

This week Simon and David do a deep dive riff on that old age chestnut...authentication!  Uber has recently been in the news regarding a data breach...one seemingly executed by using an MFA Bombing attack technique.  Could it have been stopped?  What options are available?  They then discuss a recent LinkedIn poll run by The Cyber Hut asking why are we not using passwordless au

This week Simon and David do a deep dive riff on that old age chestnut...authentication!  Uber has recently been in the news regarding a data breach...one seemingly executed by using an MFA Bombing attack technique.  Could it have been stopped?  What options are available?  They then discuss a recent LinkedIn poll run by The Cyber Hut asking why are we not using passwordless authentication....tune into hear the midweek poll results.


Ocean Protocol

veOCEAN is Launching, Data Farming is Resuming

Earn by locking OCEAN and by Curating Quality Data. 482M OCEAN Available in Rewards 1. Overview veOCEAN launches Mon Sep 26. Data Farming (DF) will resume immediately after. You can lock OCEAN, to get veOCEAN, for as little as one week or as long as 4 years. The longer you lock your OCEAN, the more veOCEAN you get. The amount of OCEAN you receive when the lock ends will always be equal to
Earn by locking OCEAN and by Curating Quality Data. 482M OCEAN Available in Rewards 1. Overview

veOCEAN launches Mon Sep 26. Data Farming (DF) will resume immediately after.

You can lock OCEAN, to get veOCEAN, for as little as one week or as long as 4 years. The longer you lock your OCEAN, the more veOCEAN you get. The amount of OCEAN you receive when the lock ends will always be equal to the amount you locked; plus there will be rewards in the meantime.

⚠️ veOCEAN cannot be unlocked before the pre-set time. If you’ve locked some OCEAN for a year, you can’t unlock it during that time, nor can you decrease the lock time.⚠️

As a veOCEAN holders, you get passive rewards by default. And, if you actively curate data by allocating veOCEAN towards data assets with high Data Consume Volume (DCV), then you can earn more. This is the DF aspect.

DF now has 481.8M OCEAN allocated. This is an increase from before, via an OceanDAO evolution towards automation.

Typical APYs will likely be 5–10%, but may be as high as 125% and as low as 1.5%. APYs will vary week to week. The value depends on total OCEAN staked and other factors.

veOCEAN uses Curve’s veCRV contracts, which have been battle-tested over two years.

The rest of this post is organized as follows. Section 2 has key dates. Section 3 describes veOCEAN. Section 4 describes Data Farming. Section 5 describes the evolution of OceanDAO. Section 6 is a walk-through of OCEAN release schedule and possibly APYs. Section 7 discusses security. Section 8 concludes.

2. Key dates Mon Sep 26: veOCEAN goes live. You can lock your OCEAN for veOCEAN at df.oceandao.org (linked from Ocean homepage) Thu Sep 29: Counting for DF resumes. As the previous DF round was DF4 (week 4), counting will start for DF5. Thu Oct 6: First ve rewards distribution to veOCEAN holders available. Rewards = DF5 payout + cut of Ocean fees. DF6 counting starts. Thu Oct 13: Next ve distribution: DF6 payout + cut of fees; DF7 counting starts. Thu Oct 13: Next ve distribution: DF7 payout + cut of fees; DF8 counting starts. And so on, for each week.

DF will follow phases as before. When it resumes, it will be with a new “DF/VE Alpha” phase for 4 weeks. “DF Beta” and “DF Main” phases will follow, as previously targeted. Details are below. As the first phase is alpha, the process may be rough at the beginning; we will iteratively improve this over weeks and months as we evolve from alpha to beta to main.

veOCEAN holders can claim their rewards distribution via the webapp, or smart contracts directly.

3. Details: veOCEAN 3.1 veOCEAN Overview

veOCEAN will operate as described in “Introducing veOCEAN”. This section gives the highlights.

ve tokens have been introduced by several projects such as Curve and Balancer. These tokens require users to lock project tokens in return for ve<project tokens>.

In exchange for locking tokens, users can earn rewards. The amount of reward depends on how long the tokens are locked for. Furthermore, veTokens can be used for voting in grants DAOs or for asset curation.

We are rolling out veOCEAN to give token holders the ability to lock OCEAN to earn yield, curate data, and [eventually] vote on OceanDAO proposals.

People can lock their OCEAN up to 4 years to get veOCEAN. If someone locks 1,000 OCEAN, they get 1,000 OCEAN back at the end, plus rewards along the way.

veOCEAN supports passive locking of OCEAN by default. Users can get higher yield by active curation of data assets in the DF setting.

3.2 veOCEAN Core Idea

The core idea is: lock OCEAN for longer for higher rewards and more voting power. A locker can be passive, though they earn more if active.

You receive proportionally more veOCEAN for longer lock times, as follows:

lock 1 OCEAN for 1 week → get 0.0048 veOCEAN = 1 / (4 * 52) … lock 1 OCEAN for 1 year → get 0.25 veOCEAN lock 1 OCEAN for 2 years → get 0.50 veOCEAN lock 1 OCEAN for 3 year → get 0.75 veOCEAN lock 1 OCEAN for 4 years → get 1.0 veOCEAN

Critically, veOCEAN cannot be unlocked before the pre-set time. If you’ve locked some OCEAN for a year, you can’t unlock it during that time. Your veOCEAN balance decays linearly over time; the amount of OCEAN you’ll receive when the lock ends will always be equal to the amount you locked.

You can always extend your lock time or the lock amount. But, lock time can not decreased.

veOCEAN is non-transferable. You can’t send it to others.

3.3 veOCEAN Earnings

veOCEAN holders have earnings from two sources:

Community fees. Every transaction in Ocean Market and Ocean backend generates transaction fees, some of which go to the community. 50% of the community fees will go to veOCEAN holders; the rest will go to OceanDAO grants, etc. All earnings here are passive. Data Farming: veOCEAN holders will get each weekly DF rewards allocation, except a small carveout for DF Crunch. For DF rewards, veOCEAN holders can be passive, though they will earn more if active. “Being active” means allocating veOCEAN to promising data assets (data NFTs). Then, rewards follow the usual DF formula: DCV * stake. But now, stake is the amount of veOCEAN allocated to the data asset, rather than liquidity in a datatoken pool. (And this stake is safe: you can’t lose your OCEAN as it is merely locked.)

All earnings for veOCEAN holders are claimable in Ethereum mainnet. (Data assets for DF may published in any network where Ocean’s deployed in production: Eth mainnet, Polygon, etc.)

There’s a new DF round every week; in line with this, there’s a new ve distribution “epoch” every week. This affects when you can first claim rewards. Specifically, if you lock OCEAN on day x, you’ll be able to claim rewards on the first ve epoch that begins after day x+7. Put another way, from the time you lock OCEAN, you must wait at least a week, and up to two weeks, to be able to claim rewards. (This behavior is inherited from veCRV. Here’s the code. )

3.6 Flow of Value

The image below illustrates the flow of value. On the left, at time 0, the user locks their OCEAN into the veOCEAN contract, and receives veOCEAN. In the middle, the veOCEAN holder receives OCEAN rewards every time there’s revenue to the Ocean Protocol Community (top), and also as part of DF rewards (bottom). On the right, when the lock expires (e.g. 4 years) then the user is able to move all their OCEAN around again.

Flow of value 4. Details: Data Farming 4.1 DF Overview

DF incentivizes for growth of data consume volume in the Ocean ecosystem. It rewards OCEAN to veOCEAN holders who curate towards data assets with high consume volume. DF’s aim is to achieve a minimum supply of data for network effects to kick in, and once the network flywheel is spinning, to increase growth rate.

DF will operate as described in “Ocean Data Farming is Launching”, with a key change: “staking” now means allocating veOCEAN to data assets, rather than LPing into data pools. And veOCEAN means OCEAN is locked for up to four years. Since it doesn’t involve pools (AMMs), there is no risk of impermanent loss. DF now supports fixed-price assets, and free assets (where “price” = gas cost to consume).

The rest of this section gives the highlights from the post.

4.2 DF Schedule

DF has these phases:

[completed] DF Alpha. Counting started Thu June 16, 2022. 10K OCEAN rewards were budgeted per week. Rewards were distributed at the end of every week, for the activity of the previous week. It ran 4 weeks. [new] DF/VE Alpha. Counting starts Thu Sep 29,2022. 10K OCEAN rewards are budgeted per week. Rewards are distributed at the end of every week, for the activity of the previous week. It runs 4 weeks. The aim is to test technology, learn, and onboard data publishers. DF Beta. Counting starts Thu Oct 27, 2022. Rewards are up to 100K OCEAN per week. It runs up to 20 weeks. The aim is to test the effect of larger incentives, learn, and refine the technology. DF Main. Immediately follows DF Beta. Rewards are up to 1.6M OCEAN per week. It runs for decades; at least 481.8M OCEAN total is committed. 4.3 DF Reward Function

The reward going to a veOCEAN holder for a given Ocean data asset depends on the amount of veOCEAN they allocated to the asset, and how much that asset is being consumed ($ volume). Reward is:

RFij = Sij * Cj

where

Sij is amount of veOCEAN that user i has allocated to asset j Cj is data consume volume for asset j, in $ veOCEAN holders receive rewards pro-rata according to RFij

For priced data, consume volume = Cj = ($ price of asset) x (number of consumes of the asset in the week). DF now handles free data as well, where gas fees act as “price”: Cj = ($ gas paid at time of consume) x (number of consumes of the asset).

The higher the consume volume, the more the rewards. The more veOCEAN allocated, the more the rewards.

For a given week, OCEAN rewarded is bounded by the OCEAN budget, and keeping APY ≤ 125%.

This is the reward function for the next DF round (DF5). Based on learnings, we can expect it to evolve in rounds that follow DF6, DF7, etc.

4.3 Data Assets that Qualify

Data assets that have veOCEAN allocated towards them get DF rewards.

The data asset may be of any type — dataset (for static URIs) or algorithm for Compute-to-Data. The data asset may be fixed price or free price. If fixed price, any token of exchange is alright (OCEAN, H2O, USDC, ..).

To qualify for DF, a data asset must also :

Have been created by Ocean smart contracts, deployed by OPF to production networks Visible on Ocean Market Can’t be in purgatory 4.4 Data Farming Q&A

Q: I staked for just one day. What rewards might I expect?

At least 50 snapshots are randomly taken throughout the week. If you’ve staked just one day, and all else being equal, you should expect 1/7 the rewards compared to the full 7 days.

Q: The datatoken price may change throughout the week. What price is taken in the DCV calculation?

The price is taken at the same time as each consume. E.g. if a data asset has three consumes, where price was 1 OCEAN when the first consume happened, and the price was 10 OCEAN when the other consumes happened, then the total DCV for the asset is 1 + 10 + 10 = 21.

Q: Can the rewards function change during a given week?

No. At the beginning of a new DF round (DF1, DF2, etc), rules are laid out, either implicitly if no change from previous round, or explicitly in a blog post if there are new rules. This is: rewards function, bounds, etc. Then teams stake, buy data, consume, etc. And LPs are given DF rewards based on staking, DCV, etc at the end of the week. Overall cycle time is one week.

Caveat: it’s no at least in theory! Sometimes there may be tweaks if there is community consensus, or a bug.

5. Evolution of OceanDAO

With veOCEAN, OceanDAO evolves to be more like CurveDAO:

ve is at the heart with v = voting (in OceanDAO grants) and e = escrowed (locked) OCEAN. The longer the lockup, the more voting and rewards, which reconciles near- and long-term DAO incentives. The DAO has increased bias to automation, and to minimizing the governance attack surface. The 143.8M OCEAN that was originally earmarked for a DAO treasury will go into DF instead. And, 122.3M OCEAN earmarked for grants will go to DF (>21.5M OCEAN remains for grants). This is on top of 215.7M OCEAN previously allocated . Therefore DF now has 481.8M OCEAN allocated; this is 34.2% of total OCEAN supply (1.41B OCEAN). 6. Walk-Through Numbers

This section walks through example numbers. It’s basically the same as in the previous DF launch post, because DF/VE Alpha phase has the same parameters as the previous DF Alpha phase.

The first subsection describes OCEAN disbursement with the most aggressive possible schedule; the subsection after describes a conservative schedule. The likely scenario is somewhere in between. A third subsection describes possible APYs.

6.1 OCEAN Schedule with Aggressive Ramp

Here’s a scenario that goes through DF the most aggressively: DF Beta takes 10 weeks and emits 100K OCEAN per week, and each multiplier in DF Main time takes 0 time (goes to 100% multiplier immediately). All plots are computed from this Google Sheet.

The image below shows the first half-year for the aggressive-ramp scenario. The y-axis is OCEAN released each week. It’s log-scaled to easily see the differences. The x-axis is time, measured in weeks. Week 0 is Oct 6, 2022. We can see the distinct phases for DF/VE Alpha, DF Beta, then DF Main.

OCEAN released to DF per week — first 0.5 years, aggressive ramp

The image below is like the previous one, but now shows for the first five years. DF Main starts at week 14 with full-blown OCEAN emissions. Recall that DF Main emission follows a Bitcoin-style curve where the rewards decrease according to an exponential with a 4-year half-life. With each DM main multiplier taking 0 time, it means that half of all the DF Main rewards are distributed in its first 4 years.

OCEAN released to DF per week — first 5 years, aggressive ramp

The image below shows the total OCEAN released by DF for aggressive-ramp scenario. The y-axis is log-scaled to capture both the small initial rewards and exponentially larger values later on. The x-axis is also log-scaled so that we can more readily see how the curve converges over time.

Total OCEAN released to DF — long term, aggressive ramp 6.2 OCEAN Schedule with Conservative Ramp

Here the opposite-extreme scenario, having a highly conservative ramp of DF rewards. DF/VE Alpha runs at 10K OCEAN per week, over four weeks. DF Beta runs at 10K OCEAN per week, over 20 weeks. DF Main starts at the 10% multiplier, transitions to 25% at 12 months, to 50% at 18 months, and to 100% at 24 months. The image below shows the first five years.

OCEAN released to DF per week — first 5 years, conservative ramp

The image below shows the total OCEAN emitted for DF. The y-axis is log-scaled to capture both the small initial rewards and exponentially larger values later on. The x-axis is also log-scaled so that we can more readily see how the curve converges over time.

DF Beta starts at week 8, where the curve gets steeper. DF Main starts at week 20, where the curve is again more gradual. The curve gets steeper with each multiplier ratchet at weeks 48, 72, and 98, finally arriving at a shape just like the Bitcoin exponential curve.

Total OCEAN released to DF — long term, conservative ramp 6.3 Example APYs

The two tables below show estimated possible APYs, from an upper bound (first table) to lower bound (second table). They are estimates because they have to make assumptions about the amount of OCEAN staked.

Upper bound APYs (approximate). This would result from a combination of aggressive OCEAN emissions schedule (parameters from above), and a lower community rate of OCEAN staking. For the latter: it assumes 2M OCEAN staked in week 0, OCEAN staking growth initially 20% per week, and growth rate reduces by 1% relative per week.Lower bound APYs (approximate). This would result from a combination of conservative OCEAN emissions schedule (parameters from above), and higher community rate of OCEAN staking. For the latter: it assumes 5M OCEAN staked in week 0, OCEAN staking growth initially 30% per week, growth rate reduces by 1% relative per week, and maximum OCEAN ever staked is 75% of OCEAN emitted.

The tables are calculated from this Google Sheet.

The numbers assume that OCEAN is locked for 4 years, so that 1 OCEAN → 1 veOCEAN. If lock for 2 years, then divide APY by 2; if lock for 1 year, divide APY by 4. The numbers assume active veOCEAN allocation towards data assets. For passive locking, divide APY by 2 (approximate).

7. On Security

veOCEAN core contracts use veCRV contracts with zero changes, on purpose: the veCRV contracts have been battle-tested for two years and have not had security issues.

We have built a new contract for users to point their veOCEAN towards given data assets (“allocate veOCEAN”). These new contracts do not control the veOCEAN core contracts at all. In the event of a breach, the only funds at risk would be the rewards distributed for a single week; and we would be able to redirect future funds to a different contract.

We have an ongoing bug bounty via Immunefi for Ocean software, including veOCEAN and DF components. If you identify an issue, please report it there and get rewarded.

8. Conclusion

veOCEAN goes live on Mon Sep 26, and Data Farming resumes three days later. veOCEAN (with DF) has passive rewards for locking OCEAN for veOCEAN, and higher rewards for veOCEAN holders who actively allocate veOCEAN towards data with high Data Consume Volume (DCV).

Appendix: Contract Deployments

The veOCEAN & DF contracts are deployed to Ethereum mainnet, alongside other Ocean contract deployments. Full list.

{
“veOCEAN”: “0xE86Bf3B0D3a20444DE7c78932ACe6e5EfFE92379”,
“veAllocate”: “0x55567E038b0a50283084ae773FA433a5029822d3”,
“veDelegation”: “0xc768eDF2d21fe00ef5804A7Caa775E877e65A70E”,
“veFeeDistributor”: “0x256c54219816603BB8327F9019533B020a76e936”,
“veDelegationProxy”: “0x45E3BEc7D139Cd8Ed7FeB161F3B094688ddB0c20”,
“veFeeEstimate”: “0xe97a787420eD263583689Bd35B7Db1952A94710d”,
“SmartWalletChecker”: “0xd7ddf62257A41cc6cdAd7A3d36e4f1d925fD142a”,
“DFRewards”: “0xFe27534EA0c016634b2DaA97Ae3eF43fEe71EEB0”,
“DFStrategyV1”: “0x545138e8D76C304C916B1261B3f6c446fe4f63e3”,
}

veFeeDistributor has a start_time of 1663804800 (Thu Sep 22 2022 00:00:00)

Follow Ocean Protocol on Twitter, Telegram, or GitHub. And chat directly with the Ocean community on Discord.

veOCEAN is Launching, Data Farming is Resuming was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


SC Media - Identity and Access

Passwordless Authentication: Getting Started on Your Passwordless Journey: Part 2

Dumping passwords and going fully passwordless may seem like an impossible task, but there are tried-and-true ways it can be done. Here's how to get your organization to move beyond passwords for good.

Dumping passwords and going fully passwordless may seem like an impossible task, but there are tried-and-true ways it can be done. Here's how to get your organization to move beyond passwords for good.


KYC Chain

Key pillars of an AML & KYC strategy for fund managers

Private fund managers need to balance a complex range of considerations in their operations – from risk and return on investment (ROI) to long-term investment strategies and compliance with financial regulations. In this article, we outline six key pillars of an AML & KYC strategy that can help fund managers realize their compliance goals. The post Key pillars of an AML & KYC strate

Infocert (IT)

GoSign desktop calcola la tua impronta ecologica: diventa più green grazie alla firma digitale

Quanto incidono i nostri comportamenti sull’ambiente? Fare una stima esatta non è semplice, ogni essere umano può avere una “carbon footprint” più o meno rilevante a seconda di quelle che sono le azioni che compie nel quotidiano, sia in ambito privato che lavorativo. Grazie alle soluzioni di Digital Trust, InfoCert supporta i propri clienti nell’innovare i processi di […] The post

Quanto incidono i nostri comportamenti sull’ambiente?

Fare una stima esatta non è semplice, ogni essere umano può avere una “carbon footprint” più o meno rilevante a seconda di quelle che sono le azioni che compie nel quotidiano, sia in ambito privato che lavorativo.

Grazie alle soluzioni di Digital Trust, InfoCert supporta i propri clienti nell’innovare i processi di business, integrando gli obiettivi ambientali, sociali e di governance, funzionali al proprio rating ESG.

Da oggi, grazie all’ultima release di GoSign Desktop, sarà sempre più facile calcolare e monitorare quanto l’utilizzo della firma digitale possa ridurre l’impatto ambientale in termini di carta e CO2 risparmiata.

GoSign Desktop registra la tua impronta ecologica!

Grazie all’ultima versione di GoSign Desktop, infatti, è possibile misurare il proprio “contributo green”, ogni transazione di firma contribuisce a ridurre la carbon footprint;  in base al numero di firme apposte l’utente, , potrà raggiungere diversi “livelli di sostenibilità” partendo dal livello friend arrivando al livello warrior.

GoSign Desktop fornisce inoltre una visione di quanto la community InfoCert contribuisca a contenere le emissioni di Co2 e lo spreco di risorse.

Grazie a Gosign Desktop potrai calcolare ciò che la Community InfoCert ha contribuito a risparmiare in termini di:

Numero di fogli di carta Litri d’acqua Kg di Co2 Scarica ora GoSign Desktop Our digital trust, your sustainability – InfoCert

The post GoSign desktop calcola la tua impronta ecologica: diventa più green grazie alla firma digitale appeared first on InfoCert.


SelfKey

Soulbound NFTs

SoulBound NFT’s significant characteristics show how crucial non-transferable NFTs will be for the growth of Web3. The post Soulbound NFTs appeared first on SelfKey.

SoulBound NFT’s significant characteristics show how crucial non-transferable NFTs will be for the growth of Web3.

The post Soulbound NFTs appeared first on SelfKey.


Spark Plus Tech

Crypto banking & payment ecosystem for future.

https://blog.sparkplustech.com/wp-content/uploads/2022/09/1645500358069-150x150.jpg There are many people who are familiar with the idea of cryptocurrency. Cryptocurrency has succeeded in drawing attention from all across the world throughout time. Cryptocurrency payments are rapidly being accepted by retailers, individuals, and enterprises. While cash may be sent manually, online payments make
https://blog.sparkplustech.com/wp-content/uploads/2022/09/1645500358069-150x150.jpg

There are many people who are familiar with the idea of cryptocurrency. Cryptocurrency has succeeded in drawing attention from all across the world throughout time. Cryptocurrency payments are rapidly being accepted by retailers, individuals, and enterprises. While cash may be sent manually, online payments make it simple to accept cryptocurrency payments, Saving time. As society and digital money have progressed well beyond bitcoin, the single currency wallet is becoming increasingly unviable. Contrarily, many investors found these digital currencies to be a disappointment. It turns out that folks lost their cryptos as a result of inadequate security. 

In addition, there are over 150 other cryptocurrencies, One must get familiar with the technical lingo used by each currency wallet in order to keep them in their unique single cryptocurrency wallets, which would be a hassle for us. In these circumstances, multi-crypto wallets are really helpful. Because they offer a wide range of services in addition to being safe and secure. As Crypto  Multicurrency wallets are vital and critical to sending, receiving, and storing crypto assets, it is expected to build a secure and easy-to-use multicurrency wallet. 

As the payment industry evolves, SPARK+ is utilizing genuine and distinctive innovation from blockchain technology to modify and aid in the development of smarter firms. SPARK+ Multicurrency wallet solution enables one to manage money wisely and makes it easier to handle different currencies in one place. This multi-currency wallet solution upholds the strictest security protocols to protect your information. An adaptable and trustworthy multi-cryptocurrency wallet solution that accepts bitcoin and major altcoins and other digital assets. Based on the demands of the Client, our wallet may be adjusted to increase and improve trade opportunities.

The platform supports a wide range of cryptocurrencies from different blockchain networks such as Ethereum, Binance Smart Chain, Tron, Bitcoin, Polygon, Avalanche, Harmony, and so on. It enables users to check balances and transaction history, send and receive coins, and exchange coins with ease. All of this within few clicks and on one platform. It eliminates the complexities of decentralized systems for users and enables smooth management of your crypto and fiat currencies with ease.

Instead of relying just on your local population, you may expand your business beyond international borders. Because blockchain transactions are rapid and have no geographical limitations, they broaden the reach of your business.

Many cryptocurrency investors may be looking for a means to put their cryptocurrencies to use. By implementing a Multicurrency wallet into your company or Business you can simply request cryptocurrency from clients, and you'll have your payment in a flash! Businesses can sell services and goods to everyone who possesses cryptocurrency, thereby boosting your company's reach, prospects, and profitability. 

This Spark+ multi-currency wallet solution, with best-in-class security features and an easy-to-use interface, can be white-labeled by businesses to stay ahead of the competition and broaden the reach of their business. For any more details feel free to reach out to us on hello@sparkplustech.com. We would love to hear from you.


SC Media - Identity and Access

Credential stuffing attacks exacerbate

BleepingComputer reports that more than 10 billion credential stuffing attempts have been recorded by Okta during the first three months of 2022, accounting for 34% of overall authentication traffic.

BleepingComputer reports that more than 10 billion credential stuffing attempts have been recorded by Okta during the first three months of 2022, accounting for 34% of overall authentication traffic.


Trinsic (was streetcred)

What’s the Difference Between an NFT and a Verifiable Credential?

An In-depth Guide Comparing Non-Fungible Tokens (NFTs) and Verifiable Credentials (VCs) Table of Contents Why We Need a Better System for Digital Identity (Tomislav’s Story) When my co-founder Tomislav applied to move to a new apartment, they requested his personal information to prove he would be eligible and able to pay rent. He sent in […] The post What’s the Difference Between an NFT and a V
An In-depth Guide Comparing Non-Fungible Tokens (NFTs) and Verifiable Credentials (VCs) Table of Contents Why We Need a Better System for Digital Identity (Tomislav’s Story)

When my co-founder Tomislav applied to move to a new apartment, they requested his personal information to prove he would be eligible and able to pay rent. He sent in photographs of his government ID, marriage certificate, bank statements, and more. Someone at the rental agency tried to take his information and withdraw his life’s savings from his bank account.

At Trinsic we’re obsessed with creating a new paradigm that allows people to share personal information in a convenient and privacy-preserving manner. Verifiable credentials make this possible, but people are just beginning to understand what VCs are and what they enable. This post will examine the world of verifiable credentials and answer a number of common questions that people have when exploring the space of decentralized identity.

A Quick Definition of Digital Identity

Both NFTs and verifiable credentials have attracted people who are interested in the future of digital identity, so we’re doing a deep dive on the benefits and drawbacks of both technologies.

First, a definition for Identity:

Identity = identifiers + attributes


Here’s what this equation has looked like over time:

Physical example: name (identifier) + hair color, height, occupation (attributes) Web2 example: username or email (identifier) + profile picture, bio, posts (attributes) Web3 example: Public cryptocurrency wallet address (identifier) + NFTs and tokens you hold (attributes) Decentralized identity example: DID address (identifier) + verifiable credentials (attributes) What Are Verifiable Credentials?

Verifiable credentials are data containers for personal information. They are JSONs (Java Script Object Notation) with semantic meaning and structured schema signed by a public-private key pair. Timothy Ruff has compared them to shipping containers. While shipping containers revolutionized the world, they also don’t seem very revolutionary to the average person. The same thing goes for how our data flows online.

Verifiable credentials standardize the format of information allowing you to own your data and selectively share it with relevant parties across the internet. Instead of hosting information in siloes that only interact through API integrations, VCs put the power back in your own hands.

What Are Non-Fungible Tokens?

Non-Fungible Tokens are unique representations of an asset on a blockchain, like Ethereum or Solana. NFTs can be bought and sold in marketplaces like OpenSea and Magic Eden. Thus far, NFTs have frequently been speculative “assets” used to represent community memberships, profile pictures, art, metaverse land, and more. NFTs are blockchain-specific, so a token that is on the Ethereum blockchain cannot be transferred to the Solana blockchain. 

Implications of Verifiable Credentials

Verifiable credentials allow you to own your data, making the process of sharing your data easier and more privacy-preserving. For example, sharing verifiable credentials could make the process of applying for an apartment easier. The purpose of checking documentation like bank statements and identity documents is to prove that you are who you say you are, and you can be trusted to pay rent. Currently, you have to show documentation and identification to each individual vendor you’re applying to. Instead, imagine that you held credentials in a digital wallet that could provide assurances about your identity and trustworthiness to pay rent.

Rather than sharing all of your personal information with each apartment vendor, you could share a cryptographic proof that you can pay rent. Concretely, what would be needed to make this a reality would be a trusted issuer, like a bank, who already has access to your financial history and could issue you a credential verifying that you’re able to pay monthly rent up to a certain amount each month. Now, instead of sharing documentation with the apartment vendor, you can instead share the credential and the apartment vendor can verify its legitimacy. The result is a more privacy-preserving system for sharing personal information.

Implications of Non-Fungible Tokens

We’re seeing NFTs used primarily as a form of purchasable membership pass, allowing people special access to channels in a Discord server or to exclusive parties at events like NFT NYC. People also use NFTs as profile pictures and social signals on platforms like Twitter. NFTs could also be used in video games, allowing people to own in-game assets and monetize in new ways. Additionally, there are people selling their digital art as NFTs, giving them access to a worldwide audience of potential collectors.

NFTs facilitate the seamless buying and selling of digital assets. Because they are closely tied with blockchains, and often associated with speculative value, NFTs are inherently financial. Anyone can make an “offer” on an NFT that you own. You pay fees to buy or even transfer NFTs on most blockchains.

Technical Differences Between NFTs and VCs Public Wallet Addresses vs. Decentralized Identifiers (DIDs)

NFTs are associated with static, public wallet addresses. So if I were to share my wallet address, you would be able to view every NFT that I own and have ever owned. This is beneficial for displaying your collection but has obvious downsides for privacy. Public wallet addresses are not a great place to store your private data.

Verifiable credentials are associated with decentralized identifiers (DIDs). VCs are not publicly viewable but instead are selectively shared with verifying parties as needed. You can have a VC issued to your wallet without the whole internet knowing. While you can make your DID public, which is very useful if you are issuing and signing credentials as a trusted authority, you can also rotate DIDs as a user interacting with different vendors.

The Role of Cryptographic Signatures

Since NFTs are associated with public addresses, you can easily check if someone’s wallet address holds a certain asset. The importance of cryptographic signatures is to prove that you are the person who is in control of the wallet. For example, if you go to an event and you are only allowed in if you hold a certain NFT, you will have to cryptographically sign a message from the wallet that holds the NFT to prove you’re the real owner.

When dealing with verifiable credentials, it’s not publicly known which VCs are associated with which DIDs. If you needed to hold a certain credential to enter an event, the vendor would request a cryptographic proof that you hold the required credential. You would then have the option to share the credential with that vendor, and they can check the legitimacy of the proof. Your information would only be shared with the event host and would otherwise remain encrypted and private to you.

How Blockchain Relates to NFTs and VCs

NFTs are minted on specific blockchains, establishing a unique identifying address and a link to the digital asset. When transferred, NFTs must pay ledger fees to move them on their specific blockchain. NFTs are moved from public address to public address in easily trackable transactions. Blockchains are required and used for any minting and transferring of an NFT.

In comparison, verifiable credentials are issued to digital wallets, where the credentials are stored and can only be accessed by the wallet owner. Verifiable credentials are not tracked by their movement on a blockchain, and you would have no way of knowing another user received a verifiable credential, or if they presented a credential to a third party. Blockchains are not required for exchanging verifiable credentials, but layer-2 networks like Ion, built on top of Bitcoin, can provide an additional level of security when creating DIDs.

How NFTs and VCs Could Impact Our Digital Identities Key Benefits of NFTs in Digital Identity

NFTs are great for public and transferrable digital assets. Blockchains allow you to show that a specific wallet address owns a digital item. For example, we know that the owner of Cryptopunk #5822 is the Ethereum address 0xe648730be51893acd0045522a0830ec5399ae74c. This is great for showcasing a collection of things like art and membership passes. For this reason, NFTs can help to establish your reputation publicly as a collector, investor, community member, or donor.

 

NFTs also provide a monetization strategy for artists, designers, and community builders, allowing them to tap into a global, decentralized market. The combination of public display and financial speculation has created viral marketing loops, leading to a large interest in NFTs over the past two years.

Biggest Drawbacks of NFTs in Digital Identity

You would not want to represent identifying documents as NFTs. You cannot “prove” claims about yourself such as “I am 21 years old” using an NFT. Even if the US government issued everyone an ID card NFT, you could sell it.


Note: It is possible to turn off the “transferability” function in a non-fungible token contract. These assets are often referred to as “Soul Bound Tokens” (SBTs) which means they are forever tied to a specific public wallet address. SBTs have more potential for use in an identity context but risk making private information like your date of birth and address public.

Since NFTs are tied to blockchains, you incur fees when transferring them. Thus, if you were to issue tickets to an event as NFTs, you would have an additional cost of paying fees to send the tickets to every attendee’s address. You’re also tied to the speed of that blockchain to complete the transfer of the ticket.

Key Benefits of VCs in Digital Identity

Verifiable credentials are stored privately in a way that doesn’t save any personal information to a public blockchain. You can selectively share information with relying parties without leaking it to the entire internet. VCs do not have to be tied to a blockchain and can be faster and more scalable for implementing in the real world. Thanks to DIDs and World Wide Web Consortium (W3C) specifications, it’s possible for VCs to be interoperable across different blockchains and even web2 environments. Lastly, VCs are non-transferrable nor purchasable, so you will not run into the issue of someone selling their government-issued ID. 

Biggest Drawbacks of VCs in Digital Identity

While privacy and a lack of financial speculation are two huge benefits of using verifiable credentials in digital identity, both factors have limited the social reach and virality of the concept. Search volumes for NFTs are several orders of magnitude higher than for verifiable credentials. When you hear about someone selling an NFT for $10,000 profit, you are likely to look into what’s going on and spend some time understanding the technology. 

Monthly searches for terms realted to NFTs and VCs

Verifiable credentials are the more practical technology to digitize our identities and finally allow people to prove who they are on the internet. The challenge of user adoption is on everyone’s mind, and Trinsic is here to make it easy for developers to issue, manage, and verify credentials. If you’re interested, you can get started in our Studio for free right away.

The Future of Digital Identity Is in Our Hands

Both NFTs and VCs provide new approaches to trust digital information. NFTs allow you to publicly prove that you are the unique owner of a digital asset, whereas VCs allow you to privately prove the legitimacy of a claim made over the internet. At their best, both technologies create new avenues for digital coordination that are more humane, efficient, and user-centric.

The future of digital identity will be determined by builders, developers, designers, community builders, and dreamers. If you want to continue the conversation about the future of digital identity, join our Slack community!

The post What’s the Difference Between an NFT and a Verifiable Credential? appeared first on Trinsic.

Thursday, 22. September 2022

KuppingerCole

A Comprehensive Approach to Solving SaaS Complexity

As businesses adopt cloud-based services as part of digital transformation programs to enable flexible working, boost productivity, and increase business agility to remain competitive, many IT and security teams are finding it challenging to gain oversight and control over the multitude of Software as a Service (SaaS) applications. Join security experts from KuppingerCole and Axonius as they

As businesses adopt cloud-based services as part of digital transformation programs to enable flexible working, boost productivity, and increase business agility to remain competitive, many IT and security teams are finding it challenging to gain oversight and control over the multitude of Software as a Service (SaaS) applications. Join security experts from KuppingerCole and Axonius as they discuss how to address common security challenges around SaaS such as a lack of visibility of applications and internal and external threats. They will also look at managing risk and explain how to go about delivering security that can drive and enable business growth.

Richard Hill, Senior Analyst at KuppingerCole, says security starts with an understanding of what IT assets require protection. He will therefore cover the importance of asset inventories, connections to IT applications and services options, compliance, and automation of policy enforcement. Amir Ofek, CEO of Axonius, will explore the key SaaS challenges, the pitfalls of existing approaches to solving them, and how both business value and risk management needs can be addressed in a single comprehensive solution that can also help optimize licensing and spend.




Identosphere Identity Highlights

Identosphere 100!!! Decentralized Ecosystem Governance vs Trust Registries • Cardano on the bandwagon? • OpenWallet Foundation

We do our very best to cram the top SSI\VC\DID news ideas and market insight to a single page EVERY WEEK!!!! Consider a monthly contribution to our patreon :)
Welcome to the 100th issue of Identosphere’s Weekly Digest!

Please consider contributing to our Patreon so we can keep rocking out and bring it even harder for the next 100!

Thanks to our Supporters! Contribute on Patreon …or reach out to Kaliya directly

Read previous issues and Subscribe : newsletter.identosphere.net

Content Submissions: newsletter [at] identosphere [dot] net

Upcoming

Messari Crypto - Mainnet2022 9/21-23 -Kim Hamilton Duffy Speaking

Risk Management: Gramm-Leach-Bliley Act Security Compliance 9/29 EdgeSecure

Self-Sovereign Identity - Blockchain’s Killer Application?! 9/27 Cryptovalley. Switzerland/Online 

Zero Trust Authentication – Blockchain government ID with PKI Orchestration 9/27-29

North Capital Forum by @USMexicoFound, 9/28-30 Mexico City - Transmute’s Thursday afternoon panel "Trust and Transparency in Trade"

European Blockchain Week 9/29-10/5 Slovenia and Croatia 

Infrachain Summit 10/4 about real business – not the next hype

Identity Week USA (formerly Connect:ID) Washington DC 10/4-5 (Kaliya and other DIF folks will be there)

Internet Identity Workshop #35 11/14-16, Mountain View CA

Community Project on mDL and VCs Last week we shared about the Community Project on mDL and VCs

Next week we are hosting two community calls to collect input for the project On Sept 27th in Asia morning time and on Sept 27th in US morning time.

Meme of Week Hiring Vinícius Niche @viniciusniche of Truvity shares

Hey Tech Twitter, @TruvityHQ (where I work) is hiring engineers for the Infrastructure Developer (Go/Kubernetes) role, details are on the thread

Kaliya met the CEO this week at the Open Source Summit Dublin and was impressed. 

Explainer

TOWARDS A USER-CENTRIC DIGITAL IDENTITY SYSTEM Irish Tech News

Debunking Common Misconceptions About Passwordless Authentication auth0

From DID to DeFi and EVM Metadium

What is Web5? Timothy Ruff

Research \ Report Cryptography Review of W3C Verifiable Credentials Data Model and DIDs Standards and Cryptography Implementation Recommendations SRI International

SRI focused primarily on the cryptographic algorithms being used in the W3C standards and not on blockchain and DLT technologies or their use in operational systems. An algorithmic review is an important starting point to a full, system-level review for compliance to the federal standards and other requirements

Unmasking Power: Alternative Futures for Empowering Our Digital Identities.Chopra, Shreya (2022)

The project is directed primarily toward design and innovation teams, and associated knowledge workers, whose efforts have significant influence on future technologies, platforms, and their impacts. This work explores how we might deconstruct power dynamics prevalent in digital service design today. Through multiple analyses, maps and models of these systems, the paper reveals multiple opportunities for change. 

INNOPAY paper on data sharing published in CEUR Workshop Proceedings Innopay

This week, CEUR-WS.org has published the paper titled ‘Harmonization Profiles for Trusted Data Sharing Between Data Spaces: Striking the Balance between Functionality and Complexity’ in the CEUR Workshop Proceedings. 

Sellafield DLT Field Lab Harnessing the power of distributed ledger technology: how Digital Catapult’s Field Lab methodology can transform your business Condatis

The nuclear sector presents an exciting opportunity to implement advanced digital technologies for driving operational improvements and cultural transformation. Our DLT Field Labs showed how some of the challenges that seemed perplexing at the start of our journey have been deciphered through innovation and collaboration. 

Legal identity of a person in a digital world Vikas Malhotra

Today, Sep 16th is the International Identity Day, a commemoration of the UN Sustainable Development Goal 16.9 which calls for the provision of legal identity for all by 2030.

UNDP LEGAL IDENTITY AGENDA ONLINE FORUM: PRIVATE SECTOR ENGAGEMENT ROUNDTABLES: DATA PROTECTION AND PRIVACY Marketing McK Insights: Why digital trust truly matters, and what it means for your bottom line Dr. Carsten Stöcker

The results of our survey of more than 1,300 business leaders and 3,000 consumers globally suggest that establishing trust in products and experiences that leverage AI, digital technologies, and data not only meets consumer expectations but also could promote growth.

Instagram “slide show” about SSI Market Potential mehdicherifm

GROWTH
- New revenue growth
- New revenue streams
- Turn regulation in to revenues
- More customer reach
- More efficient operations

Writing for Verifiable Credentials Marketing Workshop Indicio

An Interactive workshop designed to uncover the winning strategies, and pitfalls to avoid, when communicating decentralized identity to customers, internal stakeholders, and the world.

Interop Validated ID is set to complete the S4EDI20 interoperability tests

This represents the last phase of the AS4EDI20 project to implement the CEF eDelivery AS4 profile in Europe. This project is co-financed by the European Commission through the CEF Telecom program and managed by HaDEA, with action number 2020-EU-IA-0024.

Wallets OpenWallet Foundation Nat Zone

The formation of the OpenWallet Foundation was announced at the Open Source Summit held in Dublin on the evening of the 14th. The OpenWallet Foundation is an open source wallet based on standard protocols

The Launchpad: Introducing the new ID Wallet Global ID

As the user-facing part of the Trust Triangle, your ID Wallet should be beautiful, secure, and convenient. 

An Identity Wallet Bill of Rights - Starting With the Mobile Driver License Spruce Systems

Spruce’s continued mission is to let users control their data across the web, whether it’s web2, web3, or beyond. This also applies to credentials issued by existing entities, such as the Mobile Driver License (mDL) issued by motor vehicle authorities across the world.

Global ID: Introducing our new ID Wallet FUTURE PROOF

Our biggest product release in some time, our new ID Wallet is a core pillar of our mission to enable anyone to create and own their digital identity. We spoke with GlobaliD’s Trey Steinhoff to discuss the launch.

Aries \ Indy \ AnonCreds the dialogue continues Learnings from Aries, Indy and Various Verifiable Credential Implementations Northern Block

Standards such as OIDC and mDL are all now in dialogue with W3C, AnonCreds, Aries, etc. Mobile is a predominant technology, just like the way laptops were once upon a time. To reduce consumer friction and drive adoption, convergence of all these different technologies is required inside a mobile environment

Hyperledger Aries is the Present and the Future of Internet-Scale Trusted Verifiable Credential Ecosystems Indicio

While no technology runs perfectly on every device, a signal strength of Aries, AnonCreds, and Indy is that they work on the vast majority of current devices and systems, including $35 smart phones and low powered IOT/embedded devices. They represent the most inclusive way into this technology, which is an important factor in their popularity.

AnonCreds Indy-Pendence Cheqd

Part 1: Decoupling the reliance on Hyperledger Indy and creating more extensible AnonCreds Objects with cheqd.

Standards Work Premature Standardization & Interoperability Continuum Loop

Here’s my premise – we don’t have standards nor interoperability – at least not as people really need. We have been through a process that is powerful and good – but what we have is what I call “premature standardization.” It’s a great start but nowhere near where things will be.

Notes from W3C TPAC on major deployments of Verifiable Credentials Manu Sporny via Phil Archer

Steel, Oil Agriculture Shipment into US Customs ($2.3T in good/year)

European Digital Wallet (€163M funding, 450M people)

Digital Education Credentials in Uganda, Nigeria, Kenya (323M people)

Digital Age Verfication (152k retail stores, 200M people)

Content Authenticity Initative (30M Adobe customers)

Digital Permanent Resident Cards (14M people)

IDnow joins Accelerate@IATA to shape the future of seamless air travel IDnow

The goal of IATA One ID is to set industry standards that further streamline the passenger journey with digitalization of admissibility and a contactless process through secure biometric enabled identification.

Cardano showing interest in our work Advancing digital identity through DID core specification IOHK

Good news to see Cardano jumping on the bandwagon, looks like they will join the fray and bring DID\VC to Atla Prism.

The recent DID core specification approval at the World Wide Web Consortium (W3C) provided clearer and stronger foundations for identity platforms building decentralized identifiers.

Governance Decentralized Ecosystem Governance: Better, More Effective, and More Robust than Trust Registries Indicio

Decentralized Ecosystem Governance makes verifying data an easy-to-play game of red light/green light. And, importantly, it decentralizes governance to the appropriate authorities.

Trust Registries Tweetstorm Continuum Loop 

We want to start a conversation on Trust Registries and get people thinking about how Trust Registries will help answer the hard questions an ecosystem needs to create a whole experience [tweetstorm]

DAOs are not corporations: where decentralization in autonomous organizations matters Vitalik Buterin

Because DAOs do not have a sovereign above them, and are often explicitly in the business of providing services (like currency and arbitration) that are typically reserved for sovereigns, it is precisely the design of sovereigns (political science), and not the design of corporate governance, that DAOs have more to learn from.

Web 3 Identity and Web3 auth0

A key opportunity Web3 presents in the identity space is the ability to interact with a user's blockchain data. This presents two benefits: enriching user profiles and streamlining the login process with federated logins using storage wallets.

Infura Announces Plan to Foster a Decentralized Infrastructure Ecosystem ConsenSys Blog

Decentralizing access to blockchain APIs is a vital step to improve network uptime and importantly, give people sovereignty of their personal data.

Ocean Protocol joins leading Web3 projects on the €20M+ Gaia-X moveID initiative to advance pan-European mobility Ocean Protocol

Ocean Protocol, the Web3 platform to unlock data services for AI and business innovation, has joined forces with Chainstep, Datarella, Fetch.ai, peaq and 51nodes to develop the system architecture for European mobility with the preservation of data autonomy as its core principle, within the Gaia-X moveID project.

Dock’s Web3 ID Now Available on Auth0 Marketplace Dock

Dock has partnered with Auth0, one of the world’s leading identity management companies. Auth0 has added the support for Dock’s Web3 IDs in their marketplace integration to enable Auth0’s enterprise customers to integrate Web3 IDs on their platforms.

Ethereum Merge The Merge Is Done. What’s Next for the Ethereum Ecosystem? ConsenSys Blog

Now that Ethereum runs on a PoS consensus mechanism, builders will be able to start using a form of data partitioning to improve throughput, called sharding. Sharding is a horizontal scaling technique common among other major database providers. 

Ethereum Blockchain Eliminates 99.99% of its Carbon Footprint Overnight After a Successful Merge According to New Report ConsenSys Blog

This upgrade transforms Ethereum, the world’s first and largest smart contract platform, into an almost net-zero technology positioned for sustainable future growth

Use Cases Transmute U.S. CBP Steel Tech Demo [video]

The story focuses on critical trade verifiable credentials being issued, presented, and verified by trade, CBP, and PGAs.

InfoCert, AUTHADA and Dr. Ing. Wandrei develop a new tool for QES in the circular economy Infocert

signature can now be done on mobile devices such as smartphones and tablets with the new NSUITE.mobile product, with a consequent streamlining of the entire process.

More Security in the Internet of Things – Thanks to ETO ETO

ETO uses a network of distributed digital identities (DIDs) and verifiable credentials (VCs). A side benefit from the perspective of human Internet users: they regain data sovereignty over their personal data. [github]

War Against the Robots – Pick your Side with SelfKey & Metaproof Platform SelfKey Foundation

Trying to ban or sideline bots is not a solution. The solution is to distinguish bots from humans so that humans can do human stuff and bots can do bot stuff. 

Creating a culture of recognition We Are Open co-op

Too often, though, these badges focus on credentialing rather than recognition. Open Recognition is the awareness and appreciation of talents, skills and aspirations in ways that go beyond

Self-Sovereign Digital Twins MOBI OpenEarth partners with BCGov to develop a digital trust marketplace for climate accounting OpenEarth Foundation

OpenEarth Foundation partnered with the Mines Digital Trust initiative to allow BC companies to also share their greenhouse gas (GHG) emissions credentials to the OpenClimate platform so that they can be integrated into BC’s subnational climate inventory and showcased to interested purchasers and civic society

✨ Thanks for Reading! ✨

Subscribe \ Read More: newsletter.identosphere.net

Support this publication: patreon.com/identosphere

Contact \ Submission: newsletter [at] identosphere [dot] net


FindBiometrics

Is Google Actually Selling Facial Recognition to Israel?

What are the Project Nimbus protests really about? Earlier this month, employees of Google and Amazon took to the streets to protest their employers’ work with the Israeli government through […] The post Is Google Actually Selling Facial Recognition to Israel? appeared first on FindBiometrics.

What are the Project Nimbus protests really about? Earlier this month, employees of Google and Amazon took to the streets to protest their employers’ work with the Israeli government through […]

The post Is Google Actually Selling Facial Recognition to Israel? appeared first on FindBiometrics.


Indicio

Decentralization and the Future of Enterprise Security

The post Decentralization and the Future of Enterprise Security appeared first on Indicio.
The pandemic fundamentally changed the architecture of cyber security— now organizations must adapt to survive and eventually thrive

By Dr. Chase Cunningham Ph.D 

Imagine a wall with a gate. The gate allows you to monitor and decide who you let inside the wall. It’s somewhat effective at achieving these ends. Some people climb over, some tunnel under, and some fake their way inside with stolen access credentials. Improbably, some allegedly managed to get inside one of these walls hidden in a big wooden horse. It happened in Troy, remember that? 

Troy was our first proof of failure of a perimeter-based security model, and it happened in 1184 B.C. We have known for a long time, over a millenia, that this model is doomed to fail. Yet when it came to protecting digital assets, all we did was take a failed physical approach, digitized it, and expected to do better than the Trojans.  Wow, were we wrong.

If I labor this point, it’s because the digital walls we have built  now have thousands of gates: one for every employee, one for every employee device — maybe even one for every cloud application. Working from home during the global pandemic turned the wall into a sieve, everywhere, all at once, that operates at the speed of light (literally).

The wall has fallen and isn’t coming back. It’s not simply that hybrid work environments blew it up, it’s that the rapid pace of digitalization makes it impossible to rebuild. The expansion, every day, of digital services and e-commerce shows that we, collectively, want the convenience, efficiency, and opportunity of a more digitized life. Businesses require speed and dynamism, users want security and ease of use anywhere at any time, and global connectivity demands that we interoperate and optimize now, not later.

The sooner we realize that we need some “post-wall” thinking about security, the sooner we’ll be able to deal with the exponential growth in attacks on company and organizational infrastructure. If we accept that we are now living in a digitalized society, we need to start by acknowledging some new ground rules:

People are not going to magically become security experts. You can train your workforce to be cyber aware, but all it takes is one person to click on a malicious link for your company to be phished. Training may reduce the odds, but not to zero. That we see phishing attacks work over and over again demands that we accept human nature as it is and not how we’d like it to be.. People will seek the most convenient process. We are energy misers. The course of action most likely to be taken will be the one that requires the least amount of effort. So, while various forms of multi-factor authentication may increase security, they also increase cognitive burden and annoyance. We have to make security simple—a part of the architecture.
. Employees are not going to want to submit their personal devices for securitization. Putting security and surveillance software on all the devices your employee is likely to interact with is going to be challenging and likely impossible, for practical, privacy -related, and political reasons. People don’t want their employers to have access to and control of their personal mobile devices.
. We already have physical means of verifying and validating “who” a person is and “what” they are trying to do (ever boarded an aircraft since 9/11?). We should be able to apply that same methodology and metric into today’s digitized systems and not make people suffer through a network-enabled TSA.

What then is the answer

If the wall has been dissolved or, at the very least, rendered useless by a thousand gates, then everything must be verified everywhere continuously because the gates are now everywhere. And the only way to do this easily, effectively, and inexpensively is through using solutions like verifiable credentials coupled with a security technology stack that removes friction while increasing adoption and use.  This is applied to the user and doesn’t rely on them to adopt any additional solutions.

Verifiable credentials make trust in identity and data portable, because the process of creating a verifiable credential means that the integrity of the transaction and the integrity of any information that transaction contains is guaranteed by a combination of decentralized identifiers, decentralized ledgers, cryptographic signatures, and biometric interfaces with mobile devices. To put it simply, using verifiable credentials means we know the “who” and “what” and are able to validate that there is a reason for an action to occur.  We don’t just allow a connection because of a username and password —and we don’t enable default connectivity because a firewall rule says “allow all”.

These technologies allow trust to be triangulated between the issuer of a credential, the entity holding the credential, and the validator verifying the credential. And when the integrity of a verifiable credential is verified, the information associated with it is immediately actionable.

A critical feature of decentralized identity is that it enables the separation of form from content. It’s the form in which digital information is held that makes the  information verifiable. This has huge implications: It removes the need to store critical data or personally identifiable information in a centralized database as part of the verification process as the nature of that validation triangulation is distributed from the start. All data is tied to its owner and shared in privacy-preserving ways over uniquely encrypted communications channels.

A verifiable credential replaces a login and password, and verification is instant and device agnostic. There is no “one ring to rule them all” that is the single point of failure in this model. If we take other security technologies and apply them to build out the security posture of the entities tied to that validated and verified credential, the rising tide of a better security position does finally raise all ships.

As the perimeter defenses of a billion digital entities dissolve, we can see online interaction as a much more open environment where information can move much more freely; that’s what the internet is for. It was designed for that very purpose, it was not built to be secure. But with today’s technologies applied in innovative but natural ways, we can operate in a secure and optimized manner. The only way to make this digital reality work is to accept that the old security architecture failed all of us and is never coming back. Thankfully.

The way we get better is to accept that reality and build our future to look the way that the internet was meant to operate. Distributed, diverse, adaptable, dynamic, and integrated with our physical selves.

The post Decentralization and the Future of Enterprise Security appeared first on Indicio.


FindBiometrics

EU Lawmakers Spar Over Facial Recognition: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments There is now majority support […] The post EU Lawmakers Spar Over Facial Recognition: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments There is now majority support […]

The post EU Lawmakers Spar Over Facial Recognition: Identity News Digest appeared first on FindBiometrics.


SC Media - Identity and Access

Passwordless Authentication: Getting Started on Your Passwordless Journey: Part 1

Everyone agrees passwords are bad, but few organizations know how to move on from them to a passwordless solution. Here's how passwords fail, why it's hard to break up with passwords and why now is nonetheless the time to go passwordless.

Everyone agrees passwords are bad, but few organizations know how to move on from them to a passwordless solution. Here's how passwords fail, why it's hard to break up with passwords and why now is nonetheless the time to go passwordless.


auth0

Choosing the Right .NET SDK for Auth0

A map to choose the right SDK to integrate Auth0 in your .NET application.
A map to choose the right SDK to integrate Auth0 in your .NET application.

Northern Block

Feature Showcase – Presentation Proposals

Our NB Orbit Enterprise is a web based platform which includes a cloud wallet, connection management system, verifiable credential issuance & verification tools and more. By using our products, organizations can deploy digital credentialing ecosystems, become credential issuers, accept digital credentials within workflows and use a variety of digital wallets and agents to connect and […] The


Our NB Orbit Enterprise is a web based platform which includes a cloud wallet, connection management system, verifiable credential issuance & verification tools and more.

By using our products, organizations can deploy digital credentialing ecosystems, become credential issuers, accept digital credentials within workflows and use a variety of digital wallets and agents to connect and exchange credentials.

Today, let us show you how our NB Orbit Enterprise Presentation Proposals function allows an Organizational Holder to send a proposal to a verifier entity. 

It allows an organization to initiate the workflow rather than waiting on a verifier to send a proof request. We have implemented the orchestration provided by Aries RFC 0037: Present Proof Protocol 1.0 to enable all this to happen.

We’ve written a blog post about why credential issuance should be holder-driven (here). We think that the same is true for verifications and other functions.

The steps are the following:

The Proposal is sent by the Prover (your organization) The Verifier can be automatically sent back a proof request to the Prover (this is configurable) The Prover clicks on Proof Request and selects a credential to provide in the proof presentation. The Prover can add self-attested attributes if they wish. The Verifier then receives the Proof Presentation (in Verification Monitoring if they are using NB Orbit Enterprise)

The post Feature Showcase – <strong>Presentation Proposals</strong> appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Dock

Web3 and Crypto Scams You Should Be Aware of and How to Avoid Them

From fake Bitcoin giveaways to impersonations and crypto rug pulls, crypto scams have troubled every Web3 and Cryptocurrency user. Here's the most updated guide on the most popular crypto scams and how to avoid them.
Introduction

In 2022, a total of $2 billion has been already lost in the cryptocurrency and Web3 space as a result of scams and hacks. Centralized crypto platforms attempt to provide users with security features like account passwords and 2-factor authentication to secure their funds. However, the control of users’ crypto wallets and private keys still remains with the company.

This is the reason why you are hearing when centralized crypto lending platforms going bankrupt because they are taking users’ funds with them. With a custodial wallet, another party controls your private keys and most custodial wallets these days are web-based exchange wallets.

This is why we always say in crypto:

“Not your keys, not your crypto.”

This advice refers to two key points:

1) if you don’t truly own your private keys, you don’t truly own the cryptocurrencies in your wallet

2) if your private keys are, in any way, compromised, stolen, or in the hands of someone else, your funds are not safe.

The latest iteration of the World Wide Web, Web3 is built on the blockchain with a decentralized infrastructure promising to give its users complete control and ownership of their data with much greater security. Decentralization of data networks or the financial ecosystem also means that the responsibility of the security of users’ data and funds also comes back to the users.

In this blog, we are going to discuss in detail the types of scams Web3 investors fall victim to and how to stay safe against them. Users have suffered due to these scams frequently and our team has

Types of Crypto Scams

Web3 is no longer merely an idea. It is intrinsically linked to ownership of data and financial value. Millions of users are falling victim to the scams in the Web3 and crypto industry and billions of dollars of funds have been lost. This is preventing the penetration of Web3’s adoption into different industrial sectors as the hacking incidents portray a flaw in the technology.  

In this blog, we are going to discuss in detail the types of scams web3 investors fall victim to and how to stay safe against them. Users have suffered due to these scams frequently. You will also find the best prac

Phishing Scams

Phishing scams are the most common type of scams in the crypto and Web3 space. Scammers only target one thing: your private key and seed phrase because this information gives full access to your funds. Every attempt people make in this type of scam is to get your private keys in any way possible.

What are these ways? Here are the types of phishing scams that are frequently used to deceive users into submitting their private keys.

Seed Phishing Through Ads

In a recent incident that happened in Q1 2022, scammers put Google Ads to promote the malicious URLs of their websites that were either claiming themselves to be a wallet or a wallet aggregator service. They made users submit their seed phrase on their website to access their own (different) crypto wallet on their platform. What it turned out to be was a simple form where they were collecting public addresses and seed phrases of users and using that information to draining all the funds which left users’ wallets empty.

Continuing the same warning, NEVER EVER share your private keys and seed phrases with anyone or on any platform. No legitimate Web3 and cryptocurrency platform will ever ask you for your seed phrase.

Google Ad of a fraud website Ice Phishing

In the real world, ice fishing is a practice of catching fish where the fisherman needs to create a hole in the middle of a frozen lake or let’s say an opening. A similar opening is created by hackers on the smart contract platforms where they tamper with the user interface. All the hacker needs to do is inject a code that will replace the receiver’s address with the attacker’s address. Being an encrypted randomly generated stream of characters, users don’t double-check before confirming the transaction and click ‘ok’.

Stay protected:

There is only one way to defend against this attack: DOUBLE-CHECK each and every character of the sender’s address at least twice, both in the receiver’s address box and the confirmation screen.

Social Media Phishing

Crypto security firm Certik released a quarterly report in July 2022 in which they noticed a 170% increase in phishing attacks. A striking fact that Certik also mentions in their report is that most phishing attacks are conducted on social platforms like Discord, Telegram, and Twitter. Social media is a huge hunting ground for crypto scammers.

Scammers will share quotes by famous billionaires or celebrities, slide into your DMs, or share links with exciting rewards to lure users into stealing or acquiring their private information. On Telegram and Twitter, it is easier to do as the security layers groups and projects can implement are very limited. These scammers will either ask you for funds in the promise of higher returns, risk-free profits, secure your wallets, or any other way to either take and run away with your money or get a hold of your private keys.

This is how fake accounts share fraudulent links on social media 

Stay protected:

Never ever share your private keys with them or send a single penny of your funds to them. Do not click on any suspicious or unknown links. Keep two-actor authentication (2FA) on all of your social accounts. Never reply to any messages with suspicious links that you receive on your social accounts.
Employer Phishing

Employees of Web3 and Crypto companies are constantly targeted with pretentious and fake emails by scammers who pretend to be seniors or bosses of these employees and ask them to either send funds to them or share their wallet addresses.

Additionally, scammers target job seekers by either faking themselves as potential employers or promising them jobs with lucrative returns. The scammers ask the job seekers to share their wallet private keys to send bonuses or advance payments for the roles. Do not share anything with them! Again, no legitimate person or organization will ever ask for your private keys or seed phrase.

Stay protected:

Always check the domain name of the emails and make sure it is an official email address. Organizations should implement anti-phishing code to help the receivers identify that the email is authentic and sent by the organization. If you receive any such email, inform your company seniors or official team members right away.  
Malicious Airdrops

People surfing Crypto Twitter or crypto-based Telegram channel have seen images similar to the ones below, containing promises by global icons like Elon Musk, Joe Bidden, or Bill Gates running giveaways of millions of dollars in cryptocurrencies.


Clearly, these are fake! Despite Tesla having millions of dollars worth of Bitcoin in their reserve, we are pretty sure they are not holding it to run airdrop campaigns for retail investors. Do not fall, a victim to these scams, because its super easy to create such Photoshopped images.

In addition to impersonating celebrities, scammers also create impostor accounts of crypto and Web3 projects, download the images from their official social profiles, and pretend to run airdrops to scam users.

Before falling for any of these seemingly lucrative offers, make sure to check the URLs and usernames of the channels and confirm on the official channels with the admins. Mostly, the official channels of any project are available either on the footers of the website or on the community pages.

Crypto Rug Pulls

Remember the Tom and Jerry cartoons where Tom pulls the blanket under Jerry’s feet and Jerry rolls over and falls on his head. A similar experience is often faced by crypto and NFT investors when they invest their hard-earned money into a new crypto/NFT project and the founders of the project run away with the funds which drop the price of their own token to zero. This type of scam is known as a Crypto rug pull.

It all starts with scammers attempting to convince the investors that their project is legitimate and holds a bright future. This convincing is done in various ways like by building a spectacularly looking website, adding the names of fake employers and advisors, and presenting a roadmap that demonstrates the project has a sustainable future. Retail investors around the world start investing in these projects, exchanging their tokens for the project’s tokens in IDO and ICO rounds, and trading on exchanges.

Then the founders play the real game and begin to pull the rug in one of the following ways:

Dump their tokens in the open market and run away with the users’ funds and sold tokens. Put a smart contract that locks users’ funds and disables them from selling their tokens. Extract all the liquidity from liquidity pools they started on the decentralized exchanges leaving users with only tokens and no medium to trade them.

Stay protected:

Research every aspect of the project starting with authenticating the team members and their backgrounds Don’t fall for huge promises of returns or their claims of becoming the next ‘Bitcoin’ or ‘Ethereum’. Research! Research! Research!
Price of a rug pulled token dropping straight to zero Malware

Malware has always posed a threat to crypto and Web3 users since the inception of the industry. All scammers need to find is a way to inject a virus or malware into your computer system or mobile device. We all copy-paste wallet addresses when we send cryptos to a receiver’s address. This is the exact time when these malwares play their mischiefs and replace the addresses from the receiver’s address to the scammer’s address.

Addresses being a complex string of alphanumeric characters are not often double-checked by investors before sending and investors end up sending their funds to the scammers.

Stay protected:

Enter the receiver’s address by scanning the QR code Triple-check each and every character of the receiver’s address. Try sending a smaller amount of funds before making a big transaction and confirming with the receiver. Keep your devices virus-free with the help of antivirus programs. Do not click on any suspicious links that you find on the internet or receive in your emails and messages. Spoofing

Spoofing means when something pretends to be something else. Scammers attack via spoofing using URLs, websites, emails, text messages, and IPs. And now with the innovations in technology, add facial and GPS spoofing also to the list. For example, a spoofed URL is a fraudulent link that pretends to be a legitimate URL designed to steal your data. This is called URL spoofing. Scammers attack via spoofing using URLs, websites, emails, text messages, and IPs. And now with the innovations in technology, add facial and GPS spoofing also to the list.

Scammers will often use the name of a big trusted Web3 organization and send you communications about your activity or promotions. For example, it can be a promotional email from Binance that says you have won $50 for being a valued customer. However, the email will contain malicious links that will send you to a website that looks like Binance. This spoofed Binance page will have a sign-in form that has been created by scammers to collect your username and password.

This is just one example of numerous ways how a user can be tricked into believing the legitimacy of online communication or platform. Look at the example below where the scammer has sent an email that looks like it has been sent by MetaMask’s support team. But, if you look at the email address, the reality is different.

Email from a fake website impersonating as MetaMask

Stay protected:

Always check the URLs, email addresses, and any link that you are accessing. Once you know you are really on the authentic website, bookmark it to reduce your chances of clicking on a scam site that can appear on Google. Crypto and Web3 projects will never ask you for your seed phrases. Never share them with anyone. Do not click on any suspicious links that you receive on any communications. Straightly mark all such emails as spam as it will help other investors also Never connect your wallet to an unknown platform

In contrast to custodial wallets that are common to centralized crypto exchanges, there are platforms that offer non-custodial wallets where the entire responsibility for the security and safety of users’ funds is with the users. With a non-custodial wallet, you have complete and sole control of your private keys which means you fully control your cryptocurrency and prove the funds are yours.

Other Ways to Get ‘REKT’

We have discussed all the major scams that are burning the pockets of Web3 investors. Additing to the ones discussed above, there are a few other ways that can leave Web3 users REKT!

Scammers might blackmail you that they possess personal information, chat history, images, or videos about you and ask for funds. They might send ransomware to your devices, encrypt all the files on your device, and demand a hefty fee in Bitcoin to decrypt the files. They organize pumps and dumps on low trading volume altcoins on centralized crypto trading platforms. They can build Ponzi schemes where at first, they will offer you magnificent returns on your investments, and later, they will run away once they convince you to deposit a large sum for higher returns. The scammers may act as a romantic partner or someone dealing with life-threatening diseases to lure you to pay them money.
Conclusion

Web3 is full of many creative ways to scam users and snatch their private keys and seed phrase to get access to their funds and private information.

Despite all the variety of scams, there are really only a handful of things you need to keep yourself as safe as possible:

Always triple check the sender’s and receiver’s address as scammers will change the address so the assets will go to their address instead Don’t click on anysuspicious or unknown links, buttons, and ads Never share your private information, private key, and seed phrase Always check website URLs, email addresses, and redirects closely Never send any money to an unknown person or a company Always keep strong passwords and two-factor authentication on all your accounts Research every corner of a project before investing your money in it.

If you can learn to implement all the above-mentioned points every time when you interact on a Web3, DeFi, or a crypto platform, your funds will be safe and secure. Your keys and your private information are only yours to keep. While Web3 promises to bring complete ownership of your data, it will be solely your responsibility to keep it secure.

Stay Safe!


KuppingerCole

CIAM Platforms

by John Tolbert This report provides an overview of the market for Consumer Identity and Access Management solutions and provides you with a compass to help you to find the CIAM product or service that best meets your needs. We examine the market segment, vendor product and service functionality, relative market share, and innovative approaches to providing CIAM solutions.

by John Tolbert

This report provides an overview of the market for Consumer Identity and Access Management solutions and provides you with a compass to help you to find the CIAM product or service that best meets your needs. We examine the market segment, vendor product and service functionality, relative market share, and innovative approaches to providing CIAM solutions.

1Kosmos BlockID

Reflections from UnionDigital Innovation Festival

Here at 1Kosmos, It’s been an action-packed September full of events, analyst mentions, awards and more. One of these exciting events was an Innovation Festival in the Philippines hosted by our customer, UnionDigital. The three day festival event with the theme “The Future of Banking is with U” showcases digital innovations of Union Bank and … Continued The post Reflections from UnionDigital Inn

Here at 1Kosmos, It’s been an action-packed September full of events, analyst mentions, awards and more. One of these exciting events was an Innovation Festival in the Philippines hosted by our customer, UnionDigital. The three day festival event with the theme “The Future of Banking is with U” showcases digital innovations of Union Bank and its subsidiaries in both the government and the private sectors. Union Bank hosted the event exhibiting along with its official partners. 1Kosmos, being one of the official partners, was invited to the event to hold workshops with key stakeholders and participate in the fireside chat titled, “Is identity proofing the key to financial crime prevention?”

From the moment the team arrived in the Philippines, we were amazed by the hospitality of the UnionDigital team. Even though it was the first day of the event and the UnionDigital team had just moved into their new HQ, the dynamic and visionary CEO of UnionDigital, Arvie de Vera, came to meet with us during one of the workshops and to ensure that we were doing well. Our experience was very similar across the board with other folks at the event and outside. We are thankful to the entire UnionDigital and Aboitiz group for their warm welcome during our stay in the Philippines!

As the event took place over the next several days, our team quickly gained valuable insights. First, it was clear that everyone has a very progressive mindset. They are eager to adopt new technologies to stay ahead of the market. As a progressive digital identity company, our team was pleased to connect with these like minded individuals.

One of Union Bank’s progressive offerings is their digital initiative which offers Banking as a Service. Through their partnership with 1Kosmos, digital identity is set to become a key security component driving this initiative. In fact, through various panel discussions, it was evident that Digital Identity has become a focal point that will play a crucial role in mitigating fraud, increasing efficiency and offering a better user experience. Improved digital identity capabilities will also help UnionDigital move toward Zero Trust while reducing the risk of cybercrimes such as ransomware.

The Philippines is amongst the fastest growing economies and is slated to become a trillion dollar economy by the end of this decade. With UnionDigital and Aboitiz group looking to capitalize on this growth opportunity, they seem to be committed to continue making significant investments in its digital infrastructure. And we, at 1Kosmos, are very excited to be part of the journey as a trusted partner.

The post Reflections from UnionDigital Innovation Festival appeared first on 1Kosmos.


Ontology

OWN Insights 08: Add Value, Not Noise: Navigating Web3 Branding in a Market Downturn

This article was contributed by Rory O’Neill, Executive at strategic communications consultancy Wachsman. The crypto market downturn has taken the world of Web3 by storm over the past three months. Subsequently, this has placed blockchain technology under the microscope of the media, leading to journalists and crypto sceptics alike flocking to provide varying opinions on what the future of the sp

This article was contributed by Rory O’Neill, Executive at strategic communications consultancy Wachsman.

The crypto market downturn has taken the world of Web3 by storm over the past three months. Subsequently, this has placed blockchain technology under the microscope of the media, leading to journalists and crypto sceptics alike flocking to provide varying opinions on what the future of the space might hold.

However, in the hangover of all of this hysteria, there is one issue which has perhaps been largely overlooked, and that is the need for strategic brand management by companies in this field, to effectively weather this crypto winter storm — however lasting it might prove to be.

In recent weeks, the crypto news cycle has been dominated by negative headlines and navigating this media landscape is no mean feat for firms deeply rooted in Web3 technology. While it might be tempting for brands to retreat into hibernation, it is almost certainly not the right time for those who intend to be in this industry for the foreseeable future to adopt a policy of radio silence.

Rather, in the throes of this tumult, it is important that blockchain brands act as the standard-bearers for their industry and demonstrate that there is infinitely more to Web3 and blockchain technology than many of these sensationalist headlines might suggest. Accordingly, this makes it the opportune time for companies in this space to keep the drum beating, even in the face of adversity. It’s a time for building, and direct attention should now be placed on the underlying value of Web3 technology.

We are at a juncture where it is crucial for brands to place a renewed emphasis on education, building and development, and spell out their long-term aspirations, not only for their in-house projects, but also for the industry as a whole. Only this will ensure that brands are well-positioned to enjoy the fruits of the future that lie ahead, in spite of the current downturn.

An abundance of media coverage is now — understandably — placing a significant emphasis on market conditions. However, against this backdrop, there exists fertile ground for thought-leaders to be a positive force, lean into their expertise, and provide direction to others in the sector on the business-critical decisions that must be made in the current market to enable them to weather this storm and poise themselves for success ahead of the next bull market. Not only does this type of commentary help cut through the noise that we are seeing in the headlines in the short-term, but it can serve to establish spokespeople as go-to sources for valuable analysis in this area, all while refocusing the narrative on the important mission of building towards Web3.

Beyond this, with blockchain technology continuing to become ever more a part of our vernacular, and with big name brands continuing to delve further into the world of Web3, there also exists significant scope to educate those who are new to the industry. At this difficult time, it is imperative to show newcomers and the crypto curious the positive improvements that this budding industry can provide to existing Web2 infrastructures. While authenticity is key, showcasing the exciting projects that continue to make the blockchain space appealing is a great way to avoid going quiet while also making a meaningful contribution to the conversation.

Further to this point, it is essential that brands use this market downturn as a time to grow and develop, and insulate themselves against the fallout of these types of market fluctuations in the future. What has allowed Web3 to reach previously lofty highs is not pure hype, but rather the strong fundamentals of the underlying technology. It is important to not lose sight of this. Now is the time to focus on these fundamentals and develop technology offerings that can add real value for the companies and consumers who will use them. New product offerings and use cases continue to make headlines, and they provide a tangible positive lifeline for an industry that has an appetite greater than ever for good news.

By continuing to focus on development while promoting use cases, brands can simultaneously highlight that the positive impacts offered by Web3 technology stretch far beyond short-term fluctuations in the market. Doing so can provide a timely reminder that this winter is something seasonal, not only for individual brands but for the industry as a whole; proof of the lasting value of this technology is found in the fact that it continues to survive the whims of the market. This is perhaps one of the most valuable lessons to be learned from the current crypto climate, and accordingly, it will prove integral to the success of brands in this space from here on outwards.

Brand management is unquestionably an area that will prove increasingly important as we manoeuvre our way out of this difficult chapter in Web3’s history. Amidst a flurry of media attention, having a strategic approach to navigating this tricky period is paramount to ensuring success. If implemented correctly, not only will a successful branding project facilitate the avoidance of negative coverage, but it could in fact amplify a brand’s awareness while doing so.

Ultimately, now is not the time to retreat to the shadows, but rather, to rise to the occasion and act as a light at the end of the tunnel for Web3!

— -

OWN (Ontology Web3 Network) Infrastructure is a series of blockchain protocols and products that provides the much needed tools to create an interconnected, interoperable global blockchain ecosystem. The infrastructure is bringing trust, privacy, and security to Web3 applications through decentralized identity and data solutions.

Aimed at allowing Web3 developers to quickly build Web3 applications, saving them from creating basic functions from scratch, OWN includes the Ontology blockchain, ONT ID framework and more. Individuals can also seamlessly and quickly access Web3 through products such as ONTO Wallet.

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

OWN Insights 08: Add Value, Not Noise: Navigating Web3 Branding in a Market Downturn was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


OWI - State of Identity

The Puzzle of KYC

What impact does eID have on the KYC space? On this week’s State of Identity podcast, host Cameron D’Ambrosi is joined by Liudas Kanapienis, Co-Founder & CEO at Ondato. This duo discusses the impact of eIDs on the broader KYC space and where the industry is headed. Find out what lessons the rest of the world can learn from Baltic nations, deployment of eID.

What impact does eID have on the KYC space? On this week’s State of Identity podcast, host Cameron D’Ambrosi is joined by Liudas Kanapienis, Co-Founder & CEO at Ondato. This duo discusses the impact of eIDs on the broader KYC space and where the industry is headed. Find out what lessons the rest of the world can learn from Baltic nations, deployment of eID.


SelfKey

SelfKey Marketplace

A variety of financial, corporate, citizenship, and crypto services are available through the SelfKey Marketplace, and users can compare, contrast, and instantaneously sign up for them. The post SelfKey Marketplace appeared first on SelfKey.

A variety of financial, corporate, citizenship, and crypto services are available through the SelfKey Marketplace, and users can compare, contrast, and instantaneously sign up for them.

The post SelfKey Marketplace appeared first on SelfKey.


Infocert (IT)

InfoCert con Wallife per il lancio di WALLIFE® BIOMETRICS ID: la prima polizza che protegge l’identità biometrica

Secondo una recente ricerca IPSOS, “Violazione dell’identità digitale, comportamenti adottati e rischi percepiti”, quasi un terzo degli italiani ha subito una violazione della propria identità digitale e il 71% prenderebbe in considerazione strumenti di protezione assicurativa per far fronte a tali rischi. Si pensi solo a quanto oggi siano diffusi strumenti come l’autenticazione biometrica o […]

Secondo una recente ricerca IPSOS, “Violazione dell’identità digitale, comportamenti adottati e rischi percepiti”, quasi un terzo degli italiani ha subito una violazione della propria identità digitale e il 71% prenderebbe in considerazione strumenti di protezione assicurativa per far fronte a tali rischi.

Si pensi solo a quanto oggi siano diffusi strumenti come l’autenticazione biometrica o tramite username e password per l’accesso a conti bancari, agli account di pagamento e ai profili social e quanto queste “chiavi di accesso” siano poco tutelate.

Si inserisce in questo contesto Wallife® Biometrics ID un innovativo prodotto assicurativo in grado di proteggere i dati sensibili legati alla vita online dell’individuo assicurando l’identità biometrica digitale, prevenendo il rischio e mitigando il danno.

La soluzione è stata realizzata da Wallife, la prima Insurtech al mondo in grado di fornire risposte sulla sicurezza e sulla protezione degli individui da rischi derivanti dall’utilizzo delle nuove tecnologie, e presentata in una conferenza stampa a cui ha preso parte anche InfoCert con Pasquale Chiaro – Head of Marketing.

Come ridurre i rischi provenienti dal mondo Cyber e vivere con fiducia il mondo digitale.

InfoCert, la più grande Autorità di Certificazione europea e gestore accreditato AgID dell’identità digitale in ambito SPID è stata tra i primi a credere nel nuovo progetto di Wallife.

 “Il cyberspace si è rapidamente affermato come una dimensione fondamentale e imprescindibile del nostro quotidiano. Purtroppo, questa rapida diffusione non è stata sempre accompagnata da un’adeguata educazione degli utenti. Il prodotto ideato da Wallife consente di colmare questo gap, permettendo di prevenire alcuni dei rischi propri del mondo cyber e, soprattutto, di mitigarne i danni. Siamo convinti che la partnership con Wallife contribuirà al nostro impegno quotidiano: lo sviluppo e l’offerta di servizi e soluzioni che consentano a imprese, professionisti e cittadini di vivere con fiducia sempre maggiore l’esperienza d’uso della propria identità digitale e il ruolo di attori della digital economy. Questa partnership conferma l’impegno di InfoCert a fianco delle startup più visionarie e innovative per l’accelerazione della trasformazione digitale del nostro Paese.” Carmine Auletta – InfoCert CISO.

The post InfoCert con Wallife per il lancio di WALLIFE® BIOMETRICS ID: la prima polizza che protegge l’identità biometrica appeared first on InfoCert.

Wednesday, 21. September 2022

Radiant Logic

Thoughts on the Aite-Novarica Group 2022 Impact Innovation Case Study

The Aite-Novarica Group 2022 Impact Innovation case study details the transformational business value customers see from Radiant Logic. The post Thoughts on the Aite-Novarica Group 2022 Impact Innovation Case Study appeared first on Radiant Logic.

Indicio

Newsletter Vol 36

The post Newsletter Vol 36 appeared first on Indicio.

Now on LinkedIn! Subscribe here

Privacy vs Immutability: Safeguarding Digital Rights

Heather Dahl, CEO of Indicio, will be speaking on a panel at the OECD Global BLockchain Policy Forum on blockchain’s ability to meet national and supranational privacy rules and how they might be addressed from both a policy perspective and industry practice. Others on the panel include Philippe Thévoz, eGovernment Systems Vice-President of SICPA, Francesco Bruschi, Director, Blockchain Observatory of Politecnico di Milano, and Shin’ichiro Matsuo, Research Professor of Computer Science at Georgetown University.

Event Details Identity Insights — Government Use Cases

Why are governments looking to adopt open source decentralized identity initiatives to replace  legacy systems? James Schulte, VP of Business Development at Indicio, joins us to discuss some of the interesting problems governments are tackling, and how they have been looking to solve them with decentralized identity solutions. 

Watch the video Decentralized Ecosystem Governance: Better, More Effective, and More Robust than Trust Registries

Sam Curren, Deputy CTO of Indicio discusses recent work by the Decentralized Identity Foundation to make Decentralized Ecosystem Governance an open standard for verifiable credential ecosystems. Decentralized Ecosystem Governance makes governance machine-readable through treating a trust registry as a file as opposed to a service. 

Read the Article Identity Week Washington DC

Identity Week is a conference and exhibition that brings together the brightest minds in the identity sector to promote discussion, innovation and the development of more effective identity solutions. Indicio is featured in the Startup Village, so if you are attending, please come by and see some of the latest projects we’ve been working on. 

About the event Hyperledger Aries is the Present and the Future of Internet-Scale Trusted Verifiable Credential Ecosystems

Indicio’s Sam Curren, Deputy CTO, and Mike Ebert, Engineering Team Lead, discuss how Hyperledger Aries, AnonCreds, and Hyperledger Indy have given enterprises and governments a powerful way to build and use open source, interoperable decentralized identity technology with privacy-preserving features. Sam and Mike also discuss some of the myths and realities behind each of these technologies and explain why they are currently the most popular way to implement decentralized identity.

Read more News from around the community:

Register to hear David Lucatch, CEO of Liquid Avatar, speak at VRARA Retail Forum 2022

GlobaliD 101: Reusable Identity

FinClusive spoke on the impacts of de-risking on the Caribbean and strategies for ensuring financial access at the US house committee on financial services

Upcoming Events

 

Here are a few events in the decentralized identity space to look out for.

Cardea Community Meeting 9/22 Identity Implementors Working Group 9/22 DIF DIDcomm Working Group 9/26 Aries Bifold User Group 9/27 TOIP Working Group 9/27 Hyperledger Aries Working Group 9/28

The post Newsletter Vol 36 appeared first on Indicio.


Continuum Loop Inc.

Trust Registries – Beyond the Basics

The post Trust Registries – Beyond the Basics appeared first on Continuum Loop Inc..

Thank you to everyone who attended our Trust Registries – Beyond the Basics Webinar, where we answered some questions, discussed the basics of Trust Registries, the current protocol specification, and what’s next. 

We have a few things to share for attendees and those who couldn’t make it.

Link to the recording

Trust Registries Beyond the Basics – PDF of the presentation from the webinar.

Trust Over IP Trust Registry Task Force, where we are developing the v2 Trust Registry Protocol Specification.

Continuum Loop Resources:

2021 Trust Registry Webinar Bubba’s Wallet The State of Digital Wallets Series The Current and Future State of Digital Wallets – Trust Hubs Trust Registries And Your Digital Wallet  Detailed Capabilities of Wallets & Agents Managing Trust and Reputation via Trust Registries Trust Registry Task Force QUESTIONS!

The question period began at around 33 minutes. We had some great questions sent in, including:

Trust Registries will prove essential, but going one layer deeper, the next question is who will be trusted to create, maintain, and validate entrants to the registry. A consortium? In some cases, a regulatory or governmental body? The trust chain is difficult to seed. Do you see a model like a stock market for trust registries? One can buy stocks privately, but to trust that you are getting the correct information and buying legitimate stocks, you would go to one of several stock markets (with their own governance details while also under regulation). There could exist markets of registries that are overseen by regulation (and you could still have private registries) It doesn’t end with a Trust Registry. How do trust registries verify you are the owner of credentials

Participant shared: At the end of the presentation, Marcus Ubani shared that he created an open-source platform as a learning path for himself and newbies to DID/SSI. The platform aims to provide easy access and an overview of all the projects. The Open SSI / DID Directory.

Get Involved!

Now that we’re starting work on v2 of the Trust Registry Protocol Specification at the ToIP Foundation, we’re excited to open a dialogue on Trust Registries. We want to get people thinking about how Trust Registries will help answer tough questions within an ecosystem to create a whole experience.

We would love for you to get involved! If you are interested in joining the Task Force, please do not hesitate to participate. Your involvement is encouraged and welcomed.

If you have questions, please reach out to us through our Contact Us page, and we will be glad to schedule a call to discuss where a Trust Registry may fit into your business ecosystem. You can also follow us on social media and sign up for our newsletter to stay informed on the latest Trust Registry news and updates.

Join the SSI Crew!

The first step on your journey to understanding Decentralized Identity and taking control of your own digital world.

You're in the Crew!

Email

Subscribe

Follow Follow Follow Follow

The post Trust Registries – Beyond the Basics appeared first on Continuum Loop Inc..


Anonym

7 Great Things Cardrates.com Told its 15 Million+ Readers About Sudo Platform and MySudo

The “Ultimate guide to credit cards” advisory service CardRates.com recently spoke to Anonyome Labs’ Director of Marketing, Rich Sordahl, and Fintech Operations Manager, Dave Glass, to review Sudo Platformand MySudo for CardRates’ 15 million + users.  Skip straight to the review: Sudo by Anonyome Labs: Enabling Secure Digital Tools That Protect Identities an

The “Ultimate guide to credit cards” advisory service CardRates.com recently spoke to Anonyome Labs’ Director of Marketing, Rich Sordahl, and Fintech Operations Manager, Dave Glass, to review Sudo Platformand MySudo for CardRates’ 15 million + users

Skip straight to the review: Sudo by Anonyome Labs: Enabling Secure Digital Tools That Protect Identities and Transactions

CardRates, which reports the latest news, studies and current events from inside the credit card industry, broke down exactly what our privacy tools for both enterprise customers and everyday consumers do, and why they are essential for protecting identities and transactions online. 

Here are seven great things CardRates.com told its 15 Million+ readers about Sudo Platform and MySudo:  “Anonyome Labs has built a simple but effective solution through its Sudo Platform — a set of tools that protect online activity through customizable virtual identities known as Sudos. Anonyome works with large enterprises interested in providing privacy protection to end users, and it offers the MySudo app directly to consumers.”
“It’s not too much to say that a privacy crisis threatens the ability of the internet to facilitate modern conveniences without destroying consumer privacy. Credit bureaus, insurance agencies, personal privacy protection companies, and other enterprise-level clients use Anonyome’s Sudo Platform to foster a more protective internet.”
“Sudos are at the heart of the platform. Many consumers typically keep separate work and personal profiles online. Sudos provide a platform that enables individuals to create up to nine virtual identities for different aspects of their online lives.”
“Companies determine how they deploy that functionality based on their use cases. Through Sudos, end users deploy virtual email addresses and phone numbers for voice calling and texting. The platform also houses a host of other privacy protections — all easy to integrate into existing corporate infrastructure via preconstructed user interface modules, sample apps, and software development kits (SDKs).”
“The Sudo Platform keeps communications private with tools that include encrypted messaging, voice, video, and email. Private browsing, ad blocking, and site reputation services protect end users from online threats. And the ability to create virtual credit and debit cards enables users to shop securely by creating a different card for every purpose.”
“Signup [to virtual cards] through the app or one of Anonyome’s partners requires a comprehensive KYC (know your customer) identity check that Anonyome’s services perform. After they’re validated, users connect a debit or credit card to make transactions. ACH based funding via bank account is coming soon. Creating and changing Sudos for shopping, browsing, social media, or anything else is simple.”
“There’s no turning back from the efficiencies and conveniences of digital life. But every click and tap has the potential to become part of a bank’s or merchant’s revenue model. Sudos give consumers a simple way to control how they present themselves online.”

Read the entire review: Sudo by Anonyome Labs: Enabling Secure Digital Tools That Protect Identities and Transactions

Anonyome Labs did not sponsor or pay for this review. 

*Virtual cards available for iOS and US only. Android and more locations coming soon.

The post 7 Great Things Cardrates.com Told its 15 Million+ Readers About Sudo Platform and MySudo appeared first on Anonyome Labs.


FindBiometrics

Integrations, Assessments, and a ‘Surprised’ Politician: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Biometric Borders The Dominican Republic is looking […] The post Integrations, Assessments, and a ‘Surprised’ Politician: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Biometric Borders The Dominican Republic is looking […]

The post Integrations, Assessments, and a ‘Surprised’ Politician: Identity News Digest appeared first on FindBiometrics.


Northern Block

Northern Block and the Government of British Columbia Demonstrate Interoperability in the Sharing of Sustainability Data

NB Orbit Enterprise Platform used in Energy & Mines Digital Trust (EMDT) initiative to enhance Sustainability Reporting between multiple stakeholders. As leaders in climate change legislation, the Government of British Columbia (B.C.) is helping to deploy cutting-edge technologies to complement them in their journey towards a low-carbon economy. In an effort to improve trust, accuracy, […] T
NB Orbit Enterprise Platform used in Energy & Mines Digital Trust (EMDT) initiative to enhance Sustainability Reporting between multiple stakeholders.

As leaders in climate change legislation, the Government of British Columbia (B.C.) is helping to deploy cutting-edge technologies to complement them in their journey towards a low-carbon economy. In an effort to improve trust, accuracy, and efficiency when sharing sustainability data, Energy and Mines Digital Trust (EMDT) was established to incentivize the formation of a digital trust ecosystem.

Northern Block partnered with EMDT and provided our no-code NB Orbit Enterprise Platform to support two pilot projects aimed at better managing sustainable mining practices. 

Reporting environmental impact data can be a complicated and laborious process. Data is difficult to exchange internationally, and consumers cannot always access, or trust, reported data.

This project demonstrated an improved method for exchanging data, by allowing participants to share and receive digital credentials, including Verifiable GHG Emissions data related to specific mining sites and natural gas facilities.

From a technical standpoint, we accomplished some nice feats:

(1) Successfully demonstrated Interoperability between NB Orbit Enterprise and BC Traction in the exchange of Carbon Emissions Credentials.

Image taken from EMDT Case Study (here)

(2) Successfully piloted the Presentation Proposal function (Aries RFC 0037: Present Proof Protocol 1.0), so mining companies (Holders) are capable of initiating holder-driven proof requests with Verifiers such as the Climate Action Secretariat, whom they owe sustainability reporting to.


Northern Block is committed to driving digital transformation by supporting the implementation of digital trust solutions which lead to value creation for entire ecosystems. Our mission is to build impactful solutions to achieve self-sovereignty, and we are proud to be actively contributing to these goals in collaboration with the Government of British Columbia, PricewaterhouseCoopers (PwC), Copper Mountain Mining Corporation and many others.

Looking to leverage digital credentials to support your sustainability mandate?

Book a meeting with us today

Northern Block Media contact

Daniela Gutiérrez de P. Manager, Marketing and Communications daniela@northernblock.io 

 

The post <strong>Northern Block and the Government of British Columbia Demonstrate Interoperability in the Sharing of Sustainability Data</strong> appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Global ID

Meet the Team: Nikhil Khare, Product Manager

Meet the Team is our ongoing series of employee profiles. It’s an opportunity for our users and partners to get to know us a little better. As a remote company with employees around the world — from the U.S. to Slovenia to Spain to New Zealand — it’s also an opportunity for our team to get to know each other a little better. We’re thrilled to introduce Nikhil Khare, Product Manager of Identi

Meet the Team is our ongoing series of employee profiles. It’s an opportunity for our users and partners to get to know us a little better. As a remote company with employees around the world — from the U.S. to Slovenia to Spain to New Zealand — it’s also an opportunity for our team to get to know each other a little better.

We’re thrilled to introduce Nikhil Khare, Product Manager of Identity Operations at GlobaliD. Nikhil has been working in the technology industry for about ten years and has held positions at Cisco Systems and Duo Security. He earned his undergraduate degree at the University of Michigan before receiving a Masters degree from San Jose State University.

Now, he is helping bring our digital identity solutions to businesses and the people who power them in order to help fix the broken identity systems that exist today.

Meet Nikhil

My name is Nikhil Khare and I’m the Product Manager of Identity Operations at GlobaliD. I’m based in San Francisco, California and I’ve been working in the identity access management space for about five years. I’m really excited to be at GlobaliD and help develop identity solutions for businesses that also enable users to take more control of their data and preserve their privacy.

What attracted him to GlobaliD

As I mentioned, I’ve been in the identity access management industry for a while and I also have a cybersecurity background. A lot of my work has been based around understanding who the user is, who the customer is, and enabling them to have a safer online experience. That’s why GlobaliD really resonated with me because I could help build a future where our online identity is privacy preserving by default. The company seemed to be one of the organizations in the industry that was ahead of the curve in that regard.

I was also looking to have a larger impact with my role and truly get in the trenches with my team. Since GlobaliD is a startup and still relatively small I thought I would have a great opportunity to do that.

Before joining the GlobaliD team

I graduated from the University of Michigan with a bachelor’s degree in social computing information systems. After finishing college I got a job in California at Cisco Systems where I started out as a front end developer. Eventually I moved into a product management role where I gained an interest in cyber security. I also received my Master’s degree in software engineering while attending a part-time program at San Jose State University. The majority of my identity access management experience comes from my time at Duo Security where I managed the core authentication product.

How GlobaliD can help users

The best way to think about how people can utilize GlobaliD is in situations where you have an option to sign in to a website with a third-party like Apple or Google. You can also sign in with GlobaliD which is a more privacy preserving way to access your account.

When you utilize a sign-in option from Apple or Google, you’re supporting their business model which is based on harvesting your data in order to try and better understand your online behavior.

GlobaliD has a really different approach and it’s primarily ethos driven. We want to give users control over what data they share. That is the methodology and the ethos behind the company. Our hope is that by bringing a privacy preserving solution to the market we can offer end users a different option compared to what exists today where they’re trading privacy for convenience.

How GlobaliD can help businesses

One of the biggest problems that businesses have is ensuring that people who are trying to interact with them online are who they claim to be. They’re constantly trying to make sure that people aren’t committing fraud or acting maliciously.

GlobaliD’s identity platform enables companies to offer secure onboarding and authentication via a passwordless, mobile app based process. For example, we have customers who require that their users verify a photo ID document as part of their onboarding process. They can add GlobaliD as an option to their login and configure a KYC (know your customer) check in our admin panel without having to handle any of the integration themselves.

That’s where we can add a lot of value because businesses just want to know that a customer went through a reliable KYC process. They don’t want to store images of all their customer’s driver’s licenses. They don’t want to store any PII (personal identifying information). GlobaliD allows businesses to not have to deal with that issue which in turn helps them manage risk.

If they don’t have PII data about their users then when the inevitable happens and they have a data breach or get hacked, bad actors can’t access anything.

Why he’s looking forward to verifiable credentials

GlobaliD is building support into our platform for a new W3C technology called verifiable credentials. This is really exciting because if we get this right, we have the opportunity to solve the online identity problem in a different way than what is offered today. Current solutions like liveliness checks, SMS verification, email verification, and even physical document verification all have multiple points of failure, and the most determined bad actors can spoof them.

In contrast, verifiable credentials will enable a business to independently trust another organization who makes a claim about a GlobaliD user, and then issues that user a verifiable credential. This will give businesses a higher degree of certainty the user is who they claim to be while making the verification process easier for the individual.

An example where you might see this play out in real life is in the fast growing market of decentralized finance (DeFi). By nature, DeFi only requires a crypto wallet to access funds and transact on the internet. However, we still want to make sure these funds aren’t being laundered or used nefariously, so we have to bridge that gap between decentralization and security.

We can do this by having other centralized crypto exchanges who require users to go through a KYC process, like Coinbase or FTX, issue a verifiable credential to your GlobaliD. This enables smaller DeFi projects to be regulatory compliant while not requiring the user to repeat the same process over and over again.

While we still have a long way to go to achieve this vision, these identity building blocks are beginning to stack on top of each other. It’s really exciting to be a part of and I believe it’s what’s going to get us to the future we’re trying to build.

The benefits of a remote work lifestyle

We’re so fortunate to be working in technology and be given the ability to work remotely. One of the biggest positives is how my work and personal life have managed to meld together in a really flexible way. I’m very thankful for our leadership who prioritize our mental health and well being. They do a great job of making sure we’re not burnt out and allow us to step away and take care of ourselves when we need to. That allows me to bring my best self to work everyday.

It also makes it really exciting when we get to meet up and see each other in-person. When I first started I was able to attend the RSA Security Conference with Kevin Boehm (CEO) and Todd Collins (Head of Design). I also got to meet Trey Steinhoff (Product Marketing Manager) when he traveled to San Francisco for a company meeting. When you don’t see these people everyday and then you get the opportunity to meet them in-person it makes you really appreciate the interactions you’re able to have. You can figure out what their passions are both at work and outside of the company and it helps you build camaraderie with everyone.

On his hobbies and interests

Right now I’m learning how to play the piano. I’m taking lessons remotely every Wednesday. I also have a personal goal to get better at surfing throughout the rest of the year. I live in San Francisco which is next to the ocean, so I feel like I should take advantage of that and improve while I can. Those are the things I’m doing in my free time when I’m not focusing on identity standards.

If you would like to learn more about GlobaliD, sign up for the GlobaliD Insider newsletter. You’ll get monthly updates on digital identity, company news and more.

Also, ICYMI:

Meet the Team — Stefanie van Dyk, People Operations Manager Meet the Team- Meaghan Decorpo, Customer Care Manager Meet the Team — Todd Collins, Head of Design Meet the Team — Kevin Boehm, Head of Product Meet the Team — Erik Westra, Head of GlobaliD Labs

Meet the Team: Nikhil Khare, Product Manager was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Top Insights From Our 2022 State of Secure Identity Report

Attackers are getting better at compromising customer accounts. Here's how to stop them
Attackers are getting better at compromising customer accounts. Here's how to stop them

Ontology

Ontology Weekly Report (September 14–19, 2022)

Highlights Li Jun, Founder of Ontology, was invited to join the Ethereum Merge Watch Party hosted by Crypto Slate, sharing in-depth insights on the Ethereum Merge and the follow-up development of Web3. Latest Developments Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup RISCV EVM actuator. We are 87
Highlights

Li Jun, Founder of Ontology, was invited to join the Ethereum Merge Watch Party hosted by Crypto Slate, sharing in-depth insights on the Ethereum Merge and the follow-up development of Web3.

Latest Developments

Development Progress

We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup RISCV EVM actuator. We are 87% done with the Rollup L1<->L2 cross-layer communication. We are 89% done with the Rollup L1<->L2 Token Bridge. We are 99% done with the L1 data synchronization server. We are 94% done with the L2 Rollup Node. We are 59% done with the L2 blockchain browser.

Product Development

ONTO App v4.3.6 brought support for Aptos Blockchain, upgraded WalletConnect features, brought support for filtering dApps by chain type in the Discover section, swapping tokens on Optimism and Arbitrum, and improved UI for the search feature. ONTO hosted a giveaway with Apeiron, 80 Battlepass Limited NFTs up for Grabs! Follow the @ONTO Wallet Official Announcement in Telegram for more details!

On-Chain Activity

154 total dApps on MainNet as of September 19th, 2022. 7,163,286 total dApp-related transactions on MainNet, an increase of 4,209 from last week. 17,953,133 total transactions on MainNet, an increase of 12,688 from last week.

Community Growth

We held our weekly Community Call, focusing on “Ethereum Merge”. Community members discussed “What impact will the Ethereum Merge bring to the Web3 landscape” and “How Ontology should deal with this impact”, and talked about the future development of Ontology. We held our Telegram weekly Community Discussion led by Ontology Loyal Members. There were lively discussions on “Why Ethereum Merge”, “Differences After Merge”, and “Benefits of Merge”: Reduced energy consumption and improved network performance. Ontology held the second Twitter Spaces of OWNInsights, invited Ben Yorke and 0xYond from WOO Network and SnapFingers DAO, discussing “The disconnect between a DAO and developers”, “DAOs in the real world”, “Obstacles for DAOs and what are the next big changes that bring adoption”. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates.

Global News

Li Jun, Founder of Ontology, was invited to be a guest at the YouTube Watch Party All Day Livestream of the well-known blockchain media Crypto Slate, shared his thoughts on “Why Ethereum converted to PoS”, “Network security after Merge” and “What is the impact to Web3?”. Ontology Head of Community Humpty and Americas Ecosystem Lead Erick were invited to join Twitter Spaces hosted by famous blockchain media Crypto Sapiens, and shared views of “Key Challenges of DID Construction” and “The Composition of Web3 Reputation”. Ontology in the Media

Cointelegraph -What is decentralized identity in blockchain?

“With a decentralized identity, users can control their own PII and provide only the information that is required to be verified. Decentralized identity management supports an identity trust framework where users, organizations and things interact with each other transparently and securely.”

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 14–19, 2022) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Metadium

EVM and Solidity Overview

Understanding the Ethereum Virtual Machine and Solidity Ethereum virtual machine The Ethereum Virtual Machine (EVM) is a runtime environment for smart contracts in Ethereum. EVM is sandboxed and has the characteristic of running completely isolated from other environments. Thanks to this, the code running in the EVM cannot be accessed by other processes, nor can it be accessed by an un
Understanding the Ethereum Virtual Machine and Solidity
Ethereum virtual machine

The Ethereum Virtual Machine (EVM) is a runtime environment for smart contracts in Ethereum. EVM is sandboxed and has the characteristic of running completely isolated from other environments. Thanks to this, the code running in the EVM cannot be accessed by other processes, nor can it be accessed by an unspecified network.

https://ethereum.org Ethereum structure reorganization

The role of EVM can be broadly divided into two categories: Calculation and data storage.

Calculation

To avoid consensus problems, EVM keeps instruction sets to a minimum. Basic algebraic, bitwise, logical, and comparison operations are supported, as well as conditional and unconditional jumps. In addition, EVM supports commands for accessing information such as block number and timestamp so that the information of the blockchain can be used.

With the hard fork, the cost of each operation may be changed, and new operations may be added and deleted.

Data Storage

EVM’s data storage space is generally divided into three types: storage, memory, and stack. From the perspective of Ethereum as a whole, it can include Calldata or even block-level areas.

Storage is an area that permanently preserves data and is a key-value store that accesses 256 bits of value from 256 bits of the key. Because I/O uses a lot of hardware and time, storage-related instructions are relatively expensive and consume more gas. Because it is expensive, minimizing storage usage is the key to smart contract development. Also, storage in the EVM requires explicit permissions for read/write. Stored values that are not set to the public and set to internal or private cannot be accessed by other contracts. Instead, it can be accessed through functions with getter and setter roles, with visibility set to public or external functions.

Memory is a temporary space used by the contract, newly secured for each message call. It increases linearly, of course, the more it scales, the more expensive it becomes. Currently, since the cost increases to the quadratic level, it is necessary to consider a memory-saving technique in contract development.

For reference, the message call also has another storage area called Calldata, and it is possible to prevent a considerable cost due to the increase in memory by using it as it is without copying it to the memory. However, the memory area can be modified, but Calldata can only be read.

The stack is an area that handles input and output values for operations. EVM is not a computer hardware-like register machine but a stack machine with a simple structure. Therefore, all operations are performed on the stack. Since the maximum stack size is limited to 1024, be aware of this and use local variables or parameters appropriately.

The Log can also store data at the block level. The defined event can be stored by emitting it. However, the contract cannot access the same Log after the Log is created. Instead, it is an efficient area to access outside the blockchain. Since these log data are also recorded through the bloom filter, the Log can be found even by light clients that do not have full blockchain data in a cryptographically secure way.

In addition, there is a virtual ROM area for uploading codes to prevent forgery/falsification during code execution, and an area for storing block information and information of the blockchain itself.

Solidity

Solidity is a high-level object-oriented language for implementing smart contracts targeting the Ethereum Virtual Machine. Any programming is possible by paying the fees within the block gas limit.

Solidity was greatly influenced by several existing programming languages, especially in the early days, and was designed with much influence from JavaScript. Its effectiveness can be seen in a curly-bracket language and import methods. Even in the early days, it was very similar, such as function-level scoping and the var keyword, but now it is no longer supported for security reasons. Instead, it can be said that version 0.4 is closer to C/C++, such as variable declarations, looping statements, and overloading. Convenience and safety have been improved through subsequent major version updates, and it currently has its own unique style, such as strict type conversion. It was also influenced by Python, such as decorators (called modifiers in Solidity), multiple inheritances, the super keyword, and copy semantics of value and reference types.

Contract development languages

In addition to Solidity, EVM bytecode can be written in various languages. Vyper is a Python-like contract language. Yul/Yul+ is an intermediate language, and low-level language is used when optimization and security are essential. It is mainly used to implement EVM and common denominator work with Ewasm and optimistic rollup. FE is being developed as an easy-to-learn, easy-to-learn, developer-friendly language inspired by Python and Rust.

Although various languages exist, contract development through Solidity is the most active, for the following reasons:

Numerous tutorials and sample open source code A variety of tools to improve development and ease of use Large development community Inline assembly can be used when low-level development is required while providing reasonable functionality Solidity Inline Assembly

Solidity’s inline assembly is a feature that facilitates low-level development beyond the functionality of Solidity. Although it is an assembly, it is easy to develop by providing a functional style.

Example of low-level code:

3 0x80 mload add 0x80 store

Coded as follows:

mstore(0x80, add(mload(0x80), 3))

It has the same meaning, but convenience is increased thanks to the functional style, readability, and development.

Inline assemblies allow direct optimization of code to be written, but conversely, they may not benefit from optimization performed by the compiler. Therefore, it is efficient to write inline assembly only when it is determined that it can be clearly optimized and leave the rest to the compiler.

EVM Compatible Blockchain Source: https://ethereum.org/en/eth/

Most of the projects that forked Ethereum also use EVM as it is or provide compatibility. That makes it possible to develop contracts using Solidity and a natural connection to many existing tools and services, such as REMIX, a web IDE that supports debugging, convenient contract deployment and execution, and multiple extensions.

However, since Ethereum’s EVM command update and Solidity version upgrade are being made steadily, it can be said that it is a healthy project to reflect the upgrade even after the chain fork continuously.

Metadium

For example, in the case of Metadium, the contents of the Istanbul hard fork of Ethereum are applied based on the master branch and the latest release version as of September 2022.

Istanbul is a hard fork with changes at the EVM level, including the addition of CHAINID and EXTCODEHASH opcodes. The code can be verified that these two opcodes are added to Metadium.

Two block-related opcodes have been added to the Istanbul Hard Fork.

In Istanbul, the gas cost of existing opcodes has also changed several times. For example, the gas cost of SLOAD has been changed from 200 to 800, and related codes can be found in Metadium.

Code reflecting changes in the gas ratio of opcodes related to the tri-size.

Even after Istanbul, there were many significant and minor changes to Ethereum, and their reflections could be found in the Metadium development branch. The London Hard Fork has been applied, and the famous gas-ratio mechanism change, EIP-1559, has also been applied.

The BASEFEE opcode that appeared from EIP-1559 can be verified.

Although it still exists in the development branch, Metadium tracks and continuously reflects Ethereum’s updates.

References https://ethereum.org/en/developers/docs/ https://docs.soliditylang.org/en/v0.8.16/ https://remix-ide.readthedocs.io/en/latest/

EVM and Solidity Overview was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 20. September 2022

KuppingerCole

Cybersecurity-Teams mit Managed Detection Response stärken

Organisationen, die die Digitalisierung ihrer Businessprozesse versäumen, werden es in naher Zukunft schwer haben, wettbewerbsfähig zu bleiben. Mit zunehmender Digitalisierung steigen aber auch die Cyberrisiken, weil die Verlagerung von Dienstleistungen in die Cloud und die zunehmende Unterstützung von mobilem und dezentralem Arbeiten die Bedrohungsflächen rapide vergrößert. IT-Sicherheitsteams ha

Organisationen, die die Digitalisierung ihrer Businessprozesse versäumen, werden es in naher Zukunft schwer haben, wettbewerbsfähig zu bleiben. Mit zunehmender Digitalisierung steigen aber auch die Cyberrisiken, weil die Verlagerung von Dienstleistungen in die Cloud und die zunehmende Unterstützung von mobilem und dezentralem Arbeiten die Bedrohungsflächen rapide vergrößert. IT-Sicherheitsteams haben nicht zuletzt wegen geringer Budgets und mangelndem Know-How Mühe, Bedrohungen zu erkennen, darauf zu reagieren und sie einzudämmen, zumal diese zunehmend von staatlicher Seite ausgehen und mit Lieferketten zusammenhängen.

Sicherheitsexperten von KuppingerCole Analysts und Sophos erörtern die neuesten Entwicklungen in der Bedrohungslandschaft und erläutern, warum interne Sicherheitsteams in der Lage sein müssen, ihre Fähigkeiten zur Erkennung, Analyse, Behebung und Wiederherstellung von Cyberangriffen zu erweitern und zu verbessern, um Sicherheit und Compliance zu gewährleisten. Hierzu bieten sich Lösungen im Bereich Managed Detection and Response (MDR) an.

Martin Kuppinger, Principal Analyst bei KuppingerCole, wird die wichtigsten Herausforderungen der heutigen Cybersicherheit skizzieren, die neuesten Überlegungen zur Cyberverteidigung erörtern, die Bedrohungen hinter einigen aktuellen Cyberangriffen untersuchen. Zudem wird er die Notwendigkeit einer Rund-um-die-Uhr-Überwachung, effektiver Erkennungsmechanismen, schneller Reaktionszeiten und proaktiver Verteidigungsfähigkeiten erläutern.

Michael Veit, Technology Evangelist bei Sophos, wird die Vorteile von MDR für Unternehmen aller Größenordnungen erläutern, darunter den Wert von Threat Hunting, die Bestimmung des Umfangs und Schweregrads von Bedrohungen, die Anwendung von Geschäftskontext, die Neutralisierung von Bedrohungen und die Beseitigung der Ursachen von Vorfällen. Darüber hinaus gibt er Einblicke in die Sophos Angebote Managed Threat Response und Rapid Response.




FindBiometrics

Big Contracts, New Partnerships, and More: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Airport Biometrics Australia is poised to finally […] The post Big Contracts, New Partnerships, and More: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Airport Biometrics Australia is poised to finally […]

The post Big Contracts, New Partnerships, and More: Identity News Digest appeared first on FindBiometrics.


auth0

Marketplace Partner Spotlight: What's new from July and August

Learn about all the new integrations to hit our Marketplace.
Learn about all the new integrations to hit our Marketplace.

Global ID

GlobaliD 101: Reusable Identity

So far, in the GlobaliD 101 series we’ve explored: Part 1: What a smart and humanistic approach to digital identity would like Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity Part 4: Why every

So far, in the GlobaliD 101 series we’ve explored:

Part 1: What a smart and humanistic approach to digital identity would like Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today) Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity Part 4: Why every company is an identity company Part 5: What is Bring Your Own Identity?

Today, we’ll discuss a key topic that’s core to the future of digital identity — reusable identity.

Your driver’s license is a form of reusable identity. You receive one after you’ve verified yourself at the DMV with required identifying documents. (And, of course, after you’ve passed your road test.)

You can now present your driver’s license whenever some form of identity verification is required. The same goes for your passport.

That’s not how identity typically works online. Instead, you need to re-verify yourself with each new service. Imagine having to go back to the DMV every time you needed to prove that you’re legally allowed to drive.

Having to redundantly verify your identity online is also incredibly insecure. Every time you do so, your sensitive personal information is stored on yet another server on the internet. Security practices across businesses will vary. The point here is that each server is another potential target for hackers.

It’s also bad for business. Not only are there costs associated with managing user data, the need to verify identity every time you onboard a customer increases friction by an order of magnitude.

While some forms of reusable identity already exist, they are more often than not associated with your social media accounts — for instance, Google or Facebook. With one account, you can sign up and connect to a large network of services. Known as Bring Your Own Identity, we discuss that framework in our previous piece in the series. The problem here is that these forms of identity don’t have the same level of trust necessary for more important tasks, say, opening a bank account.

In an ideal world, we would be able to reuse our identity just as we reuse our driver’s license. With an identity that you own, you could verify your driver’s license once and receive a credential for doing so. You should now be able to reuse that credential without going through the same verification process wherever a driver’s license is required.

Businesses love this form of identity because it reduces the hurdles customers need to jump over to use their service, expanding their pool of potential users. With a reusable credential that they can trust, they can have the peace of mind of knowing who their customers claim to be without needing to deal with all of that expensive management of sensitive data.

Bottom line, we all need an easier way to prove who we are online. Reusable identity is the next step towards that solution. In fact, the market for reusable digital identity applications is projected to reach $266B by 2027.

With the technology continuing to improve and governments continuing to adopt electronic identity systems, reusable identity is set to play a big part in making our digital lives easier.

If you’d like to learn more about GlobaliD, visit our website, contact our sales team or follow us on Twitter, LinkedIn and YouTube.

GlobaliD 101: Reusable Identity was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


Forgerock Blog

The Evolution of Digital Trust

If you look at the internet just 15 years ago, the concept of trust only went one way. It was all about organizations on the internet needing to trust the end user. They needed to make sure the end user was who they were claiming to be; after all there was the famous New Yorker cartoon that stated, "On the internet, nobody knows you are a dog." This spoke to the anonymity that the internet provide

If you look at the internet just 15 years ago, the concept of trust only went one way. It was all about organizations on the internet needing to trust the end user. They needed to make sure the end user was who they were claiming to be; after all there was the famous New Yorker cartoon that stated, "On the internet, nobody knows you are a dog." This spoke to the anonymity that the internet provided in those days. That led to the mentality that strong security was of the utmost importance — to mitigate risks and assure data integrity to protect themselves from bad actors — with little concern for the customer's experience or data.

In an effort to be sure of whom companies were doing business with, they could put in a lot of steps and collect a lot of data from end users to assure they had the right person. And it didn't really matter. It didn't matter because the use of the internet wasn't as pervasive as it is today — it was almost more of a privilege for the tech savvy. Customers didn't have choices, there weren't a lot of businesses on the internet, and people had no idea just how valuable their data was.

Everything has changed. The rise of Google and Facebook showed people just how valuable their data was, while exposing the common practice of many companies to share or sell data to other organizations. With this information out in the news, coupled with regulations like GDPR to protect PII, people have become much more protective of their data. Now, the pendulum has swung and the internet trust model has become much more circular, with customers very concerned about privacy and about ensuring that their data is only used for the actions they consent to. In today's internet, if organizations want to collect data from their end users, they have to make their customers feel that they can be trusted with the data, showing themselves to be good stewards of their customers' data.

To build that trust with end users, organizations need to give them the tools and the proper choices to feel that the organization can be trusted. Tools that allow users to give consent for their data to be collected and used appropriately, for example, or to opt into a marketing campaign. Most importantly, users must have the ability to easily review and change those settings. Furthermore, as organizations leverage social media for user registration, it is important to give users choices. Some might feel more comfortable using one form of social media vs. another. Some companies are even exploring what social media providers actually reduce the initial perception of trust simply by offering them as a choice.

At ForgeRock, we help our customers grow and retain their user base by providing the tools that allow their users to easily manage their data. Through the use of our Intelligent Access Orchestration engine, users can easily opt in and out of marketing choices and give consent during the initial registration. Once an account has been created, we offer a dashboard where users can go and see how their data is being used and easily make changes to their settings. These changes are immediately acted on by setting off events to remove a user's data from platforms they no longer wish to be a part of. Lastly, we provide choices on what social media providers can be utilized for initial registration. All of these capabilities make ForgeRock the Identity provider that companies can rely on to build the trust in their digital identity brand.

Learn more about ForgeRock Identity Orchestration.


Shyft Network

Crypto Bill Introduced in Uruguay’s Parliament: What You Should Know?

The bill aims to clarify the country’s regulatory scenario for crypto assets. It proposes to establish the country’s central bank as the regulatory authority over cryptocurrencies. The proposal needs approval from the Chamber of Deputies and its Senate to be enacted into law. The executive branch of Uruguay has submitted a bill to Congress that aims to clarify how crypto-related activ
The bill aims to clarify the country’s regulatory scenario for crypto assets.
It proposes to establish the country’s central bank as the regulatory authority over cryptocurrencies.
The proposal needs approval from the Chamber of Deputies and its Senate to be enacted into law.

The executive branch of Uruguay has submitted a bill to Congress that aims to clarify how crypto-related activities will be regulated in the South American country. The bill will give the country’s central bank legal powers to regulate cryptocurrencies if approved.

(Image Source)

The bill is currently awaiting approval by both the Chamber of Deputies and the Chamber of the Senate for it to become law.

Virtual Assets Categorization

The document refers to virtual assets as securities and categorizes them into four types: tradable assets, stablecoins, governance tokens, and debt tokens.

On top of that, the bill proposes to create a new category of companies for virtual asset service providers (VASPs). It further seeks to amend the organic charter of the Central Bank of Uruguay (BCU) and put VASPs under the supervision of the central bank entity, the Financial Services Superintendence (SSF).

SSF will ensure that malicious elements do not use crypto assets to bypass the country’s anti-money laundering (AML) and counter-terrorism financing (CFT) safeguards.

Speaking of AML & CFT safeguards, Uruguay hasn’t yet adopted the FATF Travel Rule, but if the country green-lights the crypto bill, it will adopt the Crypto Travel Rule sooner or later.

Giving Central Bank Control

Introduced on Sept 5, the proposed bill aims to amend the Securities Market Law and treat crypto assets as book-entry securities.

Such a treatment means crypto can only be issued by a registered entity that complies with laws and regulations. With this, the document introduces another class of operators, the “virtual asset issuer.”

This new organization will be in addition to companies facilitating the purchase and exchange of virtual assets, custody providers, and third parties that lend financial services related to the sale of a crypto asset that the text establishes as part of this asset class.

“With the proposed amendments, both the previously regulated subjects and the new incorporated entities that operate with virtual assets will be subject to the supervisory and control powers of the Central Bank of Uruguay,” states the bill.
Crypto Regulation in Focus

Last year in August, Uruguayan Senator Juan Sartori introduced a bill to allow the use of crypto as payments and regulate their use in the country. Yet, so far, it hasn’t been successful. Then the Central Bank of Uruguay issued Binance a summon due to its offering of savings-oriented crypto-based financial products.

(Image Source)

Towards the end of 2021, the central bank started working on a plan to lay the foundation for regulating companies that offer crypto-related services.

The South American country has now introduced a new cryptocurrency bill that will address the gray area in its crypto sector.

(Image Source)

Interestingly, Uruguay’s neighboring country Brazil, the largest economy in the region, is also looking to change its legal framework to recognize crypto tokens as securities & bring them under the SEC’s control.

The bill’s final version was approved by the Senate in April 2022 and needs Congress revision before being signed by the president.

(Image Source)

Meanwhile, in another South American country, Paraguay’s President Mario Abdo has vetoed a proposed crypto regulation bill, stating crypto mining is an “energy-intensive” low-value-added activity. The bill has since been returned to Congress to be approved again or to be rejected.

It is to be seen if Uruguay gets ahead of Paraguay and Brazil in giving legal recognition to virtual assets. That said, a comprehensive crypto bill is a good start, nonetheless.

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope!

Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

Crypto Bill Introduced in Uruguay’s Parliament: What You Should Know? was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


1Kosmos BlockID

Biometric Based MFA for Zero Trust

Why Implement Biometric MFA? We are all aware of the risks associated with password-based authentication. The proof is in the seemingly endless list of password-based breaches that we see in the news every day. Some companies have attempted to mitigate password risks by adding two-factor authentication (2FA), pesky one-time codes sent via email or text. … Continued The post Biometric Based MFA f
Why Implement Biometric MFA?

We are all aware of the risks associated with password-based authentication. The proof is in the seemingly endless list of password-based breaches that we see in the news every day. Some companies have attempted to mitigate password risks by adding two-factor authentication (2FA), pesky one-time codes sent via email or text.

The truth is, if you authenticate users who access your systems by way of username-password combinations, even if enhanced by so-called 2FA or MFA methods, then you could improve security a lot, reduce user friction, and cut operating expenses by implementing a biometrics-based solution.

While some solutions claim to solve these problems, many are only exchanging the password for FaceID/TouchID. These legacy apps fail to do the one thing that they are supposed to do – prove who the user is.

Imagine if you could prove each user’s identity every time they logged into a resource. That makes all of the other pillars much easier.

1Kosmos offers identity based Zero Trust- with real biometrics and cryptographic proof of possession and audit.

Identity Is One of the 6 Pillars of Zero Trust

Take this diagram:

Other solutions base the “binding” on a username and password exchange which can be done by anyone. This is a problem because when an issue like a lost phone occurs, other solutions fall back to a username, password, 2FA as the bootstrapping process to re-enroll. What do you think the bad guys are going to do with this recovery process? That’s right, exploit it.

How Does 1Kosmos Prove Identity?

When a user downloads the BlockID app and enrolls, they will take a live selfie – This is part of our LiveID. Then, we ask the user for that selfie and compare it to their photo in government issued documents like a passport or a driver’s license. 1Kosmos matches the selfie with the documents, and gives the user a digital certificate that verifies their identity and it binds the account to the proven identity. When users authenticate through LiveID we compare the live selfie with the one taken at enrollment to prove identity and grant access.

To summarize, proving identity and reaching Zero Trust requires one platform that
1)  Establishes user-controlled identity
2) Proves authentication with that previously established identity

There are two standards for identity and authentication: NIST 800-63-3 for ID enrollment and strong ID usage, and FIDO2 for passwordless. No platform besides BlockID has combined both of these into one experience. When done right, this single experience can replace many legacy processes and provide a seamless user experience.

In addition to NIST 800-63-3 and FIDO, BlockID’s live biometrics are iBeta certified which validates the reliability of BlockID for accurately performing live biometrics for identity proofing.

There are other key values of the BlockID platform:
– Can be integrated with every system that matters – Remote Access (VPN, Citrix, ZScaler, Appgate, etc.), Desktops (Mac/Windows), SSO systems (Okta/Azure AD/Ping), PAM (Cyberark/Thycotic)
– Can be implemented in parallel, without impacting the existing mechanisms
– Supports multiple company accounts in the same app: Domain Admin, Regular users, AD, Unix Account, etc.
– Supports password reset in the app
– Combines physical and logical access
– Has built in integrity checking

Are you interested in learning more about how the BlockID platform can support your Zero Trust journey? Register for our upcoming webinar, “MFA tried to fix Passwords but how do we fix MFA?“.

 

The post Biometric Based MFA for Zero Trust appeared first on 1Kosmos.


Ockto

Blog: 5 uitdagingen bij KYC-identificatie: van verkeerde kopie ID tot AVG-eisen

Geen streep door het BSN op het paspoort of een incomplete dataset – er gaat nogal wat mis bij het aanleveren van gegevens voor het KYC-proces. Welke belangrijke struikelblokken en aandachtspunten er nog meer zijn, en hoe je daar mee om kan gaan? Lees snel verder in ons blog 👇🏻 The post Blog: 5 uitdagingen bij KYC-identificatie: van verkeerde kopie ID tot AVG-eisen appeared first on Ockto.

Geen streep door het BSN op het paspoort of een incomplete dataset – er gaat nogal wat mis bij het aanleveren van gegevens voor het KYC-proces. Welke belangrijke struikelblokken en aandachtspunten er nog meer zijn, en hoe je daar mee om kan gaan? Lees snel verder in ons blog

Lees het hele blog hier

The post Blog: 5 uitdagingen bij KYC-identificatie: van verkeerde kopie ID tot AVG-eisen appeared first on Ockto.


Blog: Alles wat je moet weten over digitaal identificeren

Bij de aanvraag van een financieel product kun je er niet omheen: de klant moet zichzelf identificeren en je moet de identiteit verifiëren. Het liefst laat je dit op een zo efficiënt mogelijke manier doen zonder mogelijkheden tot fraude. Hoe zorg je dat dat proces minder foutgevoelig wordt? 👇🏻 The post Blog: Alles wat je moet weten over digitaal identificeren appeared first on Ockto.

Bij de aanvraag van een financieel product kun je er niet omheen: de klant moet zichzelf identificeren en je moet de identiteit verifiëren. Het liefst laat je dit op een zo efficiënt mogelijke manier doen zonder mogelijkheden tot fraude. Hoe zorg je dat dat proces minder foutgevoelig wordt?

Lees het hele blog hier

The post Blog: Alles wat je moet weten over digitaal identificeren appeared first on Ockto.


ValidatedID

Validated ID is set to complete the AS4EDI20 interoperability tests

Validated ID is set to complete the S4EDI20 interoperability tests to implement the CEFeDelivery AS4 profile in Europe. This project is co-financed by the European Commission through the CEF Telecom program and managed by HaDEA, with action number 2020-EU-IA-0024.
Validated ID is set to complete the S4EDI20 interoperability tests to implement the CEFeDelivery AS4 profile in Europe. This project is co-financed by the European Commission through the CEF Telecom program and managed by HaDEA, with action number 2020-EU-IA-0024.

Ocean Protocol

OceanDAO Round 22 is Live

65,000 USD in OCEAN is available for Web3 data economy projects! Hello, Ocean Community! OceanDAO is a grant DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month, retroactively, rewarding the projects that contributed the most value to the ecosystem. Round 22 has 65K USD worth of OCEAN, or 200K

65,000 USD in OCEAN is available for Web3 data economy projects!

Hello, Ocean Community!

OceanDAO is a grant DAO to help fund Ocean community projects, curated by the Ocean community. Anyone can apply for a grant. The community votes at the beginning of the month, retroactively, rewarding the projects that contributed the most value to the ecosystem.

Round 22 has 65K USD worth of OCEAN, or 200K OCEAN available, whichever is larger in USD terms. (Calculation is on October 10th, 2022, at midnight UTC.) 55K USD is available via the regular grants program, and 10K USD is via our micro-grants run by Algovera AI.

Round 22 Dates

Proposals Submission Deadline is October 4 at midnight UTC Add OCEAN to Voting Wallet by Proposal Submission Deadline. Two-day proposal Due Diligence Period ends October 6 Voting Starts on October 6 Voting Ends on October 10 Funding Deadline on October 24

All times are midnight UTC.

OceanDAO Evolution

We are tuning OceanDAO towards these goals:

A culture of contributing. Biasing to those who are engaged with the community and who are aligned with our value-add mission. Quality outcomes. Rewards flow to contributors (with OCEAN) who are effective at delivering results. Improve DAO efficiency. More efficient capital spending, yet lighter tracking overhead

We aim for OceanDAO to stay minimal (just grants; no treasury) and get more minimal yet; and double down on more automated incentives with lower governance attack surfaces (veOCEAN & Data Farming).

Towards this, the OceanDAO R21 post described some recent steps for R21 and planned steps for R22. While that post targeted a move to retroactive grants, upon deeper investigation we have found it to be high operational overhead, which we have instead focused towards veOCEAN. Therefore we are not moving to retroactive grants at this point in time.

We are moving towards these goals in other ways. First, in the vein of minimal OceanDAO, we are winding down the Parameters WG due to the low scope of governance, low turnout, and ongoing engineering work to further minimize governance. And, the OceanDAO Engineering WG will streamline to use of its OPF internal label as the “Nile” team.

Second, we are shifting more community budget to veOCEAN & Data Farming (more automated, yet aligned objectives), and away from grants (higher governance touch). Therefore, R22 total grants will be 65K USD worth of OCEAN, and R23 total grants will be 45K USD worth of OCEAN (both including 10K USD to Algovera micro-grants). Max funding amounts have been tuned proportionally. The increased funding to veOCEAN & Data Farming will lead to higher APYs for staking OCEAN and curating Ocean data assets.

Round 22 Specs

Total funding of 65,000 USD (min.) 10,000 USD will be allocated for Algovera AI micro-grants. 100,000 USD max. per project in a lifetime 100,000 USD max. per team in a lifetime (not including Advisors) Funding ceiling per tier, in a round (max possible that a project can request in a round) : New projects — 3K USD (New project: never previously received a grant from OceanDAO)) Existing projects — 6K USD (Existing project: have completed one grant from OceanDAO) Experienced projects — 12K USD (Experienced project: have completed two or more grants)

Grant Funding Categories

OceanDAO provides grants for a variety of project types.

Building / improving applications or integrations to Ocean Community or developer outreach (grants don’t need to be technical in nature) Unleashing data The building or improving core Ocean Software Improvements to OceanDAO itself

Submitting Proposals

We encourage you to submit proposals early.

To submit proposals, use the Proposal Portal. A tutorial is available here.

Proposals will then be published on Ocean Port. Port is also where voters can read proposal details and ask questions.

All proposals that meet the essential criteria are then submitted to Snapshot.

Proposals that receive support from the DAO voters receive funding, shortly after voting concludes.

Teams can reach out to the Project-Guiding Work Group, here for any guidance or assistance.

Please, do not submit proposals through Ocean Port directly anymore as you will run the risk of not having your proposal accepted.

Project Standing

If you have previously received a grant, you must update your Grant Deliverables inside each proposal to remain in Good Standing, to stay eligible in future Funding Rounds. The Project Standing Dashboard has details.

Funding Tiers

Funding Tiers are achieved by teams delivering on their grant promises, to unlock higher funding ceilings.

The Funding Tiers are:

New Project Funding Ceiling: as described earlier Requires: No one in your project has ever received a grant from OceanDAO. Open to all. Benefits: Earmarked. Receive feedback during the application process. Introduced to related projects. Receive support during voting. Existing Project Funding Ceiling: as described earlier Requires: You have completed 1 or more grants. Benefits: Same as above. Receive promotion via Newsletter, Twitter, and other channels. Receive support during voting. Experienced Project Funding Ceiling: as described earlier Requires: You have completed 2 or more grants.

The amount requested is in USD; the amount paid is in OCEAN token.

Earmarks

“Earmarks” means that there are funds available exclusively to the groups listed below, without having to compete. For example, New Teams (Outreach) have 6,000 USD available without having to compete against other projects. Beyond that amount, they have to compete.

3,000 USD For New Teams (non-outreach) 3,000 USD For New Teams (outreach) 12,000 USD for 2nd/3rd Time Teams 12,000 USD for Core Tech Initiatives (listed below) 25,000 USD for General (all projects)

Core Tech Earmark & Initiatives

There is specific tech that the core team and Ocean technical community would love to see built. Here’s the list for R22. It’s earmarked, therefore building towards it gives you “air cover” wrt competition.

You can find us in the Core-Tech WG discord channel.

Other OceanDAO Features

Quadratic Funding: Instead of projects receiving a binary grant (either 100% or 0% of what they asked for), all proposals will receive grant amounts relative to how much engagement they received, while still abiding by other rules such as Earmarks and Funding Ceilings.

Minimum USD amount: Projects can set a minimum they are willing to accept, and return the funds if they are insufficient. This drastically increases probability for receiving a grant, on top of Quadratic Funding. (You can reduce your Deliverables commenting in Port, if you only receive a minor part of your overall requested budget.

Voting Boosts: Quadratic Voting helps distribute voting power better. The more projects you vote on, the higher your overall voting power becomes. In addition, you can now visit the Web3 Portal, register yourself with BrightID and receive a 400% voting boost.

Voter Delegation: Voter Delegation allows participants to delegate their decision making to ecosystem members they trust, easily assigning their voting power to other addresses. More info here.

Funds recycling: Any funds remaining inside an Earmark will be recycled into General Grants rather than burned. If there are any funds available at the end, they will be moved back into the treasury.

Working Groups

Here are the WGs:

Core Tech Working Group Outreach Working Group Ambassadors Working Group

Ocean Discord has a channel for each WG. The pinned message will have WG meeting times, and other key info. WGs typically meet weekly or bi-weekly. When a WG meets, it’s in its respective discord voice channels. Anyone can listen in. More info.

Town Hall

There are weekly OceanDAO Town Halls, where OceanDAO WGs report progress. Also, any team in the Ocean ecosystem can request to present and share their progress.

Claiming Grants

You can claim your grant from the OceanDAO website. Please find instructions here.

OceanDAO Ecosystem

Visit Ocean Pearl to learn more about each project, track updates, get an overview of the grant process, and follow the Voting Leaderboard.

You can also find all Funding History & Round Data here!

Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 22 is Live was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


ValidatedID

A pilot project for interoperable decentralised identity between Aigües de Barcelona, CaixaBank and Validated ID

CaixaBank and Aigües de Barcelona have developed a proof of concept with the Catalan startup Validated ID, which has developed a decentralized digital identity solution based on blockchain technology.
CaixaBank and Aigües de Barcelona have developed a proof of concept with the Catalan startup Validated ID, which has developed a decentralized digital identity solution based on blockchain technology.

Affinidi

Breaking Digital Borders: Scaling User-Owned Data from Zero to One Million — Key Takeaways

Breaking Digital Borders: Scaling User-Owned Data from Zero to One Million — Key Takeaways The Berlin Blockchain Week is an annual event that all web 3.0 enthusiasts, crypto experts, and techies from all walks of life look forward to. And this year, it was extra special for us at Affindi, as we showcased our MVP Console to the world. On September 14, we hosted some of the most curious minds
Breaking Digital Borders: Scaling User-Owned Data from Zero to One Million — Key Takeaways

The Berlin Blockchain Week is an annual event that all web 3.0 enthusiasts, crypto experts, and techies from all walks of life look forward to. And this year, it was extra special for us at Affindi, as we showcased our MVP Console to the world.

On September 14, we hosted some of the most curious minds in Berlin at our “Breaking Digital Borders: Scaling User-Owned Data from Zero to One Million” event, as a part of the Berlin Blockchain week. Over some delicious food and drinks, Marco Podien, our Developer Evangelist, started the evening with his enthusiastic welcome.

Glenn Gore, the CEO of LemmaTree and Affinidi, was our first speaker, and he brought out the importance of changing data ownership for good and why it matters to everyone today. He described LemmaTree in general and Affinidi in particular, and also touched on the role of other ventures within LemmaTree, such as Trustana and GoodWorker.

Following Glenn, Junius Ho, our Chief Product Officer, talked about one of Affinidi’s most successful projects to date — the CommonCheck. He described the critical role that CommonCheck played in seamlessly verifying COVID vaccination credentials while preserving the privacy of every individual.

Next, Niandong Wang, our Chief Foundations Officer, took the stage. He started by explaining the ecosystem around data ownership and where Affinidi fits in. Next, Niandong talked about Verifiable Credentials (VCs), Affinidi Console, and the importance of a collaborative approach.

This introduction built the expectations for Affinidi Console, and Zain Yousaf, our Product Principal, and Marco took over to showcase Console to the audience. They did a walkthrough of the Console to help everyone understand how they can build privacy-preserving apps through this no-code dev tool.

This walkthrough and the potential impact of Console created a buzz among our guests, and some of them wanted to know more about it. After an informal Q&A session, we wrapped up a memorable day with pride in the journey so far, and looking forward to creating more impact in the world of data ownership and identity.

To know more about our Console and to understand how it can fit into your organization, talk to Marco Podien today. Also, sign up for early access to our Console for a hands-on feel of this platform.

Join the Community

Get conversations going with #teamaffinidi and the community on Discord Follow us on LinkedIn, Twitter, and Facebook Please get the latest updates on what’s new at Affinidi; join our mailing list Interested in joining our team? Start building with the Affinidi tech stack now For media inquiries, please get in touch with Affinidi’s PR team via pr@affinidi.com

Breaking Digital Borders: Scaling User-Owned Data from Zero to One Million — Key Takeaways was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.


SelfKey

The Ethereum Merge

As a result of the upgrade, Ethereum implemented proof-of-stake, a more energy- and environmentally-friendly mechanism. The post The Ethereum Merge appeared first on SelfKey.

As a result of the upgrade, Ethereum implemented proof-of-stake, a more energy- and environmentally-friendly mechanism.

The post The Ethereum Merge appeared first on SelfKey.

Monday, 19. September 2022

Holochain

The DWeb Is An Ensemble Piece

Making music together at DWeb Camp 2022

Back in 2020, I was going through a crisis of faith. I wasn’t sure if my involvement with Holochain was worth the effort. Sure, it was a noble idea. I mean, who doesn’t want to be more in charge of their own online lives? But I worried that it was a lot of energy wasted on the wrong solutions — maybe we’d convinced ourselves that the tech was more important than it really was. Maybe it was time for me to move on.

Then I learned about a group called Digital Democracy. They were building a mapping tool called Mapeo with the help of Hypercore, a protocol for peer-to-peer data sharing. But this tool wasn’t just for people who wanted to find the nearest coffee shop. It was being used by indigenous groups in the Amazon basin, people whose way of life was threatened by oil exploration. Mapeo allowed them to record the richness around them, from ancient burial sites to important food sources to ant nests. Eventually they were able to compile such a compelling argument for the preservation of their homeland that they won a lawsuit against a decision made by the Ecuadorian government that would have threatened their ecosystem; they currently live free from immediate danger.

That’s when I realised that decentralised tech matters. No, it’s not going to save the world, but it is a powerful, even transformative, tool in the hands of the right people. Mapeo couldn’t have been done as a traditional cloud app — there’s no internet out there in the rainforest. Only a peer-to-peer protocol was up for the task.

Two years later, I find myself at DWeb Camp, an event in north California hosted by the Internet Archive. All of us there have weathered many storms, from COVID to political unrest, and we’ve come out just a little disillusioned. But here, among colleagues, friends, and strangers, I find hope. It’s not that the people here are untouched by the grief and confusion of the past two years — it’s that they still believe we can move humanity’s course in better directions.

Here, I meet people who recognise that the tech that’s gotten us into such a nasty spot can also be used to help us create patches of light in the midst of darkness.

I hear Jill Burrows and Leanne Ussher of hREA describe the way our global economy creates flows of goods and resources that degenerate the health of ecosystems and communities, then offer a new way of looking at economies that could shift us toward regeneration using digital tools for whole-systems accounting. And I get excited when I learn that real groups are ready to use them for their sustainable textile, food, and electronics supply chain networks.

I listen to Ben Tairea talk about Āhau, an app he’s creating with the help of Secure Scuttlebutt. Ãhau lets Māori families record and share their stories to keep them alive. For Ben, peer-to-peer tech is not just an ideological choice; it’s deeply rooted in pragmatism. He tells me that “from our villages’ perspectives, yeah, it was absolutely about — we want to have our own stories; we don’t want to be putting them on these platforms; we don’t want to be entrusting somebody else to hold onto that information for us. And we have heard horror stories of people that had been using different services that no longer existed, and so they lost their database.”

I witness demos of Social Sensemaker and We, working examples of Holochain apps that help groups create healthy online social spaces to work and play in. These tools are built on peer-to-peer tech because, in the eyes of the projects’ creators, that’s the only way for a group to assert their power to create the software, data, and infrastructure that suits them best. We lets you cobble together little configurable applets — “generic tech, specific culture”, in the words of Neighbourhoods founder Sid Sthalekar. Both of these projects aim to carve out areas of sovereignty in an internet where big platforms define the game and choose who can play, often leading to ideological polarisation and disempowerment.

I listen to DWeb Fellows from Brazil, Colombia, India, and Mexico share their experiences helping build community-governed internet and cellular networks where they live. These networks give indigenous and rural people access to the power of global connectivity while preserving their right to engage on their own terms.

DWeb Fellows share their experiences helping to build community-run mesh and cell networks

I chat with countless people who are devoting themselves to beautiful, courageous, demanding visions of a future they want to live in. Every one of their stories is a gift of hope that I will carry home with me.

I see people from all sorts of outlooks — not just rich white tech bros but children, wise sages, trans people, people from the global south and indigenous territories in the north.

One thing that I notice repeatedly is people building bridges instead of shouting at each other from their own islands. They arrive with open minds, ready to share and listen, ready to build a web with many winners. Scuttlebutt and Holochain folks show up at each other’s workshops. People from various governance camps put their heads together to find new approaches to the wicked problems of the internet. And many people without a stake in any project show up to discover allies and tools that can help them in their journey to improve their corner of the world.

People discuss digitally augmented governance at a workshop

This feels different from DWeb Camp 2019 and the DWeb Summits that came before it. I’m not the only one to notice it; afterwards some of my colleagues echo the same feeling. We think things in the DWeb community are shifting — from solo acts to ensemble pieces, from head to heart, from thoughts to actions, from beliefs to relationships. This event both displays and nurtures that spirit, thanks in large part to the careful and heart-led work of Wendy Hanamura and her team of organisers, space stewards, and volunteers.

And that makes sense. We’re here, not because of technology, but because of the things we want to see birthed in the world. Of course, we’re united in believing the DWeb is a tool that can help our work immensely. But it’s ultimately just a tool. We — you and I, and the rest of us — are the ones who are going to make change happen.

How about Holochain?

While I would love to focus entirely on the beautiful community spirit, I know some of you want to know what the Holochain community was up to. Here are some highlights, in no particular order:

Jill Burrows, Leanne Ussher, and Wesley Finck presented a talk and a workshop on hREA and its underlying ontology, ValueFlows, which describes itself as “a vocabulary for the distributed economic networks of the next economy”. The workshop, which focused on practical applications to real economies, was attended by all sorts of people interested in regenerative economics, not just Holochain folks. Jamie Klinger gave a lightning talk on the ideas behind Metacurrency and Deep Wealth. Wes and Eric Harris-Braun gave a lightning talk on the organisational principles behind Acorn. Emaline Friedman, Michael Hueschen, and Sid Sthalekar gave a demo of Neighbourhoods’ Social Sensemaker engine that allows groups to define their own cultural and reputational norms in the form of metadata that can be calculated against contributions, then filtered and sorted. Once again, a lot of people from other projects showed up, which led to a lively discussion of how reputation systems can and should be designed. Guillem Cordoba and Eric Harris-Braun gave a demo of We, showing how groups can create and customise their own spaces to communicate and get things done. I was particularly impressed by a toggle that lets you switch your UI from a group-centric perspective to an agent-centric perspective that shows information from all the groups you participate in. Viktor Zaunders led a brainstorming session to identify useful applets to add to We. And once again, this event attracted people from outside the Holochain sphere. This led to small group discussions about one of my favourite topics — how to build bridges between different DWeb protocols. Eric and Art led a discussion on governance in DAOs and other forms of organisations, along with the tools and processes needed to support them. Paul Krafel, an unwitting mentor of many people in the Holochain community through his book Seeing Nature, held a Q&A about his book and led a few exercises in learning to see flows in nature and human interactions. Eric Bear gave from his heart as a volunteer and Space Steward, organising event schedules, helping people feel supported, and generally cultivating a feeling of care and connection among participants. I took lots of photos and videos for the Internet Archive — I even got to film an impromptu wedding! Some of us participated in a meet-and-greet session — Marcus (the lead facilitator of our recent Developer Immersive) and his partner Kaitlyn treated everyone to charcuterie and we all got to share what Holochain is about with anyone who was curious. Every one of us found ourselves in many conversations about a thousand subjects — governance, education, economics, sovereignty, and of course Holochain. After DWeb Camp ended, the Internet Archive graciously lent us space in their San Francisco headquarters so the remaining participants could plan the future of Holochain and its community together. Postscript

I want to leave you with this piece I recorded at DWeb Camp called Five by Four, performed by the Del Sol Quartet and written by their violist Charlton Lee. As you listen to it, pay attention to the story the musicians are telling. At first, it almost seems confused, as if they’re trying to figure out how to play with each other. The notes don’t fit quite right. Some are playing in 4/4; others in 5/4.

As the piece moves on, the apparent discord begins to resolve — gradually at first, then rapidly. Rather than give up their individual perspectives, they blend them together in an upward spiral of greater coherence, greater beauty. Some are still playing 4/4 and others 5/4, but their contributions only add to the richness; each individual gracefully dances with the rest of the ensemble.

And I love watching their body language. The subtle glances, the smiles and grimaces, the moment-to-moment changes of bow and finger. There’s no leader to keep them all in sync, and the sheet music only gives the notes. The music comes alive because of these human signals, carrying a much higher bandwidth than conductor or ink could.

For me, this piece tells the story of the DWeb, both the present moment of the people building it and the rhythm and notes of the societies we hope to help nurture. We’re growing in our capacity to cooperate, and I think that’s an important prerequisite if we want to help the world do the same. Let’s hope that we may continue to build bridges, connect our islands, come together to create many ensemble pieces.


FindBiometrics

Falsely Accused of Vehicular Homicide, Exonerated by Clearview’s Biometrics: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments The White House’s Office of […] The post Falsely Accused of Vehicular Homicide, Exonerated by Clearview’s Biometrics: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments The White House’s Office of […]

The post Falsely Accused of Vehicular Homicide, Exonerated by Clearview’s Biometrics: Identity News Digest appeared first on FindBiometrics.


1Kosmos BlockID

1Kosmos Recognized as a Leader in KuppingerCole Providers of Verified Identity 2022

It’s been a very exciting summer for the team here at 1Kosmos. Analysts are taking notice! We are honored to be recognized as an overall leader in the KuppingerCole Leadership Compass Providers of Verified Identity 2022. The Leadership Compass Providers of Verified Identity 2022 analyzes digital identity solutions that perform identity proofing and verification or … Continued The post 1Kosmos Re

It’s been a very exciting summer for the team here at 1Kosmos. Analysts are taking notice! We are honored to be recognized as an overall leader in the KuppingerCole Leadership Compass Providers of Verified Identity 2022.

The Leadership Compass Providers of Verified Identity 2022 analyzes digital identity solutions that perform identity proofing and verification or enable a verified digital identity to be imported and easily reverified. The Leadership Compass compares full-service verified identity providers or vendors that conduct digital identity verification alongside providing several of the identity lifecycle stages. As defined by Kuppinger Cole.

This could include:

Vetting/Proofing Registration Authentication Additional services: fraud reduction, electronic signing, attribute verification, orchestration

1Kosmos was recognized as a leader in three of the four categories:

An Overall Leader A Product Leader An Innovation Leader

We would be very honored just to be mentioned but Leadership recognition is tremendous, especially a technology leader. This validates our push to provide a verified portable identity and a passwordless experience to all. This rewards all of the hard work and dedication by our 1Kosmos team. We created a platform to ID users at the other end of the digital connection, taking into consideration user privacy and data security. Organizations around the globe have recognized this as ideal for passwordless authentication because it drastically simplifies the user experience, it’s Kubernetes underpinning, and ease of integration with SDK / API, but it does all of this without introducing new vulnerabilities, need for additional tailoring of network or IT / Security talent.

The 1Kosmos BlockID platform uses a private blockchain that enables organizations to create irrefutable decentralized digital identities and unifies identity proofing with passwordless authentication. The Gartner Hype Cycle for Digital Identity, 2022 provides evidence that interest in decentralized identity is increasing and will enable new digital business opportunities.

1Kosmos BlockID is built with specific capabilities for the onboarding, verification and authentication of employees, contractors, customers and citizens. 1Kosmos digitally transforms the standard onboarding process, by automating and delivering the highest degree of end-user assurance. This transformation eliminates the need for new users to share copies of government IDs, protecting their privacy.

By binding users to their proofed identity, 1Kosmos BlockID creates an identity-based biometric authentication and a passwordless experience. Users will utilize their trusted mobile device for daily authentication and step-up authentication for physical or logical access. As a result, each access event is associated with a real, verified identity.

To download the report click here

The post 1Kosmos Recognized as a Leader in KuppingerCole Providers of Verified Identity 2022 appeared first on 1Kosmos.


Shyft Network

Veriscope Regulatory Recap (September 12th — September 18th)

Veriscope Regulatory Recap (September 12th — September 18th) Welcome to another exciting edition of Veriscope Regulatory Recap! In this week’s edition, we look at the recent crypto regulation guiding principles published by Abu Dhabi FSRA to the White House’s first-ever digital assets regulatory framework. So, without further ado, let’s dive straight into it. Abu Dhabi FSRA Publishes Crypto
Veriscope Regulatory Recap (September 12th — September 18th)

Welcome to another exciting edition of Veriscope Regulatory Recap! In this week’s edition, we look at the recent crypto regulation guiding principles published by Abu Dhabi FSRA to the White House’s first-ever digital assets regulatory framework. So, without further ado, let’s dive straight into it.

Abu Dhabi FSRA Publishes Crypto Regulation Guiding Principles (Image Source)

In its new guidelines that define six key principles on which its approach would be built, the Financial Services Regulatory Authority (FSRA) of the Abu Dhabi Global Market Free Economic Zone (ADGM) defined its regulatory approach to virtual asset activities.

The agency highlighted four “risk drivers” that need heightened focus, one of which relates to the obligations of virtual asset custodians, even when they contract out work to external parties.

There will also be an emphasis on technology and governance controls that businesses use. The FSRA also plans to examine whether companies adhere to legal obligations when exchanging virtual assets, particularly to ensure that laws intended to safeguard virtual asset investors, such as disclosure requirements, are being followed.

As such, the new guide reaffirms the FSRA’s dedication to providing a strong, transparent, and consistent regulatory framework for companies engaging in virtual asset activities, prosecuting regulatory violations, and upholding “high standards” throughout the authorization process.

Read more here:

https://www.adgm.com/documents/legal-framework/guidance-and-policy/fsra/fsra-guiding-principles-for-virtual-assets-regulation-and-supervision-ia-202209012.pdf

SEC’s Gary Gensler Says Proof-of-Stake Crypto May be Securities (Image Source)

Recently, Gary Gensler, the current SEC supremo, insisted that any cryptocurrencies that follow the Proof-of-Stake consensus mechanism are likely to be securities.

In his own words, PoS blockchains “might pass a key test used by courts to determine whether an asset is a security.”

As for how a distinction is made between crypto, which is a security, and one that is not, courts utilize the Howey test to evaluate it by determining whether an investor expects to make money from owning the respective asset.

This statement from Gensler is pretty significant, considering that it comes at the heels of the successful Ethereum merge rollout, which marks its move from the Proof of Work (PoW) consensus mechanism to the Proof of Stake (PoS).

Interestingly, there were rumors that Gensler wanted to backtrack from the previous SEC stance that Ethereum is a non-security, but he was held back by a lack of sound reasoning to pull this off. That may change now with Ethereum’s move to Proof of Stake, as it may very well be the ammo that Gensler needed.

Read More Here:

https://www.coindesk.com/policy/2022/09/15/secs-gensler-signals-extra-scrutiny-for-proof-of-stake-cryptocurrencies-report/

US DOJ Launches Digital Asset Coordinator Network (Image Source)

The US Department of Justice (DOJ) launched the National Digital Asset Coordinator Network, with over 150 federal prosecutors joining the team. According to the authority, the new network will advance its efforts to counter the growing threat the misuse of digital assets by malicious actors pose to the American public.

According to the DOJ, the DAC Network will be the main venue for prosecutors to receive and disseminate specialized training, technical know-how, and advice on investigating and prosecuting digital asset offenses.

Assistant Attorney General Kenneth A. Polite Jr. of the DOJ’s Criminal Division said, “Through the creation of the DAC Network, the Criminal Division and the National Cryptocurrency Enforcement Team will continue to ensure that the department and its prosecutors are best positioned to combat the ever-evolving criminal uses of digital asset technology.”

Read More Here:

https://www.justice.gov/opa/pr/justice-department-announces-report-digital-assets-and-launches-nationwide-network

White House Publishes First-Ever Crypto Regulatory Framework

The White House published the “First-Ever Comprehensive Framework for Responsible Development of Digital Assets” to enhance regulatory control of the digital asset market.

The crypto framework is pretty detailed and consists of segments on investor protection, safe access to cost-effective financial services, strengthening the financial ecosystem, promotion of responsible innovation, ways to maintain the country’s financial competitiveness and leadership, combating illicit finance, and pros and cons of CBDCs.

It also lays the groundwork for regulators like the SEC and CFTC to coordinate among themselves to ensure strict regulatory compliance in the crypto space and build an information-sharing mechanism to keep tabs on consumer complaints.

The White House report further shines a light on “a potential US CBDC,” directing the US Treasury to consider the “potential implications of a US CBDC, leverage cross-government technical expertise, and share information with partners.”

Read More Here:

https://www.whitehouse.gov/briefing-room/statements-releases/2022/09/16/fact-sheet-white-house-releases-first-ever-comprehensive-framework-for-responsible-development-of-digital-assets/

Important Announcement: 10,000 SHFT on Offer! (Image Source)

After Shyft DAO approved the Veriscope VASP grant proposal, an aggregate of 10,000 SHFT has been granted for Virtual Asset Service Providers (VASPs) that integrate to the Veriscope mainnet by September 30, 2022. The fund will enable VASPs to pay the Shyft Network gas fees while using Veriscope. This offer will remain valid until December 31, 2022, or till the VASP exhausts its SHFT grant.

Interesting Reads:

#1. SEC Head Honcho Gary Gensler Double Downs on Demand for Strong Crypto Regulations and Investor Protection

#2. Transcript: Senator Pat Toomey on the Bad State of Crypto Regulation

#3. Industry reps suggest improvements to the Stabenow-Boozman crypto regulation bill

#4. Report: SEC Accounting Guidance Creates Hurdle for Banks’ Crypto Plans

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope! Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

Veriscope Regulatory Recap (September 12th — September 18th) was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


SC Media - Identity and Access

Digital trust important but not prioritized, report shows

TechRepublic reports that while 98% and 63% of professionals around the world considered digital trust important and integral to their positions, respectively, only 12% noted their organizations have a dedicated digital trust staff role.

TechRepublic reports that while 98% and 63% of professionals around the world considered digital trust important and integral to their positions, respectively, only 12% noted their organizations have a dedicated digital trust staff role.


FindBiometrics

Northern Europe’s NOBID Digital ID Project Plans to Use Biometric Onboarding

The Nordic-Baltic eID Project (NOBID) will likely use biometric onboarding, a technology partner has confirmed. The project, announced last week, represents a six-nation effort to develop a digital identity wallet […] The post Northern Europe’s NOBID Digital ID Project Plans to Use Biometric Onboarding appeared first on FindBiometrics.

The Nordic-Baltic eID Project (NOBID) will likely use biometric onboarding, a technology partner has confirmed. The project, announced last week, represents a six-nation effort to develop a digital identity wallet […]

The post Northern Europe’s NOBID Digital ID Project Plans to Use Biometric Onboarding appeared first on FindBiometrics.


SelfKey

SelfKey Exchange Marketplace

The SelfKey exchange marketplace is where you can compare different exchange accounts and sign up instantly. The post SelfKey Exchange Marketplace appeared first on SelfKey.

The SelfKey exchange marketplace is where you can compare different exchange accounts and sign up instantly.

The post SelfKey Exchange Marketplace appeared first on SelfKey.


KuppingerCole

Ransomware: The Invisible Enemy of Organizations

by Marina Iantorno The cost of cyber-attacks to the global economy has risen to more than 400 billion US dollars per year. Cyber-attacks affect every business, from small companies to enterprises and governments. Recovering from attacks is expensive and time-consuming. So, how to make an organization more resilient to cyber-attacks in today’s digital world?   Digital transformation ha

by Marina Iantorno

The cost of cyber-attacks to the global economy has risen to more than 400 billion US dollars per year. Cyber-attacks affect every business, from small companies to enterprises and governments. Recovering from attacks is expensive and time-consuming. So, how to make an organization more resilient to cyber-attacks in today’s digital world?  

Digital transformation has changed the business and delivered new opportunities to organizations. However, going digital also increased the risk of becoming a victim of a cyber-crime. Fighting cyber-attacks are an unfair battle because companies face a silent enemy. In most successful attacks, organizations only learn about the attack when it is already too late. This is the case with ransomware, an attack that encrypts files and frequently also steals data of an organization until a ransom payment is made. Nitish Deshpande describes the evolution of ransomware in an article "How to Protect Yourself From Ransomware in 2022".

Attacks have no border 

In the last years we have seen many different ransomware attacks across the globe affecting every industry and organizations of all sizes. Several of these attacks caused major damage. Here are some prominent examples of the past few years:   

In 2019 the city of Baltimore was a victim of a ransomware attack called “RobinHood”. This attack encrypted files and shut down all the servers except essential services. Attackers claimed a ransom of 100 thousand US dollars to give the access back to the state and threatened with permanently deleting the data. Consequences reached enormous proportions: real estate transactions could not be completed, card payment and debt checking were inaccessible, and city employees lost access to their email accounts. The recovery took the city several months and costed approximately 18 million US dollars.   At the beginning of 2021 the Colonial Pipeline ransomware attack left the US in checkmate. A bug invaded the system and impacted the computerized equipment managing the pipeline.  President Biden declared a state emergency in Virginia since 71% of the filling stations in the area were out of fuel for several days. This was known as the largest cyber-attack on an oil infrastructure target in the history. Overseen by the FBI, the company paid the amount asked by the attackers to sort the problem: 4.4 million US dollars. In May 2021, the Health Service Executive of Ireland suffered a major ransomware attack that affected the entire country and caused the IT systems to be shut down nationwide. The attackers claimed 20 million US dollars threatened to publish patients’ information online. The Irish government did not pay the ransom and, unfortunately, data of 520 patients have been published. Ireland spent more than 100 million US dollars to recover the HSE system after the ransomware attack. In May 2022, the Ministry of Finance of Costa Rica was hit by a ransomware attack that affected, among others, customs, and HR systems. The consequences affected the national economy, especially because the customs stopped processing import and export taxes and the salary of public employees was suspended for a month. The government declined negotiations with the muggers, but it ended up paying high expenses anyway: 9 million US dollars were used from the reserve to pay the salaries and pending movements that were on hold due to the attack. The country remains under national emergency alert. The beneath of the iceberg 

Unfortunately,  it is not possible to avoid attempts of attacks, and there has been a considerable increase of ransomware attacks since the COVID-19 outbreak: Ransomware During the Pandemic Crisis. Companies need to prepare themselves as best as they can to survive that moment. The obvious detriments are related to data compliance, technical issues, and large investments in the cybersecurity system. However, this is just the tip of the iceberg.  

The damage on a brand name is difficult to revert. The brand is one of the most valuable assets in business and being the target of a cyber-crime would devaluate the name and reputation of the company. This is a major issue since there are several expenses arising out of it such as losing reputation and breaking current customer relationships, devaluation of trade name of the organization, and high costs due to operational disruptions.  

Intellectual property loss is another major issue. It damages the name of the organization, as well as slowdown the business growth.  If third parties access trade secrets or publish documents that are under copyright, it would be hard to recover customers’ trust.   It would require efforts from different teams, which means time spent to sort this problem, and thus money loss. 

The only thing a company needs to become a target is to be online, and as such, the key is to stop thinking about “if a cyber-crime occurs” and change the mindset to “when the attack comes”. Becoming more resilient and vigilant is part of a good cybersecurity strategy. Mike Small, a Senior Analyst of KuppingerCole explains this very well and advice steps to follow  in case of the worst scenario: When will Ransomware Strike? Plan for the Worst.

It is vital to realize that while cybersecurity professionals work on refining the prevention tools, hackers are developing in parallel their tactics, techniques, and procedures to bypass the controls and succeed in their next attack.  The priority then, is understanding the potential risks and focusing on programs to minimize the impact and thrive against a cyber-crisis. 

To learn more about different ways of protecting your organization against a ransomware attack you can watch the following video: Analyst Chat #3: Protecting Your Organization Against Ransomware.

Learn about the business impact of cybercrime at CSLS 2022 

At Cybersecurity Leadership Summit 2022 there will be many sessions dealing with the business impacts of cybercrime. Join us if you want to learn how CISOs can promote business agility while at the same time minimize risks, how cybersecurity can become a business enabler, how Denmark provided eIDs to citizens and businesses, or how to realize business benefits of security automation


PingTalk

Authorized Push Payment and Social Engineering: How to Fight Back | Ping Identity

When fraud occurs as a result of scams and social engineering, organizations can struggle to stop it. This is because when legitimate customers fall prey to online imposter scams—for instance, in the case of authorized push payment fraud (APP)—the impact from losses can snowball, affecting not just the customer, but the organization at which the fraud took place.   In fact, according to t

When fraud occurs as a result of scams and social engineering, organizations can struggle to stop it. This is because when legitimate customers fall prey to online imposter scams—for instance, in the case of authorized push payment fraud (APP)—the impact from losses can snowball, affecting not just the customer, but the organization at which the fraud took place.

 

In fact, according to the FTC, American consumers reported losing over $2.3B to imposter scams in 2021. Meanwhile, across the pond, UK Finance reported that losses due to authorized push payment fraud rose by 71% in the first half of 2021 in the UK, the same report stating that the amount of money stolen through this type of scam even overtook card fraud losses.

 

Ultimately, financial institutions need to find ways to effectively combat thes massive losses that can come from APP fraud before they’re left footing the bill.

Sunday, 18. September 2022

KuppingerCole

Analyst Chat #141: What Defines Modern Cybersecurity Leadership

How do you implement modern cybersecurity leadership between compliance, threat protection, privacy and business enablement? To answer this question, Matthias invited the CEO of KuppingerCole Analysts, Berthold Kerl, who was and is active in various roles as a leader in cybersecurity. Together they explore questions such as how important the knowledge of basic cybersecurity technologies is and wha

How do you implement modern cybersecurity leadership between compliance, threat protection, privacy and business enablement? To answer this question, Matthias invited the CEO of KuppingerCole Analysts, Berthold Kerl, who was and is active in various roles as a leader in cybersecurity. Together they explore questions such as how important the knowledge of basic cybersecurity technologies is and what the necessary management tasks are in an organization?

 



Saturday, 17. September 2022

SelfKey

“The Merge”

The “merge,” also known as Ethereum’s long-awaited transition to proof of stake, finally took place. The post “The Merge” appeared first on SelfKey.

The “merge,” also known as Ethereum’s long-awaited transition to proof of stake, finally took place.

The post “The Merge” appeared first on SelfKey.

Friday, 16. September 2022

Forgerock Blog

The Implications of the Uber Breach

How to protect your organization from a social engineering attack Cyberhacks are commonplace in today's world, and they can happen to any company. Today it's Uber, last week it was U-Haul and the week before it was Samsung. At the root of many of these attacks is a malicious actor masquerading as a corporate IT manager or other technical role. Using this disguise, the perpetrator knows that all
How to protect your organization from a social engineering attack

Cyberhacks are commonplace in today's world, and they can happen to any company. Today it's Uber, last week it was U-Haul and the week before it was Samsung. At the root of many of these attacks is a malicious actor masquerading as a corporate IT manager or other technical role. Using this disguise, the perpetrator knows that all they have to do is convince one employee or contractor to share their credentials to gain a foothold into the targeted company's internal network. This tactic is called social engineering and is one of the key methods used in attacks that result in data breaches. These types of "unauthorized access" attacks account for 50% of all data breaches and can cost companies as much as $9.5M dollars to remediate per incident.

It's a frustrating reality for CIOs because these types of breaches are preventable as long as their companies implement the right technologies, policies, and procedures and educate their employees on how to be extra vigilant in the digital world.

In my experience, one of the most difficult aspects of protecting an organization is making sure all team members understand the importance of building security into their everyday roles. Employees need to adopt the mindset of "security first." Simple things such as not clicking on unknown attachments or deleting a suspicious request for a passcode are easy ways to stay out of harm's way.

CIOs must come to grips with the fact that we're all human and that shortcuts don't have a place when developing strong cybersecurity habits. Even the smallest and seemingly insignificant technical toe-stub can open the door to a potential bad actor. Our job is to look at the problem holistically. No amount of investment in the latest technology can provide 100% protection because technology alone is not enough. It has to be complemented with well-designed, enforceable policies, proven procedures, and strong system hygiene coupled with continuous education and awareness.

Let's take a look at some practical steps you can take to protect your organization now:

Embrace zero trust. Building strong policies and procedures. Continuously educating your workforce. Zero Trust Goes Beyond Products

Zero trust is built on the principle that no person or device inside or outside of an organization's network should be granted access to connect to systems until authenticated and continuously verified. A CIO's top priority is to incorporate zero trust into their approach to cyber security.

User access must be calculated constantly from as many different data points as possible. Companies need to leverage artificial intelligence (AI) to combat account takeovers and tackle fraud at the front door and subsequently throughout their networks.

As an example, the ForgeRock Identity and Access Management (IAM) platform continuously assesses risk as part of its access management capability to provide a zero trust environment. However zero trust needs to be applied to everything we do online. This means deploying the best cybersecurity technology that implements a zero trust paradigm; developing and implementing policies and procedures that reinforce zero trust and redundancy; and educating users and systems administrators to follow procedures that mitigate risk.

Build Strong Policies and Procedures

Most successful attacks are the result of routine lapses and failing to know what endpoints are connecting to your network. To prevent this, it is important to practice good cyber hygiene by patching operating systems and applications, backing up data, updating and whitelisting applications, limiting privileges, and using multi-factor (MFA) authentication

CIOs need to understand where their technology assets are, what software is in use company-wide, and identify unsanctioned software which, more often than not, has not been properly configured, patched, updated, or secured, creating an attractive entry point for attackers.

Internal hygiene is critical, and that includes ensuring that system level usernames and passwords are not hard-coded. If that is necessary, make sure there is regular rotation of those system passwords. Performing regular, routine maintenance, automated patching, and isolating personal devices is all part of ensuring your organization is protected.

Continuously Educate Your Workforce

Outside of technology, there is the element of human error and risk. It is critical to any business that employees are regularly educated and tested to ensure they have a strong understanding of cyber risk and the part they play in minimizing it.

One important and often overlooked element is social engineering education. For example, how much personal information are your employees actually sharing online via social media such as LinkedIn or FaceBook? This is an increasingly strong attack vector that bad actors use to understand the structure of a company. I've seen attackers target new, junior employees with a high pressure, time-sensitive request seemingly from a senior member of staff to unlock access to a company's network.

Finding new ways to keep your teams educated requires some serious thought. Training should not be a "one size fits all" approach. It needs to be tailored to teams and roles. I am sure that sometimes I drive my team crazy, but I have been known, on occasion, to contact my help desk team in an attempt to impersonate various employee roles to ensure they stay on their toes. I think it's an important aspect of understanding the maturity of a team with elevated access to many systems.

Phishing scams are another common method used to gain unauthorized access. Employees should be trained to identify malicious emails, URLs, and attachments that often are sent by attackers to their corporate or personal inboxes, or any other avenue, such as SMS.

Lastly, training has to be non-negotiable. Companies need to follow up to ensure that employees have participated in relevant training and fully understood its implications. Furthermore, employers need to make it clear there are ramifications should individuals intentionally circumvent security policies and procedures.

The correct balance for preventing any cyber attack spans people, process and technology. If you get that trio right, your organization will thrive.

Learn more by checking out these resources: Blog: How Your Organization Can Prevent Account Takeover White Paper: Cloud Without Compromise: IAM for the Hybrid Enterprise

Find out more about social engineering threats here: https://www.cmu.edu/iso/aware/dont-take-the-bait/social-engineering.html


Finicity

Five Key Trends to Consider When Integrating Digital Payments Into Fintech Innovations

Consumers are embracing digital payments and turning to fintech for everyday finance needs. According to Mastercard’s 2022 Global New Payments Index, emerging payment methods like account-to-account payments, digital wallets and Buy Now, Pay Later are all on the rise. Eighty-five percent of consumers have used a digital payment method within the last year. And 93% […] The post Five Key Trends to

Consumers are embracing digital payments and turning to fintech for everyday finance needs. According to Mastercard’s 2022 Global New Payments Index, emerging payment methods like account-to-account payments, digital wallets and Buy Now, Pay Later are all on the rise. Eighty-five percent of consumers have used a digital payment method within the last year. And 93% are likely to use a digital payment method in the coming year.

Many of these emerging digital payments are powered by open banking and are a natural progression of the shifting landscape of payments. Through our latest research, we wanted to share five key trends to consider when integrating digital payments into your fintech innovation.

1. Consumers want convenience when paying bills

Consumers across the globe are relying on digital channels for paying bills because it is more convenient and makes it easier to manage finances.

Subscriptions, bills, utilities, loan repayment and retail payments are more convenient with open banking-powered apps and services. Eighty-one percent of consumers have already heard of account-to-account payments, but they may not know that open banking added speed and convenience to A2A, or that A2A payments can now be made at the point of sale, without typing in card details or writing checks.

2. Consumers seek flexibility in making payments

The majority of global consumers want the flexibility and control to optimize digital payments. Similar to the motivations around bill pay, consumers are connecting their accounts to automate repayment for BNPL and installment loans. Fifty-eight percent of consumers are open to connecting their bank account to other financial services to enable automatic repayments, and 52% percent say that they use digital repayment tools because they help to prevent missed or late payments.

 

3. Security is top of mind

Consumers recognize the convenience that digital payments offer, and security remains a top concern, highlighting an opportunity for providers to build trust. Building comfort with emerging digital payments is key to supporting future adoption as the two trend together. Faster transactions, convenience and transparency are the top reasons that help consumers overcome security concerns.

4. Consumers rely on fintech to manage finances

Consumers are relying on fintech, and indirectly open banking, to accomplish everyday financial tasks. Eighty-three percent of consumers have used digital tools for at least one financial task, and over half use technology to accomplish five or more tasks. The majority see making a payment as the most beneficial use case.

5. Emerging payments are strongest among Gen Z and Millennials

Younger generations have gone more digital in their purchasing and payments behavior globally and it’s anticipated that their use will continue to increase. These generations are less likely to make in-person purchases and payments: 50% for Gen Z compared to 78% of Boomers. They are also less likely to use cash for purchases. While security remains a concern for them, it is less heightened than for older audiences.

Building the Future of Payment Choice at Mastercard

At Mastercard, we have always powered experiences that enable customer choice. Our solutions are built to meet consumers’ financial needs and designed with security at the center.

In recent years, we have further differentiated Mastercard in the market by diversifying beyond the card. We’ve built a complementary open banking platform that enables ACH and account-based payments with best-in-class capabilities across infrastructure, applications and services. Empowering people to pay and get paid using a card, bank account, cryptocurrency, or even cash. Using any device or no device. In real-time or later, truly empowering people with flexibility and control.

 

Certain open banking solutions are provided by Finicity Corporation, a Mastercard company.

The post Five Key Trends to Consider When Integrating Digital Payments Into Fintech Innovations appeared first on Finicity.


1Kosmos BlockID

Gartner IAM Thoughts and Observations

1Kosmos had the pleasure of attending the Gartner IAM Summit 2022 in Las Vegas. It was our first time exhibiting and sponsoring the event. I have to say it was great to see long time friends, former colleagues, customers, partners and prospects. It was a great feeling to meet people face to face again. As … Continued The post Gartner IAM Thoughts and Observations appeared first on 1Kosmos.

1Kosmos had the pleasure of attending the Gartner IAM Summit 2022 in Las Vegas. It was our first time exhibiting and sponsoring the event. I have to say it was great to see long time friends, former colleagues, customers, partners and prospects. It was a great feeling to meet people face to face again. As we move beyond our COVID shutdown, it makes me appreciate the things I lost appreciation for. For me, the show was very busy. It felt as though everyone came out just to go meet the folks that they used to be in touch with!

I attended as many sessions as I could balancing my time between meeting customers at the booth and attending the highly informative sessions.

In this blog, I wanted to share some observations and highlights of the event because I know not everyone had an opportunity to attend.

1. Passwordless is a Day 0 imperative

My first observation was that passwordless authentication was a frequent (or top of mind) topic. Gartner VP Analyst Ant Allen has been talking about the end of passwords and the need to eliminate them for years! Based on the presentations and focus of the exhibitors, this is a hot topic. I was happy to see that the discussion has moved past just passwordless and in fact it is simply “expected “ now. Gartner took it much further in fact, and described the journey to MFA and RBA (risk based authentication). And then they dropped the big one- Continuous Adaptive Trust! Learn from the environment. Learn from the user’s previous access and form an opinion about the inherent risk of each new access. I believe this is where the world of user authentication is headed and selfishly, no one can provide Day 0 passwordless as we can here at 1Kosmos. But, yes it’s good to see the market and analysts move in our direction. Also, passwordless is accelerating across workforce, customer, and citizen use cases.

In speaking with attendees, many express some doubt about their ability to achieve the goal of being passwordless or communicate that they don’t know where or how to start. Of course, I was more than happy to talk about our approach and show prospects how it’s not only possible but, with our approach, it’s very practical, and it provides an excellent experience for users. I said it here- passwordless is just the beginning and there is much much more to come to remove friction from secure user access journeys!

2. Signal Orchestration with Predictive Analytics

Multi-factor authentication is an absolute must these days. That’s nothing new. But what was new is the need to move past a risk based (RBA) model and into a continuous authentication model. The shift will ultimately mean that authentication journeys will not be preordained as it is today. The industry will move from a stepwise imperative to a desired end-state declarative model!

With the addition of signals, security teams can orchestrate the appropriate journey based on the user and their action. For example, security teams could base authentication on a NIST IAL level. If the user does not meet that IAL requirement, say they are IAL 1 presently but they need to be IAL2, then the system would adaptively put them through a journey to achieve the assertion corresponding to a NIST IAL2 level!

This move to a predictive – based on AI/ML – approach to access via declarative journeys and orchestration will be the future of access management. The shift here is that currently security teams already know the signals they are supposed to be looking for. So security is based on what is known. But the world is moving on. The hackers have moved on. We need to move to a continuous model – Adaptive trust. This is another way of saying we need to try to predict when the system might be at a higher level of compromise or threat, and then go and introduce obstacles, sensible obstacles, contextual obstacles, into the authentication journey. That’s really what they were trying to drive towards.

3. Web3 and the Rise of Privacy!

Web3 was a surprising discussion point. What was clear is that this is in its infancy. Many are talking about – specs, requirements, technology, and even implementations but the strategic imperative is clear.

Web3 is promising a complete redesign of the web as we know it.
Decentralized compute, storage, and always-on application execution will be executed via distributed ledgers.
Everything will live on a blockchain and will have an address. All nodes work together to compute, verify, and record the state change of the tokens (the things that represent value), which will lead to a token economy.
An application called a wallet is the human interface to the blockchain technology stack. It holds the seed phase, all your blockchain accounts/addresses, the corresponding public/private key pairs, your tokens, and above all, initiates the state change. Early talks of a standard to allow for wallets to work anywhere are underway.

This is just the tip of the iceberg. Web3 promises a utopia where everyone online is a known entity. No more anonymity but the fundamental principle to uphold will remain user privacy. Users and creators alike will need, and regulations will mandate, a consent driven journey with sufficient recourse available to users for consent revocation. These drivers will, over time, fundamentally alter the way we engage online. Gartner is already beginning the analysis.

While this is just a high level view, there was more to the event than just this. But these were my key takeaways. As a PM I already have much of this in production, in development, or on a roadmap. As an organization and the leader of the strategic direction of our platform I believe we are headed in the right direction to help you meet the future demands of your workers, customers or citizens. 

The post Gartner IAM Thoughts and Observations appeared first on 1Kosmos.


FindBiometrics

‘We Will Execute You’ – Biometrics and the Russo-Ukrainian War: Identity News Digest

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments The European Commission has published […] The post ‘We Will Execute You’ – Biometrics and the Russo-Ukrainian War: Identity News Digest appeared first on FindBiometrics.

Welcome to FindBiometrics’ digest of identity industry news. Here’s what you need to know about the world of digital identity and biometrics today: Regulatory Developments The European Commission has published […]

The post ‘We Will Execute You’ – Biometrics and the Russo-Ukrainian War: Identity News Digest appeared first on FindBiometrics.


KYC Chain

Identity verification for Fintechs with KYC-Chain

Fintech companies have revolutionized the way people and businesses access and use financial services. However, as the possibilities for new and exciting ways to interact with financial systems have exploded, so too have have the methods that criminals use to take advantage of them. As fintechs battle with the rising threat of fraud and money laundering, efficient and automated KYC can be the key

This week in identity

E9 - Gartner Security & Risk Management London / Outcome Driven Metrics for Cyber & Identity / International Identity Day

In episode 9, Simon and David briefly discuss the International Identity Day that is being promoted on Sept 16 - that aims to include, protect and empower citizens globally in the pursuit for having government issued identities for all.  Simon attended the Gartner SRM conference this week in London, where there was a left-shifting of identity into the app-sec and network-sec worlds, as well a

In episode 9, Simon and David briefly discuss the International Identity Day that is being promoted on Sept 16 - that aims to include, protect and empower citizens globally in the pursuit for having government issued identities for all.  Simon attended the Gartner SRM conference this week in London, where there was a left-shifting of identity into the app-sec and network-sec worlds, as well as a detailed discussion on outcome driven metrics - and making sure the business know how their cyber and IAM investments are doing. 


SelfKey

Web3 identity verification

Next-generation digital identity management might develop into a fully decentralized peer-to-peer networking system with the rise of Web3. The post Web3 identity verification appeared first on SelfKey.

Next-generation digital identity management might develop into a fully decentralized peer-to-peer networking system with the rise of Web3.

The post Web3 identity verification appeared first on SelfKey.


KuppingerCole

Raising User Awareness to Increase Cyber Resilience

by Raj Hegde With the common use of heuristics and especially the advance of Artificial Intelligence (AI) in automated cyber threat mitigation, why do we still need to focus on user cybersecurity awareness?  In cybersecurity, there never was just one solution to reduce risk or fix mitigations. But you always need multiple layers of security. So each layer consists of additional asp

by Raj Hegde

With the common use of heuristics and especially the advance of Artificial Intelligence (AI) in automated cyber threat mitigation, why do we still need to focus on user cybersecurity awareness? 

In cybersecurity, there never was just one solution to reduce risk or fix mitigations. But you always need multiple layers of security. So each layer consists of additional aspects. Software and hardware measures are excellent at catching mass and known threats, with the advance in AI and also the detection of new threats based on known issues. But we can never neglect the human factor as a first, but also the most important line of defense.  

I often give the example of driving a car. You can make a car very safe to drive with airbags, ABS, and even some measures with automatic street lane detection and object detection. But even with all the advances in self-driving, at least for now, you still need a human who knows how to drive so that a person can react in new or unexpected situations. This is the same for technology and specifically cybersecurity, therefore users need to have an awareness that we need them to fight cyber threats and crime. So if users know how important they are and how they can help, this can only be our advantage. So give them the knowledge, train them and users can be a first, last, and often also the best line of defense. 

What are methods of cyber social engineering and how to identify them?  

Social engineering is an age-old tactic to trick people into doing something they do not want to do. So users are tricked by communication, emails, phone, SMS, or videos pretending to be from trusted parties such as banks, colleagues, executives, online payment processors, IT admins, or social websites. So all social engineering attacks have one or more common traits. The scammer will focus on your good nature and human emotions. The criminal will use a false title, a position of authority, for example, as someone you would normally trust, a bank manager, social services, a doctor, etc. and they will try to create an urgency, a time-sensitive situation which will put you under pressure and to act. They may request urgent information, money, or something similar for their criminal gain. So currently the most prevalent social engineering attacks we have seen are via email, voice, SMS, and social messaging.  

Social engineering attempts via email are called email phishing or business email compromise. So even with the constant barrage of emails everyone receives, it's normally not difficult to recognize them. Was the email expected? is the first question you should ask yourself. Check from whom this email is. Is the sending domain suspicious or are there some obvious misspellings? Many companies also use technology to insert an external email banner warning to indicate that this email is from outside of their organization. You always should remain suspicious if there is an attachment that asks to enable macros, in office documents mostly, and also be very careful if the email has some information that tries to raise a sense of urgency, "You need to do this immediately", this is always a warning sign.  

This is very similar to phone phishing attempts, where criminals may try to lure victims into handing over valuable information. Employee names, title roles, or anything like that. So they will normally call via the switchboard or directly from an unfamiliar number claiming to be a colleague or a well-known vendor, or a third party and create again a sense of urgency with their request. Similar to SMS phishing, people may receive SMS messages asking for a piece of personal information or asking to click on links. This SMS may look legitimate, being from a bank or parcel delivery service or something similar. Always, before clicking or replying, check if this SMS was expected and if the link goes to a legitimate website. And of course, keep your mobile phone updated to avoid security exploits. 

How did the increased Working from Home change the cyber threat landscape in the last two years from a user perspective? 

Let's start with something outside of the cyber domain, but still related to safety, specifically health and safety. What we have learned in the last two years is that it is important to have an environment at home which enables a safe, healthy, and economical workspace. So this means having sufficient and good lighting, a comfortable and ergonomic chair and the right size desk is much better than sitting on the kitchen table or working from a sofa. But working from home reminded us also to be mindful of who and what is around us when conducting confidential work conversations. So do not discuss confidential information with your IoT devices like Alexa or Google Home nearby, in case they may be listening and sending it to their servers. So, collaboration with your team is very important. It is vital when you are working remotely, but make sure to only use company-approved methods for communication. Don't use any personal or social media apps to discuss work-related topics and of course, the usual advice, upgrade your home network and any personal computers, game consoles, internet-connected TVs, baby monitors, and other IoT devices, and never use a default password. So the security basics also apply at home - or even more at home. 

What can we expect from your talk at CSLS 2022? 

We have traditionally seen cybersecurity as a domain for technology, with the expectation that solutions for cyber resilience have to be provided by IT. And we happily accepted this challenge and deliver numerous software and hardware solutions, design and development principles, policies, and process controls. However, the most successful cyber attacks in recent times have started with targeting users with phishing emails or other kinds of social engineering, and therefore raising awareness of the users' role in increasing cyber resilience is at least as important as providing just a technical solution. So my presentation will be based on a very famous example, a real robbery in Berlin in the early 19th century showing how user awareness can become an important line of defense in cybersecurity. 


IDnow

Has the UK’s light-touch approach to KYC nurtured an environment of world-leading fraud?

It’s time we had a serious discussion about the risk-based system…  A fraud epidemic is sweeping the UK.  Criminals bilked UK victims of a record £1.3 billion in 2021, according to new figures by top industry lobby group UK Finance. The numbers were so bad that publication of the report was delayed for banks to […]
It’s time we had a serious discussion about the risk-based system… 

A fraud epidemic is sweeping the UK. 

Criminals bilked UK victims of a record £1.3 billion in 2021, according to new figures by top industry lobby group UK Finance. The numbers were so bad that publication of the report was delayed for banks to consider their response. 

Approximately 91% of identity fraud cases reported to UK authorities in 2021 were carried out online. Although anyone can fall victim to online fraud, tackling it is not a priority for law enforcement, according to the government. Instead, firms are being asked to do more to protect their customers and themselves or run the risk of heavy fines or even criminal sanctions for non-compliance. 

The year of 2020 saw a record-breaking £9.2 billion in penalties for KYC and customer identification failures, which triggered a demand for outsourced services and better technology, with automation of key onboarding processes deemed as particularly vital.   

Digitization has revolutionized the finance landscape and left manual ID verification processes firmly in the past; customers no longer want to wait days or weeks for their accounts to be approved, and they want to be able to do all this through their phones. 

Speaking to IDnow on condition of anonymity, the Head of Operational Risk at a mid-tier UK investment bank said: “I believe, and I am not alone, there are significant problems, weaknesses with how external auditors check over all applications manually. It’s just too time-consuming, expensive and prone to errors. We are looking at how we can automate more and it’s the same issue whoever you ask up and down the street.” 

Lost in the wash 

The advanced payments infrastructure, almost zero policing of fraud-related crime, and use of the world’s most widely used business language, has made Britain the global incubator for scams

Money launderers exploit the poor standard of identification checks, the weaknesses in manual processes and physical documentation requirements to slip through the net.  

Just two years ago, Commerzbank’s London branch was fined more than £37 million for a series of compliance failings, including outdated customer onboarding processes. The bank allowed a queue of 2,350 accounts awaiting background checks to build up, but some were able to trade freely, in clear contravention of the Financial Conduct Authority (FCA) rules.  

Despite the near-constant updates to anti-money laundering and counter-terrorist financing (AML/CTF) laws, knowing exactly who their clients are appears a problem for many UK financial services firms. 

The risk-based approach to Know Your Customer (KYC) processes favored by the UK, but eschewed in most other developed nations, is gatekeeper to an environment where fraud continues to run rampant.  

” In the Western world, it is almost impossible to find a major, global bank that has not been sanctioned for AML or other financial crime failings in recent years.

Now that AML-related concerns and failings have resulted in many large banks being sanctioned, regulators are beginning to pay increased attention to other areas of financial services.

Malin Nilsson, Managing Director, Financial Services Compliance and Regulation at Kroll

For example, regulators have started to implement guidance for digital identity verification, such as in the EU’s 5th Anti-Money Laundering Directive released in 2020. In it, it states that an obliged entity must identify the customer, either through traditional documentary evidence or information obtained from a reliable and independent source, including electronic identification means. Those electronic identification methods must comply with Regulation (EU) No. 910/2014, which sets out criteria for identity verification services. 

Outdated processes 

Before assessing the risks associated with a customer in relation to money laundering or financing of terrorism, financial institutions must first verify the individual’s identity. This is done via the KYC process, which involves scrutiny of government-issued identification documents, such as a passport or driver’s licence, or other means, such as utility bills or bank statements to ensure the customer or client is who they say they are, and to prevent illicit activities. 

What that process looks like differs around the world (which creates complications of its own), but there are a set of general guidelines overseen by a global group of regulators from various jurisdictions sitting as the Financial Action Task Force (FATF).  

The UK’s interpretation of these rules requires firms to take a risk-based approach, which means assessing the specific threats the business is exposed to, such as banking vulnerable or politically exposed individuals.  

The current rules and regulations were originally created to address the risks and controls associated with retail banking, which has unintentionally created regulatory barriers and AML complications for non-banks like PayPal, alongside other fintechs, and sectors like online gambling and crypto trading companies. A “low risk” for a bank may be a “high risk” for another sector. 

As legislation around anti-money laundering and risk-based-approaches was largely written in the pre-digital era, access to the data that helps firms calculate risk was limited. Ironically, there is now almost too much data to handle, with many organizations struggling to take advantage of the insights and technology available to them. 

A Thomson Reuters investigation into KYC challenges impacting financial institutions and their corporate clients found rising onboarding costs, lengthy onboarding times and sub-par ongoing maintenance of client records.  

Costs are spiralling, with resourcing and the need to hire more staff a major concern. It still takes too long to onboard customers, and the time taken to refresh client data is torturous.  

Multinationals face further complications dealing with the UK’s risk-based approach, which allows more time to make decisions on threats and has no international standards to follow. As regulations can be interpreted differently, banks also often request inconsistent things from their clients, which results in inefficient operations of their own.  

One of the most common issues that IDnow hears from financial leaders responsible for supervising KYC and AML compliance within their organizations is the disconnect between their work and the business functions of the institution.  

Making sense of the exponential amount of data and documentation required for KYC and AML checking can be tough and, due to complicated and inefficient processes, can halt processes such as onboarding. 

Watchdogs like the FCA say cryptocurrencies and crypto assets are only magnifying the fraud problem, despite the growing sector working hard to shake off its reputation as a haven for bad actors. 

New school thinking 

Ironically, the answer to avoiding ever-tougher controls, circumventing inflexible processes, and improving both standards and efficiencies in the KYC process may come from an unlikely source: the world of crypto.  

Virtual asset service providers (VASPs) must also keep detailed records of beneficiaries, complete further enhanced due diligence of politically exposed persons (PEPs) and appoint an individual to oversee compliance and regulatory issues in the wider financial space. 

KYC is one of the biggest roadblocks that cryptocurrency firms face, especially given the nature of digital tokens to protect or shield the identity of the owner.  

Cryptocurrency firms have an opportunity to give regulators and customers confidence, outshine competition and demonstrate credibility by implementing industry-leading compliance practices.

Ben Luddington, Director at PwC UK. 

Unburdened by years of legacy system upgrades, bolt-ons and software changes, alternative finance companies have automated large swathes of the KYC process and built systems that can process a much wider source of data. This has created faster, safer and more robust paths to verification than in place at traditional banks and financial institutions.  

Configurability is essential to addressing the challenges faced by firms operating in multiple jurisdictions, but rigid technology stacks at many financial organizations jam up KYC compliance workflows. Indeed, every bank, insurance, accounting and law firms will likely have dusty rooms full of photocopied identity documents clogging up the filing archives of the business. 

Challenger finance apps and crypto exchanges have also advanced the integration of Multi-Factor Authentication systems based on SCA (Strong Customer Authentication), Liveness checks, and other forms of biometric-based authentication. These types of identity verification processes, which sometimes require little more than a selfie and a PDF of a household bill, dramatically improve the customer’s onboarding experience. 

How intelligent technology enhances the KYC process 

Document verification determines if something is genuine or fake through a series of checks, but the quality of the process varies wildly. Field-to-field consistency, data validations and font anomaly detection, among others, are popular, but where smaller, nimbler services beat the larger incumbents is by combining the identity verification document check with biometrics.  

These platforms have also made good use of electronic signatures to communicate with their customers, further allowing them to promote additional financial products and services. 

The smartest solutions have AI-based face and emotion recognition capabilities, or the ability to scan the internet for negative sentiment, making KYC procedures more cost-effective, accurate and customer-friendly. 

All firms must turn to technology to bolster KYC frameworks and accelerate the detection and mitigation of financial crime, Luddington said. This may be a build vs buy decision. 

“Deciding which processes will benefit from what technology requires considerable and careful thinking, but this will bear fruit in the long run,” he said. “We’re already seeing the FCA using analytics to innovate its regulatory approach and recognizing the importance of technology in driving financial crime compliance.” 

Risk management has become more challenging over time as regulations tighten, and financial institutions have faced larger fines where compliance programs have failed. 

Banks have long struggled to balance fraud prevention and customer service, and existing paper-based and risk-based processes are simply ill-suited to the demands of modern finance, while advanced, intelligent identity proofing developments are proving critical. 

“It’s more important than ever for financial services firms, particularly those in the UK, to understand the benefits of automation and apply them to the KYC process in order to truly manage fraud.

Regulators should do more to create robust identity verification guidelines and set expectations that firms leverage better technology in order to stop the unwanted headlines about bank fraud. 

Rayissa Armata, Head of Regulatory Affairs at IDnow

For more information about how automated KYC solutions can enable organizations to offer a safer, more secure, and more efficient verification process, check out our ‘What is KYC’ page

By

Jody Houton
Content Manager at IDnow
Connect with Jody on LinkedIn

Thursday, 15. September 2022

KuppingerCole

Zero Trust Is Driving the Evolution of Authorization

Verifying what specific applications, files, and data that a human or non-human entity has access to, is at the heart of cybersecurity in the face of increasing theft of data for espionage or other criminal purposes. Authorization, therefore, is extremely important to security, but it is also key to boosting brand trust and improving user experience. Join security experts from KuppingerCole A

Verifying what specific applications, files, and data that a human or non-human entity has access to, is at the heart of cybersecurity in the face of increasing theft of data for espionage or other criminal purposes. Authorization, therefore, is extremely important to security, but it is also key to boosting brand trust and improving user experience. Join security experts from KuppingerCole Analysts and Ping Identity as they discuss the challenges of authorization in modern IT environments, market trends and changes, and how a modern approach to authorization can address the challenges, including how to cater for consumers and other external parties.

Osman Celik, Research Analyst at KuppingerCole will look at the business and security benefits of moving to policy-based access controls (PBAC), how this supports a Zero Trust approach to security, and how PBAC can be practical and scalable in hybrid and multi-cloud IT environments.Adam Rusbridge, Senior Product Manager at Ping Identity will highlight the main authorization use cases that are driving change in enterprise architecture teams. He/She will also share recommendations on how organizations can improve security, enhance brand trust, and deliver better user experiences.




1Kosmos BlockID

Is There Really No Easy Fix for ID Verification for Government Benefits?

Recently, I came across an article titled “No easy fix for ID verification for government benefits” that explored a major challenge that governments have been facing for the past few years. This problem involves balancing fraud prevention in unemployment insurance claims and ensuring that qualified individuals can access the benefits that they are entitled to. … Continued The post Is There Reall

Recently, I came across an article titled “No easy fix for ID verification for government benefits” that explored a major challenge that governments have been facing for the past few years. This problem involves balancing fraud prevention in unemployment insurance claims and ensuring that qualified individuals can access the benefits that they are entitled to. The concern is that anti-fraud measures may prevent benefits from reaching the constituents that they are intended to help.

Since COVID-19 relief programs started in 2020, states have grappled with a dramatic increase in fraudulent claims. This led 27 state governments to work with a commercial identity verification provider in hopes of flagging potential fraud. Unfortunately, this resulted in a new issue that made the news earlier this year called Identity Decisioning Bias. Identity Decisioning Bias was used in algorithms designed to verify identity but is built in a way that disadvantages certain populations, such as racial and gender groups. Depending on the identities that the AI algorithms were trained on, databases show bias towards certain populations. This means that some users might not be recognized and denied access to services.

What’s the solution to this problem? First, pooling data to build “intelligence” is risky because it leaves private data at risk of being hacked. Instead, data should be stored on a private, immutable blockchain to eliminate the possibility of theft. Additionally, live biometrics should be used for identity proofing instead of device level biometrics. The 1Kosmos BlockID platform can use mobile phones to perform a live selfie or a “liveness test” which can determine if the face in front of the camera is real or not with 99% accuracy. This eliminates the possibility of Identity Decisioning Bias in the 1Kosmos BlockID platform. Our platform was tested on iBeta to produce zero false matches in 200 live biometric attempts on iOS and Android devices, validating the reliability of the BlockID platform for accurately performing live biometrics for identity proofing.

As the world continues to move to an online-first environment and begins to adopt zero-trust constructs at scale, we find we have architected a solution to suit the security needs of government and organizations today and well into the future.

The post Is There Really No Easy Fix for ID Verification for Government Benefits? appeared first on 1Kosmos.


Shyft Network

SEC Head Honcho Gary Gensler Double Downs on Demand for Strong Crypto Regulations and Investor…

SEC Head Honcho Gary Gensler Double Downs on Demand for Strong Crypto Regulations and Investor Protection Gary Gensler, the current SEC Chairman, rejected the notion that current securities law is incompatible with the crypto market. The SEC Supremo also asked crypto intermediaries such as crypto exchanges to register with them as securities exchanges and broker-dealers. He also supported
SEC Head Honcho Gary Gensler Double Downs on Demand for Strong Crypto Regulations and Investor Protection
Gary Gensler, the current SEC Chairman, rejected the notion that current securities law is incompatible with the crypto market.
The SEC Supremo also asked crypto intermediaries such as crypto exchanges to register with them as securities exchanges and broker-dealers.
He also supported giving regulatory authority over the crypto industry to the CFTC when it comes to nonsecurity tokens.

Gary Gensler, the Chairman of the US Securities and Exchange (SEC), is yet again calling for crypto regulation.

A “vast majority” of crypto are securities issued to the public in violation of federal laws, he said at the Practicing Law Institute.

“These are not laundromat tokens. Promoters are marketing, and the investing public is buying most of these tokens, touting or anticipating profits based on the efforts of others,” highlighted Gensler.

Can Securities Law be Extended to Cryptocurrencies?

Many in the crypto industry have been calling for a new set of rules designed particularly for crypto assets due to existing securities law being unsuitable.Gensler completely rejected that notion saying, “Nothing about the crypto markets is incompatible with the securities laws.”

(Image Source)
It is important to point out here that the SEC officials, both current and former, have reiterated multiple times that most cryptocurrencies are actually securities, barring Bitcoin.

Ethereum, the second largest cryptocurrency by market cap, too, has been given a clean chit in the past. Still, the current SEC Chair, Gary Gensler, has denied giving any definitive statements on Ethereum’s status. So, the SEC’s stance toward Ethereum is unclear at the moment.

(Image Source)

However, a piece of news making rounds on crypto Twitter indicates that Gensler would love to back away from the previous SEC stance that Ethereum is a non-security but hasn’t found a credible way to pull this off.

Protecting Investors

The focus has been on protecting the investors, which is “just as relevant” in crypto “regardless of underlying technologies,” he added. According to him, investors need disclosure to “help them sort between the investments that they think will flourish and those that they think will flounder.”

The SEC chairman further called for crypto intermediaries like exchanges to be registered with the agency as securities exchanges and broker-dealers. On top of that, he wants crypto and stablecoin operators to “get their tokens registered and regulated.”

Gensler concluded his remarks by quoting Joseph Kennedy, the first Chairman of the SEC, “No honest business need fear the SEC.”

James Powell, chair of the Federal Reserve, echoed Gensler’s sentiments, saying the sector needs to be “appropriately regulated” if it intends to play any role in the global financial system.

Sharing Power With the CFTC

At a separate event, Gensler signalled that he would support Congress handing more authority to the Commodity Futures Trading Commission (CFTC) greater power to oversee certain cryptocurrencies.

Speaking at an industry conference, the Chairman said he looked forward to working with Congress to give the SEC’s sister markets regulator added power to oversee and regulate “non-security tokens…and the related intermediaries.”

Gensler also said he is open to working with the lawmakers as long as they “don’t inadvertently undermine securities laws.”

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope!

Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

SEC Head Honcho Gary Gensler Double Downs on Demand for Strong Crypto Regulations and Investor… was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Ocean Protocol

OceanDAO Round 21 Results

13 proposals received and 416,395 OCEAN granted OceanDAO Grants Hello, Ocean Community! The OceanDAO is honoured to share the results of the 18th round of our community grants initiative: A total of 416,395 OCEAN was available. The conversion rate of 0.18 OCEAN/USD. A final amount of 75,000 USD (at time of voting deadline) was available for batch grants. 10,000 USD were assigned

13 proposals received and 416,395 OCEAN granted

OceanDAO Grants

Hello, Ocean Community!

The OceanDAO is honoured to share the results of the 18th round of our community grants initiative:

A total of 416,395 OCEAN was available. The conversion rate of 0.18 OCEAN/USD. A final amount of 75,000 USD (at time of voting deadline) was available for batch grants. 10,000 USD were assigned to Algovera for their Micro-Grants program.

All funds that were not granted will be recycled back into the treasury as part of our initiatives to continue leveraging treasury funds into greater outcomes.

Round 21 included 5 first-time projects and 8 returning projects.

416,395 $OCEAN have been granted

Congratulations to all grant recipients! For the full vote results please see the Voting Page. You can also view the expanded proposal details on our Round 21 Ocean Port Forum!

For the most recent updates, check: Twitter @oceanprotocol, @oceandao_, and our blog.

For information on getting started with OceanDAO, we invite you to get involved and learn more about Ocean’s community-curated funding on the OceanDAO website.

Thanks to all proposers, participants, and voters who engaged in Round 20!

OceanDAO Round 21 Results

You can find the final results on the Oceanpearl Leaderboard, or for a fully detailed overview on our Round 21 — Votes page.

Round 21 Rules

Proposals with 50% or more “Yes” Votes received a grant until the “Total Round Funding Available” is depleted in descending number of votes received order.

Claiming your Grant

If your Proposal was voted to receive a grant, you have 14 days to claim your granted $OCEAN: here

You can find instructions on claiming your grant here.

Deadline to claim your grant (claiming for grants is after this period):
September 26th, 2022, midnight UTC

Funding Tiers — All Categories (max per team):

New Project Funding Ceiling: $3,000 USD Requires: No one in your project has ever received a grant from OceanDAO. Open to all. Benefits: Earmarked. Receive feedback during the application process. Introduced to related projects.

2. Existing Project

Funding Ceiling: $7,500 USD Requires: You have completed 1 or more grants. Benefits: Same as above. Receive promotion via Newsletter, Twitter, and other channels.

3. Experienced Project

Funding Ceiling: $15,000 USD Requires: You have completed 2 or more grants.

Earmarks

“Earmarks” means that there are funds available exclusively to the first three groups listed below, without having to compete as an incentive to apply.

6,000 USD for New Teams (non-outreach category) 3,000 USD for New Teams (outreach category) 17,500 USD for 2nd/3rd Time Teams (all teams) 13,000 USD for Core Tech Initiatives (listed below) 35,500 USD for remaining General Grants

To distribute 75,000 USD, earmarks were adjusted based on the same distribution for a total of 416,395 OCEAN. The final numbers distributed are as such:

33,311 OCEAN for New Teams (non-outreach category) 16,655 OCEAN for New Teams (outreach category) 97,158 OCEAN for 2nd/3rd Time Teams (all teams) 72,175 OCEAN for Core Tech Teams 197,093 OCEAN for remaining General Grants

The grant proposals from the snapshot ballot that met these criteria were selected to receive their OCEAN Amount Requested to foster positive value creation for the overall Ocean ecosystem.

Voting opened on September 8th at midnight UTC Voting closed on September 12th at midnight UTC

Proposal Vote Results:

13 proposals submitted 11 funded or partially funded 54 unique wallets voted (plus delegations) 6,150,205 OCEAN voted Yes on proposals 0 $OCEAN voted No on proposals 6,150,205 OCEAN tokens voted across all proposals 416,395 OCEAN have been granted 0 OCEAN will be returned to the treasury Winning Proposals

Compare: Oceanpearl Leaderboard

Including Completely and partially funded projects

General Grants

DATALATTE: Share your data to Earn passive income! We empower internet users to monetize their own data and provide data scientists with access to non-identifiable users’ data using AI Feature Store at an affordable price.

WeDataNation: WeDataNation offers an anonymous and secure platform for user data aggregation. Users can upload their data from social media, e-commerce, gaming, streaming and Web3 apps on WeDataNation. Their data runs through algorithms (in an anonymous way) and is pooled into a dataset. Insights of the dataset can then be purchased by various players such as marketing agencies to improve their marketing strategy. The generated revenue is shared between all users.

Walt.id: With this project we enable NFT-based access for applications used by the OCEAN community like Discord or Oceanpearl as well as for one of the most used open source access management tools globally: KeyCloak.

New Entrants

Visualizing C2D: Intuitive Visual Apps for Monitoring Compute and Access Services

Startin’blox: Data app framework for Ocean’s ecosystem

New Outreach

HYGROPRO Defarm: We build controlled environment devices that use an array of sensors and actuators to tailor the plant environment and empower the user to achieve different plant expressions. So, not only are our users farming crops, they are also generating a lot of data valuable to large-scale operations. That’s why we want to create proof of growing data market, the Defarm.

Stories_of_ai: Community outreach to minority AI teams beyond web3 to create awareness of how to monetize data sets and decentralized AI projects to the Ocean market

Wat: Fully bootstrapped, we distributed the Wat API keys to more than 20 projects, from trader.xyz to Reservoir.

2nd/3rd Time Proposals

ITRMachines: With the use of the AI libraries created by ITRMachines for the Ocean community (oceanai-js & oceanai-py), we propose the creation of a dapp that will provide developing environments as a service to facilitate the construction, testing and optimization of AI models to forecast time series.

Onboard — web3, in your hands: Onboard is creating a seamless onboarding experience into OceanDAO, using our no-code platform for creating gamified learning pathways; we are starting with creating a ‘Lite’ version of the Ocean Academy, with the aim of stimulating curiosity and priming new users to become Ocean Ambassadors.

Eden Protocol of Developer DAO: We believe our ethos around data interoperability really aligns deeply with the OceanDAO philosophy. In order to be able to add to the Ocean Ecosystem with algorithms & data, we need to grow Eden beyond Developer_DAO.

OceanDAO Ecosystem

Continue to support and track the progress of all of the grant recipients inside of Ocean Pearl or view our Funding History page to learn more about the history of the grants program.

Much more to come — join our Town Halls to stay up to date and see you in Round 20.

Thank you!

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

OceanDAO Round 21 Results was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


Innopay

INNOPAY paper on data sharing published in CEUR Workshop Proceedings

INNOPAY paper on data sharing published in CEUR Workshop Proceedings bauke 15 September 2022 - 15:01 This week, CEUR-WS.org has published the paper titled ‘Harmonization Profiles for Trusted Data Sharing Between Data Spaces: Striking the Balance between Functionality and Complexity’ in the CEUR Workshop Proceedings. Co-authored by INNOPAY’s Ba
INNOPAY paper on data sharing published in CEUR Workshop Proceedings bauke 15 September 2022 - 15:01

This week, CEUR-WS.org has published the paper titled ‘Harmonization Profiles for Trusted Data Sharing Between Data Spaces: Striking the Balance between Functionality and Complexity’ in the CEUR Workshop Proceedings. Co-authored by INNOPAY’s Bauke Rietveld and Vincent Jansen in collaboration with TNO, the paper explores what is required to enable interoperable data sharing between a wide variety of data spaces, in alignment with the EU Data Strategy. The paper was presented at the I-ESA 2022 conference earlier this year.

The ambition of the EU Data Strategy can be summarised as a ‘federation of interoperable data spaces’. Current data space implementations are already seen to have a wide range of architectures, frameworks and protocols. The Data Sharing Coalition and the NL AI Coalition Data Sharing Working Group have done considerable work to identify the concept of harmonisation profiles required for interoperability. Building upon this work, the paper presents a framework for structuring these harmonisation profiles.

CEUR-WS.org is a publication channel for workshops and conferences from the computer science and information systems domain.

Data Sharing Data sharing Off Afbeelding verbergen op home On

OWI - State of Identity

The Three Pillars of Cybersecurity

On this week’s State of Identity podcast, host Cameron D’Ambrosi sits down with serial entrepreneur, Mickey Boodaei, CEO and Co-Founder of Transmit Security. This duo discusses the challenges of finding an internal stakeholder champion to "own" identity across business units, why the UX battleground isn't just about your competitors, it's about any consumer experience across industry verticals, an

On this week’s State of Identity podcast, host Cameron D’Ambrosi sits down with serial entrepreneur, Mickey Boodaei, CEO and Co-Founder of Transmit Security. This duo discusses the challenges of finding an internal stakeholder champion to "own" identity across business units, why the UX battleground isn't just about your competitors, it's about any consumer experience across industry verticals, and the importance of shifting enterprise perspective on identity to encompass the entirety of the "digital identity lifecycle.”


Forgerock Blog

Raising the Bar on Identity Orchestration

ForgeRock introduces next-generation identity orchestration features to deliver superior digital identity experiences ForgeRock announced today a major update to our industry leading identity orchestration capability, known as Intelligent Access Trees. This is a proud day for the ForgeRock team as we advance our vision around identity orchestration. Enterprises can now deliver great user experi
ForgeRock introduces next-generation identity orchestration features to deliver superior digital identity experiences

ForgeRock announced today a major update to our industry leading identity orchestration capability, known as Intelligent Access Trees. This is a proud day for the ForgeRock team as we advance our vision around identity orchestration. Enterprises can now deliver great user experiences with much less effort. It's faster and easier for organizations to personalize customer and workforce journeys, deliver innovative experiences with no-code development, integrate identity into applications, and measure user engagement with the industry's first journey analytics dashboard.

Five New Capabilities Put Intelligent Access Trees in a Class of their Own

Our approach to identity orchestration is grounded in the belief that creating effortless digital experiences should be as natural as breathing. That's why it was natively built into our unified platform from the start. These 5 new capabilities help organizations take the user journey to the next level:

Dynamic and Contextual User Journeys Make Personalization a Breeze

We've introduced the ability to enhance user journeys and make them "just right" for an individual without disrupting their experience or creating extra work for developers. It's easier than ever for organizations to change a person's user interface, language, and terms and conditions — or even offer a more accessible experience for someone with unique needs.

No-Code Drag-and-Drop Nodes Support Broad Use Cases

Pre-built out-of-the-box nodes span a wide variety of use cases ranging from registration, social authentication, multi-factor authentication (MFA), A/B testing, and zero trust, to name a few. Plus, more than 150 additional third-party integrations are now available through the ForgeRock Trust Network. All of this adds up to faster out-of-the-box use case solutions, while reducing development and maintenance costs.

Infusing AI Directly into the Journey

We've combined the best of Intelligent Access Trees and Autonomous Access. Our updated orchestration engine is infused with artificial intelligence (AI)-powered threat protection that can be leveraged with no-code drag-and-drop capabilities. By leveraging AI, organizations can optimize a journey to minimize friction and improve the user experience while protecting against account takeover and fraud. We are especially excited to share that we continue to focus on what's important: making it as easy as possible to use AI with the latest advanced enhancements.

Measure User Engagement with Journey Analytics

The new journey analytics dashboard helps administrators measure and project trends for multiple parameters: total and engaged users, number of new sign-ups, and success and failure rates of individual user journeys across the identity lifecycle. These powerful analytics give organizations insights into how friction in user journeys affects the user experience.

Managing Journeys at Scale

Organizations often house large libraries of journeys that are difficult and costly to manage. To lighten the burden, we've added features for testing and debugging, tagging, organizing, and searching journeys. These features vastly simplify processes for developers, and are particularly helpful when journeys are connected to a specific region, brand, or customer segment.

Natively Built Into Our Unified Platform

Our vision came into being with our 2018 release of Intelligent Access, a first-of-its-kind no-code/low-code identity orchestration capability that allows organizations to quickly and easily define personalized and secure user journeys across the identity lifecycle. Intelligent Access enables organizations to finely orchestrate every aspect of the user journey — from registration to password and profile self-service, to authentication and ongoing authorization. Because all these features are now native to our unified platform, organizations reap these benefits:

Reduced complexity: Development and administration from a single pane of glass means configuration of end-to-end flows does not require integration of disparate, bolted-on products which are administered separately. Simpler, less risky policy creation: Policies for registration, authentication, and authorization are configured in one place to avoid inevitable "drift" that arises when security is configured in unintegrated systems. Flexible user journey design: Context can be injected at exactly the proper point in the user journey, if needed. The user journey can dynamically respond to context and then adjust for less friction or more friction, in line with requirements. Easier, faster updates and changes: Our SDKs are fully integrated and automatically reflect any changes to user journeys without the need to republish applications. Adding new MFA flow to journeys? No problem. There's no need for re-integration. Minimized risk: Execution takes place inside the same runtime as the rest of our platform. This means all of the security, controls, service uptime, and performance characteristics are also unified. Organizations can take comfort in knowing that our proprietary, patented cloud security architecture with full tenant isolation applies to all of our products equally, with nothing bolted on that could expose the organization to increased risk. Streamlined management and operations: Built-in orchestration means there is only one product to manage from an operational and SLA perspective. Furthermore, it's available across the platform, whether you use our cloud service, self-managed software, or you're operating a hybrid environment. Troubleshooting and diagnosing issues does not require stitching together complex logs from different systems without the ability to trace transactions end to end.

Native, built-in orchestration is a smart choice. Organizations benefit from lower risk and tremendous cost savings in development and ongoing operations and maintenance.

We're looking forward to seeing what our customers do with these new capabilities.

Learn More

Want to understand how it all works? Visit: https://www.forgerock.com/platform/intelligent-access/orchestration/capabilities

New to Intelligent Access Trees? Check out this introduction video.

Interested in the benefits of Orchestration Built Into a Unified Platform? Read our documentation.


SelfKey

Ethereum Merge

Now the merging is finished. The Merge is about Ethereum transitioning to proof of stake, a more energy-efficient way of validating transactions that take place on the platform. The post Ethereum Merge appeared first on SelfKey.

Now the merging is finished. The Merge is about Ethereum transitioning to proof of stake, a more energy-efficient way of validating transactions that take place on the platform.

The post Ethereum Merge appeared first on SelfKey.


Okta

Communicate Between Microservices with Apache Kafka

One of the traditional approaches for communicating between microservices is through their REST APIs. However, as your system evolves and the number of microservices grows, communication becomes more complex, and the architecture might start resembling our old friend the spaghetti anti-pattern, with services depending on each other or tightly coupled, slowing down development teams. This model can

One of the traditional approaches for communicating between microservices is through their REST APIs. However, as your system evolves and the number of microservices grows, communication becomes more complex, and the architecture might start resembling our old friend the spaghetti anti-pattern, with services depending on each other or tightly coupled, slowing down development teams. This model can exhibit low latency but only works if services are made highly available.

To overcome this design disadvantage, new architectures aim to decouple senders from receivers, with asynchronous messaging. In a Kafka-centric architecture, low latency is preserved, with additional advantages like message balancing among available consumers and centralized management.

When dealing with a brownfield platform (legacy), a recommended way to decouple a monolith and ready it for a move to microservices is to implement asynchronous messaging.

In this tutorial you will learn how to:

Create a microservices architecture with JHipster Enable Kafka integration for communicating microservices Set up Okta as the authentication provider

Table of Contents

What is Kafka? Microservices communication with Kafka Configure microservices deployment with Docker Compose Add OpenID Connect (OIDC) authentication Use Spring Cloud Config to override OIDC settings Communicate between store and alert microservices Enable debug logging in production Add an EmailService to the alert microservice Add a Kafka consumer to persist alert and send email Microservices + Kafka container deployment Learn more about Kafka and microservices What is Kafka?

Apache Kafka is a distributed streaming platform. It was initially conceived as a message queue and open-sourced by LinkedIn in 2011. Its community evolved Kafka to provide key capabilities:

Publish and Subscribe to streams of records, like a message queue. Storage system so messages can be consumed asynchronously. Kafka writes data to a scalable disk structure and replicates it for fault-tolerance. Producers can wait for write acknowledgments. Stream processing with Kafka Streams API, enables complex aggregations or joins of input streams onto an output stream of processed data.

Traditional messaging models are queue and publish-subscribe. In a queue, each record goes to one consumer. In publish-subscribe, the record is received by all consumers.

The Consumer Group in Kafka is an abstraction that combines both models. Record processing can be load balanced among the members of a consumer group and Kafka allows you to broadcast messages to multiple consumer groups. It is the same publish-subscribe semantic where the subscriber is a cluster of consumers instead of a single process.

Popular use cases of Kafka include:

Traditional messaging, to decouple data producers from processors with better latency and scalability. Site activity tracking with real-time publish-subscribe feeds. As a replacement for file-based log aggregation, where event data becomes a stream of messages. Data pipelines where data consumed from topics is transformed and fed to new topics. As an external commit log for a distributed system. As backend log storage for event sourcing applications, where each state change is logged in time order. Microservices communication with Kafka

Let’s build a microservices architecture with JHipster and Kafka support. In this tutorial, you’ll create store and alert microservices. The store microservices will create and update store records. The alert microservice will receive update events from store and send an email alert.

Prerequisites:

Java 11+ Docker 20.10.17 Node.js 16.17.0 Okta CLI 0.10.0

Install JHipster.

npm install -g generator-jhipster@7.9.3

The --version command should output something like this:

$ jhipster --version INFO! Using bundled JHipster 7.9.3

Create a directory for the project.

mkdir jhipster-kafka cd jhipster-kafka

Create an apps.jdl file that defines the store, alert, and gateway applications in JHipster Domain Language (JDL). Kafka integration is enabled by adding messageBroker kafka to the store and alert app definitions.

application { config { baseName gateway, packageName com.okta.developer.gateway, applicationType gateway, authenticationType oauth2, prodDatabaseType postgresql, serviceDiscoveryType eureka, testFrameworks [cypress] } entities Store, StoreAlert } application { config { baseName store, packageName com.okta.developer.store, applicationType microservice, authenticationType oauth2, databaseType mongodb, devDatabaseType mongodb, prodDatabaseType mongodb, enableHibernateCache false, serverPort 8082, serviceDiscoveryType eureka messageBroker kafka } entities Store } application { config { baseName alert, packageName com.okta.developer.alert, applicationType microservice, authenticationType oauth2, serverPort 8082, serviceDiscoveryType eureka messageBroker kafka } entities StoreAlert } enum StoreStatus { OPEN, CLOSED } entity Store { name String required, address String required, status StoreStatus, createTimestamp Instant required, updateTimestamp Instant } entity StoreAlert { storeName String required, storeStatus String required, timestamp Instant required } microservice Store with store microservice StoreAlert with alert

Now, in your jhipster-kafka folder, import this file with the following command:

jhipster jdl apps.jdl Configure microservices deployment with Docker Compose

In the project folder, create a sub-folder for Docker Compose and run JHipster’s docker-compose sub-generator.

mkdir docker-compose cd docker-compose jhipster docker-compose

The generator will ask you to define the following things:

Type of application: Microservice application Type of gateway: JHipster gateway based on Spring Cloud Gateway Leave the root directory for services as default: ../ Which applications to include: gateway, store, alert Which applications do you want to use with clustered databases: (none) If monitoring should be enabled: No Password for JHipster Registry: <default>

Almost when the generator completes, a warning shows in the output:

WARNING! Docker Compose configuration generated, but no Jib cache found If you forgot to generate the Docker image for this application, please run: To generate the missing Docker image(s), please run: ./mvnw -ntp -Pprod verify jib:dockerBuild in /home/indiepopart/jhipster-kafka/alert ./mvnw -ntp -Pprod verify jib:dockerBuild in /home/indiepopart/jhipster-kafka/gateway ./mvnw -ntp -Pprod verify jib:dockerBuild in /home/indiepopart/jhipster-kafka/store

You will generate the images later, but first, let’s add some security and Kafka integration to your microservices.

Add OpenID Connect (OIDC) authentication

This microservices architecture is set up to authenticate against Keycloak. Let’s update the settings to use Okta as the authentication provider.

Before you begin, you’ll need a free Okta developer account. Install the Okta CLI and run okta register to sign up for a new account. If you already have an account, run okta login. Then, run okta apps create jhipster. Select the default app name, or change it as you see fit. Then, change the Redirect URIs to:

http://localhost:8081/login/oauth2/code/oidc,http://localhost:8761/login/oauth2/code/oidc

Use http://localhost:8081,http://localhost:8761 for the Logout Redirect URIs.

What does the Okta CLI do?

The Okta CLI streamlines configuring a JHipster app and does several things for you:

Creates an OIDC app with the correct (see above, below are the default values) redirect URIs: login: http://localhost:8080/login/oauth2/code/oidc and http://localhost:8761/login/oauth2/code/oidc logout: http://localhost:8080 and http://localhost:8761 Creates ROLE_ADMIN and ROLE_USER groups that JHipster expects Adds your current user to the ROLE_ADMIN and ROLE_USER groups Creates a groups claim in your default authorization server and adds the user’s groups to it

NOTE: The http://localhost:8761* redirect URIs are for the JHipster Registry, which is often used when creating microservices with JHipster. The Okta CLI adds these by default.

You will see output like the following when it’s finished:

Okta application configuration has been written to: /path/to/app/.okta.env

Run cat .okta.env (or type .okta.env on Windows) to see the issuer and credentials for your app. It will look like this (except the placeholder values will be populated):

export SPRING_SECURITY_OAUTH2_CLIENT_PROVIDER_OIDC_ISSUER_URI="/oauth2/default" export SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_OIDC_CLIENT_ID="{clientId}" export SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_OIDC_CLIENT_SECRET="{clientSecret}"

NOTE: You can also use the Okta Admin Console to create your app. See Create a JHipster App on Okta for more information.

In the project, create a docker-compose/.env file and add the following variables. For the values, use the settings from the Okta web application you created:

OIDC_ISSUER_URI={yourIssuer} OIDC_CLIENT_ID={yourClientId} OIDC_CLIENT_SECRET={yourClientSecret}

Edit docker-compose/docker-compose.yml and update the SPRING_SECURITY_* settings for the services store-app, alert-app, gateway-app, and jhipster-registry:

- SPRING_SECURITY_OAUTH2_CLIENT_PROVIDER_OIDC_ISSUER_URI=${OIDC_ISSUER_URI} - SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_OIDC_CLIENT_ID=${OIDC_CLIENT_ID} - SPRING_SECURITY_OAUTH2_CLIENT_REGISTRATION_OIDC_CLIENT_SECRET=${OIDC_CLIENT_SECRET} Use Spring Cloud Config to override OIDC settings

An alternative to setting environment variables for each application in docker-compose.yml is to use Spring Cloud Config. JHipster Registry includes Spring Cloud Config, so it’s pretty easy to do.

Open docker-compose/central-server-config/application.yml and add your Okta settings there.

spring: security: oauth2: client: provider: oidc: issuer-uri: /oauth2/default registration: oidc: client-id: {yourClientId} client-secret: {yourClientSecret}

The registry, gateway, store, and alert applications are all configured to read this configuration on startup.

Communicate between store and alert microservices

The JHipster generator adds a spring-cloud-starter-stream-kafka dependency to applications that declare messageBroker kafka (in JDL), enabling the Spring Cloud Stream programming model with the Apache Kafka binder for using Kafka as the messaging middleware.

Spring Cloud Stream was recently added back to JHipster. Now, instead of working with Kafka Core APIs, we can use the binder abstraction, declaring input/output arguments in the code, and letting the specific binder implementation handle the mapping to the broker destination.

IMPORTANT NOTE: At this moment, JHipster includes Spring Cloud Stream 3.2.4, which has deprecated the annotation-based programming model, @EnableBinding and @StreamListener annotations, in favor of the functional programming model. Stay tuned for future JHipster updates.

For the sake of this example, update the store microservice to send a message to the alert microservice through Kafka, whenever a store entity is updated.

First, create an outbound binding for a new topic store-alerts. Add the interface KafkaStoreAlertProducer:

package com.okta.developer.store.config; import org.springframework.cloud.stream.annotation.Output; import org.springframework.messaging.MessageChannel; public interface KafkaStoreAlertProducer { String CHANNELNAME = "binding-out-store-alert"; @Output(CHANNELNAME) MessageChannel output(); }

Include the outbound binding in the WebConfigurer:

package com.okta.developer.store.config; @EnableBinding({ KafkaSseConsumer.class, KafkaSseProducer.class, KafkaStoreAlertProducer.class }) @Configuration public class WebConfigurer implements ServletContextInitializer { ...

Add the binding configuration to application.yml:

spring: cloud: stream: ... bindings: ... binding-out-store-alert: destination: store-alerts-topic content-type: application/json group: store-alerts

In the store project, create an AlertService for sending the event details.

package com.okta.developer.store.service; import com.okta.developer.store.config.KafkaStoreAlertProducer; import com.okta.developer.store.domain.Store; import com.okta.developer.store.service.dto.StoreAlertDTO; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Qualifier; import org.springframework.messaging.MessageChannel; import org.springframework.messaging.MessageHeaders; import org.springframework.messaging.support.GenericMessage; import org.springframework.stereotype.Service; import org.springframework.util.MimeTypeUtils; import java.util.HashMap; import java.util.Map; @Service public class AlertService { private final Logger log = LoggerFactory.getLogger(AlertService.class); private final MessageChannel output; public AlertService(@Qualifier(KafkaStoreAlertProducer.CHANNELNAME) MessageChannel output) { this.output = output; } public void alertStoreStatus(Store store) { try { StoreAlertDTO storeAlertDTO = new StoreAlertDTO(store); log.debug("Request the message : {} to send to store-alert topic ", storeAlertDTO); Map<String, Object> map = new HashMap<>(); map.put(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON); MessageHeaders headers = new MessageHeaders(map); output.send(new GenericMessage<>(storeAlertDTO, headers)); } catch (Exception e){ log.error("Could not send store alert", e); throw new AlertServiceException(e); } } }

Create the referenced AlertServiceException class.

package com.okta.developer.store.service; public class AlertServiceException extends RuntimeException { public AlertServiceException(Throwable e) { super(e); } }

And add a StoreAlertDTO class in the ...service.dto package.

package com.okta.developer.store.service.dto; import com.okta.developer.store.domain.Store; public class StoreAlertDTO { private String storeName; private String storeStatus; public StoreAlertDTO(Store store){ this.storeName = store.getName(); this.storeStatus = store.getStatus().name(); } public String getStoreName() { return storeName; } public void setStoreName(String storeName) { this.storeName = storeName; } public String getStoreStatus() { return storeStatus; } public void setStoreStatus(String storeStatus) { this.storeStatus = storeStatus; } }

Inject the AlertService into the StoreResource API implementation, modifying its constructor. Also modify the updateStore call to publish a StoreAlertDTO for the alert service:

@RestController @RequestMapping("/api") public class StoreResource { ... private final StoreRepository storeRepository; private final AlertService alertService; public StoreResource(StoreRepository storeRepository, AlertService alertService) { this.storeRepository = storeRepository; this.alertService = alertService; } ... @PutMapping("/stores/{id}") public ResponseEntity<Store> updateStore( @PathVariable(value = "id", required = false) final String id, @Valid @RequestBody Store store ) throws URISyntaxException { ... Store result = storeRepository.save(store); log.debug("SEND store alert for Store: {}", store); alertService.alertStoreStatus(result); ... } ... } Enable debug logging in production

Since you are going to deploy the prod profile, let’s enable logging in production. Modify the store/src/main/java/com/okta/.../config/LoggingAspectConfiguration.java class:

@Configuration @EnableAspectJAutoProxy public class LoggingAspectConfiguration { @Bean @Profile({JHipsterConstants.SPRING_PROFILE_DEVELOPMENT, JHipsterConstants.SPRING_PROFILE_PRODUCTION}) public LoggingAspect loggingAspect(Environment env) { return new LoggingAspect(env); } }

Edit store/src/main/resources/config/application-prod.yml and change the log level to DEBUG for the store application:

logging: level: ROOT: INFO tech.jhipster: INFO com.okta.developer.store: DEBUG Add an EmailService to the alert microservice

Now let’s customize the alert microservice. First, add the consumer declaration KafkaStoreAlertConsumer to the config:

package com.okta.developer.alert.config; import org.springframework.cloud.stream.annotation.Input; import org.springframework.messaging.MessageChannel; public interface KafkaStoreAlertConsumer { String CHANNELNAME = "binding-in-store-alert"; @Input(CHANNELNAME) MessageChannel input(); }

Include the binding in the WebConfigurer:

package com.okta.developer.alert.config; @EnableBinding({ KafkaSseConsumer.class, KafkaSseProducer.class, KafkaStoreAlertConsumer.class }) @Configuration public class WebConfigurer implements ServletContextInitializer { ...

Add the inbound binding configuration to application.yml:

spring: cloud: stream: bindings: ... binding-in-store-alert: destination: store-alerts-topic content-type: application/json group: store-alerts

Create an EmailService to send the store update notification, using the Spring Framework’s JavaMailSender.

package com.okta.developer.alert.service; import com.okta.developer.alert.service.dto.StoreAlertDTO; import org.springframework.beans.factory.annotation.Value; import org.springframework.mail.SimpleMailMessage; import org.springframework.mail.javamail.JavaMailSender; import org.springframework.stereotype.Service; @Service public class EmailService { private JavaMailSender emailSender; @Value("${alert.distribution-list}") private String distributionList; public EmailService(JavaMailSender emailSender){ this.emailSender = emailSender; } public void sendSimpleMessage(StoreAlertDTO alertDTO){ try { SimpleMailMessage message = new SimpleMailMessage(); message.setTo(distributionList); message.setSubject("Store Alert: " + alertDTO.getStoreName()); message.setText(alertDTO.getStoreStatus()); message.setFrom("StoreAlert"); emailSender.send(message); } catch (Exception exception) { throw new EmailServiceException(exception); } } }

Create the referenced EmailServiceException.

package com.okta.developer.alert.service; public class EmailServiceException extends RuntimeException { public EmailServiceException(Exception exception) { super(exception); } }

Add a StoreAlertDTO class in the ...service.dto package.

package com.okta.developer.alert.service.dto; public class StoreAlertDTO { private String storeName; private String storeStatus; public String getStoreName() { return storeName; } public void setStoreName(String storeName) { this.storeName = storeName; } public String getStoreStatus() { return storeStatus; } public void setStoreStatus(String storeStatus) { this.storeStatus = storeStatus; } }

Add a new property to alert/src/main/resources/config/application.yml and to alert/src/test/resources/config/application.yml for the destination email of the store alert.

alert: distribution-list: {distributionListAddress}

NOTE: You’ll need to set a value for the email (e.g. list@email.com will work) in src/test/.../application.yml for tests to pass. For Docker, you’ll override the {distributionListAddress} and {username} + {password} placeholder values with environment variables below.

Update spring.mail.* properties in application-prod.yml to set Gmail as the email service:

spring: ... mail: host: smtp.gmail.com port: 587 username: {username} protocol: smtp tls: true properties.mail.smtp: auth: true starttls.enable: true Add a Kafka consumer to persist alert and send email

Create an AlertConsumer service to persist a StoreAlert and send the email notification when receiving an alert message through Kafka. Add KafkaProperties, StoreAlertRepository, and EmailService as constructor arguments. Then add a start() method to initialize the consumer and enter the processing loop.

package com.okta.developer.alert.service; import com.okta.developer.alert.config.KafkaStoreAlertConsumer; import com.okta.developer.alert.domain.StoreAlert; import com.okta.developer.alert.repository.StoreAlertRepository; import com.okta.developer.alert.service.dto.StoreAlertDTO; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.cloud.stream.annotation.StreamListener; import org.springframework.messaging.Message; import org.springframework.stereotype.Service; import java.time.Instant; @Service public class AlertConsumer { private final Logger log = LoggerFactory.getLogger(AlertConsumer.class); private StoreAlertRepository storeAlertRepository; private EmailService emailService; public AlertConsumer(StoreAlertRepository storeAlertRepository, EmailService emailService) { this.storeAlertRepository = storeAlertRepository; this.emailService = emailService; } @StreamListener(value = KafkaStoreAlertConsumer.CHANNELNAME, copyHeaders = "false") public void consume(Message<StoreAlertDTO> message) { StoreAlertDTO dto = message.getPayload(); log.info("Got message from kafka stream: {} {}", dto.getStoreName(), dto.getStoreStatus()); try { StoreAlert storeAlert = new StoreAlert(); storeAlert.setStoreName(dto.getStoreName()); storeAlert.setStoreStatus(dto.getStoreStatus()); storeAlert.setTimestamp(Instant.now()); storeAlertRepository.save(storeAlert); emailService.sendSimpleMessage(dto); } catch (Exception e) { log.error(e.getMessage(), e); } } }

NOTE: Any unhandled exception during message processing will make the service leave the consumer group. That’s why there’s code above that catches Exception.

As a last customization step, update the logging configuration the same way you did for the store microservice.

Microservices + Kafka container deployment

Modify docker-compose/docker-compose.yml and add the following environment variables for the alert application:

- SPRING_MAIL_USERNAME=${MAIL_USERNAME} - SPRING_MAIL_PASSWORD=${MAIL_PASSWORD} - ALERT_DISTRIBUTION_LIST=${DISTRIBUTION_LIST}

Edit docker-compose/.env and add values for the new environment variables:

MAIL_USERNAME={yourGmailAccount} MAIL_PASSWORD={yourPassword} DISTRIBUTION_LIST={anotherEmailAccount}

Make sure Docker Desktop is running, then generate the Docker image for the store microservice. Run the following command from the store directory.

./mvnw -ntp -Pprod verify jib:dockerBuild # `npm run java:docker` is a shortcut for the above command

NOTE: If you’re using Apple Silicon, you’ll need to use npm run java:docker:arm64.

Repeat for the alert and gateway apps.

Before you run your microservices architecture, make sure you have enough RAM allocated. Docker Desktop’s default is 2GB, I recommend 8GB. This setting is under Docker > Resources > Advanced.

Then, run everything using Docker Compose:

cd docker-compose docker compose up

You will see a huge amount of logging while each service starts. Wait a minute or two, then open http://localhost:8761 and log in with your Okta account. This is the JHipster Registry which you can use to monitor your apps’ statuses. Wait for all the services to be up.

Open a new terminal window and tail the alert microservice logs to verify it’s processing StoreAlert records:

docker logs -f docker-compose-alert-1 | grep Consumer

You should see log entries indicating the consumer group to which the alert microservice joined on startup:

2022-09-05 15:20:44.146 INFO 1 --- [ main] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-store-alerts-4, groupId=store-alerts] Cluster ID: pyoOBVa3T3Gr1VP3rJBOlQ 2022-09-05 15:20:44.151 INFO 1 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-4, groupId=store-alerts] Resetting generation due to: consumer pro-actively leaving the group 2022-09-05 15:20:44.151 INFO 1 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-4, groupId=store-alerts] Request joining group due to: consumer pro-actively leaving the group 2022-09-05 15:20:44.162 INFO 1 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: 2022-09-05 15:20:44.190 INFO 1 --- [ main] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] Subscribed to topic(s): store-alerts-topic 2022-09-05 15:20:44.225 INFO 1 --- [container-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] Resetting the last seen epoch of partition store-alerts-topic-0 to 0 since the associated topicId changed from null to 0G-IFWw-S9C3fEGLXDCOrw 2022-09-05 15:20:44.226 INFO 1 --- [container-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] Cluster ID: pyoOBVa3T3Gr1VP3rJBOlQ 2022-09-05 15:20:44.227 INFO 1 --- [container-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] Discovered group coordinator kafka:9092 (id: 2147483645 rack: null) 2022-09-05 15:20:44.229 INFO 1 --- [container-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] (Re-)joining group 2022-09-05 15:20:44.238 INFO 1 --- [container-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] Request joining group due to: need to re-join with the given member-id 2022-09-05 15:20:44.239 INFO 1 --- [container-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-store-alerts-5, groupId=store-alerts] (Re-)joining group

Once everything is up, go to the gateway at http://localhost:8081 and log in. Create a store entity and then update it. The alert microservice should log entries when processing the received message from the store service.

2022-09-05 18:08:31.546 INFO 1 --- [container-0-C-1] c.o.d.alert.service.AlertConsumer : Got message from kafka stream: Candle Shop CLOSED

If you see a MailAuthenticationException in the alert microservices log when attempting to send the notification, it might be your Gmail security configuration.

alert-app_1 | org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException: 535-5.7.8 Username and Password not accepted. Learn more at alert-app_1 | 535 5.7.8 https://support.google.com/mail/?p=BadCredentials *** - gsmtp alert-app_1 | alert-app_1 | at org.springframework.mail.javamail.JavaMailSenderImpl.doSend(JavaMailSenderImpl.java:440)

To enable the login from the alert application, go to https://myaccount.google.com and then choose the Security tab. Turn on 2-Step Verification for your account. In the section Signing in to Google, choose App passwords and create a new app password. In the Select app dropdown set Other (Custom name) and type the name for this password. Click Generate and copy the password. Update docker-compose/.env and set the app password for Gmail authentication.

MAIL_PASSWORD={yourAppPassword}

IMPORTANT: Don’t forget to delete the app password once the test is done.

Stop all the containers with CTRL+C and restart again with docker compose up. Update a store again and you should receive an email with the store’s status this time.

In this tutorial, authentication (of producers and consumers), authorization (of read/write operations), and encryption (of data) were not covered, as security in Kafka is optional. See Kafka’s documentation on security to learn how to enable these features.

Learn more about Kafka and microservices

This tutorial showed how a Kafka-centric architecture allows decoupling microservices to simplify the design and development of distributed systems. To continue learning about these topics check out the following links:

JHipster: Using Kafka JHipster: OAuth2 and OpenID Connect Apache Kafka Introduction

You can find all the code for this tutorial on GitHub in the @oktadev/okta-kafka-microservices-example repository.

There are also a few tutorials on Kafka, microservices, and JHipster that you might enjoy on this blog:

Reactive Java Microservices with Spring Boot and JHipster Secure Kafka Streams with Quarkus and Java A Quick Guide to Spring Cloud Stream

Please follow us @oktadev on Twitter for more tutorials like this one. We also have a YouTube channel where we frequently publish videos.


Metadium

Metadium Tech: The Series

Do you understand blockchain and its concepts? From DID to DeFi and EVM, Metadium has your back. Dear Metadium community, We understand that with the constant development of blockchain technology sometimes it’s difficult to stay updated and understand all its concepts and implementations. Nevertheless, it’s important that more people get familiar with all these notions in order for the tech
Do you understand blockchain and its concepts? From DID to DeFi and EVM, Metadium has your back.

Dear Metadium community,

We understand that with the constant development of blockchain technology sometimes it’s difficult to stay updated and understand all its concepts and implementations. Nevertheless, it’s important that more people get familiar with all these notions in order for the technology to continue expanding its reach. This is why the Metadium Team created a series for those who want to learn the necessary notions in an easy way:

Metadium Tech Series

In the upcoming weeks and months, our experienced developers will guide you in a series of posts that introduce and explain multiple topics ranging from technical content, like the basic structure of the blockchain and the principles of operation, to analyzing the limitations and problems the technology faces.

Is there a topic you would like to know more about? Leave us a comment on this post or on our social media and we will select the most popular requests.

— Metadium Team.

If you have questions about developing on Metadium, please contact our team of developers.

Metadium Tech: The Series was originally published in Metadium on Medium, where people are continuing the conversation by highlighting and responding to this story.


Imageware

Danger: Device Biometrics are NOT Safe!

The post Danger: Device Biometrics are NOT Safe! appeared first on Imageware.

Blockchain Commons

Silicon Salon 2 Posted!

Thanks to everyone who joined us for our second Silicon Salon, this one focused on Secure Boot, Supply-Chain Security, and Firmware Upgrades. Courtesy of some terrific presentations, we were able to get into good depth on these topics, both discussing the state of the art and what we could do better. You can find all the info on the Silicon Salon 2 pages of siliconsalon.info. At the website, you ca

Thanks to everyone who joined us for our second Silicon Salon, this one focused on Secure Boot, Supply-Chain Security, and Firmware Upgrades. Courtesy of some terrific presentations, we were able to get into good depth on these topics, both discussing the state of the art and what we could do better. You can find all the info on the Silicon Salon 2 pages of siliconsalon.info.

At the website, you can find the videos, slides, and transcripts of all the presentations, plus an overview of the discussion. We’re also continuing to use our Silicon Salon topic to further these discussions as part of our Community repo.

If you would be interested in sponsoring or planning Silicon Salon 3, either this fall or winter, please email us.

Wednesday, 14. September 2022

Continuum Loop Inc.

Trust Registries Tweetstorm

The post Trust Registries Tweetstorm appeared first on Continuum Loop Inc..
Trust Registries Tweetstorm

Darrell’s tweetstorm over the weekend has been getting a lot of attention (Threadreader unroll here). We want to start a conversation on Trust Registries and get people thinking about how Trust Registries will help answer the hard questions an ecosystem needs to create a whole experience. 

He points out the easy questions we already know to ask:

Where do the systems you want to work with go to get the answers to questions like How do I know can trust - this Issuer for that Credential? - trust that Verifier? - that Wallet App? 2/

— Darrell O'Donnell 🇨🇦🆔 (@darrello) September 11, 2022
However, there is much more to explore – many other questions are just as essential. We are starting the v2.0 of the Trust Registry Protocol Specification over at the ToIP Foundation to further investigate some of these questions.

A few things to mull over are below:

questions like... What credential formats does the ecosystem define? -- NO - there is no interoperable standard for real-world use of #VerifiableCredentials, so you can't say, "just use W3C Verifiable Credentials". /5

— Darrell O'Donnell 🇨🇦🆔 (@darrello) September 11, 2022

How do you do delegation in your ecosystem? Is it baked into your DID Method? How does someone confirm delegation is still valid (revocation may help - it may not)? /10

— Darrell O'Donnell 🇨🇦🆔 (@darrello) September 11, 2022

How do you support revocation in your ecosystem - if you need it? most don't - do your homework on that one - i.e. not more later. /9

— Darrell O'Donnell 🇨🇦🆔 (@darrello) September 11, 2022

The Trust Registry Task Force will discuss the questions above at the end of this month. We are starting the Trust Registry Protocol Specification next week, and you are welcome to join us at the Task Force meeting. We would love to have individuals with diverse backgrounds join us and share their knowledge and expertise. If this interests you or your organization, we hope you consider joining the working group!

Ecosystems are evolving – quickly, and it‘s time to start thinking about how we will manage trust and reputation and how Trust Registries help to anchor the confidence we need in our systems.

Visit our blog, follow us on social media, and subscribe to our newsletter to stay up-to-date on the latest Trust Registry news and updates.

Join the SSI Crew! The first step on your journey to understanding Decentralized Identity and taking control of your own digital world. You're in the Crew!

Email

Subscribe

Follow Follow Follow

The post Trust Registries Tweetstorm appeared first on Continuum Loop Inc..


Indicio

Market Signals – Why is Training Important?

The post Market Signals – Why is Training Important? appeared first on Indicio.
The value of continued training is often overlooked in the hectic business world. We take a look at some examples of why it is important to make it a priority.

By Tim Spring

Department of State Cybersecurity Training Boosts Global Resilience Against DPRK Malware

The Department of State’s Bureau of Cyberspace and Digital Policy (CDP) is sponsoring a training series to counter malware created by the Democratic People’s Republic of Korea (DPRK). The nine-day training program will be offered to six partner nations from Africa, Asia, and the western hemisphere, predominantly targeting employees and their public sector equivalent from security incident response teams. Increased training for cybersecurity across the globe is a key part of the US commitment to promoting a stable and reliable internet for all. In recent years malicious cyber activities by the DPRK have increasingly threatened the integrity and stability of the global financial systems, even stealing directly from banks. 

Patreon Lays Off Its Entire Security Team

This is probably the biggest news in the security space this week, and it has been covered by several news outlets. Here, we have a brief summary by Nathaniel Mott, a contributing writer to PCmag. While some believe this to be a particularly bold move, Patreon claims that this will in no way impact their ability to provide a secure platform and that they have partnered with a number of external organizations to continue to develop their security abilities. Security is an extremely important aspect of any business or product, and can be very costly when it goes wrong. One great aspect of verifiable credentials that Indicio is always stressing is their ability to tie into Zero Trust and your security stack.

Security Awareness Training Must Evolve to Align With Growing E-Commerce Security Threats

Digital transformation is creating new security headaches for e-commerce, says Bruno Farinelli, a Fraud Analytics Manager for ClearSale. He points to shipping fraud, email phishing attacks, and ransomware and malware that exploit hybrid and remote workforce trends as the major trends. Citing TransUnion’s “2022 Global Digital Fraud Trends” Farinelli points out that shipping fraud has skyrocketed year-on-year since 2019, with package- rerouting scams exploiting good customer service practices. Similarly, business email compromise is a growing problem, with hackers pretending to be trusted suppliers or vendors. In this fascinating article, Farinelli argues that e-commerce businesses need to see employee security awareness as a process rather than a one-off training, as the points of attack and the means of defrauding business are constantly evolving. While this is ideal, it is unrealistic to expect employees to maintain sufficient security vigilance to defeat fraud. The reality is that all these problems can be solved with verifiable credentials that provide the kind of reliable authentication within a Zero-Trust framework.

‘Cyber insecurity’ in healthcare is leading to increased patient mortality rates

Esther Shein, a freelance writer for TechRepublic, summarizes a recent report that found ransomware attacks are actively delaying procedures and tests, leading to increased complications and worse patient outcomes. 

The article breaks down statistics from the report, some of the most staggering being: 

The study surveyed 641 healthcare IT and security practitioners, and more than 20% of organizations that experienced the most common types of attacks (cloud compromise, ransomware, supply chain and business email compromise) saw increased patient mortality rates. Healthcare organizations have an average of more than 26,000 network-connected devices, only 51% include them in their cybersecurity strategy. Ransomware attacks are most likely to have a negative impact on patient care, leading to delays in procedures or tests in 64% of the organizations. Only 59% address employees’ lack of awareness, with 63% conducting regular training and awareness programs and 59% resorting to monitoring employee actions.

Shein says that training and awareness programs are the top two defenses for healthcare providers, but lack of funding and resources is a consistent hurdle for security teams. The article leaves us with Ryan Witt, healthcare cybersecurity leader at Proofpoint, weighing in on the state of cybersecurity in healthcare – “Healthcare has traditionally fallen behind other sectors in addressing vulnerabilities to the growing number of cybersecurity attacks, and this inaction has a direct negative impact on patients’ safety and wellbeing… as long as cybersecurity remains a low priority, healthcare providers will continue to endanger their patients.”

The Business and Investor’s Guide to Self-Sovereign Identity

As you can see from the above articles good training is important for both the organization and the employees that help make it successful. For those of you interested in training your own teams in decentralized identity, I highly recommend taking a look at some of the workshops offered by Indicio. The Business and Investor’s Guide to Self-Sovereign Identity is a great place to start for leadership or investors looking to understand the technology and be able to talk through their ideas with an industry expert. 

The post Market Signals – Why is Training Important? appeared first on Indicio.


Global ID

FUTURE PROOF EP. 22 — Introducing our new ID Wallet

FUTURE PROOF EP. 22 — Introducing our new ID Wallet Our biggest product release in some time, our new ID Wallet is a core pillar of our mission to enable anyone to create and own their digital identity. We spoke with GlobaliD’s Trey Steinhoff to discuss the launch. Download the latest version of the GlobaliD app Trey on Twitter Past episodes: EPISODE 21 — Building aware
FUTURE PROOF EP. 22 — Introducing our new ID Wallet

Our biggest product release in some time, our new ID Wallet is a core pillar of our mission to enable anyone to create and own their digital identity. We spoke with GlobaliD’s Trey Steinhoff to discuss the launch.

Download the latest version of the GlobaliD app Trey on Twitter Past episodes: EPISODE 21 — Building awareness around digital identity EPISODE 20 — Telling our story with the new GlobaliD website EPISODE 19 — Making decentralized identity mainstream EPISODE 18 — Everyone will have an ID wallet EPISODE 17 — Digital wallets of tomorrow will be PRIVATE EPISODE 16 — How XUMM Wallet is changing the game EPISODE 15 — Olympic hopeful Lila Lapanja is a GlobaliD ambassador EPISODE 14 — What we learned at Solana Breakpoint EPISODE 13 — DeFi and Identity: Compliance in a decentralized world EPISODE 12 — The future of GlobaliD Groups EPISODE 11 — The XRP Card and the future of communities EPISODE 10 — How to decentralize identity and empower individuals EPISODE 09 — Understanding GlobaliD’s identity platform EPISODE 08 — Owning your identity and data with VP of Design Antoine Bonnin EPISODE 07 — Understanding the future of fintech with Ayo Omojola EPISODE 06 — Establishing trust and safety in tomorrow’s networks EPISODE 05 — How ZELF combines the power of payments and messaging EPISODE 04 — The future of blockchain with the creator of Solana EPISODE 03 — Should we trust Facebook? EPISODE 02 — JP Thieriot on why Uphold isn’t delisting XRP EPISODE 01 — The SEC’s crypto turf war and why XRP isn’t a security

Have a question for us? A topic you’d like covered? A guest you’d like to see? Let us know!

GlobaliD on Twitter

FUTURE PROOF EP. 22 — Introducing our new ID Wallet was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


auth0

Debunking Common Misconceptions About Passwordless Authentication

Increase user security, convenience, and privacy by enabling authentication using device biometrics
Increase user security, convenience, and privacy by enabling authentication using device biometrics

Infocert

InfoCert, AUTHADA and Dr. Ing. Wandrei develop a new tool for QES in the circular economy

Signing documents involves a lot of effort in the office: documents travel back and forth between departments, must be printed out and copied. If the parties involved are also in different locations and for example suppliers are involved, the process of signing becomes even more time-consuming and complicated. For a long time, this posed challenges […] The post InfoCert, AUTHADA and Dr. Ing. Wan

Signing documents involves a lot of effort in the office: documents travel back and forth between departments, must be printed out and copied. If the parties involved are also in different locations and for example suppliers are involved, the process of signing becomes even more time-consuming and complicated. For a long time, this posed challenges for those involved in the circular economy.
Thanks to the cooperation of Dr. Ing. Wandrei, AUTHADA and InfoCert, this process has been optimised and an important milestone in the digitalisation of the circular economy has been achieved for all parties involved.

Since 2010, according to the electronic waste records procedure (eANV), actors in the circular economy in Germany such as carriers and disposers have already been obliged to sign disposal records and consignment notes digitally and by means of a remote signature, the Qualified Electronic Signature (QES). Until now, however, this was only possible in a roundabout way – with card readers. A procedure that no longer reflected the actual processes and necessities in an up-to-date manner.

Dr. Ing. Wandrei, a Berlin software company with 30 years old experience with software solutions characterised by cloud capability, platform-independent standards and a range of interfaces, took on this problem and looked for partners with whom it could further develop and optimise a mobile solution for NSUITE for the electronic waste records procedure (eANV) and expand it to include a hardware-independent QES.

The sticking point during the search was that most signature service providers only offered the possibility to sign Word or PDF documents via remote signature. AUTHADA and InfoCert provides a solution where documents could be signed in the prescribed XML format, becoming two perfect partners for Dr. Ing. Wandrei

Uncomplicated and fast with new technology

Thanks to the technology from AUTHADA and InfoCert implemented in NSUITE, the signature process in the circular economy is now faster and more streamlined. Drivers no longer have to carry signature cards and expensive hardware, go to the office or call an employee from the office. Instead, the signature can now be done on mobile devices such as smartphones and tablets with the new NSUITE.mobile product, with a consequent streamlining of the entire process.

Read the full Press Release

The post InfoCert, AUTHADA and Dr. Ing. Wandrei develop a new tool for QES in the circular economy appeared first on InfoCert.


Ontology

Ontology Weekly Report (September 6–13, 2022)

Highlights The 7th OWN Insights “The Financialization of DAOs and DID” has been published, written by Erick Pinos, Ontology Americas Ecosystem Lead, sharing the pathway to the broad adoption of DAOs and DID. Latest Developments Development Progress We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup RISCV EVM actuator. We
Highlights

The 7th OWN Insights “The Financialization of DAOs and DID” has been published, written by Erick Pinos, Ontology Americas Ecosystem Lead, sharing the pathway to the broad adoption of DAOs and DID.

Latest Developments

Development Progress

We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup RISCV EVM actuator. We are 85% done with the Rollup L1<->L2 cross-layer communication. We are 87% done with the Rollup L1<->L2 Token Bridge. We are 99% done with the L1 data synchronization server. We are 93% done with the L2 Rollup Node. We are 56% done with the L2 blockchain browser.

Product Development

ONTO has published its August monthly report, summarizing a series of functional optimizations, various appealing activities and cooperations, as well as the progress of its global ecosystem growth. ONTO hosted a big giveaway activity with ElementWorld, 6,666 ELM were up for grabs! Follow the @ONTO Wallet Official Announcement in Telegram for more details!

On-Chain Activity

154 total dApps on MainNet as of September 13th, 2022. 7,159,077 total dApp-related transactions on MainNet, an increase of 5,436 from last week. 17,940,445 total transactions on MainNet, an increase of 19,969 from last week.

Community Growth

We held our weekly Community Call, focusing on the topic “Web2 & Web3”. Starbucks launched its own Web3 platform recently, as many brands did previously, so community members discussed the relationship between Web3 and Web2, and the way that Web3 attracts more users. We held our Telegram weekly Community Discussion led by Ontology Loyal Members, discussing “Funding the Education in Web3”. The realization of Web3 is not only a problem of technical realization, but also a problem of user education. The progress of education is closely related to the implementation of Web3. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates.

Global News

Ontology cooperated with Coinhub Wallet, a famous multi-chain wallet, and held an AMA. Global users asked questions about the latest features and security of Coinhub Wallet, and won generous rewards while learning about the application. Ontology in the Media

Foresight -The Financialization of DAOs and DID

“There are a plethora of Web3 technologies and even more projects implementing them. Financialization of these technologies isn’t all that matters to drive mass adoption, but it is a powerful driving force. We must take great care in implementing it sustainably and with properly aligned incentives in place to ensure Web3 develops optimally.”

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 6–13, 2022) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.


Dock

Dock’s Web3 ID Now Available on Auth0 Marketplace

Dock has partnered with Auth0, one of the world’s leading identity management companies. Auth0 has added the support for Dock’s Web3 IDs in their marketplace integration to enable Auth0’s enterprise customers to integrate Web3 IDs on their platforms.

Web3 developers can now grant access and verify user eligibility based on private user data.


Zug, Switzerland – 14 September, 2022 – Dock, a pioneer in the Decentralized Identity space, today announced the availability of Web3 ID on Auth0 Marketplace, a catalog of trusted technology integrations to extend the functionality of Auth0’s customer identity management platform. Dock’s Web3 ID enables privacy-preserving user verification for Web3.

Web3 ID is a blockchain-based authentication and authorization system that allows developers to grant access and verify end-user eligibility by requesting private data from users' non-custodial identity wallet apps. Web3 developers can now implement privacy-preserving age verification for Web3 gaming, gambling and entertainment; verify eligibility to enter Metaverse and IRL (in real life) experiences based on private data; or verify token ownership without the users revealing all their wallets’ contents. Dock’s Web3 ID complements Auth0 extensible identity, and the ease in which customers can seamlessly integrate adjacent technologies to facilitate the successful execution of larger projects such as digital transformation, threat detection, compliance, and customer conversion.

Web3 introduced the ability to sign in using cryptocurrency wallets. But with crypto wallets, all the user data that a developer can request and verify has to be publicly available on the blockchain. Dock's Web3 ID gives developers the ability to verify private user data. With this integration, Dock brings decentralization, ownership of their identifiers, and the ability to privately sign in to apps and services to Auth0 customers.

“Dock Web3 ID was built to provide all the convenience of OAuth style login while also ensuring that individuals don’t leak their data, and remain in control of their login credentials at all times. Working with the industry-leading customer identity platform, Auth0, is a key aspect of Dock’s growth strategy as their reach amongst development teams is extensive and we anticipate this collaboration paving the way for Dock Web3 ID to be used at scale.” said Nick Lambert, Dock’s CEO.

“It has been really exciting to work with Dock on a newly-built partner integration for Auth0 Marketplace. This best-in-class solution adds an integral layer to our platform that provides our customers with greater choice and flexibility,” said Cassio Sampaio, SVP of product at Auth0, a product unit within Okta. “After speaking with many customers, we have identified the types of integrations that matter to them, and we are so thrilled to have Dock’s Web3 ID as a vetted and valuable vendor in Auth0 Marketplace.”

Auth0 reviews partner integration functionality, and makes available integrations that are easy to discover and integrate for customers. Partners can participate in the growing demand for digital identity solutions and increase their visibility as part of Auth0 Marketplace, and can learn more here: https://auth0.com/partners.

About Dock

Dock brings decentralization, ownership of digital identifiers and privacy-preserving user verification to identity management. Dock's Decentralized Identity Platform provides a highly secure and scalable solution for businesses and developers to issue and verify digital identity credentials that are instantly verifiable using blockchain technology. Enabling organizations and individuals to create and share verified data.

For more information, visit https://dock.io


Ocean Protocol

New grantee Directimo joins Ocean Shipyard

The project focuses on a versatile open source data aggregation system for the Ocean Protocol Community Directimo — a company driven by the mission to create a database with verified properties for the proptech industry, is the latest grantee to benefit from Shipyard — the $2M grants program for entrepreneurs looking to build open-source Web3 solutions on Ocean. Directimo will develop an op

The project focuses on a versatile open source data aggregation system for the Ocean Protocol Community

Directimo — a company driven by the mission to create a database with verified properties for the proptech industry, is the latest grantee to benefit from Shipyard — the $2M grants program for entrepreneurs looking to build open-source Web3 solutions on Ocean.

Directimo will develop an open-source parsing system able to gather valuable data from any public website for anyone in the Ocean Protocol community. The project aims to deliver the foundation to scrape most of the websites around the globe by developing a tool to extract data from various sites for further processing. The solution will minimize the programming knowledge that developers need to have to extract data from multiple sites in parallel.

This kind of data will provide access to powerful insights from the market. For instance, the tool can be used to collect daily inputs from a real estate agency, extract all product data from competitors in one go, including full pricing/product description, category, and brand information and the data can be plotted to discover new trends and create reports. Directimo will support all users in bootstrapping their projects and finding solutions for specific use cases.

Sheridan Johns, Ecosystem & Partnerships, Ocean Protocol said:
“The data parsing tools directimo are building are a great match for Ocean Shipyard. We are particularly excited because Directimo tools will be available to the entire Ocean community and will dramatically help onboard quality data onto Ocean.”
Alexandru Dan, CTO of Directimo further commented:
“Web scrapers automatically collect information and data that are usually only accessible by visiting a website in a browser. By doing this autonomously, web scraping scripts open up a world of possibilities in data mining, data analysis, statistical analysis and much more. Everyone with some basic computer knowledge will be able to clone the open source project and start a journey into web parsing and data aggregation in order to collect meaningful information from the web.”

The project will employ advanced technologies including Python, Redis and Kafka to create a distributed on-demand scraping cluster. The goal is to distribute URLs among many waiting spyder instances, whose requests are coordinated via Redis.

The benefit of using a distributed system is that it will enable communication between spyders and generate flows. If a user wants to obtain the title from a paginated list, they will be able to use 2 spyders: one to open each page and one to instruct the other spyder to take the titles. Using Kafka Directimo will also allow the system to track the status of each request and trigger them if the configuration is faulty.

The goal is to create customizable spyders, allow users to write their own and eventually pave the way for future orchestration. All required data will be exported to a database in order to create specific reports for the users. The goal is to leverage this to further facilitate a real-estate model that’s scalable on a global level and a platform that integrates marketing tools for 4x superior conversions, lead nurturing and buyer advisory services.

If you are a passionate individual or team looking to build something valuable on Ocean Protocol, this is your best chance to kickstart your dream project and be a pioneer of the new data economy.

Applications for the next wave of Shipyard contenders are opening on September 15. Start filling out your application today, visit https://oceanprotocol.com/shipyard

New grantee Directimo joins Ocean Shipyard was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Nov 15, 2022: Making Passwordless Authentication a Reality: The Hitchhiker’s Guide

In this webinar, Bojan Simic, founder and CEO at HYPR, and Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will share their insights and experience on what to consider when moving towards passwordless authentication, and making this a reality.
In this webinar, Bojan Simic, founder and CEO at HYPR, and Martin Kuppinger, Principal Analyst at KuppingerCole Analysts, will share their insights and experience on what to consider when moving towards passwordless authentication, and making this a reality.

UbiSecure

Digital Currencies, with David Birch, Principal at 15 Mb – Podcast Episode 75

Let’s talk about digital identity with David Birch, Principal at 15 Mb and author, advisor and commentator on digital financial services. In... The post Digital Currencies, with David Birch, Principal at 15 Mb – Podcast Episode 75 appeared first on Ubisecure Customer Identity Management.
Let’s talk about digital identity with David Birch, Principal at 15 Mb and author, advisor and commentator on digital financial services.

In episode 75 David Birch discusses digital currencies – the differences between digital currency and cryptocurrency, the role in which identity plays in digital currency and the importance on identity verification within digital currencies.

[Transcript below]

“Digital currency needs some form of digital identity, that might actually drive digital identity forward and help digital identity to develop into the mass market.”

David G.W Birch is an author, advisor and commentator on digital financial services. Principal at 15Mb, his advisory company, he is Global Ambassador for the secure electronic transactions consultancy, Consult Hyperion, Fintech Ambassador for Digital Jersey and Non-Executive Chair at Digiseq Ltd. He is an internationally-recognised thought leader in digital identity and digital money. Ranked one of the top 100 fintech influencers for 2021, previously named one of the global top 15 favourite sources of business information by Wired magazine and one of the top ten most influential voices in banking by Financial Brand, he created one of the top 25 “must read” financial IT blogs and was found by PR Daily to be one of the top ten Twitter accounts followed by innovators (along with Bill Gates and Richard Branson).

His latest book “The Currency Cold War—Cash and Cryptography, Hash Rates and Hegemony” (published in May 2020) “paints a fascinating and stimulating picture of the future of the world of digital payments and its possible impact on the wider global and economic orders” – Philip Middleton, OMFIF Digital Monetary Institute. His previous book “Before Babylon, Beyond Bitcoin: From money we understand to money that understands us” was published in June 2017 with a foreword by Andrew Haldane, Chief Economist at the Bank of England. The LSE Review of Books said the book should be “widely read by graduate students of finance, financial law and related topics as well as policy makers involved in financial regulation”.  The London Review of Books called his earlier book “Identity is the New Money” fresh, original, wide-ranging and “the best book on general issues around new forms of money”.

More information is available at www.dgwbirch.com and you can follow him @dgwbirch on Twitter.

Connect with David on LinkedIn.

We’ll be continuing this conversation on Twitter using #LTADI – join us @ubisecure!

 

 

Podcast transcript

Let’s Talk About Digital Identity, the podcast connecting identity and business. I am your host, Oscar Santolalla.

Oscar Santolalla: Hello and welcome to a new episode of Let’s Talk About Digital Identity. Today, we’ll talk about digital money, especially a type of digital money that I see that not many people are discussing today. Except, of course, our special guest who is David G. W. Birch. He is an author, advisor and commentator on digital financial services, Principal at 15 Mb, his advisory company. He is Global Ambassador for the secure electronic transaction’s consultancy, Consult Hyperion. He is Fintech Ambassador for Digital Jersey and Non-Executive Chair at DIGISEQ Limited. He is an internationally recognised thought leader in digital identity and digital money. Also, author of several books including his latest book, The Currency Cold War: Cash and Cryptography, Hash Rates, and Hegemony. Hello, David.

David Birch: Hello, Oscar. Thank you so much for inviting me.

Oscar: It’s a real pleasure talking with you and super interesting topic we’re going to discuss today about digital money. So yeah, let’s start a conversation. Let’s talk about digital identity. I would like to hear first, a bit about yourself and your journey to the world of identity.

David: Oh, sure. OK. Well, my background originally was in secure communications, and originally for military and government purposes. And then, of course, in the ’80s, that became part in the financial services sector was developing a networking and suddenly those skills were needed in financial services. And then I began to specialise in the area of payments, which are very, very interesting to me.

But after a while, I began to realise that a lot of the problems that we were facing in the payment sector, things like fraud and so on, were really not payment problems, they were really identity problems. And so, I became very fascinated by the world of digital identity, and how to, if you like, re-imagine identity for an online and interconnected world. And so that’s how I came to originally edit a book. And then I came to write a book about it.

And as time goes on, I’ve become… well, I’ve become more convinced than ever, that digital identity is the fundamental enabler for all sorts of new business, new ways of working and new society, but also a fundamental problem, because we don’t seem to have been able to fix it. And we all still struggle with passwords and logging on and scamming and fraud and all sorts of things. So yeah, so I began to realise that digital identity was really the fundamental problem needed to be solved. And then I became very interested in how to solve it in different ways.

Oscar: Yeah, super interesting. And as you said, there are problems that are dragging for many years already and we’re still dealing with those. But you are working with the very latest, as I said earlier, not many people here, what we’re going to discuss today, particularly in about digital money, there is a type called CBDCs digital currencies. So, could you tell us what is that?

David: People are very familiar with cryptocurrencies. And there’s lots of fun and interesting things happening in the world of cryptocurrencies. I mean, there’s also lots of crazy things happening and lots of and lots of criminal things happening. But nonetheless, the technology is very interesting. But cryptocurrencies are valued by the market. They have no inherent value. There’s– A Bitcoin is worth what people will pay for a Bitcoin. It’s not backed by anything.

A digital currency is something that’s backed by something. So, for example, you could have a digital currency like the circle, you know, USDC, where you have cryptographic tokens, but they’re redeemable for one US dollar. You could have tokens that are backed by commodities, like oil or gold or something like that. Or you could have tokens that are backed by goods and services, or future products and services, all sorts of things.

But in my very nerdish distinction, cryptocurrencies are backed by supply and demand, only. Whereas digital currencies are backed by something that has value, which might be another currency, it might be a commodity, it might be a company, or you know, whatever. Now, within that world of digital currencies, lots of central banks around the world are starting to think, “Well, we want our citizens to have access to new innovation, new ways of doing things, better ways of doing business. So perhaps we should look at making digital versions of our currencies.” And these are what people call digital fiat, because central bank currencies are what is called fiat currencies, or Central Bank Digital Currencies, CBDCs. I’ll use that latter word, because that’s what people– I mean, I know it’s longer, but that’s what people have come to use.

So, when we’re talking about Central Bank Digital Currencies, we’re talking about using some of the technologies of cryptocurrencies and crypto assets and decentralised finance and all that kind of thing. But we’re using them to transport these tokens which are backed by something else. And if you think about it, I mean the reason why lots of people want to do that is because it can be much more efficient to trade values across distributed ledgers in a decentralised manner, using decentralised finance protocols. You can easily see why this is.

Because after all, if let’s say, you buy a share in Apple from me, for you to get that share in Apple involves all sorts of third parties in different layers, you have brokers and dealers and market makers and front office and back office, and middle office, settlement and reconciliation, clearing, you know, you have lots of things that have to happen. But if I send you a token, that’s worth one share in Apple, it just goes from my wallet to your wallet. So, the financial services people, the serious financial services people are very interested in this because it’s a much more cost-effective way of doing business, not because of they have any ideological commitments but because it’s a lower cost way of doing business.

So Central Bank Digital Currencies exist in that world. And you know, the idea that you and I might trade through some protocols, in some sort of decentralised finance market, and I send you my token that represents an Apple share, and you send me some tokens that represent dollars. That’s not crazy. That’s where I think we’re going. And I think that’s what’s going to happen.

Oscar: Yes, seeing as a token, something simple one– something simple that can be, as you said, given directly from one person to another, it simplifies a lot what– how things are doing today, as you say, with the example of buying– you sell me, for instance, an Apple share, indeed. How does that token goes from you to me? For instance, I read one of your articles and you mentioned that it has to be offline, or not, it’s like device to device, or that’s known as…

David: Yes. So, this is– that’s a slightly different argument. So, if we say, well, OK, a Central Bank Digital Currency is a good thing. It would be nice if we had tokens that were pounds or dollars, pesos that we could exchange with each other. That’s good. But actually, if a central bank is going to provide those, and it’s going to provide them for all its citizens, not just some people that have nice computers and high-speed broadband, if it’s going to provide those for all of its citizens as a potential alternative to cash, then it has to satisfy some additional criteria. And in particular, it has to be able to work where there are no networks.

So, if you and I – I mean I always take a simple example, which is car parking. If I go to the underground car park, I want my car to be able to use digital currency to pay for its parking place. But in the underground car park, there’s no mobile signal. So, it’s annoying, when you try to use apps and things like that. A digital currency that’s going to be a cash alternative has to be able to function in a device-to-device mode, when there’s no mobile network, no internet, possibly even no electricity.

I should be able to transfer money from my phone to a merchant’s phone so that I can buy some milk, even when there are no networks, perhaps because of there’s a natural disaster or power failures or so on. Because otherwise, you’re not really providing a real alternative to cash. If it’s a real alternative to cash, it has to work everywhere all the time. And this means it has to operate in a device-to-device mode, so that I can send money from my USB stick to your phone, like direct by NFC or tapping them or I mean, whatever, you know.

Oscar: Yeah, it has to work with or without internet, as you said, right? Be the replacement of cash.

David: Yeah, yeah, that’s right.

Oscar: OK. And to also clarify, because many people have been, not only talking but also buying and selling cryptocurrencies, well until very recently, but yeah, it’s a term that has been used a lot in the last year. People get more familiar, maybe not everybody understand it but yeah, we have an idea what is a cryptocurrency. What are the main differences? You already mentioned a bit, but let’s say the top differences between cryptocurrency like the Bitcoin and the CBDC.

David: Well, like I said, cryptocurrencies don’t have anything behind them. They’re only valuable according to what people will pay for them. Their value is set by supply and demand. But digital currencies have a backing. And in the case of Central Bank Digital Currencies, that’s central bank money. So, if the Bank of England issues a Central Bank Digital Currency, that Central Bank Digital Currency will be backed by Sterling. You’ll be able to take your Central Bank Digital Currency tokens and get Sterling for them. They’ll be a reserve in Sterling. The real difference between cryptocurrencies and digital currencies is that digital currencies have a backing. There’s some asset sitting behind them that you can have access to. And in the case of Central Bank Digital Currencies, that Central Bank money.

Oscar: Yeah, exactly. That’s a core difference. OK, thank you for clarifying that again. And also, cryptocurrencies, well, I would say, anybody could create a cryptocurrency, right? Basically, you can. That’s why there are so many, too many. And many come and go. That’s how it is. But of course, the CBDC is very different that way. So, there will be rules on that. So are there already regulation rules on that, or what kind of regulation would come in?

David: So, in general, we need proper regulation of the whole sector, of course. But many of these finance people I’ve spoken to have said that when there is appropriate regulation in place, they intend to tokenize everything, equities, bonds, commodities, everything. Because trading tokens is a more efficient way of trading. So, lots of people want that regulatory structure to come into place.

In the case of Central Bank Digital Currencies, you can already see what that should be like, because in Europe, we already have the example of the electronic money regulations. We can already issue electronic money against reserves that are sufficient quality, what you call tier one capital or HQLA (High Quality Liquid Assets). So, we can sort of see what that regulation should look like. Basically, anyone will be able to issue a currency provided they have this backing. And in the particular case of Central Bank Digital Currency, I mean, bear in mind that no central banks really want to issue this to customers directly. They want to do it through intermediaries, like banks.

Any digital pound token that you hold, will be backed by a Bank of England pound. I imagine. I mean, who knows what exactly those regulations will look like. But I imagine those regulations will say that you’re not allowed to charge fees on transferring in and out, I think it will have to sort of say that. You know, I don’t know exactly what the regulations will be. But I can sort of see roughly what they will be. And in that case, I assume, and I think a lot of central banks assume this, the reason that those digital currencies will become worthwhile and useful, isn’t to replicate what we already have. Like, I don’t need digital currency to go to the supermarket. I already have a debit card, it works fine. You know, that’s not the issue.

But the potential for innovation with those currencies, I think is very real. So, if you imagine I mean, some people talk sort of loosely about imagining money with an API. And I think that’s an interesting way of thinking about it. If you have cash that has an API, then that would mean that there will be a whole, lots of creativity could come in that space. I’m sure even now, you know, students in a lab somewhere could be building some terrific new products and services. And you know, maybe I’m too old to imagine what those will be. But I sort of think of things like micro payments, I think will be a good example. But the idea of the regulation is not to create something which allows us to do what we do now. But to put in place a platform for innovation, new ways of doing things, new products and services in the future.

Oscar: Yeah, exactly. Yeah. One you already mentioned, you know, you are on the underground parking, yeah and you cannot use internet to pay electronically. There are many others like cases there’s no other solution today. Yeah, I guess that the most interesting thing to hear now is also what is the role of identity in this type of Central Bank Digital Currency?

David: Well, look, I mean, it’s very hard to imagine. It’s very hard for me to imagine that central banks will allow unlimited amounts of anonymous digital currency into circulation. And that would be catastrophic, because it would enable the criminals and terrorists and oligarchs to be free of any kind of control. And I don’t think any of us would want to live in that kind of society.

So therefore, there must be some kind of identity in the digital currency space. And in fact, I’d go further than that – we want some kind of digital identity in that space. It’s a good thing. And actually, the need, because digital currency needs some form of digital identity, that might actually drive digital identity forward and help digital identity to develop into the mass market.

Now, exactly what form that digital identity will take, that’s a very interesting subject for discussion. Should it be pseudonymous in some way? Who should know who these identities are? Who should be able to follow the transactions and manage them in this kind of thing? Those are complicated discussions, much too complicated just for a podcast. But you see what I’m driving at. There has to be some kind of digital identity.

And therefore, it’s good to start the discussion now, and bring the stakeholders together so that the stakeholders, and stakeholders I mean, not just banks and central banks, but I mean, law enforcement and citizens groups and lawyers and regulators. There are lots of people, because there are many issues around this to do with financial inclusion and this kind of thing. There are lots of issues that need to be resolved to get a Central Bank Digital Currency together. Identity is a really important part of that. And I think it’s useful to begin that discussion now so that people can have an educated and informed debate on how exactly that digital currency should work.

Oscar: Yeah, absolutely. I agree with that is a topic that yeah, not only people are talking about. And it’s, yeah, I think it’s going to happen. I think it’s something that you, every time you write, you say this is going to happen. From what I see, you could imagine from that not a desire option that this data currency is – it’s like cash, completely anonymous, going to pseudonymous and going to verify identity. It’s something still we don’t know who would verify. But yeah, it’s something– I agree with you there’s a lot of discussion have to be done now.

David: Yeah, I’m not saying, I’m not saying I know exactly what all the answer to that should be Oscar, like, where exactly we should set the dial? I do feel that we need to have an informed debate about where to go. I think the people who are the extremists, the people who say that digital cash should be completely anonymous are clearly wrong.

But simultaneously, the people who say that every transaction should be traced and monitored, they’re probably wrong as well. We need to figure out exactly where the balance lies. And I don’t think that decision should be left to technologists, like me, I think that decision should involve civil society. And it will take some time, people like the Bank of England are saying that a Central Bank Digital Currency could be sort of five years away. I agree with them. But you know, it could easily take five years to work out all of these issues as to how exactly it should all work.

Oscar: Yeah, it can be, as you said, five years and still, do you know, in some countries there is more progress or more work on that, or how is it around the world?

David: Well, look, I mean, I think, you know, in different parts of the world, it’s progressing in different ways, because there are different cultural attitudes to how this should all work. So, what people in America might want and what people in China might want, will be very different, I think with this sort of thing. So, around the world, it’s progressing in different ways.

In the sort of developed countries, I think that those debates actually have some time to go, I really do. I don’t think it’s going to happen tomorrow. It’s just too complicated and too important to get those decisions right. Decisions that we make about how exactly is Central Bank Digital Currency going to work? These are important decisions that have ramifications for a long time. So, I am a big fan of Central Bank Digital Currency. I want to see it, but I don’t need to see it now. I need to see it; I need to see it in the future working in such a way as to benefit everybody.

Oscar: Yeah, exactly. And I guess when you say, of course, some countries have more progress and Central Bank have more progress than others, of course, the ones who, yeah, have a right solution first, also we’ll take advantage like, that’s what I can– I could foresee. Yeah, it’s very difficult to see the future. What is your best way of foreseeing the first implementation, what kind of… yeah, how would you see it in practice?

David: We’re already seeing the pilot implementations in China, implementations in other places around the world. I think in the US and in the UK, my feeling is that we will choose privacy-enhancing technologies to be part of the infrastructure. So, if you talk to the technologists, they already have techniques, cryptographic blinding, homomorphic encryption, zero-knowledge proofs, this kind of thing. They already know how to do things in a more private way. What we need to decide is how exactly those will all come together to form the infrastructure.

But I’m actually quite optimistic about it, Oscar, because I think those technologies are already there. They already work. It could be that the legislators and the regulators may not understand how powerful those tools are or what they can do. But I do feel that we have the tools that we need. So once these stakeholders can come together and say, how anonymous or how non-anonymous it needs to be, then I think we can implement the relevant digital identity infrastructure. I think we already have all the technology; we need to do that. So yes, it will take a while, but I’m optimistic.

Oscar: OK. So, there are already some pilots in some countries. Interesting. Definitely have to… yeah, interesting for everybody, I think, to have a look at those. And yeah, be prepared. Yeah. Because I think no matter which job functions, we are, I think we have to, as you say, not only technologists take part into making the decision how this system is going to be designed. Yeah, definitely very difficult to see the future. But we’ll see quite a lot progress in, as you say, in five years from now.

Final question I would like to ask you, David, for all business leaders that are listening to us right now, what is the one actionable idea that they should write on their agendas today?

David: I think for most businesses that I’m involved with and because I do a lot of work in finance and payments, I think for most businesses, what they need to do is to have a strategy towards digital currency. They don’t need to implement digital currency right now, it’s downstream. But they have to have a strategy because we need them to provide input into that stakeholder consultation process. So, it makes sense for retailers, for people in the value chain acquires, issuers processes, governments, law enforcement, it makes sense for them to have their strategy towards digital currency, because we need that strategy as input.

So just because it’s not going to happen for four or five years, well, four or five years is not a long time if you’re a bank. So, what I would say to them is, you don’t have to do anything about digital currency tomorrow, but you do have to start building a strategy towards digital currency. I hope that’s helpful.

Oscar: Yes, I think if you’re– I agree with having a strategy, preparing for the next 5, 10 years when this will for sure happen. OK, well, thank you very much, David. It’s super interesting this conversation about the Central Bank Digital Currency. Definitely, we’re going to hear more and more in the coming months and years. Please let us know, for someone who like to get in touch with you follow you, what are the best ways?

David: Just follow me on Twitter, @dgwbirch, or on LinkedIn, DGWBirch, or you can always just go to my website, which is dgwbirch.com.

Oscar: OK, excellent. Again, Dave, it was a pleasure talking with you and all the best.

David: Thank you for an interesting discussion, Oscar. It’s a pleasure talking to you.

Thanks for listening to this episode of Let’s Talk About Digital Identity produced by Ubisecure. Stay up-to-date with episode at ubisecure.com/podcast or join us on Twitter @ubisecure and use the #LTADI. Until next time.

The post Digital Currencies, with David Birch, Principal at 15 Mb – Podcast Episode 75 appeared first on Ubisecure Customer Identity Management.


Aergo

Layer 1.5 AERGO — going forward

Layer 1.5 AERGO — going forward AERGO was an official sponsor of the Korea Blockchain Week 2022. AERGO took this opportunity to introduce the Dapps such as BOOOST, Plusfi and CCCV. CCCV also introduced CSB(CCCV Soul Bound Badge). BOOOST is the first Live-To-Earn lifestyle app that pushes the boundaries of ‘To-Earn’ models while creating a Web 3.0 ecosystem that seamlessly integrates wi
Layer 1.5 AERGO — going forward

AERGO was an official sponsor of the Korea Blockchain Week 2022. AERGO took this opportunity to introduce the Dapps such as BOOOST, Plusfi and CCCV. CCCV also introduced CSB(CCCV Soul Bound Badge).

BOOOST is the first Live-To-Earn lifestyle app that pushes the boundaries of ‘To-Earn’ models while creating a Web 3.0 ecosystem that seamlessly integrates with your day-to-day activities. BOOOST recently released its litepaper which you can find on its website.

AERGO has 3 key features; SQL base, D-DPoS (deterministic) consensus and Merkle Bridge. It is well known for being the leading hybrid blockchain for enterprises. In living up to its name, AERGO has built its partnership and use-cases with numerous companies.

From Dec. 2014 (even before Ethereum appeared), AERGO’s main aim was to become the de-facto blockchain for enterprises. Recognizing companies’ need for processing large volumes of data in real time with SQL/JDBC, AERGO targeted the private blockchain market specifically. This is where AERGO provides a conducive environment for traditional businesses to create their own mainnet immediately. AERGO’s sidechain allows enterprises to connect with AERGO’s public blockchain based on SQL’s high compatibility. Lastly, the Merkle Bridge that facilitates interoperability and asset transfer between each blockchain and sidechain allows AERGO-enabled enterprises to create their own blockchain and manage it flexibly. In summary, AERGO’s public mainnet acts as a kind of central dogma for sidechains.

During KBW 2022, AERGO was introduced as a ‘Layer 1.5’. The main reason AERGO is unique enough to be called a ‘Layer 1.5’ is due to its versatility. In other words, AERGO has both the characteristics of a Layer 1 and 2.

It is worth mentioning that AERGO has a public chain with its own consensus structure like other layer 1s. AERGO also has layer 2 advantages — such as high ETH compatibility, low fees and high TPS — and characteristic of layer 2s which not all data is stored in a single chain but through each sidechains.

In summary, AERGO allows users to:
1) configure on-demand sidechains;
2) interconnect each sidechain through AERGO public; and
3) provide interoperability with large blockchains such as Ethereum

AERGO is an accessible smart contract platform for enterprises. Furthermore, AERGO enables enterprises to anchor data easily and transfer assets to a public chain by connecting the private chains in a single sidechain form. AERGO SQL, a smart contract engine, supports connecting existing DBs around the private chain.

Going forward, AERGO aims to expand its service scope for not only enterprises but also individual users, allowing for the issuing and utilisation of various tokens / STO / NFT / SBT, etc between all Layers.

These plans will be embodied through the upcoming AERGO Dapps.

아르고는 지난 8월 7일부터 14일까지 진행된 Korea Blockchain Week(KBW)에 공식 파트너로 참여했습니다. 아르고는 이번 행사에서 기존 파트너와 구축해온 다양한 블록체인 생태계를 소개하고, 아르고 체인을 기반으로 하는 최초의 Live-2-Earn 앱인 BOOOST, 디파이 서비스 Plusfi, NFT 마켓플레이스인 CCCV에서 CSB(CCCV Soul Bound Badge)를 공개했습니다.

BOOOST는 리브 투 언(live-to-earn, 일상생활에서 돈을 버는) 라이프 스타일 앱으로, P2E를 포함한 기존의 ‘-투 언(to-earn)’ 모델이 지닌 한계를 허무는 것은 물론, 우리의 일상 활동과 유기적으로 연결된 Web 3 생태계를 구성합니다. 최근 공개된 BOOOST 라이트페이퍼웹사이트에서 만나보실수있습니다.

이번 아르고 KBW 부스에서는 특히 ‘레이어 1.5’라는 표어가 추가되어 특히 눈길을 끌었는데요. ‘엔터프라이즈를 위한 하이브리드 블록체인’이라는 기존 표어가 아르고의 다양한 기업 파트너 및 상용화 사례를 바탕으로 했다면, 레이어 1.5는 어떤 의미를 담고 있을까요? 아르고는 어떤 의미로 레이어 1.5를 표방하고, 또 이를 달성하기 위해 어떤 노력을 진행하고 있는지, 함께 알아보시죠.

아르고는 자체 합의 구조를 가진 레이어 1의 특장점은 물론, ▲높은 ETH 호환성 ▲높은 TPS ▲ 낮은 수수료를 비롯한 레이어 2의 특장점을 모두 갖췄습니다. 물론, 여러 레이어 1 체인들이 이더리움 생태계의 유동성을 흡수하기 위해 EVM 호환을 지원하며 최근 몇 년 사이 급격한 성장을 보여주기도 했습니다.

아르고는 이더리움이 세상에 등장하기 이전인 2014년 12월부터 엔터프라이즈를 위한 블록체인이라는 초기 방향성을 바탕으로 기획되었습니다. 업종과 시대를 불문하고, SQL/JDBC 지원을 통해 대량의 데이터를 실시간으로 신속히 처리하고자 하는 니즈는 꾸준했기 때문입니다. 아르고는 기존의 기업 고객들이 바로 블록체인 메인넷과 연동할 수 있는 환경을 제공해 프라이빗 블록체인 시장을 공략해 왔습니다.

아르고는 ▲SQL 언어 기반 ▲D-DPOS 합의 구조 ▲머클 브릿지라는 핵심 기능을 바탕으로 엔터프라이즈부터 퍼블릭 블록체인까지 다양한 환경에서 자원과 로직을 연동하고 활용할 수 있는 사이드 체인을 만들어 왔습니다.

특히, 머클 브릿지는 각 블록체인과 사이드 체인간의 상호운용성과 자산 이동을 용이하게 하는 기술입니다. 아르고는 머클 브릿지를 통해 사이드체인 기반의 손쉬운 엔터프라이즈 블록체인 제작 및 유연한 운영을 지원합니다. 아르고 퍼블릭 메인넷은 각 사이드 체인의 일종의 센트럴 도그마와 같은 역할을 수행합니다. 아르고 생태계에서 모든 데이터는 하나의 체인에 저장되는 것이 아닌, 각 사이드 체인에 저장되고 운영됩니다. 레이어 2의 보편적인 특성을 갖고 있는 동시에 아르고 퍼블릭과 같은 레이어 1의 면모도 함께 갖춘, 말 그대로 ‘레이어 1.5’ 인 것입니다.

다시 한번 정리하자면, 아르고는 다음과 같은 3가지 핵심 기능을 구현하고 있습니다.

1) 기업 고객 및 사용자가 용도에 맞는 별도의 사이드 체인을 구성하고,

2) 각 사이드 체인은 메인 네트워크를 통해 상호 연결되며,

3) 최종적으로 이더리움을 비롯한 다양한 블록체인과의 강력한 상호운용성을 형성합니다.

기본적으로 아르고는 기업 입장에서 접근하기 굉장히 용이한 스마트 컨트랙트 플랫폼입니다. 대표적으로 아르고의 엔터프라이즈 제품은 프라이빗 체인을 중심으로 Aergo SQL이라는 스마트 컨트랙트 엔진을 통해 기존 DB를 연동하는 기능을 지원합니다. 이러한 이점을 기반으로 프라이빗 체인을 하나의 사이드 체인 형태로 연동하여, 퍼블릭 체인으로 쉽게 데이터를 앵커링하거나 자산을 이전할 수 있는 기술입니다.

앞으로 아르고는 기업은 물론, 여러 계층의 개인 사용자를 중심으로 한 다양한 블록체인 서비스를 기획하고 있습니다. 이를 위해, 블록체인 간의 자산 이동과 각종 유틸리티 토큰/증권형 토큰/NFT/SBT 등을 적극적으로 발행하고 사용할 수 있도록 기술과 서비스를 제공하고자 합니다.

이러한 계획은 앞으로 선보일 아르고 Dapp들을 통해 구체화될 것입니다.

Layer 1.5 AERGO — going forward was originally published in Aergo blog on Medium, where people are continuing the conversation by highlighting and responding to this story.

Tuesday, 13. September 2022

KuppingerCole

Managing Cyber Risk in a Hybrid Multi-Cloud IT Environment

Today’s IT environments blend applications and services from multiple public cloud networks, private clouds and on-prem networks, making it difficult to view and inventory assets deployed across complex hybrid networks, and keep track of the security risks. Organizations need to find a way to improve visibility, identify and prioritize risks, and maintain cyber resiliency. Join security exper

Today’s IT environments blend applications and services from multiple public cloud networks, private clouds and on-prem networks, making it difficult to view and inventory assets deployed across complex hybrid networks, and keep track of the security risks. Organizations need to find a way to improve visibility, identify and prioritize risks, and maintain cyber resiliency. Join security experts from KuppingerCole Analysts and cybersecurity posture automation firm Balbix as they discuss the need for a common approach to measuring and managing cyber risks across modern IT environments, which is a major challenge to successful digital transformation.

Mike Small, Senior Analyst at KuppingerCole will describe the challenges to managing risk in hybrid environments, the current hybrid cloud approaches and platforms, and the approaches needed to reduce complexity to govern the cyber risks to business-critical services on-prem and in the cloud. Chris Griffith, Chief Product Officer at Balbix will explain the business and security benefits of a unified risk model across cloud and on-prem environments. He will describe how to create a unified view of cyber risk monetary terms and automate the identification, prioritization, dispatch, and mitigation of vulnerabilities, with reference to real world implementations.




PAM versus CIEM: Clash of Identity Management Cultures or Saviour?

Paul Fisher, Lead Analyst at KuppingerCole will discuss how the seemingly different capabilities of PAM platforms and CIEM platforms are in fact beginning to converge as multi-cloud architectures come to dominate. The advanced tools appearing in both platforms will benefit CISOs and IT managers struggling to manage privileged access to key assets in the cloud. Paul will explain how to secure ROI f

Paul Fisher, Lead Analyst at KuppingerCole will discuss how the seemingly different capabilities of PAM platforms and CIEM platforms are in fact beginning to converge as multi-cloud architectures come to dominate. The advanced tools appearing in both platforms will benefit CISOs and IT managers struggling to manage privileged access to key assets in the cloud. Paul will explain how to secure ROI from PAM and CIEM, and how in continually expanding IT universes, architecture matters less than ensuring widespread and disparate identities have access to global assets.




Panel | Best Practices for Effective Privileged Access Management




The Business Case for PAM in Financial Services

The Business Case for PAM is more than just keeping data safe, or making regulators happy. We’ll talk about 4 of the most common business cases for improving a security posture covering detection, remediation, audit, and forensics.  Looking forward to questions on which may be most relevant to your enterprise.

The Business Case for PAM is more than just keeping data safe, or making regulators happy. We’ll talk about 4 of the most common business cases for improving a security posture covering detection, remediation, audit, and forensics.  Looking forward to questions on which may be most relevant to your enterprise.




PAM: The Access Foundation for the Age of the Limitless Stack




Lessons learned from a PAM rollout

In the last couple of years the ransomeware attacks and other cybersecurity threats has not only brought cybersecurity topics to the board rooms but also widened the attack surface outside of the multi nationals and the financial institutions. The cyberssecurity approaches and strategies that works well for a multinational with a large and well funded cybersecurity department may not be as applic

In the last couple of years the ransomeware attacks and other cybersecurity threats has not only brought cybersecurity topics to the board rooms but also widened the attack surface outside of the multi nationals and the financial institutions.

The cyberssecurity approaches and strategies that works well for a multinational with a large and well funded cybersecurity department may not be as applicable for a mid sized company where the security department may be a single person.

Still if the partner company that delivers the cheese to a retailer falls to a cybersecurity attack there is simply no cheese to sell to the customers so the retailer not only loses money but also fails at their most basic task. So how do we as multinationals help our partners with implementing basic controls such as PAM in a way that works in their business reality?

In this session we will be looking at how you as a relatively cybersecurity mature company can do to help your less mature partners. It is also suitable for persons who has been asked to launch a cybersecurity or PAM program without been given the full resource to execute a full program.




Expert Chat: Interview with Denny Prvu

KC Analyst Paul Fisher interviews Denny Prvu, Global Director of IAM at Royal Bank of Canada.

KC Analyst Paul Fisher interviews Denny Prvu, Global Director of IAM at Royal Bank of Canada.




IBM Blockchain

Automating EDI to the max: no partner left behind

At an IBM event few years ago, I watched a customer present on the great benefit he was experiencing from his Document Conversion Services (fax and email to EDI) solution. I was coming from a global, standards-driven, high-automation point of view and was surprised that he was so effusive about a solution that sounded to […] The post Automating EDI to the max: no partner left behind appeared fir

At an IBM event few years ago, I watched a customer present on the great benefit he was experiencing from his Document Conversion Services (fax and email to EDI) solution. I was coming from a global, standards-driven, high-automation point of view and was surprised that he was so effusive about a solution that sounded to […]

The post Automating EDI to the max: no partner left behind appeared first on IBM Supply Chain and Blockchain Blog.


1Kosmos BlockID

1Kosmos Mentioned in 5 Recent Gartner® Hype Cycle™ Reports

We are excited to share that 1Kosmos has been mentioned in 5 recent Gartner® Hype Cycle™ reports:  Hype Cycle for Privacy 2022  Hype Cycle for Digital Government Services, 2022  Hype Cycle for Emerging Technologies, 2022  Hype Cycle for Digital Identity, 2022  Hype Cycle for Blockchain and Web3, 2022  If you are not familiar with the … Continued The post 1Kosmos Men

We are excited to share that 1Kosmos has been mentioned in 5 recent Gartner® Hype Cycle reports: 

Hype Cycle for Privacy 2022  Hype Cycle for Digital Government Services, 2022  Hype Cycle for Emerging Technologies, 2022  Hype Cycle for Digital Identity, 2022  Hype Cycle for Blockchain and Web3, 2022 

If you are not familiar with the Gartner Hype Cycles, they “provide a graphic representation of the maturity and adoption of technologies and applications, and how they are potentially relevant to solving real business problems and exploiting new opportunities. Gartner Hype Cycle methodology gives you a view of how a technology or application will evolve over time, providing a sound source of insight to manage its deployment within the context of your specific business goals”(Gartner Hype Cycle). In our opinion, being recognized in the analyst space is a validation and something every organization covets. We believe, the recognition justifies the time, effort, and vision of an organization, or at least that’s how we feel here at 1Kosmos. 

We believe recognition in these Hype Cycle reports validates our push to provide a verified portable identity and a passwordless experience to all. This also means that the market is catching up to where we already are.

According to the Gartner Hype Cycle for Digital Identity, 2022, “Existing approaches for digital identities cannot scale to the accelerating needs of a digital society.” Fragmentation is a problem due to service providers (banks, retailers and governments) forcing consumers to create individual identities for every service. Decentralized identity (DCI) offers an approach with increased security, privacy and usability compared to traditional digital identity approaches. As standards continue to be refined, and legislative efforts around the world are multiplying, use cases for DCI are emerging in the market. 

Users gain greater control of their identities and data, and service providers gain higher trust, speed and confidence. Currently, providers collect huge amounts of identity information about users, in order to increase assurance to an acceptable level. DCI can help identity and service providers increase trust, security, privacy, and access convenience for end users without the need for centralized data; thereby reducing risks of data breaches, account takeovers and privacy compliance violations.[1]

The 1Kosmos BlockID platform uses a private blockchain that enables organizations to create irrefutable decentralized digital identities and unifies identity proofing with passwordless authentication. 

The Gartner Hype Cycle for Digital Identity, 2022 states that the growing adoption of DCI is a key driver for implementing VCs to streamline business processes.[1] 

1Kosmos BlockID is built with specific capabilities for the onboarding, verification and authentication of employees, contractors, customers and citizens. 1Kosmos digitally transforms the standard onboarding process, by automating and delivering the highest degree of end-user assurance. This transformation eliminates the need for new users to share copies of government IDs, protecting their privacy. 

By binding users to their proofed identity, 1Kosmos BlockID creates an identity-based biometric authentication and a passwordless experience. Users will utilize their trusted mobile device for daily authentication and step-up authentication for physical or logical access. As a result, each access event is associated with a real, verified identity. 

[1] 2,  Gartner, Hype Cycle for Digital Identity, 2022, authored by analyst Felix Gaehtgens, published 25 July 2022 . 

GARTNER and HYPE CYCLE are registered trademarks and service marks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and are used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s Research & Advisory organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

 

The post 1Kosmos Mentioned in 5 Recent Gartner® Hype Cycle™ Reports appeared first on 1Kosmos.


Indicio

Hyperledger Global Forum Provides Launch Point for Development and Deployment News from Across Expanding Ecosystem

Yahoo The post Hyperledger Global Forum Provides Launch Point for Development and Deployment News from Across Expanding Ecosystem appeared first on Indicio.
Indicio hosts Meetups once a month where we bring together leaders in the decentralized identity community to discuss current topics and trends. Join our community here to stay up to date on our next one!  https://www.meetup.com/indicio-identity-community/ Recently had the chance to sit down with Gabriel Rene, executive director of VERSES and author of “The Spatial Web: How Web 3.0 Will Connect Humans, Machines, and AI to Transform the World.”  A video of this interview is currently live on our YouTube channel, you can watch it here: https://www.youtube.com/watch?v=S2DntbiRmrE&t=342s Below is a transcript of the event, slightly edited for continuity.

Maya

Hi everyone and welcome to this SEO identity community meetup for January 2022. We are very excited to have Gabriel Rene with us today and our topic is how web 3.0 will connect humans, machines, and AI to transform the world. Before we get started we would like to have a few words from heather dahl, CEO of Indicio.

 

Heather

Hi Maya, thank you, welcome to everyone, I’m excited to kick off the first of our monthly meetups for 2022. We are moving into our just second year of providing this type of content to the community, and I think the topics that we tackle and the experts that we bring on bring really intriguing conversations. I want to give a special welcome to anyone who’s here and new to our community, it’s so cool to see in the chat where everyone’s coming from. Hamburg, I heard a shout out from Brussels, Dublin, go Fairfax, Virginia, we have Seattle, Canada, by all means if you’re new to the meetup or you’re returning, add your location to the conversation because I think it’s really interesting that we bring together this global community. For those of you who aren’t familiar with Indicio, we are a public benefit company, and our mission is to advance decentralized identity. A part of our mission is running educational events like this, and because we’re out there building ecosystems for trusted data, today’s topic we find really exciting, absolutely provocative, excited to see Gabe here, and to talk about the internet of everything, and what it means for people and digital objects and non-digital objects to provide identity and to communicate with each other in ways that are secure and trustworthy. Ken was supposed to be here today to lead the conversation but he couldn’t make it, so senior architect Sam Curren is here to lead the conversation and interview with our absolutely terrific guest, Gabrielle Rene, thank you all for being here and I’m gonna turn it over to you Sam.

 

Sam

Thanks Heather, and and thanks Gabriel for being here, I guess we should start with an introduction, that’s appropriate, Gabriel is the architect of the spatial web, it’s a vision he’s described with his co-author Dan Mapes, in the international best-selling book titled the spatial web, how web 3.0 will connect hans machines and ai to transform the world. He’s the executive director at VERSUS lab and the spatial web foundation, and he and his team are bringing this vision to life. The groundbreaking hyperspatial markup language is currently being discussed at a new working group with the IEEE, both web 3 and the metaverse have been recent hot topics in the last couple of months but the spatial web is where the real becomes virtual and the virtual real. So welcome to Gabriel Rene.

 

Gabriel

Thank you so much for having me Sam, thank you Heather, and Ken when you watch this hopefully we’re doing you proud, good work. on your behalf so Sam thank you, thanks for hosting me today, and it’s an honor to be here and have a chance to speak to everyone. 

 

Sam

Excellent, I’ve been stunt doubling for Ken in a few areas and I’m not as good looking as Ken is but I’ll do my best. So excellent all right, let’s dive into some questions here, and please answer with whatever you think is relevant, and we can take this discussion wherever we need to go. So you’ve described the spatial web as the internet of everything and you’ve thought a lot about what the internet of everything will look like, what do you see? 

 

Gabriel

Well the first thing I see is a more holistic approach to the concept of the application of network computing technologies, like the internet, and a lot of times what tends to occur is that perspectives on emerging technology are industry specific or technology specific. We talk about AI or we talk about XR, we talk about IOT or we talk about blockchain, and we talk about quant, or it’s by industry, this applies to the government or this applies to supply chain which applies to retail and this applies to healthcare. So these narratives are constantly emerging where we even have these themes: human 2.0, we have dialogues around web 3.0, industry 4.0, and the Japanese are talking about society 5.0.

 

In all of those cases almost the exact same set of technologies are being discussed, and it’s the usual suspects I just rattled off. Some people started talking about the internet of things a handful of years ago, this idea that it wasn’t just the internet where we had sort of interconnected communications, but rather these physical connected smart devices would form this new network. That was very exciting, it remains exciting, we’ve yet to see a whole lot of it work. I can tell you that the smart home struggle is real, and interoperability is at the heart of every problem. So this idea that you can connect things is one level, then there’s another debate about things being smart. So at one level you’ve got the internet of things, and then there’s been spatial computing, which has really emerged around this idea and is now re being repackaged again into the metaverse narrative, which you might call the internet of non-things, or sort of virtual things, that are not things or “what are they” things. And then there’s been this idea that is the blockchain folks saying that the internet of money, or the internet of trust, it sort of starts to intersect with the work that your community is doing. And there’s the internet of intelligence thing, which is the AI folks attempt early on the semantic web again. Whether we use the internet, or the web, or some sort of talking variations of this kind of a network where the data is intelligent or smart in and of itself.

 

The term smart, which has taken over everything, so there’s a smartphone that was the first thing, it’s just a computer, but that’s fine it’s a smartphone. Why? Because it has some connected capability, and in the beginning smart meant connected. As machine learning and deep learning found its footing over the last decade. The idea of things becoming smart meant actually being intelligent to the feedback loops, adaptive smart home now means that your nest adjusts the temperature for you, and that the lights turn on and off based on some feedback loop or information. Then of course we’re talking about smart cities, and so we go from smartphones, smart things, smart cars, smart cities, the idea is that these cities become intelligent and adaptive. Your autonomous car is able to just drive. Because, traffic is this sort of amorphous thing, like a bunch of starlings moving through. Drones and automated functionality that the amazon go, vacation of everything, you walk into a store, you take what you want, you walk out. All these sort of smart functions that might be built into government services, and all this sort of future sci-fi smart city narratives, but when I spoke to a community of smart cities four years ago in Europe, cities that came together to explore this, their number one thing was well how do we connect with other smart cities? So how do we form a network of smart cities? Well smart cities with all these smart cars and smartphones and smart devices and smart things with these internet and web connections across these various sets of interests and industries forms what you might call a smart world. And that idea that you’ve got this global cyber physical network now that encompasses all of these things, and that the addressability of an object isn’t just that it’s connected, it might just be that it’s being viewed by a camera, it could be an object that has no internet connection in and of itself, but this network has its own, and it’s essentially showing up on the network, that network we call the spatial web. 

 

Sam

Really good explanation, so a natural follow-on to that, I realize that web 3.0 has become a loaded term. You used it first in the title of your book, obviously, so it’s recorded there, but it’s come to mean some different things in recent times. Using sort of your original meaning of web 3.0 why doesn’t web 2.0 already solve this? What is missing from what we’ve got today that we need to go where we need to go?

 

Gabriel 

Well there’s a handful of things, and I think that some of the architects of web 1.0 have lamented about some of the infrastructure decisions that were made at the time. Now there were technical limitations or practical limitations, hardly anyone actually cared, but there were debates around identity and privacy and security, there was a tendency, being academics and scientists, not to think too deeply into all of the societal implications of these technologies. So the anticipation that people would use this for good, and not for personal, selfish, or otherwise harmful ends was the default. “Oh it’s gonna be this awesome network and we’re gonna do information sharing” great, problem solved. Decades of work had gone into that, from the ARPAnet all the way up into the world wide web, where you have Tim Berners-lee pulling a bunch of threads together and stitching that into what became the backbone of this amazing communications network.

 

Identity was actually one of the real missing ingredients in my opinion, and that anonymity part, the thing that makes the web open is also the thing that makes the web dangerous. The hackers, trackers, and fakers that are permeating our lives from simple sort of issues around misrepresentation, all the way to identity theft, all the way to now ransomware, and we’re about one hop away from biotech, and people being able to hack devices that were what we’re wearing, or that are on our bodies, or that are in our bodies, and so the the openness of the web is a double-edged sword that seems to cut both ways. That allowed the web 2.0 era organizations and companies to maximize the capital potential of that structure, and so they essentially monetized that openness, that anonymity was a function of the sort of the fact that you didn’t have an identity, so we’ll rent you one. We’re google here’s ours, we’re Facebook here’s ours, apple here’s one for you as well, how about one for linkedin and facebook? Alright so we all run around with these sort of two-dimensional avatars, locked in these digital fighters where we serve certain functions. We get certain services from that, so i don’t think that a pure polarized view on this looking at web 2.0 is inherently negative is fair or accurate, but these are major holes in the architecture that allowed for this type of capitalism to thrive in a new dimension, and that thriving has compromised some of the inherent freedoms, and some might say human rights. Now It’s becoming part of the debate: what property do we own here with respect to our data? And I think that those holes have produced really horrific outcomes, like not just on the monetization side, but on how we are programmed into wanting or thinking certain things. One of those things are products and services, whether those things are our own ideas about ourselves and our identity and our self-image, or whether it’s our viewpoint on the effectiveness of vaccines and science, and what political representative we think is in our best interests. You can no longer separate the technical work that we’ve done here with the web and our societal implications, and that’s the real zeitgeist challenge that I think we face at this moment. It requires some new ways of thinking about it.

 

Sam

So you’ve talked about some architectural adjustments that would get that will help get us there, one of one of those pieces of architecture i think is the hyperspatial markup language, HSML, that is part of your work at the IEEE, can you describe what HSML will help us with?

 

Gabriel

Yeah, so just one quick parenthetical, and it’s a function of a little bit of us copying something. It is HSML, but it’s hyperspatial modeling language instead of markup language, although there was quite a bit of debate internally about whether that was an accurate description. The function of a market language and a modern language are different, but depending on your perspective, in this case it is about modeling at the heart of it. How do we model the world digitally in a way that is coherent with the way that humans cognize the world? And I provide that caveat because there are other ways of cognizing the world that we’re unable to be aware of, and that is not what our concern is with respect to what we need to teach machines or AIs. We need to ensure that they can cognize the world in a way that is similar, or compatible, or at least incorporates our understanding of reality.

 

What HSML does is it functions in a sort of semIOTic structure of forming the core elements, in which there are 12, that act like a language. So just like nouns, verbs, prepositions, and adjectives are sufficient components with which to construct descriptions and representation of reality that predominantly is all we’re using today. Like today I’m just using magic mouth sounds to paint pictures in everyone’s heads, and if I had it with squiggly lines it would do nearly as well. But if i could just do what we’ve been able to see some whales and some dolphins do which is actually just project an entire three-dimensional holographic representation of this particular bay, and the position of all the fish, kind of like a sonar snapshot, and just send that to you 20 miles away, “like hey the party’s over here, here’s where the fish are guys” that might be another way to communicate. So HSML enables multiple IOT devices to do what’s called sensor fusion, or data fusion, even policy fusion. So you can take information from sensors, we take information from databases, and you can take policy related information laws, or instructions, or whatever, and you can fuse those together into a multi-layered scene graph, which is a way of representing the information about any set of activities in any given space. All of the objects in the space, all the users in the space, all the activities in space, all the permissions in the space, and at the end of the day that it is intended to enable you to construct descriptions of states of things, and then describe workflows for what needs to happen, so you could conceivably program all the smart things in your house to work in a very certain particular way, one could inform another one, and this could then extend all the way out into the city, and so forth and so forth to the world at large. So the idea of making a standard here in the in many of the same ways that HTML allows us to structure content and information on pages, HSML lets us structure context and information in spaces. 

 

Sam

So I like that the modeling of context is a sign of something that sort of begun to happen, but without much organization. So when the web was new it was a new location, we would visit websites and we would go there and it was all about this new world and this new place, and honestly web 2.0 was a lot of that too. IOT starts to get a little interesting because now there’s this internet of things, but the things generally exist in real life and also on the internet, so it’s blending now between there. Of course now we’ve got augmented reality, and virtual reality, and a lot of things that pull that together, so when we think about all this stuff that in the real life that’s sort of blending together with a digital life that means like a proliferation of cameras and sensors and objects that we can interact with etc. So this means identity for everything, which is great. But there’s some implications here that the tinfoil hat wears among us, and I occasionally wear one of a particular shape. What are the implications of all of this interconnectedness upon the privacy we have and maybe governance that can help us with that?

 

Gabriel

Yes, so that was the question we started to ask ourselves back in 2016, which was all of that connectedness, all of that capability, all of that monitoring was going to happen anyway. Like that was the clear thing that was happening, so then the question became okay can you ensure certain levels of interoperability? And you need that in order to enable governance, if things are not interoperable you can make as many rules as you like, and it doesn’t it doesn’t matter. So what there’s a bit of a logical cake here, coherence is the first requirement, if you have coherence then interoperability is a function of your description of coherence. What is or isn’t coherent, like if you can describe the physics of the environment, and you want to say how you navigate through it, you can’t do it until you’ve done the first part. And then governance is a function of saying well what you can or cannot do are the ways that you must then navigate that environment, or who can when and what time and all sorts of all the parameters of activity management or activity governance. Because governance at the end of the day is really about changing states in the world. It’s not a theoretical function, it’s a practical function, and so that was what we saw as being really the scariest thing I’ve ever noticed.

 

All this cyber stuff is going to get all interconnected, you’re going to end up with this internet of everything. Which there was no term spatial web in that sense, the GIS community had been using it for many years, but not in the sense that we saw this multi-dimensional, multimodal sort of thing, it was like okay this internet of everything is going to happen, there are a handful of real challenges with that, including the possibility for abuse and looking at how wonderfully we did the web 2.0. this was just going to cascade into essentially black mirror, that’s what we’re all concerned about. So it was at that time that identity became clear to me that it was a fundamental infrastructural architectural requirement. Having been in telecom before that, and looking at telemedicine services, that was when I started looking at blockchain and the idea that you might need to own these assets related to your medical history. 

 

I wasn’t interested in crypto in the beginning, Bitcoin seemed interesting, but not enough for me to buy any or care, blockchain looked really interesting, and this idea that you would own your sort of property. At that point I started looking for what kind of new identity structure, or what would become like a spatial web url, some sort of key identifier, and that’s when I came across the work that many of you on this call were pioneering. You were working on it before that of course, but also like right around that time, and that was a real eye opener, and it frankly made it possible for us to focus on the other bits. Because it was like this community is completely dedicated to the identity, in our version of this the human is one of those entities, or one of those actors, but every single object, including conceptual objects, can have a DID in the spatial web, and all of these are sort of structured using HSML, and some of these other ontology and taxonomy structures you’re looking at to maintain that coherence. That then grants you this ability to do governance, which gives you the second portion of this of the standard, which is called HSTP, which is hyperspace transaction protocol. What that does is it basically says “okay cool world give me all the information of the states of all these objects, users, actors,” and again they could be holographic objects in that space that that you otherwise wouldn’t be aware of, the IOT device isn’t going to pick that up, but the space knows that it’s there. You sent me something to this room so I could present it on this call as an example. I wanted to get this exact “xyz” position and I knew it was coming.

 

Today one challenge is that you can send anything you want to my email address, I can’t stop you, you can come to my website, I can’t stop you, in fact you can send money to my eth wallet, and I cannot prevent that. But in the spatial web those everything is sort of permission based, and so that ability to then transport objects users activities and the governance around those is described in HSML, and essentially activated using HSTP, which says what are those states, i would like this state to change this way, or if the state changes what is or isn’t allowed in this case in this space, and HSTP says that transaction has passed or failed. So if only Heather is allowed to enter this room, if the temperature is x, and the time of day is y, and her facial expression or other biometric information says she’s in a certain state, all of which are just inputs that would be happening with any modern technology, but now could coherently say okay Heather’s at the right state the door can open and if she’s not then then the door remains closed. So this becomes you are able to program cyber physical activities, and define the governance for those if you were an authority of that space, and so that is the key sort of requirement.

 

Sam

I think that it’s important that this doesn’t invent some brand new type of authority, but rather gives existing authorities the ability to express the rules about what they have authority over. So our house still remains our house, and I posit that we’ll have even more control over what’s actually going on in it from a digital perspective than we do today. 

 

Gabriel

Yes and so you could ensure that nothing is installed in your house, if it doesn’t make a certain threshold of some privacy standard, or that requires some certificate, if it doesn’t have it, it can’t it can’t be there, or it can’t turn on. The city could say drones are only allowed to fly in the city in these specific flight lanes, and if they have cameras on them those cameras are allowed to be on, but the drone cannot fly within five miles of the airport. it can’t have its cameras on when it’s flying over schools, or certain private properties, or certain zones, and all this can be spatially defined as if it were a video game. But now you can do it in the physical world in any digitally mediated set of functions.

 

Sam

And that’s an incredibly powerful piece of this, so to switch gears just a little bit, we’ve talked about the blending of the digital and the real life thing, but with it with better technology that we’ve got for for things like virtual reality, you can now sort of have a spatially aware virtual thing, instead of a website that we’re looking at as a docent on a screen. So how do these same concepts apply in a pure virtual space? Where I might be in a fictitious world rather than in real life, and what’s the blend between the real and the virtual?

 

Gabriel

I love that question, so if you’ll allow me to nerd out for a moment, if you’ve seen the original avengers series of movies that marvel put out, there’s a moment where you get the arc of bruce banner and the hulk. It’s like when he can or cannot become the hulk, and everyone’s afraid he’s going to do it, in the final scene of the first avengers movie they’re like “Bruce get angry”, and he goes “that’s my trick I’m always angry”. So the the parallel here is that it’s actually always virtual, in one instance it must map to the physical world, in all other instances it doesn’t need to. So a harry potter world or the death star rendered in a 3d engine somewhere can have all the exact same authorities, and policies, and functions that apply. Vader is allowed to apparently strangle people at the helm of the thing, and Dumbledore, if he does something wrong the ministry fires him for a couple weeks, he’s got to go into hiding. So there are actually policies and things that exist in these virtual spaces, as we look at things like decentraland, and ownership, and NFTs, and assets and rights around this we’re going to need certain guarantees around what we can trust in these environments. How these authority structures are set up what is transparent, and and what we can investigate, what we can audit and so building property in a virtual world that you don’t ultimately have ownership of, or you can’t take those assets somewhere else, for example you don’t have any rights in roblox or fortnite they can do whatever they want, it’s a corporation they own it they can just change it, and change all your assets, change your skins, they can just flip the whole thing they can sell it, they can take the company public, there’s all kinds of other functions and interests at play.

 

So if we expect and of course it seems natural that we’re going to want digital assets, they are becoming increasingly valuable, they’re also becoming increasingly more realistic, and to that degree at some point even experiences themselves will be indistinguishable from reality. Our senses are not that sophisticated, and you have people like Elon and others working on neural link, and others working on neural dust, that at some point are not going to require any sort of ocular interface. There’s not just a singularity of AI, there’s a singularity of reality that is a whole consideration that’s coming, so having certain guarantees around these environments will become increasingly important. This whole move with cryptocurrency, which is native internet money, and now NFTs, which are really smart assets, well not yet, they’re smart receipts to a virtual asset. I think DID’s really get you all the way across the line there, and HSML by the way would allow you to describe the asset in such a way that it would match up. So with the IDs in HSML I would know all these attributes down to color, texture, size or whatever is that version of that thing, and not just a receipt that says that I own something which could be copied a thousand times. A bit wondering of an answer but hopefully that hit a couple of notes.

 

Sam

No, that’s fantastic, so I want to draw something out a little bit. You sort of talked about how the same applies when I brought up this virtual versus real thing. You said real is the only instance where it actually needs to line up with reality and all these other things can exist virtually as well, that sounds like the metaverse has been talked about by a lot of folks and facebook audaciously renamed themselves to sort of try and own the space. What’s the difference here between the hyperspatial web that you see and the vision of the metaverse that facebook or other companies might be interested in offering?

 

Gabriel

Okay great question, so i think that i’ll try and provide a nuanced answer. I actually think everyone is making their best attempt to describe the same thing, which is an open and interoperable network. I think the battle right now is not for Facebook to become AOL, everyone’s seen this we’ve all seen this game before, like you don’t you don’t ultimately win with a closed version of this thing. In fact facebook thrives on an open web so that’s not a requirement that they sort of close their borders and they’re going to defend those borders, i think the battle right now is who’s the wordpress of the metaverse. Where are you going to go to build your sites, your stuff, who’s going to provide all the services, all the plugins, all that. So it’s not about Facebook changing their name to Meta and then saying we’re all about the metaverse, even though they’ve been the least arguably the least trustworthy brand in the modern era since I don’t know, Exxon. We go towards well they want to keep doing the same thing and making money the same way and that’s what Meta is for. I don’t actually think that, I think this is a massive pivot for the company, and that they’re doomed. I talk about this in the book. I don’t think that surveillance capitalism will be a profitable business in 10 years. I think it completely comes apart, look at GDPR right now saying google analytics is not going to work here anymore. That’s his web 2.0 getting smashed, these pillars are already coming down, and I think Mark Zuckerberg has seen this coming since the day he walked in to lucky palmers and put on something that was a space and realized that he had a company about a book. He must have said “oh it’s no longer about pages it’s about spaces” and we’re just seeing the arc of that.

 

Mark wants an economy, he wants to build the largest economy with the largest amount of services to capture the most. Now people will willingly give over their data in order to optimize and maximize all kinds of inputs and data services around that. He doesn’t have to stay in that same game, and if he doesn’t get out of that game, while building hardware that’s capturing biometric information, the regulators are just going to keep squeezing and squeezing, and the laws and the walls are going to close in. So I don’t see a way out of this for surveillance capitalism-centered businesses, and I think actually that they’re smart enough to know that. This is why they already just announced last week, “hey we’re Twitter, everyone NFTs we have NFTs now” you can take them and move things around. You can build assets as well, say the asset I would like to move around is everything that you have about me, so let me NFT my data, and then I want to move it around, okay that’s what i would like. So I think that everyone’s still pointing in the same direction, and if you’re not, and you think that you’re gonna be the prodigy in aol, it didn’t work last time, but we’re gonna really give it a shot this time. So I just I think everyone’s smart enough not to repeat that play, but just like the oil companies and the automobile companies drag their feet as long as possible squeezing every dollar out of something that was causing arguable harm to the world, they still need to maximize the lens that they operate on, which is a fundamentally, their corporations with capital based goals. So now everyone ever wants an electric car, but it took someone like Elon and Tesla to just come in and change the whole game. So think that that’s what’s happening here, is that the battles for the wordpress of the spatial web or the wordpress of the metaverse, no one’s going to win being the siloed version of the metaverse where you don’t own any of your stuff. It’s going to be a wasteland.

 

Sam

That’s a fantastic answer, so we’ve talked about this vision and i think we all have a vision in our minds of what this looks like, we see even films depict this like Ready Player One for example does a really great job like laying out this universe of what it could possibly be like in the future. What’s less clear is how we get from today to there. What does this roll out actually look like? Is there a cut over? Is there a gradual way we get there? What’s this like?

 

Gabriel

Well I mean I can only guess like the rest, but the hardware is the problem. So that’s why you’re seeing billions and billions of dollars being poured into that space, and more or less everyone is going there, so everyone wanted a smartphone. Does everyone remember the facebook smartphone that came out? That lasted like an hour. So Mark was very smart in jumping in on the VR thing when everyone thought that was just whatever, and Google said we’re going to play around in the AR space and their commitment has been soft. Two steps forward, one step back, and a lot of people in that community have not been happy about it, then Magic Leap emerged and said we’re gonna we’re gonna be the new Apple of this space. They said we only need a few billion dollars, and then “oh we spent it all” because it’s hardware and it’s absolutely the worst possible thing to work on, and it’s super hard. So you’re talking about innovation in battery life, innovation in optics, innovation in computing, and it’s just there’s so many pieces.

 

It’s like that smartphone moment again, which is why Apple just takes their time, and does the exact same thing they always do, waiting in the background so a lot of the froth is squeezed out and some of the core problems are solved, then they find their own path in. They’re like we don’t care about the metaverse, we’re not talking about the metaverse and they’re gonna come out with something different. But you’re gonna see four or five of these all start to really hit the market in a meaningful way, variations on AR and VR or hybrid sort of mixed reality devices or glasses. Then you’ve got Qualcomm, and others that are building chips now that have so much capability in them, AI powered chips with powerful spatial computing capabilities on it. So you’re gonna get 25 chinese companies that are making essentially the android versions of these things. I think by next year we’re going to see sort of that “iPhone 1” moment, and then that’s going to kick off the party. By the end of the decade, for some that can afford it, because the prices will go down you will have those early “Ready Player One” moments. Halfway into the next decade it seems like that’s the dominant thing so that’s the arc.

 

Of course one of the places we’re looking at, because we always tend to look at this through the consumer market lens, really where a lot of this is needed is in industrial spaces. In warehouses and ports, mining and agriculture, and that’s where the holographic part of this is only one layer, like you could think of the metaverse as sort of being the internet of experience. It’s a human-like interface layer of the spatial web while the internet of things is a machine interface. You having an experience where you go into a store, you grab some stuff, a bunch of cameras identify you and the object, and your account, and your wallet, and you walk out and it just charges you, that’s not a metaverse experience. That’s that you’re embedded in a mesh network of the internet of things, but from our lens that’s still the spatial web, and if then something popped up in front of you that gave you a holographic receipt and that was maybe that’s part of the metaverse. But if as you’re walking into the environment you’re seeing information displayed on top of all of the items saying “hey you’re gluten-free this week how about 10 off that”, that’s a metaversal layer on top of an IOT mediated experience. So you have to consider that component, that sort of digitization of the physical, and physicalization of the digital. These things are intersecting, and one sort of abstract point here is that, that’s what language is.

 

The experience that a deer has of the physics of a space is almost identical. The deer is in my living room. It can navigate that room better than I can, it understands the objects, but it doesn’t know what they are. But I’ve embedded the meaning, “oh that’s my couch, that’s my tv, that this is a picture of my family” it can see all those things, but it doesn’t know what they are. So I’ve embedded this virtual layer of meaning into the world, all these objects have terms, functions, and ideas. I know what they’re supposed to do. A monkey can still tell that it can sit on a couch, it understands the basic physics, it understands gravity, it can even understand the function of a couch, in that manner, but it doesn’t know whether that it might be meaningful socially to me whether it’s from Ikea or it’s from some fancy italian store. That’s a social virtual layer that humans, the human monkey has embedded into the world around us. We both configured it physically, like we shaped all this stuff, all these objects, but we’ve also added this whole virtual layer. Now we’re just doing it in a completely multi-dimensional way, so that the sort of digital twinification is putting those data sets onto the objects, and into the environment, and then we’re going into those sort of imaginary spaces that we put in our brains that we put into books we put in pictures we put into paintings we put into films. Now we’re just making them fully experiential. I think that’s that big shift, is moving from the information based web, to this experiential based web, whether that’s in the physical world or the digital world. Those blurred lines really create some new kind of digital environment and digital era. I don’t have a good word for it, that’s not a good word, forget it.

 

Sam

That’s a good start though. So you’ve implied that the roll out of the spatial web is not tied to the success in the hardware of augmented reality or virtual virtual reality hardware, meaning it can start sooner. So what does the rollout of the spatial web technologies look like as opposed to things that are tied to a VR or AR experience?

 

Gabriel

Yeah what a good question. So well one of the things we just spoke about was it’s really the IOT first perspective. So in a warehouse you’ve got workers that essentially spend eight hours a day walking around 100,000 square feet trying to locate an item in a box. So what you do is you you build a three-dimensional map in your head, because some things are high, and something’s low, and some things are on the second floor, and then you get an order on your android phone it says go to row 17, isle 12 section l and grab item xj4-92. Okay where am I? You do all this three-dimensional triangulation, and then you route yourself. What we found is if you build a digital twin of that entire environment, and you map the inventory into every single location, so you now embedded the information into the space, you can take that same android phone and route that person to that location, and the cognitive load drops in half, and they basically speed up by 30 percent. The productivity, and performance, and actually the mental cost, the tax, we think about “oh people working hard for eight hours and now you’re making them more productive” there are some societal concerns about optimizing human activity in these environments, and there’s been some real concerns about how amazon is doing it, at the same time if you can reduce the cognitive load that’s actually the most expensive part for them, and so that’s one of the functions. So you’re building a three-dimensional twin you’re embedding the information into the environment, and you’re routing the human worker through that, now ultimately that human worker is hands and feet, because you’re no longer relying on the brain to do that much so the ability to then have humans and robots working in that same space or having humans in the day and robots at night or any combination thereof is a function of creating that shared model.

 

If there’s a shared model, whether it’s in the city and how how drones then decide are the certain flight lanes that they have, or or government policy that says in europe maybe the eu wants to say that the total flight height for all drones of a certain class is 120 meters, but in the netherlands they might want to say that it’s 100, and brussels maybe they say it’s 80, and as a drone passes through, it cannot ever go higher than 120, but as it goes through the netherlands and over the brussels it has to go down and then down. This is the machine’s view of the spatial web, and all of those capabilities and things are necessary right now, look at the supply chain, we had 500,000 cargo containers sitting off the coast here over christmas, and that’s because the ability for these systems to be able to communicate to each other is really poor, and our understanding of supply and demand globally is really bad. You have something like the pandemic hit and the butterfly fly effect is just happening all over the place and we’re not resilient enough to be able to adapt at this scale. So that’s really what the ultimate value of something like the spatial web means, adaptation. So it’s about the internet of experience over here on the metaverse side, the internet of things, this industry 4.0 narrative is about automation.

 

That’s really the goal. Can we use machines and algorithms to automate physical functions in the system, and if you did that the impact on climate and the amount of carbon we’d be using would be hyper optimized, and we could set thresholds and regulations and the machines could adapt and adjust this. Instead something like the pandemic hits, we all go to Amazon and buy everything we ever wanted to buy because we can’t go anywhere, and then we just log jam every port in the world. This allows us to start to get a more holistic view, and the way that we start to look at the function of monitoring is not so much monitoring and surveillance, because those capabilities will be there, but if you use things like verifiable credentials, you use things like the IDs, and these get built into law and regulation, then we have we can have the ability to share this kind of information with each other. So you can set up a relay race of an adaptive supply chain, don’t send the automated truck until the thing arrives at the port, minimizing the total amount of carbon output. This adaptive thing is very much what organic systems do, this is how the body manages, you’ve got the autonomic nervous system, we need autonomous supply chains. These are the kind of functions that you can have in the spatial web which are absolutely critical, that are not about having an awesome virtual experience or or trying to sell a jpeg for a million bucks. 

 

Sam

So you’re claiming then that the world yesterday needs the spatial web, and that the visualization can currently be done, maybe in a 2d factor, or something like on a smartphone, or other flat screens that we have, and then as soon as the AR and VR get ready it’ll slide right into to that function for visualization.

 

Gabriel

That’s a really important point in this, in the warehouse we have an application that is multi-modal, and some people prefer just the audio, like with turn-by-turn instructions. Some people want to look at the map, some people prefer to hear it, some people want both, in our cases a lot of the workers are spanish so they want spanish audio. Generally they don’t want 3d, we started with 3d holograms, people were like “this is awesome, but i just need this” this is what’s most useful for me. So even this idea that “oh yeah 3d information is better and holograms are better”, we started doing that four years ago, and everyone was impressed and it was faster and easier, but it hurt to wear the headset for eight hours and the battery would only last three hours. So the hardware isn’t ready on that front, but cameras, AI, algorithms, and computer vision are ready to go, and so that’s a place where this is able to start. It’s really quite useful in some of these really critical environments that are far more relevant to our day-to-day lives and not so much in the entertainment or gaming or space. There’ll be education, we won’t be doing zooms like this, we’ll all be awkwardly sitting around a very large table, but there are functions today that are super meaningful for deploying the capabilities and use of the spatial web, and this is without even getting into any of the the broader needs around how DIDs can permeate, and self-sovereign identity is relevant in and of itself. Just like land title ownership by the dimensions this is going to end up being the domains of the spatial web, we will essentially be bound like boundaries. Your home is a domain, your property is a domain, the city’s a domain, today we think of domains as urls but domains in the spatial web are spaces.

 

Sam

Absolutely, so you’ve hinted at this a little already, which is really powerful. What do governments and companies do now to prepare for the coming spatial web?

 

Gabriel

I mean I think that there are a few groups that governments and researchers and regulators need to start becoming aware of. Predominantly that’s the work that’s being done around DIDs and DIDcomm, verifiable credentials, and the spatial web foundations working group at the IEEE. I think those are the critical components that you need to understand. I think that there is a more provocative branch off of this question, which is what role do governments play in virtual worlds? So it’s outside your jurisdiction, so what is legal or not not legal, and who says and how? So thinking about your geopolitical domain is one thing, thinking about people physically in your dimension that are experiencing something in another dimension that you have no jurisdictional power or authority over is a question. This ties into larger questions of governance, i mean Minecraft i think is eight times the size of earth. That’s big, that’s a big space, okay that’s one little world, you’re talking there’ll be thousands and then hundreds of thousands. There’s 300 million websites today, there’ll be billions and billions of these environments and spaces, at some point AIs will just auto generate whatever experience you want. Just start choosing adventure in real time based on biometric feedback, so the possibilities here are going to get really weird, really interesting.

 

The Chinese quote may you live in interesting times is sort of a double-edged sword. These are interesting times and so these are the kinds of questions. The last question I want to leave you with on this point is if Disney land or H&M or whoever builds a store in decentraland, can I build a Disney store right next to the Disney store in decentraland? Who’s preventing it? Are they respecting the trademark and IP agreements of the United States? We will need treaties between organizations, countries on earth, and in these virtual worlds as if they were other planets. In the same way that we had to look at these challenges when we formed the league of nations or the united nations, we’re going to have other dimensions that we need to interact with that have entire economies that we trade in where the jurisdictions don’t apply. Then again who do you trust in those environments relative to what authorities? And so what prevents me from doing that in decentraland? Why can’t I sell Disney merch? or modify it with marvel or warner brothers stuff or whatever i like? So i think this is where trusts and guarantees around verifiability and credentialing of authorities becomes really critical, and if we’re going to digitize all of this, all of our reality in this next age, and all this money is flowing into the space, these are the kinds of big questions that we’re constantly thinking about and working on with respect to the standards. 

 

Sam

Fantastic, we are unfortunately out of time, but this has been great to hear your vision and thoughts around these things. Is there anything you’d like to leave us with?

 

Gabriel

Yes, I think that the the role of identity is a fundamental pillar, and i think that in the the arc of history when we look at the role of property, and property rights, physical property rights, i just talked about intellectual property rights, data property rights, and digital asset property rights are are the next challenge. This is not a purely philosophical question, this is going to come down to law. So we need to start thinking of what does digital law mean in the 21st century? And making sure that the work we’re doing is addressing some of those questions, or hosting some of those conversations, I’d like to see more constitutional lawyers involved in these conversations. I’d like to see people that are really working through this sort of lens of data rights and ethics having these dialogues. That’s why I think that the next battleground is property rights, with respect to the function of data and the things that data form, which become these new hyperspatial objects in this next era.

 

Sam

Fantastic, Gabriel, thank you for spending this time with us we’re we’re grateful for your willingness to be here. I hope everyone enjoyed this conversation and we’ll look forward to seeing you all in the future.

 

Gabriel

Thank you so much for having me, take care everybody 

The post Hyperledger Global Forum Provides Launch Point for Development and Deployment News from Across Expanding Ecosystem appeared first on Indicio.


Northern Block

Learnings from Aries, Indy and Various Verifiable Credential Implementations

September 13, 2022 By Mathieu Glaude Introduction My name is Mathieu Glaude. I’m the CEO of Northern Block, a company bringing products to market that will help people regain sovereignty and privacy over their lives. We’re hopeful that we can be one amongst the many contributors that put the building blocks in place to ensure […] The post <strong>Learnings from Aries, Indy and Various Veri

September 13, 2022 By Mathieu Glaude

Introduction

My name is Mathieu Glaude. I’m the CEO of Northern Block, a company bringing products to market that will help people regain sovereignty and privacy over their lives.

We’re hopeful that we can be one amongst the many contributors that put the building blocks in place to ensure that people’s rights are protected.

We’ve therefore chosen to build our products (Orbit Edge, Orbit Enterprise) on self-sovereign identity principles and open standards & technologies.

I only personally started diving deeper into the world of digital identity in 2019, when as a company, we lacked an elegant way to let users interact with decentralized governance systems that we were building using combinations of blockchains, crypto, smart contracts, etc.

I must say that these past few years have been quite the learning journey for myself. I’m fortunate to be  surrounded by a team of brilliant curious individuals who help push my knowledge every day.  

I also took it upon myself in 2021 to start doing the SSI Orbit Podcast with thought leaders in the digital ID space, with an objective to push the industry forward.

Today, Northern Block is one of the many organizations contributing to Hyperledger Aries (and consuming it).

At this time, I felt like it made sense for me to share some of our learnings as contributors and developers of systems that use Aries, Indy and various verifiable credential implementations.

As I wrote this blog post, two things became apparent to me. 

The various elements are evolving with the community to address specific use cases, and interdependencies between them are being eliminated to allow for ultimate flexibility and interoperability, where required. Convergence seems to be happening across various elements of the ecosystem. Standards such as OIDC and mDL are all now in dialogue with W3C, AnonCreds, Aries, etc. Mobile is a predominant technology, just like the way laptops were once upon a time. To reduce consumer friction and drive adoption, convergence of all these different technologies is required inside a mobile environment

I hope this blog post is helpful both for people familiar and unfamiliar with Aries, Indy and AnonCreds. 

I look forward to continuing learning together to help people regain sovereignty and privacy over their lives.


Learnings from Aries, Indy and Various Verifiable Credential Implementations
1  Aries Protocols vs Aries Frameworks

When talking about Aries, I think it’s important to first differentiate Aries Protocols from Aries Frameworks.

Aries Protocols are the defined set of Request for Comment (RFCs) for the Aries project. They basically describe important topics that we want to standardize across the Aries ecosystem.

On the other hand, Aries Frameworks are implementations of the Aries Protocols. Examples include Hyperledger Aries Cloud Agent – Python (ACA-Py), Aries Framework JavaScript (AFJ), Hyperledger Aries Framework Go, and others.

Some folks in the community have reasons to use one Aries Framework over another. We at Northern Block have chosen AFJ for our mobile wallets and ACA-Py for our cloud wallets. We wrote about some of the reasons for choosing AFJ in this recent blog post.

Challenging implementations is one thing, but the protocols are what the community is committed to continuously improving together.

The tools (frameworks) can come and go, but if we mess up the protocols, it won’t matter what tool we wish to use to implement them.


2  Aries Protocols Aren’t Tied to Indy

Aries is a messaging protocol between peers, with no reference to specific implementations, meaning that Aries is detached from Indy.

And what you move on these messaging pipes needs to refer to something stored externally, such as a wallet, ledger, or context.

You don’t know what tech stack the recipient is using. So you need interoperability on the pipes so that people can receive, interpret, etc. How they store it after that doesn’t matter. So Indy isn’t actually on the pipes.

The fact that Aries refers to Indy today is only as reference implementations. Though it’s mentioned, we shouldn’t read it as it’s tightly coupled, but instead a reference storage format.  While it is important that protocol definitions stay independent from technical stacks, today they are not. Work must be done to make it clear in documentation that they aren’t tightly bound.

Let’s take an example of an Aries RFC that refers to Indy. In the Present Proof protocol, you can implement the Aries protocol and attach different presentation formats.

From Aries 0454-present-proof-v2, not tying Aries to any presentation formats

The Aries protocol choreography (the heart of Aries) also shows that any presentation format can be attached to run it. It has nothing to do with any particular storage technology.

From Aries 0454-present-proof-v2, Choreography Diagram
3  Aries Protocols Aren’t Tied to AnonCreds 

Now that we’ve separated the messaging protocol from the storage, let’s look at how credentials can be stored.

Right now Northern Block is focused on building use cases which are privacy-preserving, thus best suited for AnonCreds. We have looked at use cases where JWT and JSON-LD credentials would be a good fit.

The AnonCreds specification is being developed to address a privacy-preserving specific use case with participation from the community.

It’s an independent specification which has evolved into its own things. AnonCreds are completely separate from Aries. And the community is working towards making AnonCreds independent of any ledger stack.

It’s good to see the data exchange protocol and the process protocol being developed independent of how credentials are stored. Aries is being developed to address data exchange and business process protocols, whereas AnonCreds are being developed for how credentials are stored and how they can be cryptographically secure. They are totally independent. 

Depending on the use case, you can choose the right combination of business process protocols. For example: mDL. For interoperability to work, one can replace Aries with mDL and use AnonCreds. Work is also in progress where the AnonCreds specification will be used to store credentials which can be consumed by OIDC. 

Ultimately the goal is interoperability and this is what we’re moving towards. That’s where community contributions are happening.

The next version of the AnonCreds specification (tentatively v1.0) will remove from the v0.1 specification any dependence on Hyperledger Indy by removing any requirements related to the storage of the objects used in AnonCreds, whether they be stored remotely on a “Verifiable Data Registry” (including Hyperledger Indy) or in local secure storage.


4  AnonCreds are Purposely Different from the W3C VC Data Model

AnonCreds are intentionally different for a good reason:  the ideal number of credential formats/types is > 1. 

Certain credential types are better for different use cases. Indy/Aries does support multiple credential types to allow for this. 

The biggest concern is with high assurance government credentials such as a verified person. This type of credential must support the privacy preservation provided by zero knowledge proofs and supported by AnonCreds. We don’t want to create the equivalent of a modern tracking cookie bound to a high assurance credential that uniquely identifies a person.

Further, we might want to use different credential formats for different schemas. Combining different credential formats could be great for certain use cases.

For example, I may want my address to be privacy-preserving (AnonCreds), whereas my tickets to a sports game shouldn’t be, and should be transferable (pick your W3C credential data model).

Again, If you’re looking for more details around this, we’d suggest you refer to Daniel’s paper, which did a great job at explaining this.

Though current implementations are that credential and schema definitions are posted on an Indy ledger, nothing is stopping anyone who implements the Aries Framework and the AnonCreds specification to use their ledger technology of choice.


5  CL signatures

Currently, CL signatures are used to serialize the payload prior to storage.

First, CL-signatures have a long history of successful usage. Real-world usage shows it is at an acceptable level for the performance of hardware. 

Secondly, Anoncreds are not explicitly tied to CL Signatures, and are replaceable/upgradeable in the future.

In the current state, CL signatures solve specific use cases around privacy preservation that are not possible with traditional public/private key cryptography. 

Privacy preservation for high assurance issued credentials is a must have requirement for certain use cases and must be addressed by the digital ID community.

Now that some of the advantages of BBS+ signatures have become relevant for privacy-preserving use cases (such as selective disclosure), Northern Block is actively involved in having W3C verifiable credentials with BBS+ signatures in Aries agents. It’s being developed right now and is close to being available across the various frameworks.

This is an area of considerable alignment across all standards, although there are still some technical challenges to solve.  As an industry we need to push NIST, BSI and other standards bodies to fully certify the core crypto technologies. This is not only an Aries problem.


6  Aries Protocols Are Built with Interoperability in Mind

If VCs can be stored in different formats, can we achieve interoperability when business processes involve multiple participating entities who may choose different tech stacks? 

Yes.

Peers can be on different tech stacks. For example, Northern Block uses ACA-Py for cloud agents and AFJ for mobile agents.

Interoperability isn’t about how credentials are stored. Interoperability is about the proofs that are exchanged, not what a holder is storing. How data is stored isn’t what matters in this context; what is returned is what matters. What a holder exchanges is a proof, not a credential itself.

Aries frameworks are where interoperability gets demonstrated and tested. Hence the message sent by one peer must be interpretable by the recipient peer. Aries agents / Aries frameworks developers must ensure that they are developing with interoperability in mind.

For those developers there are the Aries RFCs that outline all of the protocols and interopathon events to put all of the parties together. There are multiple reference implementations that any party can use to test their implementation, and all of this is open-source.

Interoperability is assessed using the Aries Agent Test Harness (AATH), an open-source software that runs a series of Aries interoperability tests and uses a Behavior Driven-Development (BDD) framework to run tests that are designed to exercise the community-designed Aries Protocols, as defined in the Aries RFC GitHub repo.

Towards the end of 2021, Northern Block demonstrated AIP 1.0 Compliance of its Orbit Edge Mobile Wallet against the AATH (here).

AIP 2.0, builds on AIP 1.0 including how Aries agents can exchange several types of verifiable credentials, including W3C standard verifiable credentials.


7  Aries Frameworks Are Built For Mobile

Now let’s move to Aries frameworks.

At Northern Block, we’ve built multiple Aries wallets. Over the past few years, we’ve built them using different architectures. 

Lately, we have selected AFJ (here), but earlier we were using other technologies. 

As more choices became available, we had more options to choose from. As the adoption of SSI increases, we expect more options to become available.

For the time being, there is no Aries Framework Swift – not because it is impossible – just that not enough people have wanted to put in the effort to make one. The same thing is true with Flutter. There is nothing stopping people from writing these and making them open-source.

AFJ is currently fitting the role with support for React-Native cross-platform mobile development.

Newer wallets have sizes of 50 megabytes or less. This represents 0.3% of the storage space on low end (cheap) smartphones.

We’ve however seen an issue in size with the limitation in the revocation registry, rather than storage. This is an issue to be addressed, but is proven to work today and is sufficiently scalable for real-world large credential deployments. No standard has a great revocation story right now and we need to ensure revocation scalability is there for use cases which need it.

8  Aries Protocol And Frameworks Continue to Evolve Through Community Learnings

Open standards should be given time to emerge, evolve, and mature. And this has been the case with Aries protocols and frameworks. This has been a global effort.

For example, let’s take the evolution of how connections have been handled by Aries:

RFC-0160 –> RFC-0023 –> RFC-0434 

You can see an evolution of the protocol as we continued to learn, making this one a good example of agility and upgradeability. 

With the first implementation (0160), there were challenges (e.g., the ability to re-use connections wasn’t possible, limited support for the type of DID Documents). 

It evolved into the second RFC (0023) based on DID Exchange and supporting multiple DID Documents.

It was then superseded by 0434 which addressed remaining challenges and provided a way to support the previous two, while creating a space for new handshake protocols to be supported as they evolve. 

The main value is that when I create an invitation, I give the choice of protocols that are supported. And we can list any number of them, which takes ties away from any one specific way of doing things. Earlier this wasn’t the case.

We expect other protocols to continue evolving through community participation.

At Northern Block, we want to continue building robust privacy-preserving solutions and would love to engage in more conversations on how those solutions can be built while keeping interoperability with other use cases and tech stacks while avoiding vendor lock-in. 

Schedule a call with us to discuss how you can leverage some of our products and/or professional services to achieve your goals!

The post <strong>Learnings from Aries, Indy and Various Verifiable Credential Implementations</strong> appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Global ID

The Launchpad: Introducing the new ID Wallet

The Launchpad is a monthly series highlighting updates to the GlobaliD product ecosystem. This is an exciting one. On September 12th, we had our biggest product launch in quite some time — an all new, revamped ID Wallet, the first of a series of major updates planned for the coming months. The ID Wallet is a core pillar for achieving GlobaliD’s mission to enable everyone to own and control

The Launchpad is a monthly series highlighting updates to the GlobaliD product ecosystem.

This is an exciting one. On September 12th, we had our biggest product launch in quite some time — an all new, revamped ID Wallet, the first of a series of major updates planned for the coming months.

The ID Wallet is a core pillar for achieving GlobaliD’s mission to enable everyone to own and control their digital identity.

Learn more: GlobaliD 101: ID wallets The all new ID Wallet

As the user-facing part of the Trust Triangle, your ID Wallet should be beautiful, secure, and convenient. This latest release represents our first step toward building a best-in-class ID Wallet that’s useful from day one. We’ll be continuously updating the new ID Wallet in the quarters to come.

With this latest release, you’ll be able to digitize, store, and manage your most important identifying documents all in one secure place — your phone.

Key features and updates: The ID Wallet is now the homepage of the GlobaliD app. When you open the app, you will instantly be able to view and add your most important identifying documents so you always have them at your fingertips. One tap copy and paste. That’s any field from any of your digitized credentials. Need to share your driver’s license number online? Simply open the GlobaliD app, tap your driver’s license number to copy it, then paste it into the desired location. It’s that simple. Easily share your credentials. Need to share a copy of your credential with an Airbnb host or provide a copy of your driver’s license to a car rental service? Just select the share option on the ID Wallet screen and instantly send an image of your credential to anyone. An intuitive, refreshed design. With our improved wallet and credential designs, it’s easier than ever to access your credentials in one, convenient place. Your most used IDs, like your Drivers License and Passport can be found on the home screen, while your less used ID information can be found by clicking “Identity checks.” Ready for web3. Our new ID Wallet is built on portable and interoperable standards including those recently finalized by the w3c. Learn more about the power of verifiable credentials in our recent podcast with Indicio.

Visit the App Store or Google Play Store to download the latest version of the GlobaliD app today.

If you would like to learn more about GlobaliD, visit our website or follow us on Twitter, LinkedIn and YouTube. You can also sign up for the GlobaliD Insider newsletter to receive monthly updates on digital identity, company news, and more.

The Launchpad: Introducing the new ID Wallet was originally published in GlobaliD on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

IndyKite Knowledge-Based Access Control (KBAC)

by Richard Hill Given the growing ecosystem of digital identities that extend beyond humans to bots, devices, and machines in a modern world, there is also a corresponding need to manage the increasing complexity of interactions between them to determine access decisions. Using the identity data, context, and relationships, IndyKite offers a progressive approach to authorization access controls wi

by Richard Hill

Given the growing ecosystem of digital identities that extend beyond humans to bots, devices, and machines in a modern world, there is also a corresponding need to manage the increasing complexity of interactions between them to determine access decisions. Using the identity data, context, and relationships, IndyKite offers a progressive approach to authorization access controls with its Knowledge-Based Access Control (KBAC) solution.

Shyft Network

Veriscope Regulatory Recap (5th — 11th September)

Veriscope Regulatory Recap (5th — 11th September) Welcome to another edition of Veriscope Regulatory Recap! We are back with exciting news from the world of crypto regulations this week, from the US Treasury’s decision to warn the White House over digital assets to the latest UK notifications over sanctions compliance to VASPs. So, without further ado, let’s dive straight into it. US Treasu
Veriscope Regulatory Recap (5th — 11th September)

Welcome to another edition of Veriscope Regulatory Recap! We are back with exciting news from the world of crypto regulations this week, from the US Treasury’s decision to warn the White House over digital assets to the latest UK notifications over sanctions compliance to VASPs. So, without further ado, let’s dive straight into it.

US Treasury to Warn the White House About Crypto-related Risks

As per recent reports, the US Treasury Department will alert the White House regarding cryptocurrencies if the federal government doesn’t enact significant new crypto laws, given the serious financial risks to US consumers and the economy.

(Image Source)
The Treasury intends to release four distinct reports that will showcase the viewpoint of top economic officials towards cryptocurrencies while pitching for strict regulation, according to the Washington Post.

Read More here: https://www.cnet.com/personal-finance/crypto/treasury-to-call-for-more-crypto-regulation-report-says/

Indian Finance Minister Asks IMF to Take the Lead in Crypto Regulation

Nirmala Sitharaman, India’s finance minister, has urged the International Monetary Fund (IMF) to play “a lead role” in regulating cryptocurrencies while meeting IMF Managing Director Kristalina Georgieva in New Delhi.

Ms. Georgieva, too, agreed, saying that the IMF is prepared to collaborate with India on issues like climate change, cryptocurrency legislation, and other contemporary world problems.‍

(Image Source)

While speaking to CNBC TV18, Ms. Georgieva reiterated that the IMF supports Indian crypto legislation and emphasized the significance of achieving a balance between the advantages and disadvantages after she met with Sitharaman.

She also noted, “India’s digital ecosystem is on steroids as cryptos have emerged strongly without regulations. It is important to remember that cryptos are like the wild, Wild West.”

Read more here: https://www.investing.com/news/cryptocurrency-news/indias-finance-minister-meets-imf-chief-to-discuss-globally-coordinated-approach-to-crypto-2888358

New German Regulation to Issue Investment Fund Units as Crypto Fund Units (Image Source)
It is now possible to issue units of German investment funds digitally, popularly referred to as Crypto Fund Units, according to the new German Regulation on Crypto Fund Units (Verordnung über Kryptofondsanteile — KryptoFAV).

As for how this will work, a distributed ledger technology (DLT) or blockchain-based decentralized Crypto Securities Register will be used to issue Crypto Fund Units.

The fund’s depository will be responsible for maintaining the relevant Crypto Securities Register, but they can alternatively assign this responsibility to a third-party business that is permitted to do so.

Interestingly, German investment funds organized as contractually constituted common funds (Sondervermögen) are subject to the KryptoFAV. However, corporate-organized investment funds are not.

Read more here: https://www.jdsupra.com/legalnews/germany-new-regulation-kryptofav-6746579/

The UK Mandates Crypto Exchanges to Notify Suspected Sanctions Violations

According to new regulations put in place amid growing concerns that sanctioned parties are turning to crypto assets to get around the restrictions imposed in rebuttal to Russia’s invasion of Ukraine, the virtual asset exchanges are now required to notify the UK authorities of any suspected sanctions violations.

(Image Source)

Official regulations were updated on August 30 to clearly list “crypto assets” as one of the key items that must be prohibited if sanctions are imposed on a person or an entity.

Any Virtual Asset Service Providers (VASPs) that fail to report clients who are subject to sanctions will be in violation of rules set forth by the United Kingdom Office of Financial Sanctions Implementation (OFSI) of Her Majesty’s Treasury.

The agency further stated that if “any other payment instruments” and assets, including cryptocurrency, are used with the intention of obtaining funds, goods, or services, it will be a gross violation of the sanctions.

Read more here: https://www.theguardian.com/technology/2022/sep/04/crypto-exchanges-suspected-sanction-breaches-russia

Important Announcement: 10,000 SHFT on Offer! (Image Source)

After Shyft DAO approved the Veriscope VASP grant proposal, an aggregate of 10,000 SHFT has been granted for Virtual Asset Service Providers (VASPs) that integrate to the Veriscope mainnet by September 30, 2022. The fund will enable VASPs to pay the Shyft Network gas fees while using Veriscope. This offer will remain valid until December 31, 2022, or till the VASP exhausts its SHFT grant.

Read more here: https://medium.com/shyft-network/shyft-dao-approves-veriscope-vasp-grant-proposal-to-enable-free-fatf-travel-rule-transactions-77e51d735cd6

Interesting Reads
Why IMF Calls for Regulating Crypto?
Crypto assets are no longer niche, says IMF
How Germany’s regulators beat the SEC in the race for crypto regulation–and convinced me to establish my business there?
Australian police unveil its crypto regulation unit 1
‘Regulation by Enforcement’ Won’t Work for Crypto, Argues SEC Commissioner
Trending Event

Shyft Office Hours with guest Oliver from Panther

Click here to set a reminder: https://twitter.com/i/spaces/1BRKjZpRzMoKw?s=20a

‍______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope! Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

Veriscope Regulatory Recap (5th — 11th September) was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Indicio

Hyperledger Aries is the Present and the Future of Internet-Scale Trusted Verifiable Credential Ecosystems

The post Hyperledger Aries is the Present and the Future of Internet-Scale Trusted Verifiable Credential Ecosystems appeared first on Indicio.
Hyperledger Aries, AnonCreds, and Hyperledger Indy have given enterprises and governments a powerful way to build and use open source, interoperable decentralized identity technology. The reasons? A large and diverse community of developers and users is driving a virtuous cycle of robust, flexible technology and innovative implementations that meet market needs

By Sam Curren and Mike Ebert

Hyperledger Aries, AnonCreds, and Hyperledger Indy are dedicated to making privacy-preserving identity solutions a reality, the exchange of trusted data easy and reliable, and the interactions between people, organizations, and systems enriching.

As a combination, they have demonstrated that they are robust, flexible, and extensible — capable of incorporating many new codebases, technologies, and cryptographic signatures, capable of delivering powerful solutions that have led to significant growth in implementations, community members, and customers.

While no technology runs perfectly on every device, a signal strength of Aries, AnonCreds, and Indy is that they work on the vast majority of current devices and systems, including $35 smart phones and low powered IOT/embedded devices. They represent the most inclusive way into this technology, which is an important factor in their popularity.

But as with any technology, they are also constantly evolving to meet the growing demand for decentralized identity solutions. To date, scaling has not been the primary challenge, but Aries and Indy are responding to this need. Important progress has been made, such as incorporating the Askar wallet, and work continues in areas such as clustering and multi-tenancy. The vision of having billions of verifiable credentials, devices, and users participating in trusted digital interactions all around the world has never been closer.

While solutions can’t simultaneously be new and tried and true, AnonCreds has been used since 2017. It is a battle-tested and reliable solution to identity and verifiable credential problems. Perhaps even more important, it is the only current credential specification that is able to meet the EU’s and real world privacy requirements; it offers the key privacy preserving features needed for marketplace identity solutions — selective disclosure, predicate proofs, and privacy-preserving holder binding. Formalization of the proven standards and best practices into an official specification is underway.

AnonCreds also coexists beautifully with other signature styles. Each has its own strengths and weaknesses, but when used for the right purposes, they complement each other powerfully. Aries added support for W3C credentials two years ago and is committed to supporting the best credentials and signature styles; Indy is incorporating cross-ledger compatibility and governance feedback.

All of this shows that Indy, Aries, and AnonCreds are robust, proven, trusted, dynamic, and adaptable — as are the community members, organizations, and customers that back them. They will continue to grow and be an important part of the decentralized identity and verifiable credential space, and we, as backers of these solutions and communities, are committed to creating the best solutions and making the right choices to implement and improve identity for all.

Some myths about Hyperledger Aries, AnonCreds, and Hyperledger Indy — and the reality

Inaccurate claims spread effortlessly on simplification, while understanding often requires mastering and translating complexity. That’s the challenge in communicating advances in science and technology. It’s easy to misunderstand what’s happening if you’re not immersed in the data or the code and then it’s hard to correct errors once they reach critical velocity or become sunk intellectual costs.

As leaders and active participants in the open source community, we urge people to attend these groups to learn about what they are doing, to ask questions, and, above all, to help solve problems that you see. This is the magic of open source technology—it’s open! And this openness accounts for its success in creating robust code and innovation.

Of course, not everyone is able to do this. Which is why we want to address some of the myths and misconceptions about the core open source technologies that we build on, contribute to, and use: Hyperledger Aries, and AnonCreds, and Hyperledger Indy. 

Myth 1: Hyperledger Aries is fragile and isn’t flexible or extensible enough for decentralized identity to scale in part because it’s tied to AnonCreds

Reality: The reason Hyperledger Aries is the most widely used codebase for implementing verifiable credentials— by enterprises and governments alike— is that it currently provides the best extensible, flexible, and reliable platform for building verifiable credential solutions. 

Far from being structurally dependent on the AnonCreds verifiable credential specification, Hyperledger Aries supports multiple credential formats including those that follow the W3C data model. Again, this flexibility is why Hyperledger Aries is so popular: it’s the easiest way to get a verifiable credential solution implemented.

In addition to the support for JSON-LD W3C credentials that has been available for years, the Aries community is working on chained credentials which will, in many cases, carry both AnonCreds and JSON-LD credentials in the same message of a credential workflow. The DIDComm protocols for issuance and presentation first developed with Aries have no opinion about which credential types and signature schemes are passed within its messages. 

As with all the open source codebases mentioned here, Aries has a large developer community devoted to improving the specification and code. One strength of this is that the community is able to respond to real-world needs in enterprise and public sector implementations. Technology evolves through use, and this synergy between implementation and development is an example of the virtuous cycle created by open source technology.

Some of the criticism of Aries appears to be driven by a “good is the enemy of the perfect” attitude; to which the answer is, “well come join the Aries community and make the good better.” We’re at the beginning of the decentralized identity journey, not at the end—and we’re in a phase of rapid growth and innovation.

Myth 2: Aries doesn’t adapt well to mobile use

Reality: We’ve heard concerns that there just isn’t enough space on a mobile phone to use Aries—or that the space issue will block scale. Newer digital wallet apps are around 50mb in size, and some are smaller. This represents just 0.3% of a low end $35 smartphone.

Other concerns focus on the unavailability of libraries for different mobile codebases. As Aries is an open source effort, libraries will be added or expanded when interest exists and contributors show up for the work. We’ve seen interest and work expand and anticipate it will continue.

Myth 3: Hyperledger Indy can’t manage large-scale issuance or verification

Reality: The idea that Hyperledger Indy networks can’t manage the mass use of verifiable credentials is contradicted by real world use. The Government of British Columbia has issued millions of credentials based on a few (<10) writes to a ledger.

Any evaluation of scale requires understanding how a ledger is used. With AnonCreds, the use of a ledger scales with the number of issuers and credential schemas and, as nothing specific to a credential issuance is written to the ledger, issuance scales without impact on the ledger. 

Verification only requires a cached copy of the relevant ledger assets, allowing verification to scale according to the number of  verifiers calibrated by caching policy. 

In other words both issuance and verification scale independently of the number of issued or verified credentials and the assets required for verification will be downloaded and cached prior to verification, which solves the speed problem.

Myth 4: Current revocation is insufficient for business needs 

Reality: Hyperledger Indy has the only working open source privacy-preserving revocation; it has not proved a limiting factor in any current business or public sector deployment that we’ve built and implemented.

Will it get better? Yes, the community is working to make it better. But the concerns about business needs are theoretical when contrasted with real world practice.

Myth 5: The AnonCreds specification is out of date 

Reality: There are two key reasons why governments, in particular, choose to build verifiable credential solutions using AnonCreds and why the specification is the most widely used in decentralized identity solutions around the world. 

First, the AnonCred credential format  is battle-tested. As an open source codebase, it’s been around and continuously updated since 2017. You can find the specification here.

Second, it’s the only credential specification that provides the privacy-preserving features that enable verifiable credentials to comply with data privacy law. No other credential specification is able to meet these critical legal and political needs. This is a non-trivial issue (see answer below).

On both counts, newer isn’t better. But at the same time, open source specifications are constantly evolving. You want to make AnonCreds better? Then get involved in the AnonCreds Specification Effort

Myth 6: Anoncreds doesn’t align with the W3C Verifiable Credential Data Model.

Reality: The AnonCreds specification provides privacy-preserving features that no other credential specification provides: selective disclosure of data, and zero-knowledge proofs. AnonCreds also predates the W3C credential data model;  as updating AnonCreds would yield few improvements in interoperability, such work has not been prioritized by the community.  

The features provided by AnonCreds are essential for organizations to comply with data privacy law. This is not a theoretical point: Indicio has many global enterprise customers and they want and need privacy-preserving features in their verifiable credential implementation. If the current amendments to the European Union’s digital identity proposal are accepted, AnonCreds will be the only current verifiable credential specification that meets the EU’s privacy requirements. 

The power of Hyperledger Aries and Indy is that they can support multiple credential specifications and thereby provide enterprises and organizations with the flexibility they need to build their solutions. The focus within the Aries community has been to support W3C credentials alongside AnonCreds, as they generally serve different use cases but are complimentary.

Two ways to learn more

Nothing beats learning by doing. And to this end, Indicio has created a series of hands-on workshops on every aspect of open source decentralized identity technology for technical and non-technical audiences. We understand that the multiple concepts and codebases and workflows can be difficult to grasp on paper! 

For those willing to dive in directly, we invite you to attend  the working group meetings at Hyperledger. It’s always best to talk to the people actually developing these codebases rather than second-hand opinions!

The post Hyperledger Aries is the Present and the Future of Internet-Scale Trusted Verifiable Credential Ecosystems appeared first on Indicio.


Ocean Protocol

Ocean Protocol joins leading Web3 projects on the €20M+ Gaia-X moveID initiative to advance…

Ocean Protocol joins leading Web3 projects on the €20M+ Gaia-X moveID initiative to advance pan-European mobility The initiative will leverage the Ocean tech stack to implement decentralized, digital identities and power the future of mobility Singapore, September 13: Ocean Protocol, the Web3 platform to unlock data services for AI and business innovation, has joined forces with Chainstep,
Ocean Protocol joins leading Web3 projects on the €20M+ Gaia-X moveID initiative to advance pan-European mobility

The initiative will leverage the Ocean tech stack to implement decentralized, digital identities and power the future of mobility

Singapore, September 13: Ocean Protocol, the Web3 platform to unlock data services for AI and business innovation, has joined forces with Chainstep, Datarella, Fetch.ai, peaq and 51nodes to develop the system architecture for European mobility with the preservation of data autonomy as its core principle, within the Gaia-X moveID project.

Ocean Founder, Bruce Pon said:
“Ocean Protocol is proud to support the moveID initiative with the technology stack needed for building a digital identity infrastructure and an inclusive Data Economy”

Rising as an alternative to centralized platforms, Gaia-X is a pan-European cross-sector initiative for a more open and democratic data infrastructure, bringing together more than 300 companies and organizations, including top industry names such as Bosch, Continental, and Airbus.

The moveID consortium is focused on driving innovative solutions for decentralized data sharing and generating real-world mobility use cases for blockchain technology. The consortium members will work to deliver secure and verifiable digital IDs. This is a crucial foundational feature enabling all mobility participants — humans or machines — to identify each other, transact, and interact between themselves without mediation from centralized third parties.

Peter Busch, Product Owner for Distributed Ledger Technologies (Mobility) at Bosch said:
“With moveID, we are setting the foundation for a hyper-connected mobility infrastructure by leveraging self-sovereign device identities and decentralized data sharing to enable hundreds of potential use cases. We are thrilled to be exploring this exciting prospect side by side with some of the leading projects in the decentralized space such as Fetch.ai Network, peaq, Ocean Protocol, Datarella, 51nodes, deltaDAO, and Chainstep.”

Through BigchainDB, Ocean Protocol will contribute by providing key technical elements needed for building moveID’s infrastructure. Essential components of Ocean’s decentralized data marketplace technology, Compute-to-Data, and data pricing mechanisms are being leveraged to build a system architecture that ensures a seamless exchange of information between providers and customers of mobility applications. deltaDAO will support the effort by implementing the Ocean tech stack to the moveID infrastructure.

The other participating Web3 protocols will also take on the following roles:

Chainstep, one of the leading deep tech companies for enterprise implementation and connection of DLT, IoT, and Machine Learning, will develop connectors to build the link between the digital and physical world. The company is developing a solution for connecting Edge IoT (V2X) devices securely to the Gaia-X cloud platform. Self-sovereign identities (SSI) work as the basis for decentralized integration security, allowing the removal of the single point of failure in relying on a centralized authorization provider.

Datarella, a leading developer of industrial blockchain solutions, will co-develop the infrastructure for decentralized digital identities and data sharing, leveraging SSI components, Autonomous Agents. To ensure a future market acceptance of applications based on moveID, Datarella also contributes its experience in Web3 mobility applications, such as MOBIX.

Fetch.ai is developing the infrastructure and tooling for creating Web2 and Web3 AI applications. Fetch-ai Network has a Cosmos SDK-based self-sovereign blockchain ledger and the supporting tools for developing DApps on the network and also the modular Autonomous Economic Agents (AEAs) and the Digital Twin Platforms that can efficiently and securely communicate peer-to-peer and provide interconnectivity with multiple networks.

peaq, the Web3 network powering the Economy of Things, will conduct intense research and development within the project. peaq will further build up its layer-one blockchain in line with the co-created requirements and standards, aiming to grant Gaia-X’s moveID the perfect infrastructure for decentralized mobility applications. peaq will also provide its core functions: peaq ID — Self-Sovereign Machine Identities (SSMIs), peaq access, and peaq pay.

51nodes, a Web3 developing and integration company brings to the project its experience with SSI architecture and implementation in the context of decentralised mobility infrastructure. In addition to developing and integrating various SSI technologies, the focus of work will be on SSI interoperability challenges.

deltaDAO, a web3 software development, integration and consulting company, will co-develop decentralized data infrastructure and federation services in the context of the broader Gaia-X ecosystem. deltaDAO contributes its extensive knowledge regarding Gaia-X compliance, interoperability, and integration.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, GitHub & Newsletter for project updates and announcements. And chat directly with other developers on Discord.

Ocean Protocol joins leading Web3 projects on the €20M+ Gaia-X moveID initiative to advance… was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


IDnow

IDnow joins Accelerate@IATA to shape the future of seamless air travel

IDnow is supporting IATA to simplify the passenger journey through digital identity Munich, September 13, 2022 – IDnow, a leading European identity proofing platform provider, is pleased to announce its participation in the International Air Transport Association’s (IATA) Accelerate@IATA 2022 accelerator program. IDnow is working with IATA and its members, providing expertise and reg
IDnow is supporting IATA to simplify the passenger journey through digital identity

Munich, September 13, 2022 – IDnow, a leading European identity proofing platform provider, is pleased to announce its participation in the International Air Transport Association’s (IATA) Accelerate@IATA 2022 accelerator program. IDnow is working with IATA and its members, providing expertise and regulatory know-how in the field of identity proofing and digital identity. Together, IATA and IDnow are working on the shared goal of making flying more seamless and low-touch for passengers, while lowering fraud risks for airlines. 

IATA is the trade association for the world’s airlines, representing some 290 airlines or 83% of total air traffic. As a leading industry association, IATA is shaping industry standards and the future of aviation.

Through the Accelerate@IATA program, IDnow is contributing to the IATA One ID working group. The goal of IATA One ID is to set industry standards that further streamline the passenger journey with digitalization of admissibility and a contactless process through secure biometric enabled identification. Under the One ID vision, upon completing document checks remotely, passengers will be able to arrive at the airport ready to fly and proceed through each airport touchpoint via simple biometric recognition. The objective is to achieve a truly interoperable global system across airports, airlines and governments.

IDnow is collaborating with airlines and other technology providers members of the IATA Think Tank on a white paper about the application of digital identity in the air travel industry. In parallel, a proof of concept is being developed to show how digital identity can support people with reduced mobility with traveling, and airlines and airports to prioritize the required staff for the necessary support.

“We are very pleased to have IDnow participation in the heart of industry innovation conversations. IDnow brings their expertise on the secure decentralized digital identity to support the airline industry for the journey to customer-centricity.” – Kat R. Morse, Senior Manager Innovation, Partnerships and Events at IATA.

“We understand that in the modern air travel industry, digital identities have a huge potential to shape a whole new traveler experience. From lowering the fraud risk for airlines, to creating a more seamless and low-touch experience for passengers and offering a GDPR compliant solution for storing and sharing of biometric pictures for airports and other players – Digital identity is the key. We are excited and proud to have been selected by IATA to work together and to partner with some of the most prestigious airlines of the world to shape the future of air travel”, says Michael A. Binner, Director Digital Identity at IDnow.


KuppingerCole

Who is Afraid of Security Automation?

by Alexei Balaganski If there is one universally true statement about every organization regardless of size, location, or industry – it is that they all have too many security problems to deal with comfortably and in time. If you believe that you have your cybersecurity well under control, you probably simply don’t have full visibility into every corner of your IT… Unsurprisingly, the idea of rep

by Alexei Balaganski

If there is one universally true statement about every organization regardless of size, location, or industry – it is that they all have too many security problems to deal with comfortably and in time. If you believe that you have your cybersecurity well under control, you probably simply don’t have full visibility into every corner of your IT… Unsurprisingly, the idea of replacing overworked (and increasingly scarce) humans with some kind of automation both for daily administrative routine and for responding to security incidents looks universally appealing to everyone in IT. Or does it? 

According to a recent market study, the majority of organizations have experienced problems implementing security automation for a variety of reasons, ranging from a lack of skills and budgets (obviously) to a much more bizarre claim of not trusting the outcomes of automation. In fact, many companies already having automation capabilities in place (such as SIEM or EDR products) do not trust them to perform any operations more advanced than sending out an alert. 

For years, we have blamed industrial security experts for sticking to their old ways and putting safety and process continuity above security. Meanwhile, AI and machine learning technologies have been making great strides and entire new market segments for intelligent and highly automated security solutions have emerged. And yet, even people not involved in OT security are still afraid of them. But why? 

Understanding security automation 

Before diving into technical details of implementing security automation, it can perhaps be useful to address a couple of common misconceptions about it. First, the goal of automating cybersecurity is not to eliminate humans from the decision-making process or drive security analysts to unemployment. On the contrary, the whole idea is to automate the least interesting but most tedious parts of the repetitive manual activities we have to face daily – from separating meaningful security events from false positives to making disjointed legacy tools work together efficiently. 

Unless this unqualified menial labor is the only thing you do as a security analyst, there is really nothing to be worried about – automation will actually allow you to spend your time on more challenging and rewarding tasks. You will also always have the final say in every potentially disruptive decision (and a good automation tool will not just help you make the right one, but also to evaluate its potential risks beforehand). Unless, of course, you’ll decide to block any future attacks of the same kind automatically… 

Having said that, it is also important to stress that investigating and responding to security incidents is definitely not the only area of cybersecurity that needs to be automated. Proactive measures such as identifying vulnerabilities in IT infrastructure or application code, as well as regulatory compliance, can greatly benefit from intelligent automation.  

And speaking of intelligence: contrary to many people’s beliefs, artificial intelligence and machine learning should not be the primary focus of any automation strategy. While AI/ML provide a multitude of opportunities to automate specific narrow tasks, on a strategic scale, automating processes, workflows and collaboration between people is much more important.  

Security Orchestration, Automation and Response (SOAR) products are perhaps the most promising class of solutions that helps, well, orchestrate the processes between other tools and products that do not support this integration natively. However, this market is still evolving rapidly, and most currently available tools tend to focus on automating forensic investigations and incident response. Designing a universal platform that could automate all security-related processes in an organization is thus still a challenge that every company must address individually. 

In a sense, security automation as a concept can be compared to another popular buzzword – Zero Trust. Both do not refer to specific technologies or products, but rather deal with architectures and guiding principles. However, whereas you still “cannot buy Zero Trust”, the situation with security automation tools nowadays is a complete opposite. There are so many various products that promise to solve all your automation needs that finding the right combination of tools becomes a problem that might need a bit of automation itself… 

Would you like to know more? 

Security automation will be one of the key topics at the Cybersecurity Leadership Summit in Berlin this November. Whether you are looking for strategic advice from industry analysts or are more interested in technical implementation details, you will surely find the right people and relevant presentations at the event. For example, check out “Sustainable vulnerability management: Case Study by KuppingerCole” by Christopher Schütze, “Security Automation Strategies to Succeed or Fail: You Choose” by Dr. Donnie Wendt of Mastercard, or a panel discussion on “Implementing Enterprise Security Automation for Threat Detection and Intelligence”. 

Monday, 12. September 2022

Identosphere Identity Highlights

Identosphere 99 • New Community Project on mDL & VCs • FTC on Commercial Surveillance and Data Security Rulemaking • Microsoft Entra Verified ID Generally Available

Weekly edition of news and updates surrounding decentralized id, data sovereignty, relevant public policies, events and other updates!
Welcome to Identosphere Weekly Digest! Thanks to our Supporters! Contribute on Patreon …or reach out to Kaliya directly

Read previous issues and Subscribe : newsletter.identosphere.net

Content Submissions: newsletter [at] identosphere [dot] net

Upcoming

NEW! Exploring Digital Identity for Policy Makers 9/19 London, UK (by Kaliya Young & Newspeak House)

Self-Sovereign Identity - Blockchain’s Killer Application?! 9/27 Cryptovalley. Switzerland/Online 

Zero Trust Authentication – Blockchain government ID with PKI Orchestration 9/27-29

Blockchain Expo 9/20-21 Amsterdam\Online

European Blockchain Week 9/29-10/5 Slovenia and Croatia 

North Capital Forum by @USMexicoFound, 9/28-30 Mexico City - Transmute’s Thursday afternoon panel "Trust and Transparency in Trade"

Identity Week USA (formerly Connect:ID) Washington DC 10/4-5

Infrachain Summit 10/4 about real business – not the next hype

Internet Identity Workshop #35 11/14-16, Mountain View CA

New Community Project Where the W3C Verifiable Credentials meets the ISO 18013–5 Mobile Driving License This project produced by Kaliya’s consultancy with Lucy Yang and Sponsored by Spruce will be focused on:

Enabling a bigger ecosystem of players to take advantage of mobile driving licenses by making sure the VCs are a viable alternative for implementers to the ISO 18013–5 standard, and are compatible to the fullest extent possible of the outputs of ISO working groups.

Giving both VC and mDL wallet vendors equal opportunity to compete and providing users with options by urging the open-up of privileged, internal APIs for all wallet vendors and developers.

The first phase of the project will focus on gathering input from stakeholders who are working in the areas where these two standards overlap. The goal is to come up with a list of recommendations to foster greater alignment between the two efforts.

Growth Decentralized Identifiers ExplodingTopics Digital Notarization Can Kickstart Digital ID Ecosystems (with Dan Gisolfi) Northern Block

After leaving IBM he is talking publicly in his new role at Discover for the first time 

What is transitive trust? And how does it differ from how trust gets established otherwise (e.g., through backend API calls)?
The missing role in the trust triangle: The Examiner.
using attestations from multiple issuers helps to create more trust.
How Issuance can become a business model for many trusted service providers.
Some challenges with the mDL (ISO/IEC 18013) standard.
The benefits of using a Microcredentials approach.

Courses New Badged Open Course: Decentralising Education Using Blockchain Technology Alexander.Mikroyannidis (from CCG)

Available on the Open University’s OpenLearn Create platform and is licensed under CC BY-NC-SA 4.0. Upon completion of the course, learners earn a free statement of participation.

Getting Started with Self-Sovereign Identity  Kaliya & Lucy via Linux Foundation and EdX… coming soon

Gain a solid foundation on self-sovereign identity (SSI) with a 360 degree overview of its evolutionary journey, key concepts, standards, technological building blocks, use cases, real-world examples and implementation considerations.

Marketing Web3, Web5 & SSI Timothy Ruff

Why the SSI community should escape Web3 and follow Jack Dorsey and Block into a Web5 big tent, with a common singular goal: the autonomous control of authentic data and relationships.

Public Sector Jeremy Grant @jgrantindc

USCIS make public announcement about their plants to use Verifiable Credentials for Immigration credentials 

Very interesting #FedID presentation on @USCIS plans for digital immigration credentials. Looking to use the @w3c Verifiable Credentials standard - this may be the first use of VCs at scale in the US government.

January Walker (UT04) on the Future of Self-Sovereign Identity Web3 Domains

January Walker is running for office in Utah’s 4th district

Policy FTC on Commercial Surveillance and Data Security Rulemaking IdentityWoman

FTC begins a process around regulating Commercial Surveillance and ensuring Data Security

The comments on its 95 questions are due October 21. 

See the comments Kaliya made in her 2 min along with links to the questions and more information about the process. 

Response to Kailiya’s post on AnnonCreds A response to Identity Woman's recent blog post about Anoncreds Kyle Den Hartog

It’s only when I started to take a step back that I realized that the architecture of Indy being a private, permissioned ledger leaves it heading in the same direction as many large corporations now extinct browser and intranet projects for many of the same reasons.

Moving Toward Identity Technology Ready for Mass Adoption

when we realized our customers were facing critical limitations caused by the underlying tech stack, we began developing an updated version of our platform that would reduce our dependency on these technologies and enable a better platform for our customers.

Response to Kaliya’s “Being Real” Post by Daniel Hardman

The post surfaces an important topic and contains some truth worth telling. I salute Kaliya for that, and for the honorable intentions behind her writing. However, the post also contains some factual errors, and its narrative both assumes and invites conclusions that I consider unjustified.

Kai @Kai_dentity Replying to @IdentityWoman

Great overview @IdentityWoman It matches with many conversations we had in the community in recent years, as well as observations we made ourselves at @GETJolocom. I hope it will help to make these issues more widely discussed and hopefully get them addressed.

Tim Bouma @trbouma Replying to @IdentityWoman

Excellent post. No matter how great a tech or framework is, I am always on the lookout for its Achilles’ heel.

Explainer Making Identity Easy for Everyone - Heather Flanagan, Spherical Cow Consulting Ubisecure

how to explain digital identity to people outside of the identity industry, why is it important for everyone to understand, and what the industry can do to improve the understanding of identity for everyone.

Trust Registries in the Real World Continuum Loop

Trust Registries allow us to know that the various shared credentials (e.g. proof of insurance) are accurate. A Homeowner can ask their Digital Wallet to verify an insurance Credential that the Contractor is honest.

Many articles are being written about SSI, not all so excellent John Philip

Most are probably not "written" in the normally understood sense of the word (auto-scraping and ctrl-c and ctrl-v isn't writing). Many are probably not written by a person at all.

Centralized\Federated vs Self Sovereign dominiumssi GlobaliD 101: Bring Your Own Identity

There is a real opportunity to combine technology that is being developed with the concept of BYO Identity that will create a new identity framework where you own and control your data.

Part 1: What a smart and humanistic approach to digital identity would like
Part 2: The Trust Triangle — the system of issuers, holders, and verifiers that illustrates how identity works (and how it’s broken, today)
Part 3: Why the ID Wallet is the first step toward achieving a new vision for digital identity
Part 4: Why every company is an identity company

Company updates Spruce Developer Update #23

Updates on Sign in with Ethereum, Kepler, DIDKit, Rebase

IDENTOS puts developers first in its latest product release

FPX Junction is a cloud-based set of software products which provide fine-grained API authorization and user centric identity management capabilities. 

Verifiable Credentials Support – Evolving to support Verifiable Credentials, FPX Junction includes an updated digital wallet API which enables the holding, receiving and presentation of W3C Verifiable Credentials using W3C Decentralized Identifiers. FPX Junction meets requirements for Aries Interop Profile 2.0 and DIDComm based credential exchanges.

Sign-in with Decentralized Identifiers with Dock Labs Auth0

The DID and VC W3C standards are core building blocks to enable Decentralized Identity scenarios. The Auth0 Lab team has been following the space closely, and we're excited to support Dock with this important work.

Microsoft Entra Verified ID now generally available

MSFT ENTRA now Generally Available!!! Congratulations to Ankur, Pam and Daniel (not there anymore but got MSFT to start working on it)

We believe an open standards-based Decentralized Identity system can unlock a new set of experiences that give users and organizations greater control over their data—and deliver a higher degree of trust and security for apps, devices, and service providers.  

Enterprise OKTA Identity Cloud Integration with SSI agent @sethisaab 

You will be able to learn how we can integrate existing centralized IDM solutions like Oracle Identity Cloud Service, OKTA identity Management, Sailpoint or Saviynt with SSI solutions like Hyperledger Aries, Spherity, or Trinsic to issue Verifiable credentials at the enterprise level as per business requirement.

Self-Sovereign Identity for the Enterprise with Switchboard Energy Web

This release includes major updates to the front-end Switchboard web application as well as the back-end libraries and components, giving companies access to the full suite features offered by legacy identity access management solutions in a decentralized architecture.

Recap Takeaways from the Gartner IAM Summit 2022 RadiantLogic

It was mentioned in nearly every analyst session, and I couldn’t help but notice the number of vendors who have incorporated this concept into their marketing and their booth displays.

Research NSSIA: A New Self-Sovereign Identity Scheme with Accountability

a few SSI schemes introduce accountability mechanisms, but they sacrifice users’ privacy. In addition, the digital identities (static strings or updatable chains) in the existing SSI schemes are as inputs to a third-party executable program (mobile app, smart contract, etc.) to achieve identity reading, storing and proving, and users’ self-sovereignty are weakened. To solve the above problems, we present a new self-sovereign identity scheme to strike a balance between privacy and accountability 

Culture Creating a culture of recognition

Pro-social behaviours are those intended to benefit others, or society as a whole — for example, helping, sharing, donating, co-operating, and volunteering. Within a community, they’re the behaviours that make it an attractive space to belong to, and which encourage its growth and/or development. It’s a central part of the value cycles that underpin the Communities of Practice model.

Web 3 Sanctions Should Target Bad Actors. Not Technology. Coinbase

Tl;dr: Coinbase is funding a lawsuit brought by six people challenging the US Treasury Department’s sanctions of the Tornado Cash smart contracts and asking the Court to remove them from the U.S. sanctions list. The lawsuit explains that OFAC exceeded its authority from Congress and the President in sanctioning open source technology, rather than sanctioning the bad actors who used it or the property of those bad actors.

Identity NOT SSI Fido Passkey Security pros say the cloud has increased the number of identities at their organizations Experian Joins iProov and Deloitte in UK’s Digital ID Program ✨ Thanks for Reading! ✨

Subscribe \ Read More: newsletter.identosphere.net

Support this publication: patreon.com/identosphere

Contact \ Submission: newsletter [at] identosphere [dot] net


Northern Block

Northern Block is thrilled to be joining Accelerate@IATA 2022

Northern Block has been selected as part of 16 startups to join Accelerate@IATA’s 2022 cohort, a travel tech startup accelerator jointly run by the International Air Transport Association (IATA) and Plug and Play.  The program is designed to give airlines and new entrants to the aviation industry an environment to connect and reach the pilot […] The post Northern Block is thrilled to be joi

Northern Block has been selected as part of 16 startups to join Accelerate@IATA’s 2022 cohort, a travel tech startup accelerator jointly run by the International Air Transport Association (IATA) and Plug and Play. 

The program is designed to give airlines and new entrants to the aviation industry an environment to connect and reach the pilot stage. Airline members and the value chain will have the opportunity to identify focus areas and select and engage with solutions providers/startups. 

Northern Block is part of the third theme of the program which is “Customer as the Reference”. We will be participating with six other startups to help airlines create more personalized offers and orders. We are excited to be working among a great group of airlines such as Singapore Airlines, Emirates, Royal Air Maroc, Copa Airlines, Korean Air, International Airlines Group (IAG), and Japan Airlines

As the aviation industry is in a post-covid boom, there are many opportunities to leverage DigitalTrust and Self Sovereign Identity to bring the aviation industry to the next level.  

If you’re interested in exploring SSI use cases in Aviation and/or Transportation, schedule a call with us to discuss!

About Accelerate@IATA   

Accelerate@IATA is a travel tech startup accelerator designed to support airlines through open innovation. This program gives airlines and value chain partners access to startups and leading-edge technologies, to facilitate projects, pilots, and implementations between the partners and the startups.  

Each year, Accelerate@IATA plans to accelerate ~5 startups per batch, 20 – 30 startups per year. Every batch will be selected by IATA and member airlines based on industry priorities.

During the program, startups have access to IATA Subject Matter Experts and Plug and Play’s mentor network, giving them a true innovation mindset to help them improve their solutions and scale to meet industry demand. 

“We are super excited to join the Accelerate@IATA, ” said Khalid Maliki, COO at Northern Block. IATA is spot on for our activities, it will help us access an incredible network of talented people within the Aviation industry and two critical markets, Europe and MENA. We are looking to improve the overall travel experience through our SSI solutions”  

About Northern Block

Northern Block is a fast-growing startup in the interoperable re-usable digital identity space, considered a global leader that facilitates digital transformation through self-sovereign identity (SSI) technologies, standards and principles.  

Through its leading Orbit Enterprise SSI Platform and Orbit Edge Mobile Wallet, Northern Block will make you successful in bridging traditional infrastructures to decentralized credential ecosystems with simple yet effective workflows and integrations.  

NB is involved in various facets within the three largest Canadian provinces’ digital trust infrastructure programs. Some of our work includes the deployment of the Pan-Canadian Verifiable Data Registry test network, development and open source contribution of multiple Aries RFCs and Indy-related open source code repositories, citizen wallet foundational work, citizen wallet design and implementation, Aries test harness and framework development, etc. 

What’s next

Northern Block will be participating in an internal PoC at IATA, more about this to come. Besides that we are truly honoured to be selected to pitch on stage at the IATA’s World Financial Symposium (WFS) that will be held in Doha, Qatar between 19 and 22nd September. This year’s theme is “Reshape Airline Resilience“.  

Media Contact

Daniela Gutiérrez            

Marketing and Communications Manager          

daniela@northernblock.io

The post Northern Block is thrilled to be joining Accelerate@IATA 2022 appeared first on Northern Block | Self Sovereign Identity Solution Provider.


auth0

Identity and Web3

How Auth0 is Investigating Decentralized Identity and Blockchains
How Auth0 is Investigating Decentralized Identity and Blockchains

Indicio

Decentralized Ecosystem Governance: Better, More Effective, and More Robust than Trust Registries

The post Decentralized Ecosystem Governance: Better, More Effective, and More Robust than Trust Registries appeared first on Indicio.
Trust registries are relics of centralized thinking that undo many of the benefits of decentralized identity. But there is a better way to implementing governance in a verifiable credential ecosystem—and Decentralized Ecosystem Governance is on its way to becoming an open standard thanks to work at the Decentralized Identity Foundation

By Sam Curren

Decentralized identity technologies are about making important data immediately actionable because that data can be trusted. The combination of decentralized identifiers (DIDs), credential schemas and definitions, cryptographic calculation, and tamper-proof distributed ledgers manages to add the missing verification layer to the internet and deliver the privacy and security features that are now essential in the marketplace.

But all these components do not fully negate the need for humans to make decisions about who can be trusted as an issuer of data—and how those decisions are implemented in the workflow of a decentralized, verifiable credential ecosystem.

Many have invoked the idea of a Trust Registry as a service to solve this problem. At Indicio, and now at the Decentralized Identity Foundation, we are turning this idea into a file — a serialized, downloadable collection of all the information relevant to an ecosystem.

The idea of a Trust Registry as a service raises many problems, both technical and governmental. Examples of the former include adding friction to a system of verification whose value is that it is maximally frictionless: If each verification needs to ping a Trust Registry and then wait for approval, the system slows down.

Then we have the issue of offline functionality due to poor internet speeds or outages — how does that work? And we have governance problems: Who governs the Trust Registry so that we can trust it to be inclusive? And how much will this cost? In sum, Trust Registries as a service create bottlenecks in identity governance and risk becoming toll booths on information flows.

These concerns were not theoretical. In the work Indicio did for SITA and the Government of Aruba, it was clear that organizations and governments want to control their verifiable credential ecosystems; sovereign authorities do not want services to do this that are outside their control.

Indicio developed Machine Readable Governance to manage this issue. In essence, keep the Trust Registry, but make it a file in a machine readable format that’s issued by a governance authority for a particular ecosystem that propagates through the agent software in that ecosystem. No bottleneck. No decrease in speed. Can be rapidly updated. And, because the file is cached, there is  offline functionality.

We weren’t the only ones thinking along the same lines. Other developers and organizations were working independently on the same solution, including Gabe Cohen with TBD and Daniel Buchner at Block. It made sense to collaborate and establish a community standard, and provide code in open source repositories.

Led by Indicio’s Mike Ebert, this is what is now happening in the Claims and Credentials working group at DIF. It’s called Decentralized Ecosystem Governance—and progress is rapid. As part of this work, we have developed functioning systems that use all the elements below and are involved in efforts to standardize this approach with other interested companies in the space.

Here is a quick summary of the work and status:

Participant Lists (nearly done): This is the portion of governance that designates the participants of an ecosystem — the Issuers and Verifiers involved in credential flows.

Delegated Participants (next under discussion): This allows the involvement of parties who are not explicitly listed as a participant, but rather presenting a credential issued by a listed participant. This enables credential chain support and allows a variety of dynamic and large ecosystems to benefit from this technology.

Action / Workflows (not yet discussed): Enables specific interactions for smooth UX and confident interaction with credentials. Will work with Presentation Definitions and Credential Manifests.

Presentation Definitions -(published): Enables the specification of schemas and other details of credential interactions.

Credential Manifest (published): Enables the specification of workflows related to credential issuance.

Summary: Decentralized Ecosystem Governance does five powerful things:

It provides a low cost, efficient and error-free way to choreograph the rules for interaction in a jurisdiction. It makes a governance framework portable, instead of a centralized trust registry service to verify issuers in real time, a governance authority publishes a governance file that propagates to all the agents in an ecosystem. It enables the appropriate governance authority — such as a government or health authority — to directly implement and easily update the applicable rules for how individual data is used. It can manage interactions within and across jurisdictions and systems, allowing each governance authority to enact the rule they decide on as important. It’s easily updated.

Decentralized Ecosystem Governance makes verifying data an easy-to-play game of red light/green light. And, importantly, it decentralizes governance to the appropriate authorities.

The post Decentralized Ecosystem Governance: Better, More Effective, and More Robust than Trust Registries appeared first on Indicio.


Spruce Systems

An Identity Wallet Bill of Rights - Starting With the Mobile Driver License

Spruce’s continued mission is to let users control their data across the web, whether it’s web2, web3, or beyond. This also applies to credentials issued by existing entities, such as the Mobile Driver License (mDL) issued by motor vehicle authorities across the world.

Spruce’s continued mission is to let users control their data across the web, whether it’s web2, web3, or beyond. This also applies to credentials issued by existing entities, such as the Mobile Driver License (mDL) issued by motor vehicle authorities across the world.

There is now a global standard called ISO 18013-5 that describes how we can represent a digital driver’s license on someone’s smartphone, and the whole end-to-end process on how to present it in-person. However, the methods to provision them, refresh them, or send them online are still under active discussion within ISO working groups. In the US, TSA and other organizations have already committed to using this standard, so it impacts DMVs who want their mobile driver’s licenses to also be useful for travel at the airport. In other countries, there is beginning to be adopted as well.

The EFF, ACLU, and EPIC recently have called out several concerns regarding implementations of the mDLs in their comments to a Department of Homeland Security request for comments, and as implementers of open source digital identity software, we have additional concerns in the mix that we felt it necessary to call out too:

User choice for identity wallets may be restricted to just a few companies due to anti-competitive policies for device API access. Hardware manufacturers and operating system vendors use internal APIs that third-party developers cannot access, meaning wallets by app developers will not be able to create a competitive user experience or security model on a level playing field. Practically, this could funnel users to just the operating system-provided identity wallets and extinguish the possibility of increased user choice when handling their most critical data. Is a large tech company the right decision maker for how state-issued digital identities can be used on billions of devices, especially in light of many recent antitrust allegations? For example, gatekeeping of verifiers occurs with Apple’s PassKit restricting the verifier set to Apple’s approved list of developers. In the physical world, we don’t have to ask Apple to inspect a driver’s license. This seems a lot like a policy decision for users who have mobile driver’s licenses. When was the last time your leather wallet told you that you couldn’t use something how you wanted?

The EFF further has recommended W3C Verifiable Credentials due to their history of being developed in the public, and being an open standard. We agree with this. Vendors such as Microsoft, Ping Identity, Workday, and Spruce have already adopted these in pilots and production use cases. Furthermore, Underwriter Labs, an author of ISO 18013-5, has described how to interoperate Verifiable Credentials with the ISO described data model and protocols. We look forward to continuing our collaborations here such as demonstrating interoperability with TBD:

Spruce and TBD Demonstrate Decentralized Identity Interoperability We have heard a lot about Decentralized Identity in the last few years. For both Spruce and TBD, Decentralized Identity and user control are at the center of both our missions, and we are excited to announce that we achieved another milestone towards interoperability. SpruceSpruce

We think that this can culminate in an Identity Wallet Bill of Rights that describes how a third party wallet provider can participate on an even playing field, allowing for an entire ecosystem instead of 2-3 vendors thereby increasing user choice. This way, users can enjoy increased wallet selection built on open protocols and standards, ensuring that a market can form to tailor to specific use cases. Identity is far too important and critical of a component to leave to two or three companies to dominate.

When was the last time you had to ask a tech company for permission to use your passport, with the implication that they could lock you out entirely? If things continue without an open ecosystem for identity wallets, this could be the unfortunate reality. Digital identity is still in its infancy, and it’s not too late to start now.

In our efforts to help lead this charge, we are sponsoring a community project led by Kaliya Young, a longstanding member of the identity ecosystem, to lead this and help create a level playing field for identity wallets without compromising on security. At Spruce, we believe that users should have a choice in which wallet manages their most critical data, and that two or three companies should not dictate the most meaningful and high-stakes digital interactions for billions of people.

Towards this, we are working on a set of requirements that we believe would enable wallet providers to achieve the same levels of user experience, functionality, and security that are enjoyed by wallets with privileged, internal APIs not currently available to third-party developers. We want to invite you to do the same through this community effort so that together we can ensure that wallets have a level playing field and that users have true choice. If we can get this right, we have a real opportunity to get it right for users across identity, payments, and beyond in the digital era.

We will continue to pursue our efforts in the decentralized identity ecosystem, and always champion architectures that put users first.

Check out this link for the full community project announcement:

Where the W3C Verifiable Credentials meets the ISO 18013–5 Mobile Driving License New Community Project Sponsored by Spruce MediumIdentity Woman in Business

Spruce lets users control their data across the web. Spruce provides an ecosystem of open source tools for developers that let users collect their data in one place they control, and show their cards however they want.

If you're curious about integrating Spruce's technology into your project, come chat with us in our Discord:


Ocean Protocol

Ocean Protocol and Dimitra announce Phase 2 of the Data Challenge: Algorithms, Analytics…

Ocean Protocol and Dimitra announce Phase 2 of the Data Challenge: Algorithms, Analytics, Narratives & Reports The strategic phase will incentivise data analysis and the building of algorithms to harness the power of agricultural data for global change. Singapore, September 12th: Ocean Protocol, the Web3 platform to unlock data services for AI and business innovation, has launched
Ocean Protocol and Dimitra announce Phase 2 of the Data Challenge: Algorithms, Analytics, Narratives & Reports

The strategic phase will incentivise data analysis and the building of algorithms to harness the power of agricultural data for global change.

Singapore, September 12th: Ocean Protocol, the Web3 platform to unlock data services for AI and business innovation, has launched the second phase of the Ocean and Dimitra Data Challenge — an initiative guided by the mission to gather data-driven insights and encourage algorithm building to solve complex challenges in agriculture. From a prize pool of $10,000, the rewards for this phase are: 1st place: $4,000, 2nd place: $3,000, 3rd place: $2,000, Community Award: $1,000 (payable in OCEAN + DMTR).

The Phase 2 datasets are live on the Ocean Market and the entry submission deadline is September 30 at midnight UTC. To enter the competition and access complete bounty details, go to the dedicated Questbook page.

The participants must at least perform the graphical methods outlined in the challenge description in order to be considered for the competition. They are free to augment the challenge dataset with any open-source real-data dataset of their choice. Bonus points will be awarded for publishing the referenced data on the Ocean Market.

The partnership between Dimitra’s agricultural software and Ocean’s Web3 data sharing and monetization capabilities is focused on driving the next generation of agricultural solutions using data and accelerating the adoption of Web3. In line with this vision, the Ocean and Dimitra Data Challenge was launched in July 2022 with the aim to maximize yields and mitigate operational losses for farmers across the globe with relevant, valuable data-driven insights and algorithms.

Ocean Founder Bruce Pon said:

“The global agriculture industry is facing its biggest changes and data-driven productivity improvements are the need of the hour. With our data challenge, we have all the tools in place to build relevant context around farming data, build models to improve outcomes and influence the direction of the agriculture sector for the better.”

Jon Trask, CEO of Dimitra, commented:

“As we enter Phase Two of our data science challenge we are looking forward to testing the data scientists’ use of a variety of techniques which may reveal important insights into the growth of soybeans based on ten years of historical data. I’m interested to understand the trends and changes that are affecting performance and possible insights into the nature or impact of recent climatic events that may influence crop performance. Hidden knowledge is disguised in the data and I hope to see how this group of data scientists reveal the opportunities hidden behind the mask.”

As part of the Ocean-Dimitra Data challenge Phase 2: Algorithms, Analytics, Narratives & Reports, participants will be presented with a dataset containing the crop yield data for soybeans in the 46 districts of Madhya Pradesh state, India and the MODIS satellite data on normalized difference vegetation, leaf area, evapotranspiration, land surface temperature and rainfall. The challenge is to study the target soybean yields satellite data, rank the factors that affect yields and accordingly build a simple model to predict yields based on the features. The participants will also identify and explain any trends in the yield data over a 10-year period. The goal of this phase is to gain insights into the nature and impact of recent climatic events that may have influenced crop performance.

A panel of evaluators from Ocean and Dimitra will independently review and rank submission entries selecting 1st, 2nd and 3rd place winners. The selection will be announced publicly and the community will be invited to vote for the Community Choice award winner. All valid entries submitted will receive an award of 250 $OCEAN tokens.

About Ocean and Dimitra Data Challenge

The Ocean and Dimitra Data Challenge is part of the broader Ocean Data Bounty program — a strategic initiative incentivising data-driven insights and the building of algorithms to solve complex business challenges.

The Ideation phase focused on garnering valuable ideas on how agricultural datasets in the Ocean Market could be utilized to maximize yields and crop quality while minimizing risks for farmers in the state of Madhya Pradesh, India. A panel of team judges chose the top three winners for their submissions on crop yield prediction algorithms, and key factors influencing yield and profitability through data analysis. The Community Award and 5 honourable mentions were also conferred based on the level of innovation, the value of opportunity, feasibility, and completeness of ideas.

About Ocean Protocol

Ocean Protocol is a decentralized data exchange platform spearheading the movement to unlock a New Data Economy, break down data silos, and open access to quality data. Ocean’s intuitive marketplace technology allows data to be published, discovered, and consumed in a secure, privacy-preserving manner By giving power back to data owners, Ocean resolves the tradeoff between using private data and the risks of exposing it.

About Dimitra

Dimitra’s mission is to place technology in the hands of millions of small farmers around the world. By doing so, Dimitra aims to enhance productivity and, therefore, the lives of farmers, improve food safety, and enable greater food security globally.

In 2021 it set itself two additional objectives. One is to advance AgTech innovation and the other is to secure operational grants for developing countries around the world to enable participation in the Dimitra ecosystem.

Ocean Protocol and Dimitra announce Phase 2 of the Data Challenge: Algorithms, Analytics… was originally published in Ocean Protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

Nov 22, 2022: You Deserve a Better Security Testing Experience

To remain competitive, businesses are embracing digital transformation, adopting cloud services and agile software development. But this is creating opportunities for attackers because most organizations lack the skills, knowledge, and expertise to match attackers’ ability to find and exploit vulnerabilities. There needs to be a shift in the way organizations conduct security testing.
To remain competitive, businesses are embracing digital transformation, adopting cloud services and agile software development. But this is creating opportunities for attackers because most organizations lack the skills, knowledge, and expertise to match attackers’ ability to find and exploit vulnerabilities. There needs to be a shift in the way organizations conduct security testing.

Oct 13, 2022: Implementing Modern and Future-Proof PAM Solutions

Privilege Access Management (PAM) is changing, driven by the move of most businesses from on-prem IT applications and infrastructure to the cloud, resulting in a multi-could, multi-hybrid IT environment. This has resulted in a proliferation of privileged identities that need to be managed.
Privilege Access Management (PAM) is changing, driven by the move of most businesses from on-prem IT applications and infrastructure to the cloud, resulting in a multi-could, multi-hybrid IT environment. This has resulted in a proliferation of privileged identities that need to be managed.

The Importance of Standards in the IT Security Industry

by Raj Hegde Is the security sector served well by the standards, regulations, and frameworks we have? The security industry has been around for a good few years and we've understood the importance of standards. If you look at the way that standards, frameworks, and regulations work, it does take quite some time for them to come into place. In the early days, we had standards, which may hav

by Raj Hegde

Is the security sector served well by the standards, regulations, and frameworks we have?

The security industry has been around for a good few years and we've understood the importance of standards. If you look at the way that standards, frameworks, and regulations work, it does take quite some time for them to come into place. In the early days, we had standards, which may have been around specific technologies, but they were quite general. So for example, we had the standards around wireless encryption. So, there are those sorts of standards. We have standards around technologies. We have standards around products. We have standards around organizations. So for example, the BS 5577, a long time ago, was what eventually became known as the ISO 27,001 series or the starting of it.

The standards came first, then as time went on, we had more frameworks coming into play. For example, the Cloud Security Alliance is having some frameworks, ISACA itself has got a great framework, couple of frameworks. And also you find that over some time, different regulatory bodies have created their regulations. So the finance sector has often had its own. Now, the challenge with all of these things is that because they are trying to cover so many things and be all and everything to many, many people, especially those people that sit on these committees, it takes a long time to negotiate them. And because it takes a long time to negotiate them, it also takes a long time to greet them and get them out there. There are a lot of standards out there and it doesn't mean we've covered all the things that we need to cover in security. So they are evolving as time goes on. And again, this 27,000 series has shown because you've got different components of that now. We are beginning to get to the point where some of the technologies are converging in a direction that's quite useful. You're beginning to see some of the convergence aspects that are fit as well. And IoT is a very good example of that.

I'd say, yes, those are a lot of standards. There are regulations and frameworks, plenty of them, plenty to choose from and definitely, the security industry is well covered. But that doesn't mean that we've got everything covered. It's well covered in that we've got a good set of standards already at the moment that we can build on to meet the changing needs that we may have.

What role do standards and frameworks have in a world where threats and risks are forever changing?

We're in an environment where technology is changing so fast and responses to technology are a little bit slow because we can only really work in retrospect in many respects, looking at where the threats are, where the risks are. We tend to think backward rather than forwards quite often in many of these things, partly because we're in a position where we've got limited budgets, we can't go round trying to cover every risk, we have to cover those that are the biggest risks. And we have to look at the likelihood that things are going to happen, and the impacts those things are going to have. Because there are so many variations that we need to look at and the threats that are out there, we start to figure out where we want to spend our money and we start to think about the changes that are out there in terms of the threats that we need to respond to. Our budgets need to be forward-thinking. Yet at the same time, we've got technologies that are always backward thinking. We're trying to think about what's happened, where it's happened, why it's happened, and what we need to be doing next. So in terms of the standards, we're in a position where many of these things came about some time ago, and although they came about some time ago, they were almost ready for yesteryear's threats that were out there. My session thinks about that and I'll be covering what it is, and why they were good for their times at some point. And the fact that today we're in a much, much faster-changing world, that our standards if they're not changing just as fast as the threats are changing, they act, they're beginning to feel more like checklists. And if you've got checklists, as good as a checklist may well be, it's nothing more than a checkbox exercise. And that's one of the things that many security professionals are often saying about standards, there are many organizations out there that have been breached and they've been breached even though they comply with certain standards, frameworks, and regulations that comply with them. But even though they're complying, they are still being breached. And that's partly because they're complying with these things that were written five, six, seven, eight years ago in many cases. What we need to be looking at is how can we update them in shorter periods to reflect some of the threats that are emerging out there.

What will you be covering in your session?

In that keynote session, I'll be looking at giving a couple of examples of standards and frameworks that were out there, and I'll be looking at how they meet today's environment and pulling them to pieces intentionally. I know some of the ones that I've chosen are due to be changed or in the process of being changed, but effectively, the fact remains that standards do take a long time to change. We need to figure out how we can make the best use of them in a way that they're more than just a checkbox exercise. And in many respects, some organizations that may not have any security, they're a brilliant starting point. Having a good starting point is great. But if that starting point isn't going to take you anywhere more than a starting point, you need to think about the sort of threats that you're likely to get as an organization that you need to be doing. And I know that many organizations out there do have to comply in some respects with PCI DSS or HIPAA or with 27,001 and so on. So I've been looking at those as examples of saying what they were created for and how quickly they came out of date.


Affinidi

What is the Affinidi Console?

A Complete Guide to the Affinidi Console Affinidi Console — A Sneak Peek Affinidi Console is a one-stop shop that provides a suite of tools that make it easy for builders to create personalized and privacy-preserving applications. It provides data control and ownership to end-users and empowers you to leverage the advantages of a decentralized data ecosystem. With Affinidi Console, you
A Complete Guide to the Affinidi Console Affinidi Console — A Sneak Peek

Affinidi Console is a one-stop shop that provides a suite of tools that make it easy for builders to create personalized and privacy-preserving applications. It provides data control and ownership to end-users and empowers you to leverage the advantages of a decentralized data ecosystem.

With Affinidi Console, you can access many services that work on top of Affinidi’s APIs. Also, you can unlock and verify fully portable Verifiable Credentials (VCs) — tamper-proof and W3C-standard digital credentials that can be verified cryptographically. So far, Affinidi Console consists of nine services/components:

Schema Manager Bulk Issuance Registry Cloud Wallet Edge Wallet Paper Check Digital Check Rules Engine Consent Manager

Let’s take a detailed look at each of these components.

Schema Manager

Schema Manager is a tool to manage, create, clone, update, and reuse credential types and existing schemas.

It can come in handy for developers, product managers, or engineers tasked with building an SSI application and face the problem of ensuring trust between actors (mainly, Issuers and Verifiers).

Bulk Issuance

Bulk Issuance is for you if you’re someone who issues VCs frequently and in large numbers. Our Bulk Issuance component allows you to issue VCs in batches, based on the contents of a CSV file and structured according to the selected VC schema.

Bulk issuance saves you a lot of time and effort.

Paper Check

Paper Check enables you to extract information from documents that cannot be digitally verified (paper certificates, PDF reports, etc.). This tool standardizes the extracted data into defined schemas and reviews the authenticity of the documents. (Note: We can’t fully guarantee the authenticity of paper documents).

Digital Check

Digital Check is similar to Paper Check but extracts information from digital documents. Specifically, you can use this product to extract information from digitally verifiable documents, standardize the extracted data into defined schemas, and verify the authenticity of these documents.

Rules Engine

Rules Engine is a service that allows customers to check the data/documents submitted by end-users against their business rules for further processing and decision-making. These business rules are configurable using a web-based frontend.

Registry

Affinidi Registry is a secure place to store a list of trusted and verified data providers for your organization.

Cloud Wallet

Affinidi’s Cloud Wallet is a centralized storage solution for storing and sharing verifiable credentials. Also, this end-to-end solution can fit well into decentralized environments as well.

Edge Wallet

Edge Wallet is a decentralized wallet storage service that stores all data on a customer’s device.

Consent Manager

Affinidi’s Consent Manager is a privacy and data protection solution that enables application owners and organizations to manage end-customers’ consent through VCs. It is a plug-and-play solution that can integrate with your existing or future applications and scale well with your business growth.

Stay tuned to know more about these products.

In the meantime, sign up for early access to Affinidi Console today to try these products.

For further questions, reach out to our Dev Evangelist, Marco Podien.

Join our community today

Get conversations going with #teamaffinidi and the community on Discord Follow us on LinkedIn, Twitter, and Facebook Please get the latest updates on what’s new at Affinidi; join our mailing list Interested in joining our team? Start building with the Affinidi tech stack now For media inquiries, please get in touch with Affinidi’s PR team via pr@affinidi.com

What is the Affinidi Console? was originally published in Affinidi on Medium, where people are continuing the conversation by highlighting and responding to this story.

Sunday, 11. September 2022

Continuum Loop Inc.

Premature Standardization & Interoperability

The post Premature Standardization & Interoperability appeared first on Continuum Loop Inc..
Premature Standardization & Interoperability

TL;DR: While I applaud the efforts to create “interoperability” in the SSI, decentralized identity, authentic data, and whatever else you want to call this domain, it’s premature to think we have real “standards.” We have, instead, premature standardization and tiny pockets of very focused interoperability.

NOTE: This is an early document that will likely evolve – in that more and more depth may be provided. The content may be re-factored enough that what this document looks like in mid-2022 could look wildly different in a short period.

Here’s my premise – we don’t have standards nor interoperability – at least not as people really need. We have been through a process that is powerful and good – but what we have is what I call “premature standardization.” It’s a great start but nowhere near where things will be.

There are many reasons we haven’t reached “interoperability,” and I believe we are far away from deep interop.

First, we have attempted an immaculate conception approach to achieve interoperability. We think we could define what the baby will be and not do the hard and messy work of really creating something.

You can’t jump the steps required to generate real Interoperability. The reasons are deeper than I will go into right now, but interoperability is born out of the desire for systems to interact. Sure we have that notionally in the various groups attacking this problem – Trust Over IP Foundation, Decentralized Identity Foundation, W3C CCG, and other places – are working to create multiple things that will be needed in the complex symphony of interoperability that we require.

Here’s the starting point for my premise – you can’t skip steps.

It would help if you got your industry understanding things at a very, very deep and intimate depth. What is really needed as opposed to proposing ideas that haven’t been used in earnest for long enough?

There are two key dimensions that you need to understand.

Industry/Ecosystems – Are you talking about a single ecosystem or moving between wildly different ecosystems? Technical Stacks – Are you talking about interoperability between systems using the same stack or different stacks?

These two dimensions need to be handled separately.

Each dimension – the Ecosystems and the Stacks have different needs.

Different ecosystems will have competing requirements. Let’s use SSI as an example. Let’s compare two different industries to see their different needs.

Industry One Industry Two

– credentials need to be small, and the devices providing them have a severe limit on power draw

– the credentials used are used in total (i.e. their full data payload is always at play)

– the credentials must work when major parts of the system are disconnected or offline for extended periods

– High-value credentials that contain very private information are key

– Verifiers of the information are legally required to ask only what is necessary for their query

– Offline is essential and, in some cases, critical, but not expected to be running offline for weeks on end

This is a somewhat artificial but not unreal scenario. Compare the IoT industry with the travel industry; you will see that they are pretty distinct. There certainly are similar patterns, though.

Let’s similarly look at different stacks.

Stack One Stack Two

– used across various integration cases – data from databases, sensors, governments, etc.

– the system is message-centric

– the system is meant to run on mobile and low-power devices

– Broadly created to meet integration needs with IAM systems (e.g. OIDC).

– Used for data sharing in an API web-centric context.

 

Both stacks above are valid and meet a particular set of needs.

How can these Ecosystems and Stacks be interoperable?

They do this in stages – and IN ORDER.

They all start in a Single Ecosystem and Single Stack. We’ve seen many of these.

THEN they start crossing into the Multiple sides – but they really only add one of the dimensions at a time. A Single Stack & Single Ecosystem approach can do one of the following at a time:

Add Mulltiple Ecosystems (Path 1) – given various business reasons, using the same stack and applying the approaches to multiple industries expands the breadth and significantly: the depth of knowledge of where the Stacks meets/falls short. Add Multiple Stack (Path 2) – in the same ecosystem adding other stacks into play begins to learn where different approaches work better and where there is no need to duplicate capabilities.

The following two paths are more about pressure from outside, and both get pulled into the broader Multiple Ecosystem + Multiple Stack world.

The critical thing to understand here is that an Interoperability Compliance Suite is required for either of the following paths to be viable. This compliance suite sets the conditions by which competitors can point to a reference as their definition of interoperability.

Depending on which path (1 or 2) was taken, we have two courses:

A Multiple Stack + Single Ecosystem adds support for Multiple Ecosystems (Path 3)  – the compliance suites here help more about context, ecosystem-specific semantics, and syntax. A Single Stack + Multiple Ecosystem adds support for Multiple Stacks (Path 4) – this path required Interoperability Compliance Suites to prove that the competing stacks are not finger-pointing.

I am unaware of any real interoperable solutions that circumvented these paths. Further, each transition is expensive in terms of resources (people, time and funds), so the market pulls it. This is a “you can’t push a rope” scenario.

​Visit our blogfollow us on social media, and subscribe to our newsletter to stay up-to-date and learn more.

Join the SSI Crew!

The first step on your journey to understanding Decentralized Identity and taking control of your own digital world.

You're in the Crew!

Email

Subscribe

Follow Follow Follow Follow

The post Premature Standardization & Interoperability appeared first on Continuum Loop Inc..


KuppingerCole

Analyst Chat #140: Debunking the Myth of the Human Being the Biggest Risk in Cybersecurity

It is always easy to blame people, i.e. users, for data breaches and ransomware attacks. But is that really still true today? Martin Kuppinger and Matthias discuss this cybersecurity myth and finally defend users against unjustified accusations. Meet us at the Cybersecurity Leadership Summit!

It is always easy to blame people, i.e. users, for data breaches and ransomware attacks. But is that really still true today? Martin Kuppinger and Matthias discuss this cybersecurity myth and finally defend users against unjustified accusations.

Meet us at the Cybersecurity Leadership Summit!



Friday, 09. September 2022

auth0

Sign-in with Decentralized Identifiers with Dock Labs

Dock Labs unlocks Decentralized Identifiers and Verifiable Credentials for your Auth0-powered App.
Dock Labs unlocks Decentralized Identifiers and Verifiable Credentials for your Auth0-powered App.

1Kosmos BlockID

What Is Synthetic Identity Theft? New Path For Fraud?

Synthetic identities can easily be mistaken for real identities and can wreak havoc on companies that fall victim. So how can you differentiate between the two? What is synthetic identity theft? Synthetic identity theft, or synthetic identity fraud, happens when a social security number is stolen and combined with fake personal information. The number is … Continued The post What Is Synthetic Id

Synthetic identities can easily be mistaken for real identities and can wreak havoc on companies that fall victim. So how can you differentiate between the two?

What is synthetic identity theft? Synthetic identity theft, or synthetic identity fraud, happens when a social security number is stolen and combined with fake personal information. The number is then used to open accounts, make purchases and steal money.

How Does Synthetic Identity Fraud Work?

Synthetic identity fraud (SIF) is a new form of identity theft that leverages modern technology and the realities of a data-driven society to take advantage of individuals and organizations unprepared to address the complex problems of data security. 

Primarily, synthetic identity theft strikes us when we least expect it. While they won’t harm individual consumers, they can significantly impact organizations that hackers fool. 

What Is a Synthetic Identity?

In modern cybersecurity, “digital identity” refers to the collective information used to represent an individual in online systems. Depending on the organization, service, and applications, this individual identity might include many different pieces of information–account numbers, phone numbers, address information, unique identification numbers, payment or credit information, Social Security Numbers, etc.

A synthetic identity, as the name suggests, is “artificial.” A hacker uses a piece of legitimate and fake data to construct a synthetic identity that can be used for nefarious purposes. 

While traditional forms of identity theft rely on hackers stealing partial or whole sets of information representing a real identity connected to a real person, synthetic identities are fake credentials anchored with a piece of accurate information to help bypass security.

These synthetic identities are unique in that they seem like real identities, and they may fool financial or other institutions who see the legitimate information and assume that the identity is legitimate.

Generally speaking, there are two approaches hackers take in making synthetic identities:

Manipulated Synthetics: A hacker will take real information, most often a social security number, and build a synthetic identity around it with natural or altered elements that include phone numbers, addresses, fake names, and so on. They may even make small changes to the SSN if there is any potential for reusing it elsewhere. Manufactured Synthetics: If a hacker can get some information on fake or non-existent SSNs (for example, from a range of numbers used for randomly assigning new numbers), they can essentially create a fully-faked identity. 

The challenge of synthetic identities is that, for the most part, they are tough to track. Hackers using synthetic tactics may sign up for services (a credit card, a buyer’s account) and use them normally for years, building reputation and available credit. Then, once enough credit is available, they will spend the money, burn the account, and vanish. 

Unlike traditional fraud, where consumers are victims, financial institutions are usually the biggest targets for synthetic fraud. The institutions that find themselves victims of synthetic identity fraud may have little or no recourse. Unless the hackers left a paper trail of their activities, the synthetic identity is just a hollow figure in their system that was never real.

Why Are Synthetic Identity Threats More Common?

The invention of synthetic identity threats isn’t a new and random phenomenon… instead, it is a natural attack vector that has come about due to how we approach digital information and security. 

Some factors that have played a role in the rise of synthetic identity hacks include:

Credit Card Security Improvements: It’s simply not as easy as it once was to hack credit card information. New developments in a card protection, compliance requirements, and the ability to track and reverse fraudulent transactions have made credit card theft… not impossible but trickier than it once was. This is due, in part, because many security measures have lined up to protect digital identities. Increased Online Activity for Payments and Benefits: The reason that credit security has gotten so advanced is due, in part, to the drastic expansion in online eCommerce and digital storefronts. Customers increasingly rely on digital-only or hybrid shopping experiences for goods and services, and these storefronts are almost exclusively Card Not Present (CNP) transactions–which means that they are ripe for fraud.
How Can You Recognize Synthetic Identity Theft?

Synthetic identity theft is a long game–-hackers will often wait years, building up identities to steal tens of thousands of dollars and leaving businesses and financial institutions holding the bag. 

Businesses and consumers/employees must stay vigilant in looking for potential synthetic fraud. 

Some tell-tale signs include:

Abnormal Credit Reports: If you work with an organization that checks consumer credit reports (or are a consumer looking at a credit report), you can see if any unusual activity has occurred related to your SSN.

More importantly, fraudsters can steal SSNs for children who technically do not have credit, which means that a report could be tied to a synthetic account using an existing SSN connected to a minor. Coordinating credit checks for these numbers can help you see if something strange is going on tied to those numbers. Social Security Statements: Social Security Statements report when payments are made into the Social Security fund as part of withholding taxes. If a business or user notices discrepancies in payments to Social Security, it could mean a fraudster has used false credentials to gain employment. Strange Bills and Multiple Addresses: If you check any official documentation, online bills with lenders and credit companies, or other organizations, you may find new, strange addresses included in those statements. You may even start receiving strange bills in the mail. These signs could mean someone using some of your information to build a synthetic identity.  How Can You Prevent Synthetic Identity Threats?

The best defense against threats from synthetic identity fraud is, for the most part, tied to proactive cybersecurity approaches related to protecting user identity information during and after authentication and authorization. 

Some of these approaches include:

Strong Identity Management: Your organization should have tough identity management and authentication security. This includes using MFA for all authentication purposes, having superior security for stored authentication credentials and identity data, and avoiding the pitfalls of centralized identity management like honeypot databases or lack of proper data obfuscation. Behavioral Biometrics: Agencies that manage identities can utilize behavioral biometrics to connect the dots between different pieces of those identities to determine potentially fraudulent activity.

As their name suggests, behavioral biometrics are a way to identify patterns of behavior between pieces of information and tie them together with assurance measures like advanced biometrics for added, preventative security. Holistic Cybersecurity: Identity management and data protection must extend across your organization. This means having a comprehensive understanding of data, identity management, authentication and access management across all relevant systems across customer and employee identity journeys. Protect Employee IDs Against All Synthetic Threats with 1Kosmos

Identity fraud, and phishing attacks are two of the biggest threats most enterprises face today. It’s increasingly common for companies to face mass email attacks, steal identities from team members, and use them to wreak havoc inside and outside your organization. 

With 1Kosmos BlockID, you can leverage decentralized and strongly-secured identity management to support authentication resistant to breaches and compliant with rigorous national identity and authentication standards. 

With 1Kosmos, you can get the following identity protection, and authentication features:

SIM Binding: The BlockID application uses SMS verification, identity proofing, and SIM card authentication to create solid, robust, and secure device authentication from any employee’s phone.
Identity-Based Authentication: We push biometrics and authentication into a new “who you are” paradigm. BlockID uses biometrics to identify individuals, not devices, through credential triangulation and identity verification.
Cloud-Native Architecture: Flexible and scalable cloud architecture makes it simple to build applications using our standard API and SDK. Identity Proofing: BlockID verifies identity anywhere, anytime and on any device with over 99% accuracy. Privacy by Design: Embedding privacy into the design of our ecosystem is a core principle of 1Kosmos. We protect personally identifiable information in a distributed identity architecture, and the encrypted data is only accessible by the user.  Private and Permissioned Blockchain: 1Kosmos protects personally identifiable information in a private and permissioned blockchain, encrypts digital identities, and is only accessible by the user. The distributed properties ensure no databases to breach or honeypots for hackers to target.  Interoperability: BlockID can readily integrate with existing infrastructure through its 50+ out-of-the-box integrations or via API/SDK. 

Watch our webinar: Techniques for Securing Transactions With Identity Verification and Verifiable Claims to learn more about identity verification.

The post What Is Synthetic Identity Theft? New Path For Fraud? appeared first on 1Kosmos.


Trinsic (was streetcred)

Moving Toward Identity Technology Ready for Mass Adoption

Kaliya Young, a respected leader in the decentralized identity community, produced a paper about Hyperledger Indy and Aries that some of our customers may have questions about. We’d like to publish our response in the open. While we won’t get into specifics in this post, we resonate with many of her points and have been […] The post Moving Toward Identity Technology Ready for Mass Adoption appea

Kaliya Young, a respected leader in the decentralized identity community, produced a paper about Hyperledger Indy and Aries that some of our customers may have questions about. We’d like to publish our response in the open. While we won’t get into specifics in this post, we resonate with many of her points and have been evolving our platform to not only remain relevant, but innovate on what it means to be a decentralized identity platform.

Background on Trinsic and Hyperledger Identity Technology

The Hyperledger Indy + Aries bundle of technologies, complete with Anoncreds verifiable credential format, was the first production-ready suite of SSI technologies to hit the market. Trinsic’s founding team were key contributors in the beginnings of the Aries project. Shortly thereafter, Trinsic launched the world’s first full-stack SSI platform based on these technologies. Subsequently, that Aries-powered platform has been used in hundreds of proofs-of-concept, pilot projects, and production deployments. Those experiences produced mountains of learnings critical for application developers, infrastructure vendors such as ourselves, and the open source community in our shared goal of self-sovereign identity.

As more customers went into production, we gathered their feedback and heard them loud and clear. We made improvements to our platform to address their concerns where it was possible. However, when we realized our customers were facing critical limitations caused by the underlying tech stack, we began developing an updated version of our platform that would reduce our dependency on these technologies and enable a better platform for our customers.

Moving Forward with Trinsic Ecosystems

Trinsic’s goal is to be the infrastructure of choice for developers who are building identity wallets and ecosystems to provide more people access to re-usable and self-sovereign identity. We make complex technology easy to use through turnkey APIs and understandable documentation.

As an infrastructure provider, it is critical we build on modular technologies that are scalable and extensible so we can:

Support a variety of standards as they evolve, including verifiable credential formats like W3C VC Data Model, BBS Signature w/ VCs, KERI ACDCs, and ISO 18013-5 mDLs; communication protocols such as OIDC and DIDComm; and roots of trust such as ION, Ethereum, and traditional DNS infrastructure. Scale to the volumes and metrics needed by our customers, in terms of throughput, response time, and reliability. Build developer tools that fit our customers’ workflows, including wallets that are accessible using any device, available in multiple programming languages, and support built-in governance.


These are the main motivations behind our development of Trinsic Ecosystems, the next iteration of our platform. Trinsic Ecosystems is built with the lessons learned from production deployments with over 100,000 live, end-users across the world. In addition, Trinsic Ecosystems already has customers using it in production with extremely positive results.

While we are investing all our efforts into making Trinsic Ecosystems the best it can be, we will continue to support our existing platform for the foreseeable future. We have many customers whose projects are built on the Indy/Aries stack, and we have every intention of making each of them successful. Moving forward:

We will eventually sunset our existing platform based on Aries, although we do not have a specific date set for that. When we make that decision we’ll communicate it publicly and work with each team with production deployment on a migration plan. Other than a few edge cases, Trinsic Ecosystems is far superior to Trinsic’s existing platform, and we will work closely with any of our customers who wish to migrate to ensure a smooth transition. We are targeting for Trinsic Ecosystems to become the exclusive platform we provide to new customers by the end of the calendar year.


For more information about Trinsic Ecosystems and the specific technical differences relative to our existing platform, please see our Migration Guide in our documentation.

FAQ How do I get started with Trinsic Ecosystems?

Trinsic Ecosystems is being used in production, and we (and our customers!) couldn’t be happier with the results. We’re working closely with an exclusive set of customers to ensure their feedback makes it into the product rapidly. If you’re interested in building on Trinsic Ecosystems, please reach out.

What about interoperability with Aries moving forward?

Interoperability is a keystone of SSI and something our customers almost universally need eventually. Today, Trinsic Ecosystems supports the W3C verifiable credential standard and OIDC-based verifications, which are interoperable in their respective domains, but Trinsic Ecosystems is not interoperable with Aries yet.


We intend on bringing support for Aries interoperability to Trinsic Ecosystems over time. One of the most challenging issues with the Hyperledger Indy/Aries tech is how bundled the current implementations are. Trinsic Ecosystems is built in such a way where we can easily add support for components of the stack separately—and some of these components are on the roadmap already:

Indy: a legitimate option for anchoring public DIDs Aries: messaging protocols and interoperability profiles for exchanging data between wallets Anoncreds: a variant of verifiable credential which is now beginning to be documented and standardized What should I do if I’m currently using Aries?

Of course the answer is “it depends”. If you’re experimenting with a proof-of-concept, then you probably don’t have much to worry about. We wrote a document outlining who should migrate, and how, in our documentation. If you’re intending on scaling to users in production or have questions about migration, we suggest you get in touch with us to discuss your situation.

Why not fix Trinsic’s existing platform instead of build Trinsic Ecosystems?

We invested months of time and hundreds of thousands of dollars on improving Trinsic’s existing platform. But we found some of the challenges with the underlying tech caused scaling and performance problems that couldn’t be feasibly overcome. The best option ahead of us was to develop something built from the ground-up with the requirements of real-world, production usage in mind.

Why not wait for the community to fix the problems with Aries?

We’re focused on building the best products, period. We don’t have a “not built here” complex with regard to developments in the SSI the space. Our customers and partners put their trust in us to be objective in our pursuit of the best possible platform. This is our only incentive—we don’t have consulting revenue or other forms of income to sustain us, so if we don’t create the best platform on earth, our business will be at risk.

So while we welcome developments of the open source ecosystem, we can’t sit around and wait for improvements like Anoncreds 2.0, which has been discussed for ~4 years, to be completed. If there are solutions that better solve our problems, we have a fiduciary duty to opt for those solutions.

Closing thoughts

We are all in this space because we envision a future where users have more control of their identity. At Trinsic, we’ll continue to support technologies that usher in that future as quickly as possible by helping developers build fantastic products that will be adopted. We encourage everyone to share their opinions and experiences using our technology. We hope Kaliya’s analysis will spur constructive dialog in the community that will drive improvements and innovation in SSI.

We are happy to engage with specific questions you have on our Community Slack channel or on Twitter and look forward to the discourse.

The post Moving Toward Identity Technology Ready for Mass Adoption appeared first on Trinsic.


PingTalk

How to Build the Business Case for Customer Identity | Ping Identity

If you value customer experience, privacy, and security, you probably don’t need convincing that customer identity should be a priority for your organization. You already know that a well-integrated customer identity and access management (CIAM) strategy is key to shaping digital experiences that keep your business secure and your customers satisfied.    The thing is, building digita

If you value customer experience, privacy, and security, you probably don’t need convincing that customer identity should be a priority for your organization. You already know that a well-integrated customer identity and access management (CIAM) strategy is key to shaping digital experiences that keep your business secure and your customers satisfied. 

 

The thing is, building digital experiences that are both smooth and secure takes multiple teams working together across the entire organization. And getting all of these people on board and collaborating efficiently? That’s often easier said than done. 


How FIDO Passkeys Will Accelerate a Passwordless Future | Ping Identity

Passwords have been around since the invention of computers and are still the primary way of protecting a large percentage of our infrastructure today. However, despite being the go-to method for security, digital passwords are inherently problematic and will continue to be problematic as technology becomes more advanced.   Why?   Passwords are knowledge-based, meaning they can e

Passwords have been around since the invention of computers and are still the primary way of protecting a large percentage of our infrastructure today. However, despite being the go-to method for security, digital passwords are inherently problematic and will continue to be problematic as technology becomes more advanced.

 

Why?

 

Passwords are knowledge-based, meaning they can easily be guessed or stolen. They’re also a source of user frustration, which can negatively impact employee productivity, customer satisfaction, and ultimately revenue.

 

All that considered, how do you reconcile this when passwords—despite their shortcomings—are really the only digital security solution society has ever known?

 

Good question. The answer is going passwordless, which several leading enterprises across the industry are pursuing via FIDO passkeys. 


Ontology

OWN Insights 07: The Financialization of DAOs and DID

Enjoy the latest OWN Insights, by our Americas Ecosystem Lead, Erick Pinos. Welcome to the OWN Insights series, where we invite industry leaders to take us on a thought provoking journey through the Web3 space. Throughout this series, industry leaders will be sharing a space, communicating their thoughts, and helping us to better understand the new iteration of the internet. Please enjoy thi
Enjoy the latest OWN Insights, by our Americas Ecosystem Lead, Erick Pinos.

Welcome to the OWN Insights series, where we invite industry leaders to take us on a thought provoking journey through the Web3 space. Throughout this series, industry leaders will be sharing a space, communicating their thoughts, and helping us to better understand the new iteration of the internet. Please enjoy this article written by Erick Pinos, Americas Ecosystem Lead at Ontology.

Web3 is an interconnected set of technologies, protocols, concepts, and frameworks that represent the next stage of a more decentralized Internet. These past few years, we saw glimpses of a Web3 enabled future through Decentralized Finance (DeFi) and Non-Fungible Tokens (NFTs). Both of these technologies enabled new kinds of online peer-to-peer interactions. DeFi allowed users access to financial instruments without meddling middlemen, and NFTs enabled provable ownership of virtual goods such as in-game assets, digital art, or intellectual property.

However, DeFi and NFTs are just two verticals of Web3. Other important verticals include Decentralized Autonomous Organizations (DAOs) for decentralized coordination in the creation of Web3 infrastructure, and Decentralized Identity (DID) for the self-sovereignty of one’s own data while interacting with Web3 services. We can predict what it will take for these remaining Web3 pillars to see similar adoption by studying the history of how the more well-known pillars saw rapid adoption.

DeFi

While DeFi became the most popular blockchain use case in late 2020, many of the most popular DeFi projects existed for years before. MakerDAO launched in 2017. Uniswap, Compound, and Aave launched in 2018. 1inch launched in 2019, and so on. You could use these services for their intended purposes, but it wasn’t until yield farming programs were put into place that usage took off.

Indeed, yield farming took the crypto world by storm in 2020. The prospect of earning tokens with your assets made many a millionaire and captured the attention of many new DeFi projects looking to grow a user base very quickly. As a result, a slew of projects launched in a short amount of time and aimed at capitalizing on this phenomenon. From decentralized exchanges and liquidity aggregators to decentralized lending and derivatives, almost any DeFi project could receive a massive boost in users and total value locked from offering token incentives to provide liquidity to their platform.

It’s no surprise that offering financial incentives will draw in a large user base, but what about retention? Unfortunately, many DeFi projects implemented yield farming unsustainably, issuing massive rewards through inflation of their own governance tokens, resulting in high but short-lived APYs at the expense of their project’s token holders.

NFTs

As the DeFi hype died down in 2021, attention moved to NFTs. It was possible to mint NFTs back in 2017, but it was technically challenging to do so. Even after no-code NFT minting tools became easier to use in 2020, pioneered by NFT marketplaces such as Rarible, it wasn’t until 2021 when prominent artists like Beeple repeatedly sold NFT art for tens of millions of dollars a piece that content creators from all over rushed to mint their art as NFTs in the hopes of earning an income through this new medium. PFP projects pushed the NFT hype forward even more. Why sell one or a few NFT art pieces when you can sell 10,000 programmatically generated variations of the same image?

For better or for worse, the financialization of a Web3 technology accelerates its adoption.

That leaves us with DAOs and DID. There are many Web3 projects that use these technologies today. But unlike DeFi and NFTs, DAOs and DID have yet to see adoption on a similar scale, and I believe it is mainly due to the lack of financialization of these technologies. Once it becomes straightforward as to how users can make money with this tech, we’ll see similar hype cycles and corresponding corrections on the likes of DeFi and NFTs.

Lets see some examples of what such financialization would look like.

DAOs

There are many no-code tools that allow for communities to spin up their own DAOs, such as Snapshot for decentralized proposal-making and voting, and Gnosis for multi-sig management of a group’s treasury. In today’s DAOs, community members vote on governance decisions, elect committees, serve as signers on its multisig, etc. But how are they compensated for their time and contributions? Making participating in a DAO financially sustainable for participants is key for mass adoption.

One such way is through the standardization and automation of bounty programs. Manual bounty programs are cumbersome, with bounty creators needing to write out assignments and claim forms to manually review and disburse payments. If DeFi projects had to onboard liquidity providers one at a time, then manually sit down and disburse yield farming payments to each one, they would have never grown as big as they did as quickly as they did.

Similarly, bounty programs need to be re-invented like an assembly line. Bounties need to become standardized and easy to post, find, take, review, and integrate back into the main project. We also need to expand the idea of what constitutes a bounty to include tasks and micro-tasks. Designing a logo, voting in a proposal, reposting a tweet, and serving on a committee all should be valid bounties with different scopes and rewards. Once bounties become standardized and processes become automated, there could be as many bounty hunters as there were yield farmers.

Decentralized Identity

Currently, data brokers make billions of dollars off of peoples’ data collected via countless terms of service agreements and privacy policies signed every day. You see none of that revenue, and in some cases you can’t even opt out of having your data sold by these data brokers.

DID is a system that attaches decentralized identifiers (DIDs) for people, entities, and objects with verifiable credentials (VCs) that store properties about those identities. This new structure allows for users to hide their data, selectively reveal it only when necessary, and even charge to access it. This opens up a host of new monetization strategies for users. If you owned your own data, you could benefit from directly selling it. Examples are earning revenue for consensually selling your own data for surveys, targeted advertising, studies, and more.

Scalability comes through the automation of these opportunities and programmatic access to such trusted and verifiable data, with streaming payments sent to the provider on a per-use basis. Such a process is not possible without a solid Web3 foundational infrastructure. It needs to become easier for projects to buy datasets directly from the community directly than from buying datasets from data brokers. And, on the user side, it needs to become easy for users to record and sell their data repeatedly easily to as many buyers as they would like.

Conclusion

There are a plethora of Web3 technologies and even more projects implementing them. Financialization of these technologies isn’t all that matters to drive mass adoption, but it is a powerful driving force. We must take great care in implementing it sustainably and with properly aligned incentives in place to ensure Web3 develops optimally.

We are in the early stages and are just beginning to see glimpses of what a Web3 enabled future looks like. But, using the recent past as an example, we can predict what it will look like and what we need to get there.

OWN (Ontology Web3 Network) Infrastructure is a series of blockchain protocols and products that provides the much needed tools to create an interconnected, interoperable global blockchain ecosystem. The infrastructure is bringing trust, privacy, and security to Web3 applications through decentralized identity and data solutions.

Aimed at allowing Web3 developers to quickly build Web3 applications, saving them from creating basic functions from scratch, OWN includes the Ontology blockchain, ONT ID framework and more. Individuals can also seamlessly and quickly access Web3 through products such as ONTO Wallet.

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

OWN Insights 07: The Financialization of DAOs and DID was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Thursday, 08. September 2022

Continuum Loop Inc.

Trust Registries in the Real World

The post Trust Registries in the Real World appeared first on Continuum Loop Inc..
In a world where we rely increasingly on digital information, we need to know that the credentials we share are accurate. Trust Registries (or Trust Hubs) will help us verify the authenticity of the information we share by storing digital certificates that can verify the identity of individuals and organizations. These certificates can authenticate online transactions, access sensitive information, or sign documents electronically. 

Trust Registries will help to reduce fraud and ensure that only accurate information is shared. They will also help build back confidence in the digital economy and enable new types of interactions that are impossible with traditional paper-based systems.

As we continue highlighting the importance of Trust Registries over the coming months, the following real-world example from the State of Digital Wallet SeriesFurther Research & Effort” gives us a great picture of how critical Trust Registries are within digital identity ecosystems. 

A Home Renovation.

Let’s imagine a home renovation project with three main groups to think about:

Homeowner Contractor City Inspector

Trust Registries allow us to know that the various shared credentials (e.g. proof of insurance) are accurate. A Homeowner can ask their Digital Wallet to verify an insurance Credential that the Contractor is honest. Their Digital Wallet needs to prove a few things:

The Contractor was the entity to which the Insurer (the Issuer) gave the Credential. Done inside the Contractor’s Digital Wallet – it uses cryptography to prove that they still control the insurance Credential. That the Insurer cryptographically signed the Credential – letting the Homeowner know that the information hasn’t been tampered with. That the Insurer is a bona fide insurance company, there will need to be a Trust Hub that lists insurers for a particular area for this to happen.

Digital Wallets can do all of the above in just a few seconds while doing other checks. Each party involved has pieces of information that they need to verify. 

Homeowner wants to know that they have the proper permits and licences for the job and that their Contractor and employees are fully licensed, insured, and up to scratch.  The Contractor wants to know that the Homeowner has the proper permits, licence, inspections and insurance.  The City Inspector wants to know that the Homeowner and Contractor have the required licenses, training, and other paperwork.

The paper-based verification process takes so long that few homeowners ever do the checks they should. With digital wallets and the ecosystems behind them, verifications can get done in seconds—while we greet each other in person.

A critical tool for re-establishing trust

Trust Registries will be a critical tool for re-establishing trust online. They will help users identify trustworthy websites and online services and help businesses build trust with their customers. 

Ecosystems are evolving – quickly, and it’s time to start thinking about how we will manage trust and reputation and how Trust Registries help to anchor the confidence we need in our systems.

Visit our blog, follow us on social media, and subscribe to our newsletter to stay up-to-date on the latest Trust Registry news and updates.

Join the SSI Crew!

The first step on your journey to understanding Decentralized Identity and taking control of your own digital world.

You're in the Crew!

Email

Subscribe

Follow Follow Follow

The post Trust Registries in the Real World appeared first on Continuum Loop Inc..


IdRamp

IdRamp: Accelerating Innovation for the Nation at Defense TechConnect

Mike Vesey, CEO of IdRamp will describe how the military and private secretary organizations can lower barriers to adoption of the mandatory zero trust federal security strategy with easy to deploy identity orchestration. The post IdRamp: Accelerating Innovation for the Nation at Defense TechConnect first appeared on Decentralized Identity Orchestration.

Mike Vesey, CEO of IdRamp will describe how the military and private secretary organizations can lower barriers to adoption of the mandatory zero trust federal security strategy with easy to deploy identity orchestration.

The post IdRamp: Accelerating Innovation for the Nation at Defense TechConnect first appeared on Decentralized Identity Orchestration.

Shyft Network

Why IMF Calls for Regulating Crypto?

The International Monetary Fund pitched for a globally coordinated, consistent, and comprehensive approach to crypto regulation through its recent publication. The IMF noted that the divided approach to crypto regulations among countries (ban outright or accept with open arms) is resulting in a fragmented global response. Although the IMF sees a global consensus on crypto regulation is forming,
The International Monetary Fund pitched for a globally coordinated, consistent, and comprehensive approach to crypto regulation through its recent publication.
The IMF noted that the divided approach to crypto regulations among countries (ban outright or accept with open arms) is resulting in a fragmented global response.
Although the IMF sees a global consensus on crypto regulation is forming, albeit slowly, the agency is concerned that “the longer this takes, the more national authorities will get locked into differing regulatory frameworks.”

Cryptocurrencies have come a long way since Bitcoin, the world’s first cryptocurrency, was introduced in 2009. Having touched a whopping $3 trillion in 2021 from a market cap of a couple of billion dollars in 2013, when there were just seven cryptocurrencies in existence, tells a story.

Although phenomenal, this incredible growth hasn’t been without its share of problems. For instance, with rising demand for crypto-related products, the failed projects, hacks, high volatility, rug pull, and pump and dump schemes have increased, too.

Besides, such massive traction has attracted the attention of many traditional financial institutions, which has led to fears of the increasing intertwining of the crypto market and the traditional financial ecosystem. The experts believe that any drastic fall in the crypto market can spill over to the broader financial industry.

Thus, crypto regulations must ensure consumer protection and keep a check on the industry without stifling innovation.

Easier said than done. Without global consensus, crypto regulations will just be an equivalent of a toothless tiger!

The International Monetary Fund, too, pitched for an internationally coordinated, consistent, and comprehensive approach to crypto regulations in its latest publication, noting that “the right rules could provide a safe space for innovation.”

The IMF also pointed out that “applying existing regulatory frameworks to crypto assets, or developing new ones, is challenging for several reasons.”

So, what makes Crypto Regulation Complex?

There are several reasons: the rapidly evolving crypto market, lack of resources among regulators, patchy data, and an ever-growing number of cryptocurrencies. Thus, global coordination is a must.

Interestingly, we are also seeing coordination between various actors within the crypto market to make crypto compliance as smooth as possible. Recently, FIS Global joined hands with Shyft Network to enable companies to participate in the crypto landscape while complying with the complex and ever-changing world of crypto regulations.

(Image Source) No Unison Approach

Although the call for crypto regulations has gained momentum pretty recently, not all regulators and authorities have been sitting ducks. The United States, the United Kingdom, Switzerland, and the European Union had introduced or amended legislation to reign in the crypto industry even before the crypto assets attracted mainstream attention. Besides, international agencies, such as the Financial Action Task Force (FATF) and the International Organization of Securities Commissions (IOSCO), too, have published guidelines for the crypto industry.

How effective all the previous guidelines and regulations have been is a matter of discussion, given that a critical piece was missing: global coordination. For instance, the FATF Travel Rule has been adopted by only a handful of countries, which has compelled the FATF to urge governments to expedite Travel Rule adoption.

The recent IMF publication also highlighted two vastly different approaches among countries when it comes to crypto regulation. On the one hand, some countries, like China, came down heavily against crypto assets by banning them outright. On the other hand, some countries welcome crypto companies with open arms, such as El Salvador. This results in a fragmented global response that defeats the purpose of regulating digital assets.

It is not to say that a global consensus is out of the question, as countries are discussing ways to regulate crypto assets in international forums, such as G20, as proposed by a G20 watchdog. So, it’s just a matter of time now.

(Image Source)

But the IMF is concerned that “the longer this takes, the more national authorities will get locked into differing regulatory frameworks.” Thus, the IMF is calling for a coordinated, consistent, and comprehensive global response.

______________________

VASPs need a Travel Rule Solution to begin complying with the FATF Travel Rule. So, have you zeroed on it yet? We have the best solution to suggest: Veriscope! Veriscope is the only frictionless Crypto Travel Rule compliance solution.

Visit our website to read more: https://www.veriscope.network/ and contact our BizDev team for a discussion: https://www.veriscope.network/contact.

Also, follow us on Twitter, LinkedIn, Discord, Telegram, and Medium for up-to-date news from the world of crypto regulations.

Why IMF Calls for Regulating Crypto? was originally published in Shyft Network on Medium, where people are continuing the conversation by highlighting and responding to this story.


Dock

How to Prevent Supply Chain Fraud With Blockchain

Blockchain and Verifiable Credentials technology in a supply chain helps prevent supply chain fraud and provides complete visibility.

The integration of blockchain in a supply chain provides complete visibility of the entire supply chain for all stakeholders, reduces the need for intermediaries, reduces operational costs, and provides transparent product data for customers.

Read the full article here.


KILT

KILT in Action — The dena Blockchain Pilot Project

KILT in Action — The dena Blockchain Pilot Project Introduction by Ingo Rübe, Founder of KILT Protocol The project on machine identities led by dena, the German federal energy agency, was finished successfully. KILT served as a core partner in this project by supplying the machine identities and demonstrating the utility of a blockchain protocol in a highly regulated environment like t
KILT in Action — The dena Blockchain Pilot Project

Introduction by Ingo Rübe, Founder of KILT Protocol

The project on machine identities led by dena, the German federal energy agency, was finished successfully. KILT served as a core partner in this project by supplying the machine identities and demonstrating the utility of a blockchain protocol in a highly regulated environment like the energy sector. This was an important milestone for us: DIDs and Verifiable Credentials and KILT in general are ready for large scale, institutional use cases that fulfil requirements from regulators. Blockchain technology is finding its way into industry!

Also of note: During the project we deployed a complete KILT node on a Raspberry Pi-based smart meter, demonstrating the efficiency of Parity Substrate-based blockchains.

Blog Post by Majella Horan, Lead Writer for KILT Protocol
dena has released a report on its pilot project on digital machine identities as the basic building block for an automated energy system. KILT was part of the project, providing decentralized machine identities registered on the KILT blockchain and showing how blockchain technology can be used inside a highly-regulated environment.

KILT collaborated on this initiative with industry leaders from the energy sector, the digital economy and science including Energy Web Foundation, Parity Technologies, EnBW, e-on, Riddle & Code, Oli-Systems, and 50 Hertz.

dena Project: Background

Government agencies and industry are increasingly looking to blockchain solutions to help achieve effective and compliant digitalization. The modern energy industry faces challenges in coordinating an increasing number of energy-producing and energy-consuming devices in the grid infrastructure, from energy plants to electric vehicles, heat pumps and storage systems.

dena launched the Future Energy Lab pilot project in 2019 to explore these issues. The goal was to create a Blockchain Machine Identity Ledger (BMIL), a decentralized digital registry that aimed to coordinate and connect devices and energy plants across the German energy system.

The project incorporated the current smart meter gateway technology, and required creating uniform digital identities for energy systems. It was designed as a three-pronged pilot project to demonstrate the technical, economic and regulatory feasibility of three variants:

Storing the identity characteristics directly at the smart meter gateway Storing the identity characteristics directly on the device Storing the identity characteristics in the cloud

All variants needed to meet governmental standards of data security and protection, scalability and privacy in a cost-effective way.

KILT Protocol was integrated for the first two variants, with the identities to be generated and stored directly on the devices.

Digital Machine Identities with KILT

As context for the KILT integration with dena, it’s important to understand KILT’s identity framework, which has two key components:

A decentralized identifier (DID), which uniquely identifies an entity or device (as a fingerprint does for humans), and Verifiable, revocable credentials that are confirmed by trusted parties (“Attesters”).

In this way, a digital identity can be built for people, machines, services and anything that identities can be built on.

Standardized digital identities for machines form the basic building block for an automated and reliable energy system. For this pilot project with dena, the machine identities each contained:

A decentralized identifier (DID), which uniquely identified the device, and The characteristics, attributes or properties that made this machine different from others. These were attached to the DID in the form of digital certificates, or verifiable credentials (VCs). These credentials show the static and dynamic properties of the device.

KILT is pleased to be part of the dena Future Energy Lab project, making a practical and effective contribution to current energy challenges and showing how blockchain technology can be used in a highly regulated, real-world environment.

The following is a technical description of the project including the technical integration of KILT. The full report is published here in German; dena will publish an English version shortly.

dena Project: Summary

The BMIL pilot project aimed to close an important gap that is currently a barrier to implementing a real-time energy economy using the smart meter gateway and blockchain technology: the lack of digital identities for energy systems.

One of the key elements was to agree on and test uniform digital identity standards, namely DIDs and verifiable credentials.

The DID enables end-to-end encryption between digital agents, allowing data to be exchanged between participants and systems without the need for a centralized data silo by establishing a Blockchain Machine Identity Ledger (BMIL) that allows assets to register their identity (their unique DID and verifiable credentials).

These digital, self-sovereign and decentralized device identities were set up and either stored and anchored on the device itself, or via a digital twin in the cloud, depending on the variant.

Transmission of the digital identity was implemented in regular operation via the smart gateway infrastructure and could be linked to the BMIL through the installation of a BMIL-compatible smart meter.

dena Project: KILT Integration

The integration of the KILT Protocol into the BMIL project is summarized here, using the example of the device-centric identity management in conjunction with a dedicated Controllable Local System (CLS) device, the OLI Box. The integration was the same for both variants.

As the goal was to achieve a high degree of decentralization, the identities were stored directly on the respective devices and anchored on the KILT blockchain.

In the first step, the private/public key pair is generated on the box. The OLI box’s crypto chip provides the random input for generating the key pair to establish the necessary security and connect the keypair to the box.

Then the identifier or DID is generated on the device and anchored on the blockchain. The DID document is signed by the box in order to link it to the DID, and is then stored in the DID store.

Through the claimer application, the credential workflow is implemented on the OLI box. It implements the claiming of properties, the requesting of a credential, the storing of an issued credential on the box, and the sharing of credentials with the verifiers. It can either share the whole credential, or only single attributes via selective disclosure, if the verifier does not need to know the whole credential.

To ensure privacy, the credentials are not stored on the blockchain. Credentials remain with the device (the OLI box) and are only anchored on the blockchain as a hash representing the data by the issuer of the credential. The anchored credential on the blockchain does not allow any conclusions to be drawn about the content and owner of the credential. Via the hash, only verifiers to which the credentials were disclosed can check the validity of that credential on the blockchain.

The different actors, e.g., the application of the Energy Web Foundation to prequalify device identities for different use cases through VC-based role assignment and others, communicate with each other via the KILT messaging service using encrypted messages.

dena Project: Conclusions

Digital identities built around DIDs and verifiable credentials provide the new standards needed for identity for governmental projects going forward. This standardization ensures uniformity, contributes to efficiency and security, and allows system services to be provided and called up fully automatically.

Future research will build on these findings and create services and applications to increase energy efficiency.

KILT is a blockchain identity protocol for issuing self-sovereign, verifiable credentials and DIDs, providing practical, secure identity solutions for enterprises and consumers.

Discover more on the KILT website and blog, brainstorm KILT use cases in Discord, or follow KILT on Twitter and Telegram to keep up with the latest news.

KILT in Action — The dena Blockchain Pilot Project was originally published in kilt-protocol on Medium, where people are continuing the conversation by highlighting and responding to this story.


KuppingerCole

CYFIRMA DeCYFIR

by Osman Celik Reaktive Sicherheitsmaßnahmen wirken auf Cyberangriffe, die bereits stattgefunden haben. Während die Ausgaben für die Cybersicherheit steigen, werden auch Cyberangriffe immer zahlreicher. Um Risiken zu mindern und potenzielle Angriffe zu verhindern, müssen proaktive Maßnahmen mit reaktiven Maßnahmen kombiniert werden. CYFIRMA bietet eine Lösung zur Verwaltung der externen Bedrohungs

by Osman Celik

Reaktive Sicherheitsmaßnahmen wirken auf Cyberangriffe, die bereits stattgefunden haben. Während die Ausgaben für die Cybersicherheit steigen, werden auch Cyberangriffe immer zahlreicher. Um Risiken zu mindern und potenzielle Angriffe zu verhindern, müssen proaktive Maßnahmen mit reaktiven Maßnahmen kombiniert werden. CYFIRMA bietet eine Lösung zur Verwaltung der externen Bedrohungslandschaft, die über prädiktive und erkenntnisbasierte Cybersicherheitsfunktionen verfügt, um Einblicke in die Perspektive der Angreifer zu ermöglichen. In diesem Bericht werden wir CYFRIMAs Produkt DeCYFIR betrachten.

OWI - State of Identity

The Power of Identity

How are organizations building technology that can help prevent fraud and automate KYC and compliance?  State of Identity host, Cameron D’Ambrosi and Gbenga Odegbami, CEO and CoFounder of Youverify take on the hot topic of closing the gaps between businesses and consumer identities.

How are organizations building technology that can help prevent fraud and automate KYC and compliance?  State of Identity host, Cameron D’Ambrosi and Gbenga Odegbami, CEO and CoFounder of Youverify take on the hot topic of closing the gaps between businesses and consumer identities.


Northern Block

Digital Notarization Can Kickstart Digital ID Ecosystems (with Dan Gisolfi)

The post <strong> Digital Notarization Can Kickstart Digital ID Ecosystems</strong> (with Dan Gisolfi) appeared first on Northern Block | Self Sovereign Identity Solution Provider.

>>>  Listen to this Episode On Spotify

>>>  Listen to this Episode On Apple Podcasts


Can Digital Notarization Solve Digital ID’s Chicken-and-egg Problem?

Which came first, the chicken or the egg?

Unlike a traditional, linear startup, a platform doesn’t need to acquire just one group of customers. At a minimum it needs two, its consumers and its producers.

But new platforms don’t initially create enough value to attract new users. It’s not economical for consumers to join the platform when there are no producers, and vice versa. This is called the chicken-and-egg problem.

This same problem exists in digital identity. Just switch out producers for issuers, consumers for verifiers, and platforms for ecosystems. And we have ourselves a chicken-and-egg problem.

The business value of credential exchange happens at verifications (e.g., lowered costs, lowered risk/fraud, lowered friction). And for this business value to be created, more credentials are needed in holders’ wallets.

Does the digital ID chicken-and-egg problem get solved by governments issuing ID credentials to their citizens? Must we wait on governments to become issuers to solve it?

What if organizations that currently conduct examinations (e.g., a financial institution does KYC during account opening) can extend their processes into the issuance of credentials?

Or what if digital trust providers (e.g., document authentication) extend their offering into the issuance of machine readable credentials?

Do issuers need to have access to source data (and uniqueness) to become attesters? 

In the physical world, notaries exist to attest to the validity of documents. And this practice is very much trusted by verifiers.

Can Digital Notarization solve Digital ID’s chicken-and-egg problem?


About Podcast Episode

Some of the key topics covered during this episode with Dan are:

How does the Chicken-and-egg Problem relate to digital identity? Is there a dependency on Government IDs to seed the marketplace? Are unique identifier databases required to become a credential issuer? What is Transitive Trust? And how does it differ from how trust gets established otherwise (e.g., through backend API calls)? The missing role in the trust triangle: The Examiner. Can Examiners become Digital Notaries? Rethinking Authentication and Authorization – using attestations from multiple issuers helps to create more trust. How issuance can become a business model for many trusted service providers. Some Challenges with the mDL (ISO/IEC 18013) Standard. The benefits of using a Microcredential approach in Issuance. Misconceptions about becoming credential issuers (e.g., assuming liability, data minimization).

Note: During the podcast, Dan refers to a post titled ‘Decentralized Digital Post’, which can be found here.

About Guest

Dan Gisolfi is currently leading the delivery of innovation capabilities across Discover Financial Services (DFS), such as Hack-aaS, Patent Program, Design Thinking Services, and an Innovation Accelerator. Prior to joining DFS, he led an innovation team focused on the incubation of IBM Security’s Zero Trust Architecture in collaboration with internal labs, academic institutions and NIST.


Dan is a passionate industry advocate for repairing the internet’s missing identity layer and the establishment of an interoperable digital trust marketplace. As CTO for Decentralized Identity at IBM, he was a subject matter expert that worked with clients and industry advocates towards the maturation of digital trust infrastructure in support of decentralized identity solutions. His activities ranged from co-founding the Trust over IP (ToIP) Foundation, convening and founding the Bedrock Business Utility. The latter being a first of a kind utility network project within the Linux Foundation. His contributions to the open standards and communities include participating in the development of the Hyperledger Indy DID Method Specification and co-chairing the ToIP Technical Stack and Utility Foundry working groups. He was also maintainer for the ToIP GitHub organization and held steering committee seats on DIF, Sovrin and ToIP. Additionally, he has held several board and advisor seats for startups, including Bonifii, a trusted peer-to-peer services network of verifiable exchange for financial cooperatives.

Where to find Dan?

LinkedIn: https://www.linkedin.com/in/vinomaster/ Blogs: https://www.ibm.com/blogs/blockchain/author/dan-gisolfi/

Follow Mathieu Glaude

Twitter: https://twitter.com/mathieu_glaude   LinkedIn: https://www.linkedin.com/in/mathieuglaude/ Website: https://northernblock.io/

The post <strong> Digital Notarization Can Kickstart Digital ID Ecosystems</strong> (with Dan Gisolfi) appeared first on Northern Block | Self Sovereign Identity Solution Provider.


Dock

SEVENmile issues fraud-proof verifiable certificates using Dock

SEVENmile now issues digital graduation certificates that are fraud-proof and instantly verifiable using blockchain technology.

SEVENmile now issues digital graduation certificates that are fraud-proof and instantly verifiable using blockchain technology.

Zug, Switzerland – 8 September, 2022 – SEVENmile, an experiential learning program in Australia, that helps high-school students learn real-world skills, today announced a strategic collaboration with Dock Labs to issue digital graduation certificates that are instantly and permanently verifiable using blockchain technology. SEVENmile now issues fraud-proof certificates, allowing students to prove their skills throughout their lives and ensure immediate trust with employers.

SEVENmile’s entrepreneurial training program pairs high school students with business owners to teach students problem-solving techniques and help them gain an understanding of real-life business issues. SEVENmile believes that in 2022, a graduation certificate in a paper format has no real-world value to a student. Printed certificates are difficult to verify and, because of that, do not help students prove their skills. Dock Labs’ verifiable credentials platform, Dock Certs, is now enabling SEVENmile to issue hundreds of fraud-proof digital certificates and credentials that students can store on a mobile phone. By scanning the QR Code on the certificate, a hiring company will have immediate certainty and trust that the certificate is authentic.

Degree and certificate fraud is a billion-dollar industry and a growing problem worldwide. Dock Labs uses innovative verifiable credentials and blockchain technology to ensure that digital certificates and credentials cannot be forged and remain verifiable forever. SEVENmile’s graduates will now have a future-proof digital credential that will allow them to create a lifelong record of the skills they learnt and ensure they can always prove it. SEVENmile is collaborating with the New South Wales Education State Agency to expand the program to 1500 high schools by 2024.

“The evolution of Web1 to Web3 has occurred over a couple of decades. From viewing static HTML text content (Web1) to the democratization of personal data (Web3), it has been a wild ride at times. At SEVENmile Ltd, we believe that the move to ownership of our personal data is a vital platform that will help transform how the internet functions and how our data will be protected. We’re applying this philosophy by working with Dock Labs to secure the credentials of students we train.” said Greg Twemlow, SEVENmile’s CEO.

“Working with Greg and SEVENmile has not only opened our eyes to the incredible work they are doing, it’s also shown us the power of verifiable credentials in an educational setting, providing students with tamper proof digital certificates that are completely under their control, respect their privacy, and can be verified instantly” said Nick Lambert, Dock Labs’ CEO.

About SEVENmile

SEVENmile Ltd prepares students to become workforce-ready graduates or next-generation innovators. Student’s skillful, confident mindset exposes opportunities, ignites ambition, and fosters the attitudes and skills that are the pathway to success. The SEVENmile programs are designed for 15-18 year old high school students, and culminates in an event where students present their proposed solutions to the real-world problems of local business owners.
For more information, visit https://www.sevenmile.org.au

About Dock Labs

Dock Labs’ Verifiable Credentials Platform, Dock Certs, provides a highly secure and scalable solution for businesses to issue and verify digital credentials and certificates that are instantly verifiable using blockchain technology. Enabling organizations and individuals to create and share verified data.
For more information, visit https://dock.io


Revoke

iOS 16 and Security: What’s New?

With every major Apple update comes a whole host of new features, and iOS 16 looks set to be no different. The post iOS 16 and Security: What’s New? appeared first on Revoke.

The post iOS 16 and Security: What’s New? appeared first on Revoke.


Ontology

Ontology Weekly Report (September 1–5, 2022)

Highlights Odaily published the 5th OWN Insights “The Realization of Web3”, sharing the possible realization of Web3. The article suggests that the future will see Web3 combined with Web2, and iteratively upgrade the underlying technology of the blockchain. Latest Developments Development We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with
Highlights

Odaily published the 5th OWN Insights “The Realization of Web3”, sharing the possible realization of Web3. The article suggests that the future will see Web3 combined with Web2, and iteratively upgrade the underlying technology of the blockchain.

Latest Developments

Development

We are 100% done with the Rollup VM design. The White Paper will be published soon. We are 98% done with the Rollup RISCV EVM actuator. We are 83% done with the Rollup L1<->L2 cross-layer communication. We are 85% done with the Rollup L1<->L2 Token Bridge. We are 99% done with the L1 data synchronization server. We are 92% done with the L2 Rollup Node. We are 54% done with the L2 blockchain browser.

Product Development

ONTO App v4.3.5 released, bringing support for Aurora Chain and BitTorrent Chain. It integrated Go+ Security Detection feature and added a section for the NFT Marketplace. ONTO hosted an online activity with XDC Network. Follow the @ONTO Wallet Official Announcement in Telegram for more details!

On-Chain Activity

154 total dApps on MainNet as of September 6th, 2022. 7,153,641 total dApp-related transactions on MainNet, an increase of 10,849 from last week. 17,920,476 total transactions on MainNet, an increase of 30,194 from last week.

Community Growth

We held our weekly Community Call, focusing on the topic “SBTs”. Ontology community members discussed “Whether SBTs can be used as DID identity credentials” to explore the connection and cooperation between SBTs and DIDs. We held our Telegram weekly Community Discussion led by Benny, an Asian community Harbinger, exploring the collision between DID and NFT with community members. NFT can be associated with DID, and DID can also use NFT as a trusted credential for login and verification, providing users with multiple trust guarantees. As always, we’re active on Twitter and Telegram, where you can keep up with our latest developments and community updates.

Global News

Ontology cooperated with OKLink Audit, which aims to improve users’ interaction experience in the blockchain world, and work together to create a safe and harmonious Web3 ecological atmosphere. Ontology’s Business Development Manager Li Ge was invited to participate in the “Fenbushi Talk” hosted by Fenbushi Capital, discussing “Web3 Recruitment/Job Application Requirements” with senior Web3 practitioners, and provided guides for people who are struggling to find jobs in the current environment. MetaPath newly supported Ontology EVM assets, global users can securely swap WONT and WONG in MetaPath with faster exchange and lower rates now. Ontology in the Media

Odaily -The Realization of Web3

“In my opinion, the final outcome of Web3 and Web2 is coexistence, and the proportion of both is not small. They will probably exist in a state where there is a bit of them in each other, and Web3 may account for 20%, 50% or 80% of them. The process must be a two-way journey between Web2 and Web3. Web3 will be a tool that some people use when dealing with some things and some relationships.”

Follow us on social media!

Ontology website / ONTO website / OWallet (GitHub)

Twitter / Reddit / Facebook / LinkedIn / YouTube / NaverBlog / Forklog

Telegram Announcement / Telegram English / GitHubDiscord

Ontology Weekly Report (September 1–5, 2022) was originally published in OntologyNetwork on Medium, where people are continuing the conversation by highlighting and responding to this story.

Wednesday, 07. September 2022

Radiant Logic

Takeaways from the Gartner IAM Summit 2022

The emerging and dominant trends coming out of Gartner IAM summit this year, as seen by our Director of Communications Heather MacKenzie. The post Takeaways from the Gartner IAM Summit 2022 appeared first on Radiant Logic.

Indicio

Newsletter Vol 35

The post Newsletter Vol 35 appeared first on Indicio.

Now on LinkedIn! Subscribe here

Indicio Identity Community Meetup: DIDComm V2: Implications on the future of the internet

Indicio recently hosted a discussion with Daniel Hardman, CTO of Provent, and Sam Curren, Deputy CTO of Indicio, to get their expert opinions on how the recent approval of the DIDComm V2 specification by the Decentralized Identity Foundation (DIF) will impact the everyday interactions people have on the internet. 

Watch the recording Solving Market Problems With Open Source Verifiable Credentials

Heather Dahl, CEO of Indicio, will be speaking with Mike Vesey, CEO of IdRamp, on the real world use of open source verifiable credential technology for business. The focus of the presentation will be how to make the technology work in practice and how open source projects can accelerate and amplify commercial solutions.

Learn more Identity Insights – Verio and Bullet ID

Christian Talle, CTO of Bullet ID joins us in this episode to discuss how verifiable credentials can be used to track ammunition and help solve crime. He shares some info on the technologies used in this solution, as well as where you can go to see it in action.

Watch the interview Market Signals – Why Should You Care about Digital Identity?

We take a look at some of the best arguments for why strong identity systems will be critical as the internet involves. Topics include how digital ID will change the nature of reputation, how to prepare for digital identity challenges, and the versatility of decentralized identity.

Read the article Identity Insights – The Implications of DIDComm V2 on the Future of the Internet

We selected some of the highlights of our recent Meetup conversation on the implications of DIDComm for the future of the internet. This is the ideal video for those who want an overview of the main points but don’t have time for the full, hour-long conversation.

Watch the video News from around the community:

GlobaliD 101: Bring Your Own Identity

Liquid Avatar Technologies and Aftermath Islands Metaverse introduce Meta Park Pass to verify real users in the Web 3 universe

Anonyome weighs in on What Meta’s Profit Drop Might Say About Consumer Sentiment on Data Privacy

Upcoming Events

 

Here are a few events in the decentralized identity space to look out for.

Cardea Community Meeting 9/8 Identity Implementors Working Group 9/8 DIF DIDcomm Working Group 9/12 Aries Bifold User Group 9/13 TOIP Working Group 9/13 Hyperledger Aries Working Group 9/14

The post Newsletter Vol 35 appeared first on Indicio.


Infocert

InfoCert invites you to join “Rebooting the Web of Trust” the online event held by the DizmeID Foundation

Next Thursday September 15th 2022 starting from 05:00 pm CET, InfoCert will be one of the speakers during the event organized by the DizmeID Foundation titled “Rebooting the Web of Trust”. Digital Identity, decentralized identity, decentralized finance, decentralized organization, crypto space, that’s will be some of the main topics, thanks the contribution of several important […] The post Info

Next Thursday September 15th 2022 starting from 05:00 pm CET, InfoCert will be one of the speakers during the event organized by the DizmeID Foundation titled “Rebooting the Web of Trust”.

Digital Identity, decentralized identity, decentralized finance, decentralized organization, crypto space, that’s will be some of the main topics, thanks the contribution of several important representatives from different companies and institutes part of the DizmeID Foundation, will be addressed during the virtual round tables.

Further relevant is the concept of Self-Sovereign Identity (SSI), the aim of SSI is to give a citizen control over their personal and financial identity, whilst preserving anonymity where possible.

Decentralization in the virtual space has thrown up many interlinking ecosystems of risk and regulation. The event objective is to understand the key challenges and existing responses to them, as well as explore new avenues for resolution and progress.

Register Now

AGENDA:

“eIDAS 2.0: evolutions and prospectives of digital identity” | Igor Marcolongo (InfoCert), Daniele Citterio (InfoCert)

“Proposal for a decentralized trust framework enabling the regulation of crypto assets in DeFi applications” | Egidio Casati (Nymlab)

Roundtable with the experts Andrea Carmignani (Keyless), Pietro Grassano (Algorand), Igor Marcolongo (Infocert), Daniele Citterio (Infocert), Egidio Casati (Nymlab)

Q&A session

Register Now

The post InfoCert invites you to join “Rebooting the Web of Trust” the online event held by the DizmeID Foundation appeared first on InfoCert.


auth0

Custom Log Streaming Integrations